Advantages & Disadvantages of Normalizing a Database
Computer databases are everywhere, from those used by banks to track customer accounts to those used by websites to store content. Databases work best when they are designed well. Normalising a database means to design the database structure to store data in a logical and related way.
It is usual for all databases to be normalised, and normalising a database has advantages and disadvantages.
Reduces Data Duplication
Databases can hold a significant amount of information, perhaps millions or billions of pieces of data. Normalising a database reduces its size and prevents data duplication. It ensures that each piece of data is stored only once.
Groups Data Logically
Application developers who create applications to "talk" to a database find it easier to deal with a normalised database. The data they access is organised more logically in a normalised database, often similar to the way in which the real-world objects that the data represent are organised. That makes the developers' applications easier to design, write and change.
Enforces Referential Integrity on Data
Referential integrity is the enforcement of relationships between data in joined tables. Without referential integrity, data in a table can lose its link to other tables where related data is held. This leads to orphaned and inconsistent data in tables. A normalised database, with joins between tables, can prevent this from happening.
- Referential integrity is the enforcement of relationships between data in joined tables.
- Without referential integrity, data in a table can lose its link to other tables where related data is held.
Slows Database Performance
A highly normalised database with many tables and joins between the tables is slower than a database without those attributes. Many people using a normalised database at the same time also can slow down database speed. In some cases, a certain amount of denormalization of the database may be required to improve database speed.
Requires Detailed Analysis and Design
Normalising a database is a complex and difficult task. Large databases with considerable amounts of information, such as ones run by banks, require careful analysis and design before they are normalised. Knowing the intended use of a database, such as whether it should it be optimised for reading data, writing data or both, also affects how it is normalised. A poorly normalised database may perform badly and store data inefficiently.
- Normalising a database is a complex and difficult task.
- Large databases with considerable amounts of information, such as ones run by banks, require careful analysis and design before they are normalised.
Alan Chester began writing in 2011, drawing on an extensive career in the information technology sector. His specialties include Microsoft Windows support, programming and Web technologies. Chester earned a B.Sc. in information technology from The Open University in the United Kingdom.