lan

English French German Spain Italian Dutch Russian Portuguese Japanese Korean Arabic Chinese Simplified
click the next picture To continue

Thursday 14 October 2010

(Database Management System (CS403

Database Management System (CS403)

Physical Record and Denormalization
Denormalization is a technique to move from higher to lower normal forms of
database modeling in order to speed up database access. Denormalization process is
applied for deriving a physical data model from a logical form. In logical data base
design we group things logically related through same primary key. In physical
database design fields are grouped, as they are stored physically and accessed by
DBMS. In general it may decompose one logical relation into separate physical
records, combine some or do both. There is a valid reason for denormalization that is
to enhance the performance. However, there are several indicators, which will help to
identify systems, and tables, which are potential denormalization candidates. These
are:
Many critical queries and reports exist which rely upon data from more than one table.
Often times these requests need to be processed in an on-line environment.
Repeating groups exist which need to be processed in a group instead of individually.
Many calculations need to be applied to one or many columns before queries can be
successfully answered.
Tables need to be accessed in different ways by different users during the same
timeframe.
Certain columns are queried a large percentage of the time. Consider 60% or greater
to be a cautionary number flagging denormalization as an option

We should be aware that each new RDBMS release usually bring enhanced
performance and improved access options that may reduce the need for
denormalization. However, most of the popular RDBMS products on occasion will
require denormalized data structures. There are many different types of denormalized
tables, which can resolve the performance problems caused when accessing fully
normalized data. Denormalization must balance the need for good system response
time with the need to maintain data, while avoiding the various anomalies or problems
associated with denormalized table structures. Denormalization goes hand-in-hand
with the detailed analysis of critical transactions through view analysis. View
analysis must include the specification of primary and secondary access paths for
tables that comprise end-user views of the database. A fully normalized database
schema can fail to provide adequate system response time due to excessive table join
operations

Denormalization Situation 1:
Merge two Entity types into one with one to one relationship. Even if one of the entity
type is optional, so joining can lead to wastage of storage, however if two accessed
together very frequently their merging might be a wise decision. So those two
relations must be merged for better performance, which have one to one relationship.

Denormalization Situation 2:
Many to many binary relationships mapped to three relations. Queries needing data
from two participating ETs need joining of three relations that is expensive. Join is an
expensive operation from execution point of view. It takes time and lot of resources.
Now suppose there are two relations STUDENT and COURSE and there exits a many
to many relationship in between them. So there are three relations STUDENT,
COURSE and ENROLLED in between them. Now if we want to see that a student
has enrolled how many courses. So to get this we will have to join three relations, first
the STUDENT and ENROLLED and then joining it with COURSE, which is quite
expensive. The relation created against relationship is merged with one of the relation
created against participating ETs. Now the join operation will be performed only once.
Consider the following many to many relationship:-
EMP (empID, eName,pjId,Sal)
PROJ (pjId,pjName)
WORK (empId.pjId,dtHired,Sal)
This is a many to many relationship in between EMP and PROJ with a relationship of
WORK. So now if we by de-normalizing these relations and merge the WORK
relation with PROJ relation, which is comparatively smaller one. But in this case it is
violating 2NF and anomalies of 2NF would be there. But there would be only one join
operation involved by joining two tables, which increases the efficiency.
EMP (empID, eName,pjId,Sal)
PROJ (pjId,pjName, empId,dtHired,Sal)
So now it is up to you that you want to weigh the drawbacks and advantages of
denormalization.

Denormalization Situation 3:
Reference Data: One to many situation when the ET on side does not participate in
any other relationship, then many side ET is appended with reference data rather than
the foreign key. In this case the reference table should be merged with the main table
We can see it with STUDENT and HOBBY relations. One student can have one
hobby and one hobby can be adopted by many students. Now in this case the hobby
can be merged with the student relation. So in this case although redundancy of data
would be there, but there would not be any joining of two relations, which will have a
better performance
to download this course click here

No comments:

Post a Comment