What is the main differnce between schema in RDBMS and schemas in DataWarehouse….? RDBMS Schema
* Used for OLTP systems
* Traditional and old schema
* Normalized
* Difficult to understand and navigate
* Cannot solve extract and complex problems
* Poorly modelled
DWH Schema
* Used for OLAP systems
* New generation schema
* De Normalized
* Easy to understand and navigate
* Extract and complex problems can be easily solved
* Very good model
What is hybrid slowly changing dimension? Hybrid SCDs are combination of both SCD 1 and SCD 2.
It may happen that in a table, some columns are important and we need to track changes for them i.e capture the historical data for them whereas in some columns even if the data changes, we don’t care.
For such tables we implement Hybrid SCDs, where in some columns are Type 1 and some are Type 2.
What are the different architecture of datawarehouse? There are two main things
1. Top down - (bill Inmon)
2.Bottom up - (Ralph kimbol)
1.what is incremintal loading?
2.what is batch processing?
3.what is crass reference table?
4.what is aggregate fact table?
Incremental loading means loading the ongoing changes in the OLTP.
Aggregate table contains the [measure] values ,aggregated /grouped/summed up to some level of hirarchy.
what is junk dimension? what is the difference between junk dimension and degenerated dimension? Junk dimension: Grouping of Random flags and text Attributes in a dimension and moving them to a separate sub dimension.
Degenerate Dimension: Keeping the control information on Fact table ex: Consider a Dimension table with fields like order number and order line number and have 1:1 relationship with Fact table, In this case this dimension is removed and the order information will be directly stored in a Fact table inorder eliminate unneccessary joins while retrieving order information..
What are the possible data marts in Retail sales.? Product information,sales information
What is the definition of normalized and denormlized view and what are the differences between them? Normalization is the process of removing redundancies.
Denormalization is the process of allowing redundancies.
What is meant by metadata in context of a Datawarehouse and how it is important? Meta data is the data about data; Business Analyst or data modeler usually capture information about data - the source (where and how the data is originated), nature of data (char, varchar, nullable, existance, valid values etc) and behavior of data (how it is modified / derived and the life cycle ) in data dictionary a.k.a metadata. Metadata is also presented at the Datamart level, subsets, fact and dimensions, ODS etc. For a DW user, metadata provides vital information for analysis / DSS.
Differences between star and snowflake schemas? Star schema
A single fact table with N number of Dimension
Snowflake schema
Any dimensions with extended dimensions are know as snowflake schema
Difference between Snow flake and Star Schema. What are situations where Snow flake Schema is better than Star Schema to use and when the opposite is true? Star schema contains the dimesion tables mapped around one or more fact tables.
It is a denormalised model.
No need to use complicated joins.
Queries results fastly.
Snowflake schema
It is the normalised form of Star schema.
contains indepth joins ,bcas the tbales r splitted in to many pieces.We can easily do modification directly in the tables.
We hav to use comlicated joins ,since we hav more tables .
There will be some delay in processing the Query .
What is VLDB? The perception of what constitutes a VLDB continues to grow. A one terabyte database would normally be considered to be a VLDB.
What’s the data types present in bo?n what happens if we implement view in the designer n report Three different data types: Dimensions,Measure and Detail.
View is nothing but an alias and it can be used to resolve the loops in the universe.
No comments:
Post a Comment