Posted on Leave a comment

ETL Data Services

A critical component in the BI system is the ability to take your source system data, clean it, and then load it into data your warehouse. It is this quality assured data that forms the backbone of your ‘single version of the truth’ in business intelligence. To achieve the timely loading of quality assured data a software program known generically as an ETL tool is used. We’ll take the SAP BusinessObjects suite as an example. The tool is based on an ETL program originally known as Data Integrator. Data Integrator is often used with another component tool known as Data Quality. As its name suggests Data Quality provides advanced data cleansing capabilities. Metadata Manager is a third component tool that can be used closely with Data Integrator to provide visibility into data lineage and data dependencies. There are several similar ETL tools in the market the best known of which are Informatica and Microsoft’s SQLServer Integration Serices (SSIS).

Posted on Leave a comment

Avoid BI Breakpoints With Cornerstone Solutions

BI System Builders in their End to End BI philosophy refer to the concept of BI breakpoints. The application of our Cornerstone Solutions® will help you avoid BI breakpoints, but what are they?

BI Breakpoints

Here’s an example of a small breakpoint in a BI system with relational tables that illustrates the escalating effort and associated cost. It starts with a poorly defined report specification. In this example no one identifies the specification as being problematic. So the report is developed according to the specification. However, when the report reaches User Acceptance Testing (UAT) it fails. After consultation with the end user that carried out the UAT it is understood that two important report columns are missing, so the report specification is changed. However, the new data columns are not supported by the dimensional model, so this also must be changed requiring modelling and DBA work. This consequence in turn means that the universe and the ETL packages must be changed and so on and so forth. You can see here the interdependence and knock on effect between the components leading to escalating effort and cost.

However, if you know what to look for you can pre-empt where the BI breakpoints might occur and put quality gates in place. Here are some examples of where BI breakpoints might occur in the dimensional model design of a relational database:

  • The use of a data warehouse generated artificial primary key rather a composite primary key on a fact table
  • Snowflaking in a relational model (not SAP OLAP)
  • A table in which its nature is not made explicit though clear naming convention
  • Table joins leading to chasms or fan traps
  • Fact to fact joins (there are exceptions)

Now a dimensional model design can be tricky to fully evaluate while it remains purely on paper or a computer screen but it is possible at this point to put a quality gate in place to minimise the risk of future BI breakpoints.  Firstly, it can be checked for best practice design principles to avoid the types of things listed in the bullet points above.  Another quality gate can occur after the physical model has been generated. This quality gate is to execute business questions against the tables using Structured Query Language (SQL) and to validate that they can be answered. To choose the business questions to be executed via SQL refer to the report specifications or consult the business users.

It is better to write pure SQL statements using a SQL tool than to use SAP BusinessObjects to generate the business questions in queries. It’s a false economy to wait for Web Intelligence reports to be designed to test the physical model. The reason for this is that the design should be validated as soon as possible, and before the universe is generated. To achieve this some test data must be loaded into the tables but it is far more efficient to identify a design weakness at this stage than later on where changes to the physical model will have a knock on effect on the universe design and any dependent reports.

If it becomes apparent in the quality gate that overly complex SQL has to be generated to answer a business question then the design should be revisited and optimized. The business questions should be focussed around any fact to fact joins, snowflaking or bridging tables as these are potential BI Breakpoints. Their existence can lead to very poor performance, miscalculations due to chasms and fan traps, and the prevention of drill across.

To minimise the risk of BI breakpoints occurring the BI Architect should introduce best practice principles along with quality gates early on. Here’s an example of the best practice principle of designing aggregates. The use of aggregates upfront can address three breakpoint areas before they occur:

 

  • Slow query response times
  • Report rendering times
  • Overly complex reports

It is of course vital to select the correct aggregate type to support the end user requirement. There are several different types of aggregates for example invisible aggregates (roll ups), new facts (ETL pre-calculated fields), pre-joined aggregates and accumulating snapshots etc. Aggregates allow the processing effort to be pushed way from the report and into the database engine and ETL program where it can be more efficiently executed.  However, the use of specific aggregates should not be decided upon until the detailed report specifications are available. Choosing the most appropriate aggregates to support the reports requires skill and experience and is not a menial task.

Aggregate tables can significantly reduce the size of the row set returned to the BI reporting tool. As a rule of thumb it is always better for BI reporting tools to work with smaller row sets. It is always best to force as much processing effort as possible back to the ‘gorilla’ of the BI system – the server itself. It can be even better when processing effort and calculations are forced back into the ETL program. However, care should be taken to educate end users when the effort is pushed back into the ETL program for calculated facts, so called ‘New Facts’, because the facts may be semi-additive or not additive at all.

The downside of using aggregate tables is the increased maintenance e.g. administering ten tables instead of five and the ETL effort required to support them. However, their benefits outweigh their cost.

BI breakpoints can manifest in far more areas than just relational tables, they may be seemingly invisible, and costly when they occur because of the independencies of the BI system. However, applying governance to the BI system around breakpoint areas is not always the most popular notion at implementation. This is because it requires the effort to think clever and to apply hard work, rigour, and discipline upfront to pre-empt problems that are almost invisible. We may be tempted to take short cuts, but, it’s better to apply the extra early effort. Entering into small skirmishes and battles in the early stages to iron out BI breakpoints is much better than allowing them to go unchecked and develop into big fire fights later. Regardless of whether your BI system relies on SAP OLAP cubes and queries or relational tables BI breakpoints are a real threat.