Posted on Leave a comment

Avoid BI Breakpoints With Cornerstone Solutions

BI System Builders in their End to End BI philosophy refer to the concept of BI breakpoints. The application of our Cornerstone Solutions® will help you avoid BI breakpoints, but what are they?

BI Breakpoints

Here’s an example of a small breakpoint in a BI system with relational tables that illustrates the escalating effort and associated cost. It starts with a poorly defined report specification. In this example no one identifies the specification as being problematic. So the report is developed according to the specification. However, when the report reaches User Acceptance Testing (UAT) it fails. After consultation with the end user that carried out the UAT it is understood that two important report columns are missing, so the report specification is changed. However, the new data columns are not supported by the dimensional model, so this also must be changed requiring modelling and DBA work. This consequence in turn means that the universe and the ETL packages must be changed and so on and so forth. You can see here the interdependence and knock on effect between the components leading to escalating effort and cost.

However, if you know what to look for you can pre-empt where the BI breakpoints might occur and put quality gates in place. Here are some examples of where BI breakpoints might occur in the dimensional model design of a relational database:

  • The use of a data warehouse generated artificial primary key rather a composite primary key on a fact table
  • Snowflaking in a relational model (not SAP OLAP)
  • A table in which its nature is not made explicit though clear naming convention
  • Table joins leading to chasms or fan traps
  • Fact to fact joins (there are exceptions)

Now a dimensional model design can be tricky to fully evaluate while it remains purely on paper or a computer screen but it is possible at this point to put a quality gate in place to minimise the risk of future BI breakpoints.  Firstly, it can be checked for best practice design principles to avoid the types of things listed in the bullet points above.  Another quality gate can occur after the physical model has been generated. This quality gate is to execute business questions against the tables using Structured Query Language (SQL) and to validate that they can be answered. To choose the business questions to be executed via SQL refer to the report specifications or consult the business users.

It is better to write pure SQL statements using a SQL tool than to use SAP BusinessObjects to generate the business questions in queries. It’s a false economy to wait for Web Intelligence reports to be designed to test the physical model. The reason for this is that the design should be validated as soon as possible, and before the universe is generated. To achieve this some test data must be loaded into the tables but it is far more efficient to identify a design weakness at this stage than later on where changes to the physical model will have a knock on effect on the universe design and any dependent reports.

If it becomes apparent in the quality gate that overly complex SQL has to be generated to answer a business question then the design should be revisited and optimized. The business questions should be focussed around any fact to fact joins, snowflaking or bridging tables as these are potential BI Breakpoints. Their existence can lead to very poor performance, miscalculations due to chasms and fan traps, and the prevention of drill across.

To minimise the risk of BI breakpoints occurring the BI Architect should introduce best practice principles along with quality gates early on. Here’s an example of the best practice principle of designing aggregates. The use of aggregates upfront can address three breakpoint areas before they occur:

 

  • Slow query response times
  • Report rendering times
  • Overly complex reports

It is of course vital to select the correct aggregate type to support the end user requirement. There are several different types of aggregates for example invisible aggregates (roll ups), new facts (ETL pre-calculated fields), pre-joined aggregates and accumulating snapshots etc. Aggregates allow the processing effort to be pushed way from the report and into the database engine and ETL program where it can be more efficiently executed.  However, the use of specific aggregates should not be decided upon until the detailed report specifications are available. Choosing the most appropriate aggregates to support the reports requires skill and experience and is not a menial task.

Aggregate tables can significantly reduce the size of the row set returned to the BI reporting tool. As a rule of thumb it is always better for BI reporting tools to work with smaller row sets. It is always best to force as much processing effort as possible back to the ‘gorilla’ of the BI system – the server itself. It can be even better when processing effort and calculations are forced back into the ETL program. However, care should be taken to educate end users when the effort is pushed back into the ETL program for calculated facts, so called ‘New Facts’, because the facts may be semi-additive or not additive at all.

The downside of using aggregate tables is the increased maintenance e.g. administering ten tables instead of five and the ETL effort required to support them. However, their benefits outweigh their cost.

BI breakpoints can manifest in far more areas than just relational tables, they may be seemingly invisible, and costly when they occur because of the independencies of the BI system. However, applying governance to the BI system around breakpoint areas is not always the most popular notion at implementation. This is because it requires the effort to think clever and to apply hard work, rigour, and discipline upfront to pre-empt problems that are almost invisible. We may be tempted to take short cuts, but, it’s better to apply the extra early effort. Entering into small skirmishes and battles in the early stages to iron out BI breakpoints is much better than allowing them to go unchecked and develop into big fire fights later. Regardless of whether your BI system relies on SAP OLAP cubes and queries or relational tables BI breakpoints are a real threat.

Posted on Leave a comment

Increased Performance Through Optimised Semantic Layer Design

SAP Business Objects Universe

SAP Business Objects UniverseThe semantic layer sits between the reporting tool e.g. Web Intelligence and the data warehouse. It is a type of graphical abstraction of how the data warehouse tables and relational joins would look if they could be viewed with the naked eye.

The critical function of the semantic layer is aptly describer in its name. All data warehouse platforms are interrogated through some type of code. The universe generates code (hence semantic) so that a query can be issued against the data warehouse. This code will be in the form of Structured Query Language (SQL) for relational platforms and Multi Dimensional eXpressions (MDX) for OLAP based universes.

The powerful feature of the semantic layer is that the business information consumer does not need to understand data warehouses or write code. The business user simply drags and drops objects of interest from a side panel on to their report page, e.g. sales revenue and location objects to generate a table displaying sales revenue by location. The semantic layer generates the code automatically behind the scenes.

However, the optimization of the code generated will be dependent upon the structure of the data warehouse tables and semantic layer configuration for relational tables and the cube and query design for OLAP sources.

Whilst it is possible to generate a semantic layer in a few minutes using an automation wizard, the resultant performance of queries may be disappointing. By performance is meant fast queries that return accurate data sets.

With experience it is possible to gain a tool box of best practice techniques to build a semantic layer designed for performance. As a semantic layer designer a good starting place is to build a query and then to ask:

What does the code generated by my semantic layer look like?

If I were to write it freehand in an optimised way, would it be different?

Do I need to change the semantic layer configuration or data warehouse structure to achieve optimised query code?

Posted on Leave a comment

SAP BusinessObjects Sizing, Hardware & Licensing Models (Legacy)

This article concerns the sizing of SAP BusinessObjects software and how it relates to the choice of hardware to support it and licensing models.

In terms of SAP BusinessObjects Enterprise XI 3.1 it is important to make an informed decision on licensing and hardware based on a SAP BusinessObjects sizing exercise. The sizing exercise involves considerations for the specific Business Intelligence software products chosen to be implemented on Business Objects Enterprise and an analysis of the quantity and type of user that will be active on the BI System. A series of calculations are then performed based on the findings.

As an example consider Web Intelligence. One single Web Intelligence Report Server (service) running on Business Objects Enterprise can support between twenty five and forty users making simultaneous requests depending upon the report complexity. The Web Intelligence Report Server requires one CPU to support it. If more than forty users stress the Web Intelligence Report Server simultaneously there will be deterioration in performance. This can be rectified by configuring more Web Intelligence Report Servers and adding CPU or limiting the number of users. This principle holds true for several of the SAP BusinessObjects Servers running on SAP BusinessObjects Enterprise.

Whilst we can scale the system retrospectively in this way it is better to carry out the sizing upfront. The benefits are that it will provide an indication to number of SAP BusinessObjects Servers to configure, the number of CPUs required to support them, and the disk space that your BI System will require.

So sizing can be a useful guide to hardware procurement. However, it is also a guide to the most effective type of licensing to procure i.e. CPU based, Named User, or a combination of both. If sufficient licensing is not procured the BI System performance will suffer, but conversely no one wants to procure more costly software licensing than is necessary.