Steps towards more agility in BI projects

“We now do Agile BI too” – such statements we hear often during conferences and while discussing with customers and prospects. But can you really do agility in Business Intelligence (BI) and data warehouse (DWH) project directly? Is it sufficent to introdouce bi-weekly iterations and let your employees read the Agile BI Memorandum [BiM]? At least in my own experience this doesn’t work in a sustainable way. In this post I’ll try to show basic root cause relations which finally lead to the desired agility.


If at the end of the day we want more agility, the first step towards it is “professionalism”. Neither an agile project management model nor an agile BI toolset is a replacement for “the good people” in project and operation teams. “Good” in this context means, that the people who work in the development and operation of a BI solution are masters in what they do, review their own work critically and don’t do any beginner’s mistakes.

Yet, professionalism alone isn’t enough to reach agility in the end. The reason for this is that different experts often apply different standards. Hence the next step is the standardization of the design and and development procedures. Hereby the goal is to use common standads for the design and development of BI solutions. Not only within one team, but ideally all over team and project boundaries within the same organization. An important aid for this are design patterns, e.g. for data modeling, the design and development of ETL processes as well as of information products (like reports, dashboards etc.).

Standardization again is a prerequisite for the next and I’d say the most important step towards more agility: The automation of as many process steps as possible in the development and operation of a BI solution. Automation is a key element – “Agile Analytics” author Ken Collier dedicateds even multiple chapters to this topic [Col12]. Because only if we reach an high degree of automation we can work with short iterations in a sustainable way. Sustainable means, that short iterations don’t lead to an increase in technical depts (cf. [War92] and [Fow03]). Without automation, e.g. in the areas of testing, this isn’t achievable in reality.

Now we are close to the actual goal, more agility. If one can release new and changed features to UAT e.g. every two weeks, these can be released to production in the same manner if needed. And this – the fast and frequent enhancement of features in your BI solutions is what sponsors and end users perceive as “agility”.

(this blog was originally posted in German here)

Event hints:


[BiM] Memorandum for Agile Business Intelligence:

[Col12] Collier Ken: Agile Analytics, Addison-Wesley, 2012

[War92] Cunningham Ward: The WyCash Portfolio Management System,, 1992

[Fow03] Fowler Martin: Technical Debt,, 2003

Testing for BI & DWH – Part 1

Since ever testing is part of every IT project plan – that’s true as well for Business Intelligence (BI) & Data Warehouse (DWH) projects. The practical implementation of testing in the BI / DWH environment has confronted me with troubles in the past again and again. Often I’ve had the impression that the BI / DWH world is still back in the Stone Age regarding development processes and environments. At least it is significantly behind the maturity level I know from the software engineering domain. The below chart illustrates this gap:

rbra_testing1 (1)

Cultural differences between the software development and BI community (Source:

If there is something tested at all, typically in the BI frontend area things are tested manually. In the DWH backend we see – besides manual tests – self coded test routines, e.g. in the form of stored procedures or dedicated ETL jobs. However the integration into a test case management tool and systematic evaluation of the test results doesn’t happen. This is heavily contrasting with the software engineering domain where automated regression testing combined with modern development approaches like test driven design are applied. At least for some time we find first inputs regarding BI specific testing (cf. the (German) TDWI book here). Concepts and paper are patient though. Where are we with regard to a possible tool support, namely for the area of regression tests?

Since summer 2014 we at IT-Logix are actively looking for better (tool based) solutions for BI specific testing. We do this together with the Austrian company Tricentis. Tricentis develops the Tosca product suite, one of the worldwide leading software solutions for test automation. In a first step we run a proof of concept (POC) for regression tests for BI frontend artefacts, namely typical reports. One of the architectural decisions was to use Excel and PDF export files as a base for our tests. With this choice of a generic file interface the efforts to develop BI product specific tests were omitted. And this way we reduced the implementation effort to about two days in the POC. The goal was to run “Before-After” tests in batch mode. We took 20 reports for the POC case (these were actually SAP BusinessObjects Web Intelligence reports, but you can imagine whatever tool you like as long as you can export to PDF and / or Excel). A current version of the PDF or Excel output of the report is compared with a corresponding reference file. Typical real life situations where you can use this scenario are:

  • recurringly scheduled regression tests to monitor side effects of ongoing DWH changes: The reference files are created somewhen e.g. after a successful release of the DWH. Imagine there are ongoing change requests on the level of your DWH. Then you want to make sure these changes only impact the reports where a change is expected. To make sure all the rest of your reports aren’t concerned by any side effects, you now run your regression tests e.g. every weekend and compare the hereby produced files with the reference files.
  • BI platform migration projects: If you run a migration project for example to migrate your SAP BusinessObjects XI 3.1 installation to 4.1, you’ll want to make sure reports still work and look the same in 4.1 as they did in XI 3.1. In this case you create the reference files in XI 3.1 and compare them with the ones from 4.1. (As the export drivers vary between the two versions, especially the Excel exports are not very useful for this use case. Still, PDF worked pretty fine in my experience)
  • Database migration projects: If you run a database migration project for example migrating all your Oracle databases to Teradata or SAP HANA, then you want to make sure all of your reports show still the correct data (or at least the data as was shown with the original datasource…)

rbra_testing1 (3)

Sample configuration of a test case template using the GUI of Tosca (Source: IT-Logix POC)

Tosca searches for the differences between the two files. For Excel this happens on a cell by cell basis, for PDF we used a text based approach as well as an image compare approach.

rbra_testing1 (2)

Depending on the chosen test mode the differences can be visualized differently (Source: IT-Logix POC)

Using the solution implemented during the POC we could see very quickly which reports were different in their current state compared to the reference state.

Another important aspect of the POC was the scalability of the solution approach as I work primarily with (large) enterprise customers. If I have not only 20 but hundreds of reports (and therefore test cases), I have to prioritize and manage the creation, execution and error analysis of these test cases somehow. Tosca helps with the feature to model business requirements and to connect them with the test cases. Based on that we can derive and report classical test metrics like test case coverage or test execution rate.

rbra_testing1 (1)

Requirements and test cases are tightly related (Source: IT-Logix POC)

In my eyes an infrastructure like Tosca is a basic requirement to systematically increase and keep the quality in BI / DWH systems. In addition advanced methods like test driven development are only adaptable to BI / DWH undertakings if the necessary infrastructure for test automation is available.

In this blog post I’ve shown a first, rudimentary solution for regression tests for BI frontend tools. In a next article I’ll show the possibilities to implement regression test for DWH backend components.

Event recommendation: Learn about a real life scenario to run a SAP BusinessObjects migration project in an agile manner. Hence test automation is key and explained in some more details. Join me during sapInsider’s BI2015 at Nice by mid of June. Find more information here.

(This blog post was first published by me in German here)

The bug paradox: When fixing the bug leads to wrong reports

My workmate Christoph Gnodtke wrote an excellent blog about how to identify SAP BusinessObjects Web Intelligence reports which are impacted by various calculation changes in newer BO versions. What I would like to point out here is that not only BO 4.x migrations are concerend but also “simple” service / support package upgrades e.g. from XI 3.1 SP2 to SP6. In my current customer case we’ve found many many reports which obviously were created in a wrong way, namely that the table structure contains the merged dimension (e.g. [Merged Country]) where as the cells within the row use a variable containing e.g. a Where operator using the original dimension ([Query1].[Country]). In our case the business requirement would have been to use the merged dimension here as well. As outlined here, in former BO support package levels a bug resulted in the effect, that the just mentioned example still showed what the business expected. Now (e.g. in XI 3.1 SP6) that the bug is fixed, the reports start to show wrong values. Although the software 360Eyes doesn’t solve the problem, it at least helps to identify concerned reports. Unfortunately we still need to look into every single report and compare between the version running on the XI 3.1 SP2 environment and the SP6 environment. In order to speed up this process we use 360Cast. This software provides similar features like BO Publications e.g. for report scheduling and bursting. The main advantage namely in the case of report testing are two fold (compared to BO out of the box features):

  1. Report selection for a schedule job can be done using good old BO categories. That means you can assign e.g. a test category to all reports you want to test in one single run. In our customer case we use categories for each data mart. In 360Cast, instead of choosing every single report individually, we just choose to select all reports of this test category.
    In order to run all these reports with one single click there is just one thing missing: Providing all the necessary prompt values, often the same values for the same prompts (like Year) over many reports. This is where the second advantage comes into play:
  2. To provide prompt values 360Cast accepts both manual input values (where a value can be applied to a all prompts with the same name) but also values from an Excel sheet (or even from an SQL query). We usually use the Excel alternative. Based on this we can easily vary input parameters for different test purporses by simply using another Excel sheet. In addition we can specify the export format and the recipients, e.g. by providing an email address.
    (The values in the drop down menues correspond to the columns in the underlying Excel spreadsheet)

After all, also 360Cast doesn’t solve the initial problem. But at least we don’t need to run every report (identified by 360Eyes earlier) on its own but can automate the refresh process and we can easily rerun reports (e.g. with different prompts by simply modifying the values in the Excel list).

Issue with Null Filters prior to Webi XI 3.1 SP6.3

After some more “theoretical” blog posts back in 2013 I’d like to start the new year with a short technical contribution. As some of you may know I’m trying to upgrade the BO XI 3.1 SP2.7 environment of one of our major customers to XI 3.1 SP6. This is sort of a painful experience as we are working on it since more than 12 months now. Still, there is some light at the horizon as back in December Fixpack 6.3 was released which contains an important bug fix. Not to mention that the bug wasn’t yet there in SP2.7 but was introduced somewhen between SP3 and SP6. The issue is referenced in the SAP KB1897777 and it seems to be fixed now.

What is our situation? We have Webi reports containing containing multiple queries and merged dimensions. If we use dimensions from two different queries in the same table, variables as well as filters containing “IsNull” functions do not work properly.

Here we are with the report in XI 3.1 SP2.7:


Now the result in SP6 (prior to Fixpack 6.3):


… and finally how it looks like with Fixpack 6.3 applied:


The tricky part was to detect this error (the above screenshots are very simplified tables for debugging purposes). Obviously even our business users didn’t caught this at first sight. Therefore I’m glad if I can contribute that you double check this if you are on a lower version than Fixpack 6.3. On the other hand: Please let me know if you find other (newly introduced) bugs in FP6.3…

And by the way: Happy New Year and lot’s of fun in the Business Intelligence world ;-)

Using HANA on Cloudshare Part 1: Setup connectivity

Hi everybody

As you may know I’m a great fan of Cloudshare, you’ll find my previous post about testing in the cloud here. So far we had to use “traditional” databases like SQL Server or Oracle to work in Cloudshare. Finally SAP managed to get its new baby – HANA – to various cloud platforms, including Cloudshare –> see here for an overview. They provide you with a regular Cloudshare environment with 24GB RAM with two machines, the HANA server on Linux and a Win7 client with HANA Studio – you can register for the 30 day trial sponsored by SAP here:


So far so good. But what is the value of an isolated HANA database? It’s pretty small. Usually in Cloudshare, an “environment” is quite isolated network wise, therefore my first idea was to extend the 24GB RAM and add another machine, e.g. with BO4 installed. Unfortunately the maximum RAM per environment is 32GB. Even more sad that BO4 doesn’t really work with 8GB of RAM… What to do? A first inquiry with Cloudshare showed that obviously the HANA environment is somewhat special. After some try and error I found how you can easily connect to your HANA environment both from your local client or another Cloudshare environment. Let me share my findings with you in this blog. As you can read in the title I plan some other posts, especially about how to fill data into HANA using SAP BO Data Services.

First thing we need to do is creating a static vanity URL for the Cloudshare machine. For this switch from “My environments” to “My Account”. There go to “Vanity URLs” and specify whatever you want – the only thing you can’t take anymore is hana ;-)


As you can see, there are two public URLs available now: the regular with and a second one In the background these two URLs are mapped to different public IPs. Whereas the first one gives you the default access to ports like 80, 8080 etc. the second one seems to redirect also HANA specific ports like 30015. Therefore you don’t need any kind of port forwarding as suggested in forum threads like here. Don’t forget to click “Save changes” at the end of the page.

You can now do a first test within the HANA Studio on Cloudshare itself – add a new system and use <your-name>



As you can see in the last screenshot, the only “issue” with the connectivity is, that somehow the status information of the HANA server cannot be retrieved, therefore you don’t get the green light but a yellow one. But don’t worry, everything works fine.

The next and so far final part is to connect from another Cloudshare environment, e.g. using the Information Design Tool:

Create a new relational connection using the HANA JDBC driver:

07_AddConnection1  08_AddConnection2

And finally you can start to build your data foundation based on this connection:


Hope this helps. Wish you a lot of fun playing around with HANA on cloudshare!

Testing BO BI 4.x using the cloud

Update End of December 2012: Currently the ITX Migration and Demo Environment on Cloudshare is not available anymore for public parties. IT-Logix customers of course can still apply for a shared copy of it. The reason why I have to end the public offering is due to increased workload on one hand. On the other hand we need the current environment for our customer projects. Unfortunately Cloudshare did not respond to my request to offer us a free environment soley for the purpose of sharing our migration environment.

(Update October 15th 2012 –> current machine list)

(Update October 28th 2012 –> Patch 4.5 installed)

(Update October 31st 2012 –> graphomate inlcuding demo dashboard installed –> see Cloudsrv012 in folder graphomate – here you’ll find the user manual too; or open Dashboards on Cloudclnt01 and drag n’ drop the graphomate component to the dashboard to test it yourself!)

While I discussed general migration challenges in my previous blog, this blog addresses the fact that every new release (even just a service / support package) of SAP BusinessObjects needs intensive testing (by the way I’m not talking about versions in Ramp-Up but the regular available versions like currently BO 4.0 SP2). SAP seems to work based on the banana principle:

The product ripe with the consumer.

I could now elaborate on how bad this is and how much better other vendors do (do they really?). But I won’t. I would like to share an opportunity of how you can better cope with the circumstance that you have to test, test and once again test whatever you do with SAP BusinessObjects before you “go live”.

When SAP provided its HANA developer environment to partners and customers I came first to know cloudshare. In the meanwhile I’m quite enthusiastic about cloudshare! It was never easier (and cheaper) for me to create development and test environments having the choice out of a multitude of machine templates and afterwards full admin rights on all machines. But the best thing about cloudshare is that you can easily share a virtual server environment with others for free (at least for a first period of two weeks).

This inspired me to create what finally was named the “ITX BO 4.x Migration Assessment and Demo Environment”. This is a virtual server environment in the cloud. It allows for quick and easy to use «hands-on» tests of current and upcoming releases of SAP BusinessObjects BI products. You can import parts (or everything) of an existing BO content from your XI 3.1 system into the XI 3.1 system in the cloud (using BIAR-Files). Afterwards you can test a migration to BO BI 4.0 SP4 (or you can use BO 4.0 SP4 simply for its own sake) You can get your own copy of the environment for free during two weeks. Afterwards you need a subscription to further use it.

The environment also includes an installation of the products 360View+ and 360Eyes from GB and Smith ( I highly recommend these two products in order to streamline your migration. There will be another blog where I will detail on this.

The 4 Available Machines

The Migration Assessment & Demo Environment consists of four machines:

  • BO XI 3.1 SP3 (Server + Client Tools + 360View + 360Eyes)
  • BO BI 4.0 SP4 Patch 5 (Server + Client Tools + Visual Intelligence + 360View + 360Eyes)
  • BO BI 4.0 SP4 Patch 5 (Client Tools, Crystal Reports 2011, Crystal Reports for Enterprise, Dashboards etc.
  • BO DataServices 4.1 + Information Steward

Request your Free Copy

Please contact me to share with you a copy of the current migration environment. You’ll find my contact information in the PDF here or use Twitter with @rbranger.
Please give me some key words why you’d like to use the environment and allow up to two working days to grant you access to a copy of the system.

You’ll receive an invitation email directly from including a link.

Register on

Afterwards you need to open a free account on

After your successful registration please log in to Cloudshare ProPlus. Your environment is already starting up… Click on «View environment» to see more details…

Wait until all machines are up and running. In the meanwhile read the description and get familiar with machine names etc.

Let’s «own» the environment. Click on the corresponding button! On the right side you have now much more options available. The license is now valid for a longer time than only the original two days.

Testing BO BI 4.0 SP4

Let’s start with using the client tools and BI Launchpad of BI 4.0 SP4. Select «Fullscreen RDP» from the drop down menu of «CLOUDCLNT012»:

The password of the BOE Administrator is always IT-Logix32
The SP4 CMS is running on cloudsrv012 on default port 6400
Here some helpful links:
Open the BI Launchpad at http://cloudsrv012:8080/BOE/BI

Find shortcuts to the available client tools on the desktop or in the start menu.

Cloud Folders

If you need to upload files (e.g. a BIAR file with your own BO content), use «Cloud Folders» to upload files using FTP:

On the virtual machine you’ll find a shortcut on the Desktop to access your cloud folders:

Have Fun and Happy Migration!

This is it. I hope you find this new opportunity useful. At least for 14 days you can use the environment for free. Afterwards you need to purchase a subscription at By the way this is nothing expensive and I wouldn’t give back mine… Regarding BO and 360 licenses there are only temporary keys part of the environment. I recommend that you use your own keys. In case you have no keys but would like to test drive BO or 360 products please contact me for an extended trial period.

My own environment which is the base for the Migration Assessment Environment is sponsored by my employer IT-Logix. Please consider IT-Logix if you need dedicated expertise for your next BO migration project.


Get every new post delivered to your Inbox.

Join 720 other followers