AgileBI workshop London

Just a quick note for my UK based readers who are interested in AgileBI: I’m proud to have been selected as a speaker for the upcoming Enterprise Data & Business Intelligence (EDBI) conference in London, taking place on November 7 – 10. I’ll lead a half day workshop on Monday afternoon November 7 around my Agile BI Maturiy Model and of course would be happy to welcome you there too!

Just have a look now: Introducing Agile Business Intelligence Sustainably: Implement the Right Building Blocks in the Right Order

If you are interested in participating in this event, drop me note either by leaving a comment or contact me on LinkedIn and I can send you a voucher to save 200£ on the registration fee.

In addition, follow #IRMEDBI on Twitter!


Teradata & WhereScape Test Environment in the Cloud

In this post I outline how I managed to get a cloud based training environment ready in which WhereScape RED, a data warehouse automation tool,  connects to a Teradata database test machine.

A few weeks ago I had to organize a so called “testdrive” for a local WhereScape prospect. The prospect uses a Teradata database appliance. Hence they evaluated to use WhereScape RED based on Teradata too. As a local Swiss based WhereScape partner we received a virtual machine containing a SQL Server based WhereScape RED environment. The training had to be run onsite at the customer’s location, IT-Logix provided their set of training laptops, each containing 4GB or RAM. These were my starting conditions.

First of all I thought about how to deal with Teradata for a training setup. Fortunately, Teradata provides a set of preconfigured VMs here. You can easily download them as zipped files and run it using the free VM Player.

Based on my previous experience with organizing hands-on sessions, e.g. during our local Swiss SAP BusinessObjects user group events, I wanted to use Cloudshare. This makes it much more easier (and faster!) to clone an environment for multiple training participants compared to copying tons of gigabytes to multiple laptops. In addition, the 4GB RAM wouldn’t be enough to run Teradata and WhereScape properly in a performant way. So I had two base VMs (one from WhereScape, one from Teradata) – a perfect use case to use the VM upload feature in Cloudshare for the first time.

I started with this support note which explains how to prepare your local VM and load it up to my Cloudshare FTP folder. From there you can simply add it to an environment:


After having uploaded both VMs it looks like this in Cloudshare:


I increased the RAM and CPU power a bit, and more important configured the network between the two machines:

Go to “Edit Environment” -> “Edit Networks”:


Here I had to specify to which virtual network I’d like to connect the VMs. Please keep in mind that this doesn’t provide an automatic DHCP server or similar. Either you create one within your machine or – as in my case – had to set static IPs within the individual VM (both were delivered by using a dynamic IP provided by the VM Player). Changing the IP wasn’t a big thing, neither on Windows nor on Linux.


But I quickly found out that the Teradata service didn’t run properly anymore afterwards.

First of all I had to create a simple test case to check if I can connect from the WhereScape VM to the Teradata machine. Besides a simple Ping (which worked) I installed the Teradata Tools & Utilities on the WhereScape machine. As I couldn’t establish a proper connection, I had to google a bit. The following article gave me the hint to add a “cop” entry to the host file:


After a restart of the machine, Teradata was up and running again. This you can verify with the following command “pdestate -a” by the way:


The next step in WhereScape was to create a new metadata repository on the Teradata database. For this I created a new schema and user in Teradata first and then created the metadata repository using the WhereScape Administrator:


In WhereScape RED I created a connection to point to the new Teradata database:


… and finally loaded a few tables from the SQL Server to Teradata:


Once I finished the work, the most important step is to create a snapshot:


Based on this snapshot I finally cloned the environment for the number of participants in the testdrive with just a few clicks. After all, every participant had his own (and isolated) environment consisting of a full stack of source database (SQL Server), WhereScape and the target DWH database (Teradata).