Teradata & WhereScape Test Environment in the Cloud

In this post I outline how I managed to get a cloud based training environment ready in which WhereScape RED, a data warehouse automation tool,  connects to a Teradata database test machine.

A few weeks ago I had to organize a so called “testdrive” for a local WhereScape prospect. The prospect uses a Teradata database appliance. Hence they evaluated to use WhereScape RED based on Teradata too. As a local Swiss based WhereScape partner we received a virtual machine containing a SQL Server based WhereScape RED environment. The training had to be run onsite at the customer’s location, IT-Logix provided their set of training laptops, each containing 4GB or RAM. These were my starting conditions.

First of all I thought about how to deal with Teradata for a training setup. Fortunately, Teradata provides a set of preconfigured VMs here. You can easily download them as zipped files and run it using the free VM Player.

Based on my previous experience with organizing hands-on sessions, e.g. during our local Swiss SAP BusinessObjects user group events, I wanted to use Cloudshare. This makes it much more easier (and faster!) to clone an environment for multiple training participants compared to copying tons of gigabytes to multiple laptops. In addition, the 4GB RAM wouldn’t be enough to run Teradata and WhereScape properly in a performant way. So I had two base VMs (one from WhereScape, one from Teradata) – a perfect use case to use the VM upload feature in Cloudshare for the first time.

I started with this support note which explains how to prepare your local VM and load it up to my Cloudshare FTP folder. From there you can simply add it to an environment:

01_UploadVM1

After having uploaded both VMs it looks like this in Cloudshare:

02_CloudshareEnvironment

I increased the RAM and CPU power a bit, and more important configured the network between the two machines:

Go to “Edit Environment” -> “Edit Networks”:

03_NetworkSettings

Here I had to specify to which virtual network I’d like to connect the VMs. Please keep in mind that this doesn’t provide an automatic DHCP server or similar. Either you create one within your machine or – as in my case – had to set static IPs within the individual VM (both were delivered by using a dynamic IP provided by the VM Player). Changing the IP wasn’t a big thing, neither on Windows nor on Linux.

04_TD_Setting1

But I quickly found out that the Teradata service didn’t run properly anymore afterwards.

First of all I had to create a simple test case to check if I can connect from the WhereScape VM to the Teradata machine. Besides a simple Ping (which worked) I installed the Teradata Tools & Utilities on the WhereScape machine. As I couldn’t establish a proper connection, I had to google a bit. The following article gave me the hint to add a “cop” entry to the host file:

04_TD_Setting2

After a restart of the machine, Teradata was up and running again. This you can verify with the following command “pdestate -a” by the way:

04_TD_Setting3

The next step in WhereScape was to create a new metadata repository on the Teradata database. For this I created a new schema and user in Teradata first and then created the metadata repository using the WhereScape Administrator:

06_WhereScapeSetup

In WhereScape RED I created a connection to point to the new Teradata database:

05_WhereScapeConnection

… and finally loaded a few tables from the SQL Server to Teradata:

07_WhereScape_Data

Once I finished the work, the most important step is to create a snapshot:

08_Snapshot

Based on this snapshot I finally cloned the environment for the number of participants in the testdrive with just a few clicks. After all, every participant had his own (and isolated) environment consisting of a full stack of source database (SQL Server), WhereScape and the target DWH database (Teradata).

Using HANA on Cloudshare Part 1: Setup connectivity

Hi everybody

As you may know I’m a great fan of Cloudshare, you’ll find my previous post about testing in the cloud here. So far we had to use “traditional” databases like SQL Server or Oracle to work in Cloudshare. Finally SAP managed to get its new baby – HANA – to various cloud platforms, including Cloudshare –> see here for an overview. They provide you with a regular Cloudshare environment with 24GB RAM with two machines, the HANA server on Linux and a Win7 client with HANA Studio – you can register for the 30 day trial sponsored by SAP here:

01_Environment

So far so good. But what is the value of an isolated HANA database? It’s pretty small. Usually in Cloudshare, an “environment” is quite isolated network wise, therefore my first idea was to extend the 24GB RAM and add another machine, e.g. with BO4 installed. Unfortunately the maximum RAM per environment is 32GB. Even more sad that BO4 doesn’t really work with 8GB of RAM… What to do? A first inquiry with Cloudshare showed that obviously the HANA environment is somewhat special. After some try and error I found how you can easily connect to your HANA environment both from your local client or another Cloudshare environment. Let me share my findings with you in this blog. As you can read in the title I plan some other posts, especially about how to fill data into HANA using SAP BO Data Services.

First thing we need to do is creating a static vanity URL for the Cloudshare machine. For this switch from “My environments” to “My Account”. There go to “Vanity URLs” and specify whatever you want – the only thing you can’t take anymore is hana 😉

02_VanityURL

As you can see, there are two public URLs available now: the regular with .cld.sr and a second one vm.cld.sr. In the background these two URLs are mapped to different public IPs. Whereas the first one gives you the default access to ports like 80, 8080 etc. the second one seems to redirect also HANA specific ports like 30015. Therefore you don’t need any kind of port forwarding as suggested in forum threads like here. Don’t forget to click “Save changes” at the end of the page.

You can now do a first test within the HANA Studio on Cloudshare itself – add a new system and use <your-name>.vm.cld.sr:

03_AddSystem104_AddSystem2

05_AddSystem306_AddSystem4

As you can see in the last screenshot, the only “issue” with the connectivity is, that somehow the status information of the HANA server cannot be retrieved, therefore you don’t get the green light but a yellow one. But don’t worry, everything works fine.

The next and so far final part is to connect from another Cloudshare environment, e.g. using the Information Design Tool:

Create a new relational connection using the HANA JDBC driver:

07_AddConnection1  08_AddConnection2

And finally you can start to build your data foundation based on this connection:

09_CreateDF

Hope this helps. Wish you a lot of fun playing around with HANA on cloudshare!

How to promote a Crystal Reports with Dynamic Cascading Prompts in BI4

This (and most probably some future) blog post will detail on my experience using Promotion Management (LCM) in BusinessObjects release 4.0. The following explanations are mostly based on the description I’ve just handed in to SAP support. I will do my best to keep this post current regarding answers from SAP support…
Infrastructure: I did all my testes on Cloudshare (see my blog here). Currently using BI 4.0 SP4 Patch 4.

Source-Sytem: Cloudsrv012
Target-System: Cloudsrv016

Promotion Management is primarily used on the Source System.

Update from SAP support

SAP support was quite quick and told me that the issue described in this post will be fixed in patch 4.7 (including the problem of promoting BusinessViews residing in subfolders)

Terms

Dynamic Cascading Prompt (DCP): A parameter object in Crystal Reports 2011 which contains a dynamic list of value (LOV).

List of Value (LOV): List of Value object based on a Business View (BV). Can be created manually in the BV-Manager.

Business View (BV): Business View’s are created in the BV-Manager (which is part of the Client Tools setup of the BI Platform). BusinessViews are based on Business Elements. Business Elements are based on Data Foundation objects. And Data Foundation Objects are based on Data Connection objects. These items are generally considerd as “Repository Objects” (at least in XI 3.1 Import Wizard this was the case).

Initial Setup

Create a LOV with its underlying BVs based on the Xtreme database (using ODBC-Connection to local Access file). Save them in a subfolder (in my example “rbra_Test”):

Create a simple Crystal Report (in CR 2011) containing a parameter with a DCP:

Save this report in the source system. In the BI Launchpad the parameter looks like this:

Problem Description

Goal: Simply promote the above created report from source to target system using promotion management.

Steps taken:

Create new promotion job in Source System including all dependencies:

Then promote:

Result: Partial Success:

My guess: The problem is that the BV-objects are in a subfolder. Therefore, I move the BV-objects in source system to root folder:

Report still works in source system:

Take the same Promotion Job as before and refresh dependencies – no Sub Folder in Business View Branch is shown anymore:

Promote again:

^

Now it shows Success:

It looks like a success in BV Manager too:

and also in Crystal Reports:

BUT: If you open report in BI Launchpad, you don’t see any List of Values:

Tested Workarounds

Promote BusinessViews separately (not working)

I tried to promote BusinessViews and LOV objects separately from the report. I have the same issue regarding storing repository objects in subfolders. Besides this I found the following:

  • Just promoting the BusinessView and underlying objects works fine according to Promotion Management. But if you look into BV-Manager you’ll get errors like this
  • Promotion Management doesn’t allow to select LOV objects separately.
  • If you then promote the same Crystal Reports containing the DCP but do NOT select the dependencies, all the BusinessView objects (and LOV objects) are promoted anyway and break the functioning of the BusinessView and LOV in the target system. Currently we couldn’t find a way to promote a Crystal Reports with DCP without automatically promoting all dependencies and therefore break the target system.

Remove DCP, export / import LOV using BV-Manager (not working)

In order to escape the circumstance that Promotion Management automatically promotes DCP objects etc. (see point above) I tried the following:

  • In the source system, set the Crystal Reports parameter to a Static list of value and save the report.
  • Promote it – no repository objects are promoted.
  • In order to “promote” LOV objects independent from report we used the option to export BV and LOV definitions in the BusinessView Manager.
  • We imported the LOV object into the target system using the import option of the BusinessView Manager.
  • Reset the static prompt to the imported LOV.
  • Result:

    The LOV of the second level doesn’t work.

Remove DCP, export / import BV, recreate LOV (working)

  • In the source system, set the Crystal Reports parameter to a Static list of value and save the report.
  • Promote it – no repository objects are promoted.
  • Promote BusinessView only using export / import in BusinessView Manager
    Using Promotion Management doesn’t work properly! (see errors in BV-Manager above)
  • Recreate LOV objects manually in target system
  • Reset the static prompt to the newly created LOV.

Although this is NOT what I expect from SAP in terms of a properly working software – at least these final steps lead to a working solution without too much of manual recreation of repository and report objects!

For all SAP internal guys if you want to track (and support me ;-): The message number with the same case description as above is 971741 / 2012. I will open up some more cases as the things shown above is just the top of the iceberg of what doesn’t work properly in Promotion Management.

Backup & Recovery in BO 4.0

This post is dedicated to the available means of backup & recovery in SAP BusinessObjects BI 4.0. There are several changes compared to the previous version XI 3.1 including some literally missing functionality.

The recovery scenario: Partial restore of report and universe objects

In my eyes the typical recovery scneario is a partial restore. It happens quite quickly that you either delete a folder with a whole bunch of reports or that you want to revert a change in a report or universe development. Especially if we consider the ad-hoc reporting capabilities of Web Intelligence you probably don’t have a local copy of the corresponding report. In addition people which do any mistake leading to a recovery procedure tend to notice that they did such a mistake only with a certain gap in time, this means they request the recovery e.g. of a given folder not immediately after its deletion but perhaps two weeks later when they realize they deleted some reports too much. In the meanwhile the system might have been used heavily, that’s why a full recovery of the system itself is not really an option. What you need in such a situation is the possibility to recover only selected objects from a backup set to the original system. In this blog I will concentrate on this scenario. I use “original” system as a term to identify the system on which I take the backup and to which I want to recover something back.

The available possibilities in BO 4.0

There are three major approaches in taking a backup of BO 4.0 and recover partial content:

  1. Create some kind of BIAR file (multiple options available, see below) and try to recover selected elements back to the original system.
  2. Do a full backup, restore the full backup to a separate BO “recovery” system and finally use LCM to “promote” selected objects back to the original system.
  3. Use a professional backup & recovery solution like 360View from GB and Smith

Let me evaluate the above approaches in the next few sections.

The BIAR approach

The BO Admin Guide states in section 12.1.1.3 (page 466):

It is recommended that you use the Lifecycle management console for SAP BusinessObjects Business Intelligence platform to regularly back up your Business Intelligence content, such as reports, users and groups, and universes. Having current backups of your content makes it possible to restore your Business Intelligence without having to restore your entire system or your server settings.

Whoever wrote this sentence at SAP doesn’t seem to have either any concrete experience with LCM or not a clear idea what a backup & recovery tool should fullfifl in practice. Respectively let’s have a look at just any given freeware to backup your Windows files. Therefore to point this out right at the beginning: Keep your hands off in trusting LCM as your one and only backup solution for BO. LCM is a tool to promote (or in the SAP jargon ‘transport’) objects from one environment to another. LCM was never made to be a backup solution. Let me explain in some more details:

The preferred way to take a backup using LCM is exporting a LCM job into a LCMBIAR file. Finally with FP3 / SP04 you can now schedule the export to such a file. But there are some critical short comings with this (as of SP04 Patch 1; anyone having differing experience with a higher patch level please comment below!):

  • reimporting the LCMBIAR file to the original system on which you created the file will fail as soon as you delete the original LCM job. What real backup solution makes itself depending on the job object creating the backup set?
  • whenever you import a BIAR file you don’t have an option to select / unselect objects to restore. There is only black or white: Either you import all the contents from your (LCM)BIAR file or nothing.
  • LCMBIAR files do not save your successful instances. Only recurring instances are backed up. But by the way you cannot decide whether to restore recurring instances or not, as mentioned before you have to restore everything belonging to the BIAR file.

A next approach in using BIAR files is to use the new Upgrade Management Tool or the “legacy” biarengine.jar. The good news here are that LCM finally is capable to import regular BIAR files which were created by these two means. The following things should be considered:

  • In contrast to LCMBIAR files, regular BIAR files can be imported without any dependancy to any LCM job.
  • The Upgrade Management Tool as well as the biarengine.jar takes a backup of both, recurring as well as successful report instances.
  • Unfortunately SAP was so stupid – sorry to say it like this, but I couldn’t find any other term to express my feelings about this situation – to remove (or just not allow…) the possibility to import a BIAR file of the same software version using the Upgrade Management Tool. In XI 3.1 this became quite standard during a recovery procedure to load the BIAR file using Import Wizard and then select only the objects you need to recover. In combination with the short coming of LCM not to be able to select individual objects this is a real sad thing (#factoryofsadness …). Dear SAP: Just give us back basic functionalities like restoring selectively either using Upgrade Management Tool or LCM!

For those interested in the biarengine.jar – I couldn’t find any hints on it in the BI4 documentation, so I took the admin guide from XI 3.1 and it seems that everything still works as before (for more detailed infos see this blog):

First of all you need a properties file to specify what you want to be backed up:

exportBiarLocation=C:/temp/BiarEngineBackup.biar
action=exportXML
userName=Administrator
password=<your password>
CMS=cloudsrv012:6400
authentication=secEnterprise
exportDependencies=true
exportQuery=select * from ci_infoobjects where si_parent_folder = <your own id or query> OR SI_ID = <your own id or query>

Save these lines of text in a file, e.g. mybackup.properties. After all you can execute the following commands on the command line or in a batch file (replace C:\BOE4 etc. with your own BO install path):

cd “C:\BOE40\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win32_x86\jre\bin”
java -jar “C:\BOE40\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\java\lib\biarengine.jar” C:\Temp\mybackup.properties

You can use either the biarengine or LCM to restore content to the original system. As you can only restore the full BIAR file, I recommend to have a dedicated recovery or sandbox system in place where you can import the BIAR file as such and then use LCM to restore only what you need back to the orginal system. Such a dedicated system you need anyway for the second major approach, restoring objects from a full backup.

The full backup / restore approach

As long as you have a dedicated system available to “mount” the full backup into a running BO system this appraoch is quite straight forward and nothing to be afraid of (as long as you know what you do ;-)). The following high-level steps will guide you through the recovery process:

  1. Take a full backup of your original system on a regular basis. This includes at least a backup of your CMS system / repository database, the FileStore folder(s). As of FP3/SP04 SAP added an official “hot backup” option (see the “Settings” area in CMC), therefore you don’t need to shutdown your BO system to take the backup. Just define a time window in which you create both, first the backup of your system database and then the backup of the FileStore. In addition to system database and FileStore, please note your Cluster Key and Administrator password from the original system!
  2. Prepare the Recovery-System: I assume you have an already installed “recovery” system. This can be a sandbox or as well e.g. a QA system you want to temporarily use as your recovery system. Stop all existing SIA and Tomcat services on the «Recovery» system. Have a look into Task Manager and make sure that all CMS.exe and sia.exe processes have been stopped.
  3. Restore the System-DB: Restore the backup of your «original» system database to a new, empty database / schema. After restore, execute the following SQL statement on this restored DB to remove all server entries: Delete from CMS_INFOOBJECTS7 where ParentID=16
  4. Restore the FileStore: On the «Recovery» system rename the existing FileStore folder to «FileStore_orig». Restore the FileStore from «Orginal» to the «Recovery» system into its original location.
  5. Create ODBC source: In case your recovered system DB is hosted on a SQL server, create a 64bit ODBC source to it on the «Recovery» system.
  6. Create Recovery SIA (1/2): On the «Recovery» system, create a new SIA with a new CMS. Point the CMS to the recovered system database (probably using the ODBC source created in the previous step). Select the «Use a temporary CMS» option.
  7. Create Recovery SIA (2/2): Once the new SIA is added, change the Cluster name from the orginal name to a new name, e.g. «Recovery». Start the newly created SIA and check in Task Manager if CMS starts up and keeps up running. Then stop the SIA again.
    (if you want you can combine step 6 and 7 and add only one additional SIA)
  8. Create second SIA to add regular servers: Add a second SIA including regular servers, you can even add a second CMS. Start this SIA and Tomcat. Login to CMC on the «Recovery» system and check in the Servers area if all expected servers are up and running.
  9. Verify File Repository Servers: Check if the file path indicated in the Properties of the Input and Output File Repository Servers correspond to the location where the FileStore has been recovered.
  10. Run the Repository Diagnostic Tool: Run the Repository Diagnostic Tool in order to remove any inconsistencies between File Repository Servers and (recovered) system database.
    (replace C:\BOE4\ with your own BO install path; more info about the command line parameters you’ll find in the BO admin guide):
    cd “C:\BOE4\SAP BusinessObjects Enterprise XI 4.0\win64_x64”
    reposcan.exe -dbdriver sqlserverdatabasesubsystem -connect “UID=sa;PWD=<password>;DSN=<ODBC_Name>” -dbkey <cluster key> -inputfrsdir “C:\BOE4\SAP BusinessObjects Enterprise XI 4.0\FileStore\Input” -outputfrsdir “C:\BOE4\SAP BusinessObjects Enterprise XI 4.0\FileStore\Output”
  11. Do a «selective restore» from the recovery to the original system using LCM (or one of the other ways explained above, mostly depending whether you need to recover report instances or not)
  12. Recreate original settings on Recovery system: If you don’t need the «Recovery» system anymore, you can reset everything to match the original settings. For this simply stop the created SIAs and either set their startup mode to disabled or delete the SIAs from the system entierly (a practical how-to you’ll find here). Rename your FileStore on the Recovery system from “FileStore_orig” back to FileStore. This means you need to either delete the recovered FileStore folder or give it another name before. In addition you can remove the recovered database (schema).

Once you excerised this process a few times it will serve you as a reliable way to recover (partial) elements in a reasonable amount of time. But still it is not the “elegant” way to go. And therefore I would like to introduce you to my third and favored major approach. What SAP fails to deliver is usually deliverd by one of the add-on providers.

The professional approach

As a professional BO administrator I like professional tools. 360View is one of my favorite tools, not only regarding backup & recovery. But this is one of the major reasons why I recommend this solution. 360View doesn’t keep any separate information outside the regular BO system database, it’s just an alternative view to its content in addition to the CMC.

Let the pictures speak for themselves:

First of all you need to create a backup job in the web based interface of 360View, you can choose from various object types. In addition you can choose whether to include subfolders, report instances or Favorites folders in case you choose groups and users:

You can schedule this job to run “now” or at a later point in time. By the way: All the jobs scheduled with 360View can be triggered by an external scheduler like $Universe etc.

Once having executed the backup job you’ll find a new entry in the context menu of any given folder or document:

And for folders which do not exist anymore completely you’ll find the Trash Bin icon:

After all you can choose from available recovery options as you are used to from any other professional backup & recovery solution:

That’s it. The only thing you need to do in addition is to save the 360View file folder on your BO server by a regular file backup tool.

Are you dissatisfied with the existing backup & recovery capabilities in BO 4.0 too? Or do you see different ways of improving this process? Let me and other knows and write a comment! Thanks for your participation!

If you need some kind of playground for either approach, have a look at Cloudshare.com and / or use my preconfigured BO4 environment. Of course this includes a 360View installation. For those currently visiting the ASUG SAP BusinessObjects User Conference go a and visit GB and Smith at their booth 221!

For European / German speaking people have a look at www.boak.ch – I’ll have five presentations myself next week. Backup & Recovery will be included during my “What’s New in BO 4.0” session.

BO 4.0 FP3: get eFashion and other MS Access datasources working

I’ve just noticed a problem with our IT-Logix Migration Assessment Environment. The problem is with eFashion and other MS Access based demo databases, namely that you get the following error in (online) Webi, both on BO 4.0 SP02 as well as with FP3 – due to 64bit connectivity problems:

You don’t get the error in Webi Rich Client usually. In this post I will quickly outline the reasons for this and how to solve it:

First of all: Others got these errors too:

http://scn.sap.com/thread/2118132

http://scn.sap.com/thread/2043784

The answers from SAP (namely http://scn.sap.com/people/henry.banks) are not really satisfying. Of course it is not very clever to use Access as a demo datasource – but why SAP then provides these (access based) samples in BO 4.0 and not e.g. within the database they include within the setup? Anyway, there are three options you can choose:

  1. Move your efashion and other MS Access databases to a “real” database like SQL Server (Express), MySQL etc. It must be just accessible by 32 AND 64 bit drivers.
  2. Migrate to BO4 FP3 – and read the rest of the blog of how you can get Access databases running…
  3. If you are on BO4 SP2 – sorry, I don’t know a way how to get Access running on a 64bit driver – if you are interested in the reason, read on… (If you know another solution, please post a comment!)

In BO 4.0 still all the client tools (like the Webi Rich Client) use 32bit drivers. Regarding eFashion this is not a problem as any default Windows XP / 7 / Server will provide preinstalled drivers. The BOE setup will automatically create the corresponding 32bit-ODBC datasources. Therefore you’re all fine.

On server side it is important to note that e.g. Webi Processing Server always uses 64bit drivers. As far as I can overlook it as well for MS Access. But these 64bit drivers seem not to be installed by default, at least they weren’t on my cloudshare.com environments. In addition there is a strange thing that the BOE setup creates both, 32bit as well as 64bit ODBC connections for eFashion and club.The below screenshot shows the 64bit ODBC Admin (trust me 🙂

But be careful: Whereas the 32bit ODBC connections work fine at least on my side I got the following errors when I wanted to modify e.g. the efashion connection:

If you want to create a new ODBC connection you will notify there are no 64bit drivers installed for MS Access:

My suggestion to solve this is to go here and download the Microsoft Access Database Engine 2010 Redistributable – because there is a 64bit setup / drivers for this:

http://www.microsoft.com/en-us/download/details.aspx?id=13255

Download the 64bit setup… and run it:

Finally your 64bit ODBC Admin “Add connection” dialog should look like this:

Now you can create the efashion, efashion-webi etc. data sources. Make sure you write it absolutely identical as it is written in the 32bit ODBC connection!

So far everything works fine for both, BO 4.0 SP02 as well as FP3. As usual there is a big BUT: You will still get the same errors shown right at the beginning of this post. Remember, you just installed the Access 2010 redistributable. This means you have to change your universe connection to use the appropriate driver (for this log in to Universe Design Tool and choose Tools – Connections). And here is, where at least I had to say there is no (obvious) way of how to solve it with SP02:

Sorry guys, no Access 2010 support in BO 4.0 SP02. But at least FP3 provides something for us:

And finally it should work. To sum up:

  1. On a BO 4.0 FP3 server install MS Access 2010 Redistributable 64bit
  2. Create necessary 64bit ODBC connection
  3. Modify your universe connections to point to the Access2010 driver
  4. have fun with efashion 😉

PS: I don’t have any issues with our BO 4.0 SP02 environment which has SP02 installed only as a Patch. We installed this environment during ramp-up for SP02 (in these times Webi was still labeled Interactive Analysis, that’s why I noticed the difference…) and only later applied SP02. I didn’t investigate, but it seems like Webi Proc servers uses 32bit drivers here… (no 64bit drivers for access installed on this system…)

PPS: Don’t have FP3 available but you ‘d like to test yourself? I can get you easily access to copy on cloudshare.com – see the corresponding blog post.

Do you have similar experiences? Any other hint I missed? Please post your comment.

How to improve the process of hiring an (SAP BO) consultant

There are some recent blogs about fraud in hiring an (SAP) consultant:

http://scn.sap.com/community/career-center/blog/2012/04/09/sap-consulting-fraud–disturbing-example
http://scn.sap.com/community/career-center/blog/2012/05/01/more-on-sap-consulting-fraud-video-discussion

In this post I would like to share our “best practice” at IT-Logix as it is part of our daily business to evaluate and in some cases hire new consultants, especially for the Business Intelligence sector in general and for SAP BusinessObjects and Microsoft BI as our major technology stacks. As I’m personally a BusinessObjects consultant (besides other skills…) I talk primarily about this area, but of course you can adapt these findings to your own domain.

A first thing which I need to mention here is the lack of quality of SAP (BO) certifications. As long as they are mainly consisting of multiple choice questions, it is easy to simply “learn by heart” the answers, especially if you use the offerings of sites like examkiller.com:

On the SAP partner portal you can do some further “qualifications” like SAP Solution Consultant. You have to do some e-learnings and finally pass web based multiple choice questions. Nothing more easy than get an experienced consultant with you and let him answer the questions you have to fill in…

Therefore I agree with all the blog voices that a face-to-face interview is an important element during the recruitement process. On the other hand it is mentioned several times that it is problematic if you as the future employer do not have the necessary technical knowledge to really evaluate an applicant. There are two ways to address this:

  • You bring in external specialists for all sorts of tasks – why wouldn’t you for hiring? I don’t talk about common head hunters but another task specific consulting company? Of course this is a little bit marketing for IT-Logix and our services: Depending on the project and customer situation we are anyway too expensive as we mainly address the top level expertise market for Business Intelligence. Therefore we can’t compete with any of these outsourcers who provide their people for <400$ a day etc. But at least we can assist customers during the evaluation process and bring in our technical knowledge to make sure people really understand what they claim to do. Maybe you are in  a similar situation: You cannot afford the high rate top consultant but would like to have some cheaper ones. But nevertheless you want to be sure the quality of skills provided are at least according to what they tell you. Then really work together with someone who can challenge the skill set adequately.
  • Let applicants do some hands-on activities. Nothing gives you more insight than how a candidate behaves using a “real” system. Let me share with you how we do at IT-Logix: We use cloudshare.com and for example our preconfigured SAP BO 4.x environment (see my previous blog) and simply share a copy with a candidate, mostly right away at the beginning of the interview appointment. In addition we distribute a set of activity instructions including the tasks to be solved by the candidate. Afterwards he / she will have some time to work on these tasks and finally to present the results. This is sometimes amazing how vast the difference can be between a candidate’s CV and the hands-on experience… I have to add that of course it is fairly easy to setup a cloud environment for SAP BO. It might take some more efforts to do so for other SAP applications like SAP NW etc. But maybe there are other cloud solutions out there or you can somehow use your internal demo environment.

What is your experience with recruiting consultants? How do you verify their skillset? Do you think it is an overkill to do hands-on assessments? I’m happy to ready your comments!

Testing BO BI 4.x using the cloud

Update End of December 2012: Currently the ITX Migration and Demo Environment on Cloudshare is not available anymore for public parties. IT-Logix customers of course can still apply for a shared copy of it. The reason why I have to end the public offering is due to increased workload on one hand. On the other hand we need the current environment for our customer projects. Unfortunately Cloudshare did not respond to my request to offer us a free environment soley for the purpose of sharing our migration environment.

(Update October 15th 2012 –> current machine list)

(Update October 28th 2012 –> Patch 4.5 installed)

(Update October 31st 2012 –> graphomate inlcuding demo dashboard installed –> see Cloudsrv012 in folder graphomate – here you’ll find the user manual too; or open Dashboards on Cloudclnt01 and drag n’ drop the graphomate component to the dashboard to test it yourself!)

While I discussed general migration challenges in my previous blog, this blog addresses the fact that every new release (even just a service / support package) of SAP BusinessObjects needs intensive testing (by the way I’m not talking about versions in Ramp-Up but the regular available versions like currently BO 4.0 SP2). SAP seems to work based on the banana principle:

The product ripe with the consumer.

I could now elaborate on how bad this is and how much better other vendors do (do they really?). But I won’t. I would like to share an opportunity of how you can better cope with the circumstance that you have to test, test and once again test whatever you do with SAP BusinessObjects before you “go live”.

When SAP provided its HANA developer environment to partners and customers I came first to know cloudshare. In the meanwhile I’m quite enthusiastic about cloudshare! It was never easier (and cheaper) for me to create development and test environments having the choice out of a multitude of machine templates and afterwards full admin rights on all machines. But the best thing about cloudshare is that you can easily share a virtual server environment with others for free (at least for a first period of two weeks).

This inspired me to create what finally was named the “ITX BO 4.x Migration Assessment and Demo Environment”. This is a virtual server environment in the cloud. It allows for quick and easy to use «hands-on» tests of current and upcoming releases of SAP BusinessObjects BI products. You can import parts (or everything) of an existing BO content from your XI 3.1 system into the XI 3.1 system in the cloud (using BIAR-Files). Afterwards you can test a migration to BO BI 4.0 SP4 (or you can use BO 4.0 SP4 simply for its own sake) You can get your own copy of the environment for free during two weeks. Afterwards you need a cloudshare.com subscription to further use it.

The environment also includes an installation of the products 360View+ and 360Eyes from GB and Smith (www.gbandsmith.com). I highly recommend these two products in order to streamline your migration. There will be another blog where I will detail on this.

The 4 Available Machines

The Migration Assessment & Demo Environment consists of four machines:

  • BO XI 3.1 SP3 (Server + Client Tools + 360View + 360Eyes)
  • BO BI 4.0 SP4 Patch 5 (Server + Client Tools + Visual Intelligence + 360View + 360Eyes)
  • BO BI 4.0 SP4 Patch 5 (Client Tools, Crystal Reports 2011, Crystal Reports for Enterprise, Dashboards etc.
  • BO DataServices 4.1 + Information Steward

Request your Free Copy

Please contact me to share with you a copy of the current migration environment. You’ll find my contact information in the PDF here or use Twitter with @rbranger.
Please give me some key words why you’d like to use the environment and allow up to two working days to grant you access to a copy of the system.

You’ll receive an invitation email directly from cloudshare.com including a link.

Register on Cloudshare.com

Afterwards you need to open a free account on cloudshare.com:

After your successful registration please log in to Cloudshare ProPlus. Your environment is already starting up… Click on «View environment» to see more details…

Wait until all machines are up and running. In the meanwhile read the description and get familiar with machine names etc.

Let’s «own» the environment. Click on the corresponding button! On the right side you have now much more options available. The cloudshare.com license is now valid for a longer time than only the original two days.

Testing BO BI 4.0 SP4

Let’s start with using the client tools and BI Launchpad of BI 4.0 SP4. Select «Fullscreen RDP» from the drop down menu of «CLOUDCLNT012»:

The password of the BOE Administrator is always IT-Logix32
The SP4 CMS is running on cloudsrv012 on default port 6400
Here some helpful links:
Open the BI Launchpad at http://cloudsrv012:8080/BOE/BI

Find shortcuts to the available client tools on the desktop or in the start menu.

Cloud Folders

If you need to upload files (e.g. a BIAR file with your own BO content), use «Cloud Folders» to upload files using FTP:

On the virtual machine you’ll find a shortcut on the Desktop to access your cloud folders:

Have Fun and Happy Migration!

This is it. I hope you find this new opportunity useful. At least for 14 days you can use the environment for free. Afterwards you need to purchase a subscription at cloudshare.com. By the way this is nothing expensive and I wouldn’t give back mine… Regarding BO and 360 licenses there are only temporary keys part of the environment. I recommend that you use your own keys. In case you have no keys but would like to test drive BO or 360 products please contact me for an extended trial period.

My own cloudshare.com environment which is the base for the Migration Assessment Environment is sponsored by my employer IT-Logix. Please consider IT-Logix if you need dedicated expertise for your next BO migration project.