Saturday, January 3, 2009

Displaying Your Own Queries in the IT Performance Reporting

You can use IT Performance Reporting (part of the System Monitoring Work Center in SAP Solution Manager, available as of SAP Solution Manager 7.0 SP15) to centrally display the development over time of the most important monitoring data for your monitored systems. This means that you can identify potential problems early, and obtain an overview of the utilization and performance of your systems. We provide a set of predefined reports for this purpose (more information: http://help.sap.com/saphelp_sm40/helpdata/en/47/6595864609500de10000000a421937/frameset.htm). Of course, you already know all of this. What could be new to you, however, is that you can also integrate your own reports into IT Performance Reporting to determine yourself which data is to be displayed, and how it is to be displayed. This is the topic of this blog post.

To do this, you only need to be a little familiar with the operation of BEx Query Designer and BEx Web Applications Designer to be able to perform all of the steps quickly and easily You integrate your own queries in three steps:

1.

Create the query
2.

Wrap the query in a Web Template
3.

Include the new Web Template in IT Performance Reporting

Step 1: Creating the Query

1.

Start BEx Query Designer (Release 3.5), and log on to your SAP Solution Manager system; documentation for operating the Query Designer is available at: http://help.sap.com/saphelp_nw04/helpdata/en/9d/76563cc368b60fe10000000a114084/frameset.htm.
2.

Create a new query. Choose the appropriate MultiCube from the following, depending on your desired data source:
*

If you want to create a query for CCMS monitoring data, choose Computing Center Management System --> Monitoring Performance Data --> CPH Performance Data (All Resolutions without Online Data) (technical name 0CPH_NALL).
*

If you want to create a query for ABAP statistics data, choose Computing Center Management System --> Statistics Data --> WebAS Kernel --> Aggregates --> Time Profiles (technical name 0CCMSMTPH); this MultiCube is used in the example screenshots below.

image
3.

Irrespective of the details of your query, use the following dimensions in the query and drag them into the specified area:
*

For queries for CCMS monitoring data:

Dimension


Technical Name


Area

Time --> Calendar Year/Month


0CALMONTH


Rows

Time --> Calendar Year/Week


0CALWEEK


Rows

Time --> Calendar Day


0CALDAY


Rows

Time, Time Slots, and Time Zone --> Time


0TIME


Rows

Aggregation Type and Resolution --> Aggre. Type


0CCM_ATYP


Free Characteristics

Aggregation Type and Resolution --> Time Resolution


0CCM_RES


Free Characteristics

MTE --> System Name


0CCM_SYS


Rows, Columns, or Free Characteristics
*

For queries for ABAP statistics data (refer to screenshot):

image

Dimension


Technical Name


Area

Time --> Calendar Year/Month


0CALMONTH


Rows

Time --> Calendar Year/Week


0CALWEEK


Rows

Time --> Calendar Day


0CALDAY


Rows

Time --> Time Range


0CCMMATIRA


Rows

Instance --> Server


0CCMAPSRV


Free Characteristics

Period Type --> Period Type


0CCMPTYP


Free Characteristics

System and Landscape --> System ID


0SMD_LSID


Rows, Columns, or Free Characteristics
4.

Restrict the dimension System ID to the systems that you want to have displayed in SAP Solution Manager. To do this, choose Restrict from the context menu, and drag the entry Systems with Solution Manager Authorizations from the Variables tab page into the selection on the right.

image
5.

Drag the desired key figures, which you want to display in the query, into the Columns area.

image

For queries that are to display CCMS monitoring data, you need to think outside the box a little. This is because the CCMS represents a semantic-free data model; the meaning of the key figures is defined by the attributes of the characteristics:
1.

From the key figures, select the type of values that you want to display, such as the average value from the Calculated Key Figures. However, by doing so, you have not yet decided which MTE classes you want to display.
2.

To define the MTE class, select the entry MTE --> Attribute MTE Class --> Values from the dimensions, and drag the desired MTE class into the Filter area, or (for example, if you want to display multiple MTE classes in a query) into the Columns area (for an example, refer to the screenshot).

image
6.

Save your query, entering a description and the technical name. The technical name should start with Z, so that the query is in the customer namespace.

Step 2: Wrapping the Query in a Web Template

1.

Start BEx Web Application Designer (Release 3.5), and log on to your SAP Solution Manager system; documentation for the operation of the Web Application Designer is available at: http://help.sap.com/saphelp_nw04/helpdata/en/a9/71563c3f65b318e10000000a114084/frameset.htm.
2.

When creating your own Web template, you are largely unrestricted. In this case, we are creating only a simple wrapper of the query created above, as an example. From the Web Items, drag the following Standard Items into the template:
* Chart
* Generic Navigation Block

image
3.

To create the connection between the query and the Web Template, proceed as follows (refer to the screenshots):

image
*

In the properties (General) of your Chart, specify the query you created above. You can leave the name of the DataProvider unchanged (DATAPROVIDER_1).
*

In the properties (Web Item) of your Generic Navigation Block in the Affected Data Provider, select the data provider that you specified in the previous step.
4.

Save your Web template; the technical name should start with Z, so that the template is in the customer namespace.

Step 3: Including the New Web Template in IT Performance Reporting

1.

You now need to include the new Web Template in the list of IT Performance Reports to be displayed. To do this, start transaction RSA1 in the BI system that is associated with the SAP Solution Manager (usually the Solution Manager system itself).
2.

In the navigation bar, choose the InfoProvider entry from the Modeling area.
3.

In the displayed InfoProviders, expand the path Computing Center Management System --> Report, and right-click the Report entry to display the context menu.

image
4.

Choose Maintain Master Data, and execute the selection on the selection screen without entering additional data on the screen.
5.

A list of the elements of a dropdown list box appears. This dropdown list box is displayed when you call the IT Performance Reports.

image
6.

Generate two new entries in this list, in the following order:

Report


Long Description

[Letter]_00_TITLE


[desired title in dropdown list box]

[Letter]_01_[Web Template]


[desired title in dropdown list box]

Explanation:
*

[Letter] is any letter; the entries created by SAP in this list have a number as a prefix, for differentiation, your entries must start with a letter.
*

[Web Template] is the technical name of the Web template that you created in the second step. This Web template is called if the user chooses the corresponding entry in the dropdown list box.
*

The texts in the Long Description determine the corresponding entries in the dropdown list box.
7.

Save your changes and return to the initial screen of transaction RSA1. Now choose Activate Master Data, and ignore the message that the master data is already active.

You have now performed all steps; the next time that you call the IT Performance Reports, the Web template that you created yourself will appear in the list of available reports:

image

Café Innovation – Leverage your platform’s capabilities

Subscribe. Subscribe
Print. Print
Permalink Permalink

EcoHub was the big story in the SAP world when I was at Tech Ed '08 in Berlin. Coming on the heels of the launch of SAP's Innocentive program, SAP made it clear that innovation was high on the list of things needed for future success. I will return to EcoHub in a subsequent post. Let us explore what leveraging the SAP platform for innovation could mean.

Are you struggling with the ability to connect a mobile work force with a monolithic application base? Are you looking to make your workforce more agile? Are you seeking to plug a gap in your process because out-of-the-box SAP does not fulfill all your requirements? These are not unusual questions nor are they novel. What is interesting is that now we can look beyond the limitations of the delivered business application and the known abilities of your programmers to find ways within the platform to address the apparent functionality gap. For example, one might have a situation where the mobile work force needs to synchronize with, and leverage, backend capabilities. If this is not something that can be easily accomplished by out-of-the-box functionality, perhaps the use of Adobe Interactive Forms and NetWeaver Mobile can solve the dilemma without the need for seeking additional external solutions or writing copious quantities of code! If this were done, it could well illustrate business process innovation enabled by the SAP platform!

When you do something like this you have shown additional return / benefit from the platform - perhaps something you had not accounted for earlier. Proper exploitation of the SAP platform going into the future is all about addressing business pain points with novel use of the platform. As many SAP customers engage in the upgrade process, they need to consider a comprehensive return on their upgrade investment - one that includes benefits from process innovation leveraging the power of the platform - and not just a mechanical ROI that is limited to a technical upgrade. I will concede that this comprehensive ROI will be difficult to estimate, but that cannot be a valid reason for not doing the right thing. When my colleagues and I evaluate upgrades for our clients we challenge them and our own folks to identify and deliver value beyond what is possible with a mere technical upgrade. This helps these SAP customers derive better value from the platform. This sets them on the path of innovation with the platform.

According to Usman Sheikh, Vice President, Global Ecosystem & Partner Group, (...and, I paraphrase) using the platform to create that differentiated value is what leveraging the platform is all about. It is my personal opinion that SAP customers who are ready to take a close look at the platform to leverage it more effectively will find a lot of support from SAP and from systems integrators. So they should not in any way hesitate to seek assistance in this matter.

New SAP Press Book – Content Integration with NetWeaver Portal – by Marty McCormick and Matt Stratford

Overview

The first half of a two-series blog that introduces a new SAP Press book titled "Content Integration with SAP NetWeaver Portal". In this book, we provide detailed insight into various impacts and considerations relevant for several popular portal content integration scenarios such as Business Intelligence (BI), SAP Manager and Employee Self Service (XSS), Supplier Relationship Management (SRM), and SAP Composite Application Resource and Portfolio Management (RPM). For each portal content area, we discuss architecture and development impacts, configuration overview, and project management considerations. In addition, we discuss the wide ranging impacts of Federated Portal Networks (FPNs) as well as several overlapping areas relevant in portal implementations today, such as Adobe Document Services, System Landscape Directory, and NetWeaver Development Infrastructure. This book is unique because it covers several skill sets, such as systems and development, and as such allows SAP professionals to greatly broaden their knowledge on the NetWeaver Portal.

The book is divided into three sections. In the first section, we introduce critical concepts that provide a solid foundation for SAP professionals, such as FPN technology. In addition, we provide an overview of the fundamentals of portal content. In the second section, each chapter is dedicated to a specific portal content area, such as NetWeaver BI, XSS, CRM, SRM, and RPM. Within each chapter in Section II, we provide detailed analysis of architecture and development impacts, such as FPN implications and where to place various software components in your SAP landscape. In the third section, we address the overlapping areas that must also be accounted for in NetWeaver architectures, such as ADS, SLD, and NWDI.
Section I - Introduction
Chapter 1 - Introduction

After a quick "refresher" of SAP NetWeaver and the SAP NetWeaver Portal, Chapter 1 will provide readers with an overview of business content integration with the SAP NetWeaver Portal. Readers will learn how portal content integration options have evolved from early Portal releases. In addition, they will gain an understanding of the benefits of the latest Portal technology and be introduced to the challenges project teams face when integrating portal content.
Chapter 2 - Federated Portal Networks

In Chapter 2, readers will learn the importance of Federated Portals and its impact on Portal content integration. After this chapter, readers will possess an understanding of the intricacies of Federated Portals and be able to discuss the impact of FPN technology on various Portal usage scenarios. Users will apply their knowledge of FPN concepts to specific content integration topics in subsequent chapters.

Challenges in BI upgrade

Scope & objectives

To minimize the risks involved in the Upgrade and to improve the performance of the Upgrade and reduce the Downtime.

Business Benefits:

Standardization of existing processes using new functionalities thereby reducing custom developments.

Increased Value Added Activities – Updated Process, End User Training and Validation Testing Documentation

Salient Features:

Steps involved in a successful BI7 upgrade and what are the costs to the business, technical dependencies that need to be considered in an upgrade, dependencies between upgrading to BI7 and integrating SAP's new acquisition.

By following the below approach, the risks can be mitigating for a successful BI Upgrade.

The challenges facing BI/BW projects can be technical, or fiscal in nature. Discover how to eliminate problems before they occur, by preparing for them from the outset. Learn about common BW mistakes, find out how to avoid them and understand how successful BW projects can be executed.

Thoroughly study the customer requirement by analyzing their landscape, and go through the specific Master guide of the Upgrade and decide on the approach that has to be followed for the available functionalities in their current landscape.

Go-through the component Upgrade guides for the particular approach and document all the relevant notes that were referred for similar Upgrades.

Analyze whether there any developments in the source BW release and decide on the strategy that will be followed in the subsequent SPDD and SPAU modifications. You may need an SSCR key to perform the modification adjustment. Get the key before you upgrade the system. The adjusted objects are collected in a repair that is released to a transport request. You cannot release this transport request; instead you must flag it for transport in transaction SPDD. Towards the end of the upgrade, SAPup exports the request to the transport directory /usr/sap/trans and registers it for transport in the umodauto.lst file.

Analyze for any large tables which require ICNV conversion, and decide on the approach for ICNV conversions depending on the Upgrade strategy.For up-to-date information, see SAP Note 490788 - ICNV in upgrade to Web AS 6.20.

If the customer wants to implement security after the Upgrade, first check whether 3.x reports are used AS-IS or whether they want to convert them to 7.0. Authorization concept in BI 7.0 has been changed dramatically; following is the note that can be referred to. New authorization concept: Authorization Analysis note 955990 - BI in SAP NetWeaver 7.0: Incompatibilities with SAP BW 3.x.

Team structure for a BI Team

Changing skills requirements to upgrade BI; managing the relationship with external consultants empower the masses without killing the system; understanding the benefits a BI Competency Centre can deliver; understanding the critical skills of an effective BI team.

Demo on Data Extraction from SAP to Non-SAP systems such as Database tables and Flat Files using Open Hub Destination

Aim: To explain Open Hub Destination and to demonstrate the extraction of data from an Info Cube to database tables and flat files. What is Open Hub Destination?

Open Hub is a SAP BI component where this service provides the framework for scheduled and monitored extraction of consolidated and integrated data from the SAP BI and the controlled distribution of data for external systems or applications.
When we use Open Hub Destination?

For example our company has an external data warehouse system and we want to make the data available to this system in the form of tables or files then go for Open Hub service.
Where we should not use Open Hub Service?

We should not use Open Hub Service if we want to transfer the data into other SAP systems or applications.
What are the important functions of Open Hub Service?

a)DB tables and Flat file in CSV format: The data of database tables of the third-party can be obtained using API’s

b) Full and Delta extraction mode possible.

c) OHD is a part of data flow: Using the data flow options such as transformations and target object for data transfer process.

d)It supports all BI data targets such as Info Cube, Multi-providers, DSO, Characteristic Info-Objects.

e)Monitoring: Integrated monitor and application log.


The Transaction Code for creating Open Hub is RSBOH1.

image



The other way to go into the OHD creation screen is by selecting the Open Hub Destination in the modeling tab which is as shown below…

image

Let us consider the Info Cube ZIC_ISET with the following data….

image
1. Extracting data into a database table…

Right Click on the Infoarea on which we get an option to create OHD….



image

Now in the OHD screen we have to enter the technical name and description of OHD we are creating and we have to specify the Info Cube from which we want to extract the data….

image
We have to enter the destination type as database table in the destination tab, then a database table will be automatically created as below….

image



In the field definition tab we have to make sure that that all the fields are copied success fully for the Info Cube selected….

image"


When we try to activate the OHD we will get a pop-up with the following message and upon saying OK it will be activated.

image



image
On successfully activating the OHD we have to create a Transformation to map between Cube and OHD.

image



Now we have to enter the name of the cube in the source of the transformation as shown in the figure….

image

The following screen shows the mapping of objects…

image

After successfully creating the Transformation, we have to create the DTP for transferring the data.

image


Now activate and execute the DTP so that the data is transferred….

image



In the monitor we can check that the data is transferred successfully into the database table….

image

oto SE16 and check the data in the database table….

image



The following data is loaded in the database table which is in the Info cube....

image


2. Extracting Data into Flat FileFor loading the the data into a flat file the procedure is the same as the above but while creating the OHD we need to select the destination type as file, then a CSV file is created with the given path.



image



The following screen indicates the CSV file created and the directory taken.

image

In the monitor we check that the data is transferred successfully into the flat file….

image





The following indicates the data in the CSV file which is extracted from the Info Cube.

image

How to Apply Support Packages while Reducing Downtime

How to Apply Support Packages while Reducing Downtime

Customers have often commented about how long it takes to perform maintenance on SAP Systems. One of the key concerns is the impact that the required downtime will have on the operations of the business. The increasing nature of global businesses increases the impact that maintenance downtime periods have on organization. The main complications arise on dual stack system. This blog discusses ways to minimize the downtime required to apply Support Packages Stacks (SPS) to SAP systems.

Overview

When applying support package stacks, it is important to understand the types of support packages and the procedures for applying the stacks. My blog on Support Packages and Corrections is a good start for this discussion. Downloading support package stacks will include the necessary software components required to update a system to the targeted level.

Example: Update an SAP NetWeaver BI System from Service Release 3 (SPS 14) to the Support Package Stack 17.

To update the BI System, the following updates will be downloaded:

· SAP Kernel Updates

· ABAP Support Packages

· Java Support Packages

· Front End Patches

SAP provides an implementation guide for each stack to assist administrators with applying the updates. The guide for SPS17 provides a recommendation for a procedure for applying ABAP and Java support packages concurrently.

Reducing Downtime and Minimizing Manual Effort

When performing system updates, downtime consists of actual downtime and the period during which the system updates are deployed to the system. In response to concerns from customers, SAP has provided a process whereby the Java and ABAP support packages can be deployed in parallel. In addition, tools are available to minimize manual efforts expended by system administrators. Administrators can utilize the Java Support Package Manager (JSPM) to automate maintenance tasks.

Update JSPM

The Java Support Package Manager has a number of uses. In this blog, JSPM is used to apply individual Java support packages and stacks. JSPM can also be used to update the SAP kernel on a dual stack system. Before initiating these tasks, the JSPM application itself needs to be updated. Please see my video blog that shows how to Patch the JSPM Application to the latest version. This process is non-disruptive to the system and can be applied during normal uptime.

Parallel Patching Procedure

The Support Package Stack guide for NetWeaver 7.0 (SPS17) lists the procedure for parallel patching as follows:

1. Prior to applying ABAP and Java Support Packages, use the Single Support Packages option in JSPM and select for update only the components SAP KERNEL and BC-FES-IGS.

Using JSPM automates the process of applying the kernel update. All SAP instances and services on the local system are stopped, patched, and restarted. My video blog on Applying SAP Kernel Updates with JSPM shows how to accomplish this task.
2. Import the Support Packages for the component SAP_BASIS into the ABAP stack using the ABAP Support Package Manager (SPAM).

The SPAM tool itself provides additional functionality to reduce downtime when applying ABAP support packages. SPAM has a reduced downtime mode that provides system administrators more control over the import process. System Administrators can execute a much of the update efforts during normal productive uptime and consolidate all downtime activities to a single phase. My video blog on Reducing downtime when applying ABAP Support Packages shows this process.
3. Import the remaining ABAP Support Packages with SPAM and in parallel; apply the Java Support Package Stack with JSPM.

While the downtime activities for the ABAP packages are running, system administrators can apply the Java support package stack at the same time. My video blog on Applying Support Packages Stacks with JSPM provides an example of deploying Java support packages.

Conclusion

These procedures provide administrators tools and processes that can reduce the amount of time it takes to apply support packages to a system. It should be noted that point in time recovery safeguards should be applied when performing any system maintenance. This blog provides a procedure that allows the application of both Java and ABAP support packages concurrently. In addition, the downtime minimized option for SPAM reduces the amount of downtime required to apply ABAP based support packages. Utilizing the tools and procedures in this blog reduces manual effort and addresses concerns raised by customers in regards to minimizing maintenance downtime.

The Road from and around Berlin (Part 2)

Ramble tamble

On the plane from Philly to Frankfurt, my first long distance Airbus trip, and there's an Entertainment console stuck on the back of the seat in front of me, on the side of my chair, and worst of all, in two-thirds of the place where my feet and day pack would go. Not to mention that the system froze before very long, requiring an announcement that the entire plane (!) would be rebooted, and everyone's console would be down for 15 minutes. Within the hour, this process was repeated, but the downtime was another 30 minutes. Again, when the TV, movie and whatever else came back up for a third time, it needed a reboot. The splash panel said Rockwell Collins (the folks that got the U.S. to the moon and back 6 times, or so).

6:00 PM: Screen says "Announcement: Inflight entertainment will end within 5 minutes." 30 more minutes of down time.

According to my Windows Explorer, the moon shot displayed below was:

Created: Friday, October 10, 2008, 9:41:48 AM
Modified: Thursday, October 09, 2008, 3:38:42 PM

Whoa, created after being modified. That's strange.

I got my first inflight German lesson, being told "Guten Appetit" over my airplane food meal by my neighbor (who probably hasn't read Accidental Tourist). "Back at ya."

There's no record of my visit to a back alley in Bühl where I listened to jazz (without paying) standing around with the smokers. Late night, a little cool. Some things don't need photos.

Fast forward to the trains in Berlin.

Berlin's erfolgreichste show - "show suitable for foreign tourists", and, Bluemax Theater on Potsdamer Platz. Advertisements that caught my eye, not because I wanted to see these shows, but because I didn't understand what was up.

Hervé Couturier and Sal Visca meeting. The most memorable quote that I wrote down was "need masively parallel ABAP". See Dennis's blog for much more insight. With the increase in threaded CPU architectures over clock speeds, any code designs that scale up for parallel processing are total wins.

Code Jam / Steve Winwood jams

  • Firebug
  • Webkit

Louenas Hamdi, co-winner of the Berlin TechEd Demo Jam, gave me 2 key phrases as we hung out backstage before the Steve Winwood concert. With my omnipresent notepad, I jotted these down for later. Louenas said, "these are revolutionary web" trends, Firebug debugs the web, where previously one might just say "the web is broken, and it's better than Adobe." Webkit is the "browser engine that powers Safari, Chrome," and more.

Those are a few links I found on a quick internet search for Firebug. My advice to you internet coders out there? "Get fire bug" And tell them Louenas sent you.

"Firebug is free and open source (BSD)" - sweet.

As for webkit, it's here: http://webkit.org/

People who write documentation, write code, or even "just" write email should look at this page for an example of clear writing:

If I Were King of the Forest

In a side conversation in the Community Clubhouse in Berlin, Michael Schwandt posed a question to me, "What would you change if you were in charge of SCN?" Similarly, Brian Bernard asked what I would do to clean up static areas of SCN. In the meantime, SDN has gotten a facelift.

Michael: Here are 3 things I'd do to improve user experience of SCN:

  1. Blog comment editor and blog editor improvements. I'm sure these are in the works for a future "support pack," but they're more important to me than what colors are in the toolbar. I have seen folks embed links into blog comments, and a year later finally was told "oh, just write raw HTML". Which sort of works, and often doesn't, like
    fails. I'm not experienced with many blog editors, but pop-up windows that lose my work, non-existent spell checkers, and an arcance process to just drop a darn image in the text are all no-brainers.
  2. Point system charity leverage. I think SCN has done tremendous good by connecting technical and other contributions to worthy causes. I'd say this can be taken a couple steps farther, using models like Paypal and Facebook to allow an easier way to designate recipients, track where the donations go, and even see success stories. My suggestion would be: (1) food, (2) books, and (3) laptops as specific material projects.
  3. Modified ratings where we track not just individual contributions but a type of social capital. Oliver Kohl shared a lot of great ideas around this on the Clue Train ride, so you have heard it, and Oliver can explainit better than I can, but the theory would be to show not just who has a lot of friends, but who has influence due to the number of people listening to them. I've done similar study of whose blogs are read, but that's just one element. The SCN business card is probably the locus for this. And the SAP investment in linkedin might be a clue for the future.

Brian: This one is even harder to answer than the question Michael posed. But after a few weeks, here's what I have:

  1. Improve contributor detail so that at least the deadwood can be visible. If I log on and see that my last contribution was a week ago, I know I'm sharing. It's a little harder to see how others are sharing. The "widget" that showed ratios of answered to unanswered question was a step in the right direction.
  2. Run contests with specific prizes to fill out the sparse areas and remove stale content. I'd concentrate on the wiki areas first, since I already know that I've built pages that need tuning.
  3. Set up archiving with retention periods, compression and delete-by-dates. It's not going to be easy, but hopefully newer frontends like docupedia and collaboration workspace can be engineered with pruning in mind rather than afterthoughts.

In a Phone Booth in Baltimore

I watched Community Day Bangalore from afar, with images via twitpic, video and audio via ustream, and a series of tweets both inhouse and remote. I commented on the quality of the content, as well as the quality of the transmissions. See below for example photos/screen shots (I'm supposed to link the twitpic capture back to the original source, per the terms of use).

The xchrono screen shot is from my home workstation. I hacked some source code to get a clock view so I'd know what time it is in India. It turns out my Windows Mobile smart phone has a "world clock" that is limited to whole hour settings, and India is 5.5 hours offset from Greenwich Mean Time. So, I can't see what time it is properly on the latest OS? I'll post my diffs to the code snippet repository on SDN later, but the key was to use a timezone (TZ variable) of "Asia/Calcutta". The rest is just code.

The link above goes to lyrics and music from Primitive Radio Gods.

image
@MarilynPratt showing Community Day schedule board. on TwitPic

image

Migration of Queries and Work Books from BW 3.x to BI 7

Introduction

This Blog Explains step by step procedure for Migration of Queries and Work Books from BW 3.x to BI 7.0 and also explains the lessons learned while migrating reports/work books from bw 3.x to BI 7.This Blog is useful for consultants who are going to work on migration of reports and work books Projects in BW.

Pre-requisites for Migration

Install the new SAP GUI release 7.10 with Latest Patch level.

Make sure that all the Migration team should have the same version of GUI and Excel.

After new GUI installation, make sure that you should have both 3.x BEX options and 2004s BEX options are available.

Migration Process:

Migration will happen when the Existing 3.x objects (Queries and Workbooks) are opened in the new tools (Using BI 7 Bex Options).

Query Migration:

Open the Existing Query in change mode with the new 2004s BEX Query Designer and Save the Query.

Once you save the Query it will be automatically migrated.

Query Migration Procedure:

Before Migration Check the Table RSZCOMPDIR:

Enter your Query Technical Name as input to the field COMPID in the table RSZCOMPDIR and execute.

Field VERSION in the Table RSZCOMPDIR should have the value below 100, It means Query is in 3.x version…

If it is Greater Than 100 means, it is already Migrated. Here it is ‘15’, means query is not yet migrated

image1

Open the existing query using the BI7 Query Designer and Save it. It will be automatically Migrated.

image2

Now you can check your RSZCOMPDIR table will have a value which is more than 100 for the field VERSION…Here it is ‘101’, means it is Migrated.

image3

Note: Once a Query is migrated to BI7, system will not allow editing the Query using 3.x Query Designer. However, it will allow execution of the Query in 3.x.

Workbook Migration:

Open the Existing 3.x Workbook by using the 2004s BEX analyzer and Save the Work Book. Once you save the Work Book, it will be automatically migrated.

Standard Workbooks are automatically upgraded.

Workbooks with customer coding (If there are any macros exists) will need manual adoption.

Work Book Migration Procedure:

Open the existing Workbook from BI 7 Bex Analyzer and Save it…It will be automatically Migrated.

image4

Note: Should delete data in the Work Book before Save it

Once a Workbook is Migrated, we will not be able to open it using the 3.x version.

Lessons Learned from this Project:

Front end must be in the same version for all the systems while Migrating Queries and Work Books.

Especially when we are working on Global Standard Model i.e. onsite and offshore model, need to be very careful in GUI versions of all the systems in the project.

We had couple of issues, due to mismatches of front end versions.

Onsite team was in SAP GUI 710 with Patch Level 4 and offshore team was in same GUI version, But Patch Level 8.Due to this variation in the front end patch levels we got the following issues.

Issue: Text elements were not showing up properly in the workbooks after migration.

Solution: This is a program Error in Patch Level 17. We rectified this problem by applying OSS note 1143771.

It was fine till the patch level 16.there is some problem in Patch level 17; again it is fine with Patch Level 18.

Issue: Text Elements were showing as blank in the Migrated Work Books.

Solution: Updated the GUI and Excel versions in all the work stations of the team

Issue:

As part of the project we have changed some existing work books in 3.x by embedding an image in the work book. We were copying the Image from an existing work book and pasting it into the target work book, which we shouldn't do, rather we should insert the image as object into the target work book. Due to this we were getting the below Informative message.

‘Work Book contains links to other data sources, you want to Update or don't update?'

Solution: Changed Excel features--àEdit in the Menu bar-àLinks-àStartup Prompt-àDon't display the alert and don't update automatic links (second radio button)..selected this radio button and saved the work book

SAP’s Consolidation Application Strategy: Clearing the Air

This is my first blog on SDN and I've decided to start off by attempting to set the record straight on the fairly hot topic of consolidation and SAP's strategy in this area following the acquisition of Business Objects which saw us bring together 3 solutions in this space:

  • SAP Business Planning and Consolidation (formerly OutlookSoft)
  • BusinessObjects Financial Consolidation (formerly Cartesis)
  • SAP SEM BCS

Before I get into the detail however I think its only right to set the scene a little...

On February 12th, 2008, Business Objects, an SAP company published the new roadmap for our EPM applications and while some argued that this took longer than they expected, this was in fact the result of very extensive research by our various product teams and incorporated the feedback of numerous customers and industry analysts including Gartner, IDC and Forrester.

Unlike some of our competitors, Business Objects decided to take some tough decisions early and as a result rationalize the EPM product set in order to allow us to drive not only integration but also innovation into our product strategy. As part of this product strategy we will optimize our applications for both SAP and non-SAP environments, drive integration between the EPM applications, our governance, risk and compliance applications (GRC) and the Business Objects BI Platform, and subsequently extend beyond finance by adding additional capabilities to measure and optimize operational performance.

This may sound like a lofty goal but this is a strategy that will be fulfilled by our roadmap over the next 3 years and is one that we are already delivering on with the release of EPM 7.0, a move which according to Gartner proves we are executing on our commitments. According to Gartner "Business Objects is building a strong vision around its combination of BI, performance management applications and GRC".

The fact is this vision is not only unique and one that our competitors cannot even contemplate delivering (as they don't have all the component parts) but is made possible only by our decision to rationalize the EPM product set. Understandably however rationalization is not an easy process and in order to achieve this we have two guiding principles; that no customer will be left behind and that all customer investments will be protected.

Now we understand the strategy we can turn our attention back to consolidations...

The aim of the rationalization was to select a single application in each EPM domain on which we would deliver our vision, something we achieved in all cases except the consolidation market. Here the breadth of our customer base in this space meant that we wanted to take a multi-product strategy in order to continue to meet our existing and future customers' needs. Now before certain parties start chastising us for this decision I would like to point out that there is not a single vendor in the leader's quadrant of the Gartner Magic Quadrant for Corporate Performance Management that doesn't also have a multi-product strategy in this space, SAP is at least being upfront about that.

So let me state the facts clearly:

  • To meet the demands of our customers in all sectors of the market and provide them with the optimal consolidation applications based upon their needs, Business Objects, an SAP company will continue to support and sell both SAP Business Planning and Consolidation (formerly OutlookSoft) and BusinessObjects Financial Consolidation (formerly Cartesis)
  • Both applications are best in class and leading solutions for their respective target market segments
  • Both fully support complex calculations and both applications cover all the basic consolidation requirements
  • SAP SEM BCS is supported through 2013 with options on extended support to 2016. SEM BCS will also continue to receive enhancements required for legal/statutory changes through 2013.

I thought it would also make sense to share some of the common questions that we have received in relation to our consolidation applications strategy along with our formal responses to them.

Question: I currently use SAP SEM BCS. What should I do now?

Answer: You don't have to do anything. Your current application is still supported with legal enhancements and enhancement packs through 2013 with options on extended support to 2016.

Question: I am currently evaluating consolidation applications but how do I know which application to consider?

Answer: The decision on which application is most suited to a given customer is specific to the customers use case and requirements. Both applications are market leading solutions which meet all the standard consolidation requirements and are being sold worldwide. Business Objects has developed a decision tree framework and team of consolidation experts to help our Account Executives and Solution Engineers review a customer's consolidation requirements, roadmap and infrastructure to recommend the best solution to meet the customer needs and deliver the biggest return on investment for that customer.

Question: When would you recommend SAP Business Planning and Consolidation?

Answer: SAP Business Planning and Consolidations is ideal for customers requiring strong financial consolidation capabilities tightly integrated with planning, budgeting and forecasting. This is a key differentiator for SAP Business Planning and Consolidation and a unique proposition in the market. SAP Business Planning and Consolidation also serves as the ideal solution for those that prefer their consolidation application run on the SAP NetWeaver platform.

Question: When would you recommend BusinessObjects Financial Consolidation?

Answer: Our analysis has identified a number of scenarios and use cases, often process related where customers will benefit from BusinessObjects Financial Consolidation. For example BusinessObjects Financial Consolidation suits large multinational customers with complex consolidation requirements and/or a highly distributed consolidation infrastructure e.g. data entry / validation, audit / controls, mgmt of consolidations at local / corporate centre.

Question: What is overall roadmap for EPM at Business Objects, an SAP company and how do the consolidation applications feature in it?

Answer: Both SAP Business Planning and Consolidation and Business Objects Financial Consolidation form a strategic part of the overall EPM roadmap and feature in the latest release of EPM 7.0 and in the future 7.5 and 8.0 releases as stand alone applications. Released in August 2008, EPM 7.0 honors existing customer commitments and introduces the first phase integration with the Business Objects BI platform and SAP NetWeaver. Release 7.5, expected in 2009, supports additional enhancements to all products, extends integration with Business Objects BI platform and SAP NetWeaver and introduces integration across both the EPM and GRC product sets.

Question: What integration plans exist for both consolidation applications into Business Objects BI?

Answer: Integration between BusinessObjects Financial Consolidation and BusinessObjects BI has already been delivered in the 7.0 release. This includes full integration with the Business Objects Enterprise platform. This integration allows users and connections to be managed from the central BusinessObjects Enterprise environment to simplify overall administration and allow users to report and analyze information from additional sources other than BusinessObjects Financial Consolidation. Further it will leverage the collaboration and search functionality of BusinessObjects Enterprise and as well will provide support for all Business Objects front-end tools including Crystal Xcelsius for dashboards, BusinessObjects Web Intelligence for ad-hoc reporting and BusinessObjects Live Office for integration with Microsoft Office. Similar integration is planned for SAP Business Planning and Consolidation in the 7.5 release.

Question: What integration plans exist for both consolidation applications into SAP NetWeaver BI?

Answer: With the 7.0 release of SAP Business Planning and Consolidation we are introducing a brand new version of the product which is integrated with the NetWeaver BI platform. The official name for this release is SAP Business Planning and Consolidation, version for SAP NetWeaver. The release is currently going through the Ramp-up process which is expected to last approximately 6 months. Furthermore the 7.5 release of BusinessObjects Financial Consolidation will allow customers to export consolidated financial information to NW BI for additional reporting and analysis.

Summary

This blog represents the ‘official SAP answers' on this topic. Anybody who contradicts a statement made above may be misinformed, so please direct them to come here and read this blog!

I fully expect there to be many more questions on the precise content of future releases and rest assured that we will be providing more and more information as and when we are able to do so.

However, I hope I was able to clear up any confusion, and this helps answer some of the big questions you might have had.

James Fisher

Senior Director Solution Marketing

Enterprise Performance Management

SAP

Survival tips for volatile retail fuel markets

With crude oil prices at 50% of the summer highs, no let-up in cost volatility and customer demand falling by the day, fuels retailers are having to adapt their business models to survive.

Traditional practices and the use of homegrown tools cannot offer the flexibility and speed of response needed to help oil & gas and retail companies adapt to the changes. Very limited use of analytics and existing data is resulting in these companies making sub-optimal pricing decisions and operating in a highly reactive mode - usually reacting to the actions of competitors.

What's needed is an integrated, more automated business process that comprises near real-time data capture, on-line analytics with exception reporting with an ability to document and apply pricing strategy as rules and tighter integration from head office (where decisions are made) to the stores (where decisions are implemented) to cut response times. With ERP as the data backbone, it is possible to replace in-house pricing systems (often Excel-based) with a more tightly integrated technology infrastructure that supports a far more dynamic business process, able to address the following:

  • responding in minutes, not hours, to competitor price changes, cost moves or shifts in local market demand
  • proactively changing pricing tactics according to quantified changes in customer demand patterns
  • pricing fuel in such a way as to drive main store traffic (for food and big-box retailers)
  • compliance with local state or government regulations on price-gouging, anti-competitive practices, collusion, state of emergency practices etc

Many retailers are hesitant to make a change because they perceive pricing as a business critical and highly emotional process, steeped in "artistry" and often the preserve of individuals who have dedicated many years to it. In practice, there are numerous opportunities to enhance the business process with technology as the enabler. Positioning the technology as such can overcome the reluctance of companies to embrace change, by offering them the chance to combine art and science in a more flexible and efficient business process.

Free our Public Data

Freeing Government data

The Show Us a Better Way competition announced its winners earlier this month. In doing so, it highlighted the imaginative ways in which public sector information can be used to solve the needs of citizens. It was setup by the UK’s Power of Information Task Force and asked citizens to outline what they would like to see done with public information. The best ideas were to be funded to the tune of £20,000 in order to develop the idea to the next level.
UK Government authorities are making gigabytes of public information available, and the essence of the competition was to identify new applications to visualize and make use of this raw data.

The District of Columbia in the US ran a similar competition called Apps For Democracy in which they solicited innovations using data from the Office of the Chief Technology Officer. This authority pioneered a new approach to public data with the creation of a CityWide Datawarehouse Data Catalog. It was established in 2006 as a means of allowing free access to various Government data sets. At the time, City Administrator Robert Bobb outlined the guiding principal for the program as enabling:

"residents to better understand our government’s activities, thereby offering more opportunities to participate in improving the quality of life and promoting economic development in the District,”

The CityDW Data Catalog and Data Feeds provide access to current permit, crime, service request and geographical data in multiple formats including Google Maps, Google Earth, XML or spreadsheet format. Also, RSS feeds are available for live data feeds.

The Apps for democracy competition awarded prizes and recognition of new applications that visualised this data in a manner useful for all. The competition resulted in 47 Applications being built in 30 days, and an estimated 4,000% return on investment. Some of the innovations were enlightening in how they presented new ways to visualize the raw City datawarehouse data.

A couple of applications concentrated on purchase order data to make it more easily assessable and informative. One application - Citystat.org - allows users to search 73,946 purchases by agency, supplier and category. This can provide for accountability of procurement decisions by the general public and other agencies. Another entry displayed the procurement data as a facebook application in which individual purchase orders could be clicked on and discussed. This allows for a level of debate on particular purchase decisions, and provides greater transparency of what authorities are purchasing and with whom.

Citystat.org

whereismymoneydc

Transparent Government and the Public Data Movement


Government authorities around the world are searching for new ways to improve service delivery, drive efficiencies and reduce cost. They’re seeking more ways in which to engage with the public and make the inner workings of Government more transparent. Government data, combined with the power of Web 2.0 tools has the power to transform citizens from mere recipients of government services, into partners in their creation and improvement. The applications and ideas created as part of the Show us a Better way and AppsforDemocracy competitions highlight what can be achieved if public data is democratized by making it freely available and reusable in web friendly formats.

W. David Stephenson outlines some of the benfits of freeing public data as

-- more informed policy debate, grounded in fact, rather than rhetoric
-- greater transparency and less corruption
-- optimizing program efficiency and reducing costs:
-- new perspectives, especially when "the wisdom of crowds" emerges.

SAP and Freeing Public Data

Following the launch of the Innocentive program, SAP made it clear innovation, and concepts such as crowdsourcing, were high on their list of priorities. Consequently, I believe SAP should empower Public cector organisations to free their data by making backoffice data available in accessible formats such as RSS, XML and Atom.

The SAP platform is integral to many Government departments around the world, and stores many aspects of the data presented within the City Datawarehouse e.g. Purchase Orders. Government organizations need to allow other outside agencies/individuals exploit and reuse their SAP data in ways not imagined internally. SAP offers many tools to extract data e.g. SAP BI’s Openhub/APD functionality, but making live feeds like RSS etc. available is difficult. Exporting live and continuously updated streams of data to the internet (in RSS, XML formats) should be easier and more standard than is currently the case. These streams could then be subscribed to using RSS readers, or tools such as ESME/Twitter.

Government authorities should leverage the SAP platform as a base upon which others can create value. This concept is outlined by a group of academics at Princeton University in a paper called Government Data and the Invisible Hand:

"Today, government bodies consider their own websites to be a higher priority than technical infrastructures that open up their data for others to use. We argue that this understanding is a mistake. It would be preferable for government to understand providing reusable data, rather than providing websites, as the core of its online publishing responsibility."


An example

I’m currently working with a Government authority on an SAP Process improvement project. The authority wants more visibility and analysis over their spend. As such, we’ll be creating various SAP BI reports displaying purchase order information with category, vendor and amount details. This information will be displayed in reports within their Internal Portal. However, it would generate much more value if it was available in raw format on their external website. This would the data to be dissected and mashed up by others. While, we as consultants can advise on how data should be visualized - in graphs, pie-charts etc - we cannot think of all the possible permutations, or how the public would like to see such data. Different public sector entities or individuals may be interested in ploting vendor addresses on google maps to see which area of the country the greatest spend is in. This could help in directing procurement towards vendors in economically deprived regions if this was a Government strategy. Others may want to see various categories of spend benchmarked against private sector organizations with similar employees. The idea is that it doesn’t matter how we might present the data to the general public. What matters is the data can be exploited by others to interpret how they wish. Sites like http://www.fedspending.org/ could be created by harnessing “the wisdom of crowds” concept to find creative new approaches to interpreting data. The resulting data visualizations can highlight inefficiencies or anomalies and can serve to gain the trust of the public.

Change

The Obama campaign, in its technology white paper, makes transparency one of its central goals. It will ‘..use cutting-edge technologies to create a new level of transparency, accountability and participation..’. A change in how Public sector authorities treat their data and how SAP technology facilitates its distribution across the internet, is necessary to create a more open dialog between Government and its citizens.

An irresistible term - BI 2.0

The term Web 2.0 means lots of things to lots of people. "Web 2.0, a phrase coined by O'Reilly Media in 2004[1], refers to a supposed second generation of Internet-based services?such as social networking sites, wikis, communication tools, and folksonomies?that emphasize online collaboration and sharing among users." (from Wikipedia definition).

Interestingly BI 2.0 is based on SOA and Web 2.0. Then what would BI 2.0 look like?

There is plenty of good content in the various BI 2.0 posts, white papers and blogs with criticism of the term as well…

Some people say BI2.0 is people centric Business Intelligence, then is it all about People being able to create and share...and Is it about People being able to talk to their data and have it talk back?

Despite the criticism i believe that the goal of BI 2.0 should be to cut the time between when an event occurs and when an action is occurred to improve business performance. The longer you take to respond to new data, the less value there is in your response.

BI tools today focus on the presentation of data. BI is not just extracting data that is hours or days old and publishing it into reports. Often users express that the information arrives too late to be really useful. Simply delivering more reports faster doesn't solve the problem. Customers expect instant results, and don’t want to wait for answers.

BI 2.0 will come about through a blending of consumer-oriented information mashup technologies with extranet-oriented traditional BI solutions.

Charles Nicholls has laid out some good ideas in a recent article.

BI 2.0 is driven by this need for intelligent processes and has the following characteristics:

  • Event driven. Automated processes are driven by events; therefore, it is implicit that in order to create smarter processes, businesses need to be able to analyze and interpret events. This means analyzing data, event by event, either in parallel with the business process or as an implicit process step.
  • Real time. This is essential in an event-driven world. Without it, it is hard to build in BI capabilities as a process step and nearly impossible to automate actions. By comparison, batch processes are informational - they report on the effectiveness of a process but cannot be part of the process itself unless time is not critical. Any application that involves trading, dynamic pricing, demand sensing, security, risk, fraud, replenishment or any form of interaction with a customer is a time-critical process and requires real-time processing.
  • Automate analysis. In order to automate day-to-day operational decision-making, organizations need to be able to do more than simply present data on a dashboard or in a report. The challenge is turning real-time data into something actionable. In short, businesses need to be able to automatically interpret data, dynamically, in real time. What this means in practice is the ability to compare each individual event with what would normally be expected based on past or predicted future performance. BI 2.0 products, therefore, must understand what normal looks like at both individual and aggregate levels and be able to compare individual events to this automatically.
  • Forward looking. Understanding the impact of any given event on an organization needs to be forward looking. For example, questions such as "Will my shipment arrive on time?" and "Is the system going to break today?" require forward-looking interpretations. This capability adds immediate value to operations teams that have a rolling, forward-looking perspective of what their performance is likely to be at the end of the day, week or month.
  • Process oriented. To be embedded within a process in order to make the process inherently smarter requires that BI 2.0 products be process-oriented. This doesn't mean that the process has been modeled with a business process management tool. Actions can be optimized based on the outcome of a particular process, but the process itself may or may not be explicitly defined.
  • Scalable. Scalability is naturally a cornerstone of BI 2.0 because it is based on event-driven architectures. This is critical because event streams can be unpredictable and occur in very high volumes. For example, a retailer may want to build a demand-sensing application to track the sales of every top-selling item for every store. The retailer may have 30,000 unique items being sold in 1,000 stores, creating 30 million store/item combinations that need tracking and may be selling 10 million items per day. Dealing with this scale is run of the mill for BI 2.0. In fact, this scalability itself enables new classes of applications that would never have been possible using traditional BI applications

Neil Raden says in his article on BI 2.0...Rest assured, the current era of BI is coming to an end and will be succeeded by a BI 2.0 era that promises simplicity, universal access, real-time insight, collaboration, operational intelligence, connected services and a level of information abstraction that supports far greater agility and speed of analysis. The motivation for this "version upgrade" for BI is the need to move analytical intelligence into operations and to shrink the gap between analysis and action.

But - I think nobody has yet had the last word on what BI 2.0 might exactly mean…

Data Quality with Business Objects Data Services and SAP NetWeaver BI

Data Quality with Business Objects Data Services and SAP NetWeaver BI

- A strong team -

Motivation

Poor Data Quality can impact an organization in many ways. It can result in misguided marketing promotions being sent to the wrong address with incorrect information. Surveys revealed that up to 75% of wrong business decisions are made due to flawed data. Hence, investing in Data Quality does not only improve the quality of decision making, but also lowers significantly Total Cost of Ownership (TCO).

Data Quality becomes also an increasingly important topic for Enterprise Data Warehousing (EDW). Data for Reporting is retrieved from all types of sources, including SAP and Non-SAP sources. Some of the data, especially from Non-SAP sources, do need various ETL (Extraction Transformation and Loading) processing and Data Quality measures before they can be considered as trusted data.

Business Objects Data Services provides a broad set of tools in the area of ETL and Data Quality. Especially for Data Quality, Data Services goes way beyond the capabilities available in SAP NetWeaver BI. Hence, using it in conjunction with SAP NetWeaver BI does leverage the quality of the data in the enterprise enormously.

Scenarios

Analyzing the potential for Business Objects Data Services features to improve the Data Quality for the extraction and loading process in SAP NetWeaver BI, we identified the following scenarios as most suitable.

Scenario 1 - Profiling and Cleansing (on Non-SAP data to be loaded into SAP NetWeaver BI)

  • DataSource - Flat File (or source application) with customer data
  • Use Data Services for profiling + cleansing
    o Domain values (occurrence of specific values)
    o Plausibility check (e.g. reasonable date range, existing region)
    o String function (Wildcard search)
    o Pattern recognition (for the structure of phone numbers, postal codes, etc.)
    o Addresses (based on country or parsing directories)
    o Matching of duplicate records (like Smith, John and John Smith))
  • Upload to SAP NetWeaver BI - Invalid data will be excluded for further correction

Scenario 2 - Address cleansing of already loaded SAP NetWeaver BI data (SAP)

  • Download data via Open Hub Service (no license needed)
  • Cleanse addresses / data via Data Services (or perform any other Data Quality measures)
  • Upload cleansed data to SAP NetWeaver BI (closed-loop)

Scenario 3 - Incorporate Data Services features in SAP NetWeaver BI Transformation (ETL)

  • WebService call to Data Services (e.g. to Universal Data Cleanse)

Publications

Resulting from the investigations and the above described scenarios, we decided to provide two publications. We focused on the usage of Data Services as a standalone solution (and uploading the cleansed data set into SAP NetWeaver BI), hence not covering Scenario 3 for the time being.

The publications are HowTo guides which should provide an introduction into the topic and the main features of Data Services, but also an assessment of which product (SAP NetWeaver BI or Data Services) can provide a solution for a specific requirement. In addition, the main Data Services features are explained in a step-by-step fashion with easy to understand examples. The HowTo documents are targeted towards people working with SAP NetWeaver BI, who want to learn about Business Objects Data Services and look into using its' features to improve the BI data. They do not provide an introduction to SAP NetWeaver BI.

The publications can be found in HowTo area of the SDN (SDN alias ‘howtoguides' or SAP NetWeaver Capabiliities --> SAP How-to Guides --> Business Information Management )

  • How To Use Data Services I - Data Quality Made Easy
    Link to HowTo guide
    This HowTo guide motivates the usage of Data Quality measures for the data within an organization and the decision making process. It introduces the Data Services architecture and its' components and features. The document provides also decision support on which product (SAP NetWeaver BI or Data Services) to use to fulfill a specific requirement.
    The document describes the required steps to connect Data Services with SAP NetWeaver BI (and vice versa). It shows the usage of basic Data Services features like Profiling, Domain Value (Plausibility) Check, Pattern Matching and String Matching on sample customer data. Eventually, the cleansed data is loaded into the SAP NetWeaver BI system.
  • How To Use Data Services II - Data Quality For Experts
    Link to HowTo guide
    The second HowTo guide takes Data Quality measures to the next level. The introduced Data / Address Cleansing, Matching and Auditing features allow for powerful analysis and massaging of the data. Data Services delivers pre-defined versions (customizing) for these features, yet allows the user also to define its own strategies based on custom-defined dictionaries and rules. The HowTo explains the main features and provides a step-by-step guide to use them in Data Services based on sample customer data.

In order to implement Scenario 2 (Closed-loop approach of cleansing SAP BI data with Data Services), the SAP BI data can easily be extracted from the SAP BI system with the OpenHub feature into a flat file. After applying Data Quality measures according to the two HowTo guides above, the data is reloaded to the SAP BI system (also described in detail in the first HowTo guide).
In the next Business Objects Data Services release the existing OpenHub service APIs (Application Programming Interfaces) are called by Data Services. This enables the initiation and processing of the OpenHub data directly from / within Data Services. Hence, the process can be completely automated for a closed-loop scenario.

Developers Tools Day - Last Call for a free event

I started out trying to write this blog with the idea that I'd embed a Google map showing the event location, with a few little easy steps for you to click around, put your home or office address and figure out how far it would be to drive or fly to the MET center at the University of Kentucky. It's near Cincinnati Ohio. But then, if you are in the area, you probably already know how far away Cincinnati is.

Unfortunately for me, it is 549 miles away. Not sure, but probably a bit farther than driving from Hamburg to Freiburg, Germany; and a couple hours longer drive as well. Airplane would work, but expensive for a one day "free" event and harder to justify to my management.

Eddy's map of the SDN world would be one way to chart a course from your place to the Developers Tools Day, but that link isn't working for me at the moment.

I was able to generate a new Google API key to replace the one that became useless when my long-time ISP shut their modems down earlier this year. As I had backups of my KML files, I was able to flail around and build a view after a struggle.

View Small Map

View Large Map

Unfortunately, I'm not seeing the forest for the trees (or Google is just messing with my head) as I keep getting blank maps, or putting the building somewhere in Greenland, so I thought I'd better stop and get back to the original intent, which was announcing the event. Like, there must be some level of zoom that works.

The upcoming event is like an ASUG chapter meeting, but bigger, and focused on development technologies. We've got Thomas Jung, Peter McNulty and Katie Beavers (though there's talk that we may have one of them only virtually). We've got a hackers night on the day, er, night before. That would be December 4th, so the 2nd Annual Developers Tools Day is Friday, December 5, 2008.

I did a podcast with Thomas Jung last month, and posted it in this blog.

Register here (limited to ASUG members):

http://www.asug.com/EventsCalendar/EventDetails/tabid/150/EventID/997/Default.aspx

Oh, yeah, here's the other guys map link:

http://maps.live.com/?where1=3861+Olympic+Boulevard+Erlanger+KY+41018

More links if you're really, really serious. No, I mean it.

ASUG Developer Tools Day Pre-Conference Hacker Night

ASUG Developer Tools Day Pre-Conference Hacker Night - Driving Directions

ASUG Developer Tools Day Pre-Conference Hacker Night - NKU Campus Map

By the way, while I was looking up references to geocoding, I updated the SDN wiki page that indexes different blogs on maps. Feel free to add any that snuck by me.

Here's a completely gratutitous screen shot of Thomas Jung hand waving in Bangalore last week.

image

BI as information stores : The next steps

Many a time a BI implementation initially focuses on data completeness - by data completeness all the modules are integrated into SAP BI and SAP BI is more of a centralized data store which is used for day to day operations...

Okay you have implemented SAP BI over the last two to three years. You are now the proud owner of a TB class data warehouse that is being used for day to day business. Over the years you have carefully integrated all the information of the enterprise into your BI system.

All the information ranging from financials to sales to payroll flow into your BI system and day to day operations is completely dependent on the SAP BI system that you have built.

Great!!! but then you might ask - have I hit a plateau of architecture by which I do not see any new developments but only more of maintenance mode for the next 2 to 3 years. Expecting the data volumes to grow but not much in terms of new tables etc etc and possibly some SP upgrades along the way.


If you are in a state as described above read on.....

A very simple question that you can ask yourself.... What next -

To answer that question in the conceptual sense :
1. You can turn your BI landscape into a rich information store that supports a variety of initiatives like :

a. Cross functional dashboards :

Simple dashboards which are able to relate cross functional information like for Example :
employee cost per product sold expressed as a trend over the year
Product profitability
Stock turnover versus cost of capital....

A lot of these will feed back into the basic business processes that drive the organization and help in achieving efficiencies of scale and help make better forecasts and better decisions.

The tools to achieve the same are already there - be it the ubiquitous Web Application Designer or the more advanced / new Visual composer - it is just the requirement that will drive such initiatives.

b. Data Mining

Historical data provides for some rich data mining methods to determine future sales direction and help predict complex relationships for the product base. Data mining has always been the exclusive domain of analytics who bring in rich domain experience to the table.

But this need not stop the initiatives and you can look at simple data mining models like ABC analysis etc which can showcase the possibility of using such models and make a business case for further application of the same to more domains of data.

c. Data Presentation based on web services

The BI system can be opened up using web services and the data can then be consumed in various applications. A simple such application would be a widget to represent data. A much more meaningful usage would be to have a demand planning application call up past sales of data from the BI system and use the same data in the external tool.

This would make the data from BI applications accessible across the enterprise and the development network within the enterprise. This way the limitation on usage of BI data within SAP BI is removed provided the developer expertise exists.

Also of interest is usage of this data within development environments like Adobe Flex where mashups and desktop applications can be formed.

The basic usage of these presentation layers / tools being that the usage of SAP BI across the enterprise increases by bringing a whole lot of new users to the landscape - users who hitherto stayed away predominantly because of the complicated nature of existing tools.

Once the user base increases , you can look at more interesting requests for data and the BI system being used increasingly for strategic decision making . this in turn will move your BI implementation into the next level where you can unlock more value from the information that lies in your BI system currently by way of making it more interactive to user needs and help in creating a vibrant data environment.

BusinessObjects Integration with SAP NetWeaver BI - Technical Material

After 4 weeks of abstinence it feels good to be back on the blogging train and the first I would like to do here (as promised a long time ago), is to share all the material that I have used at the SDN Community Day in Las Vegas and Berlin.

I want to say THANK YOU to all the people helping to make it a great success and I hope that the sessions have been valuable for all attendees.

Here the material from SDN Community Day Las Vegas:

End-to-End Sceanrios with Xcelsius

Best Practices for Web Intelligence on top of SAP NetWeaver BI

Here the material from SDN Community Day Berlin:

BusinessObjects Enterprise & SAP - Installation and configuration

Crystal Reports and SAP BI

Web Intelligence and SAP BI

Xcelsius and SAP BI

I hope the material is useful to other customers as well and I am looking forward to your feedback

Conversation regarding the importance of Dynamic Reporting (Part II)

Sorry it has taken so long to write the next part of this blog. I am sure we all know how easy it is to get distracted on other things and a year passes. This is still relevant for all BPC versions for Microsoft.

Part 1 can be found @ https://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/7488

As I stated in part one, I have a strong opinion about the advantages of dynamic reporting. In this blog I will attempt to explain the importance of using the control panel and the workbook options to override the current view. Again, I hope to be clear and concise manner (no promises on clarity).

In my last installment I stated:

“Dynamic reports allow the user to change their current view and update the data presented”

However, what if you don’t want to rely on the user selecting the right members in the current view? What if you want to ensure the casual user doesn’t make a mistake and forgets to update the current time period… resulting in viewing the wrong data? So you think hard coding the member ID’s in the report is the answer. Nope!

The right answer is to use a control panel. If you take a look at the templates that come standard with BPC 5 (dynamic reports/schedules), you will see a collapsed set of columns and rows in the upper left corner of the worksheet. This is where we (SAP) put a control panel. It is not important where it is or if it has all the bells and whistles you see in the standard. What’s important for this conversation is the left hand side where you see the current application, all the dimensions and their current members.

Here you can set an override member for one or more dimensions. Your casual users don’t have to worry about the right selection and your advanced users can easily remove the override…”the best of both worlds”.

But wait; there is something you need to know. This control panel doesn’t work magically. You have to tell BPC where it is. This can be accomplished by the use of the workbook options (see the eTools menu to locate the option). There is a section to override the current view. Here you can select the range of cells that contains the control panel. Be sure to include the application ID reference.

A note on the behavior of this option - If there is an Ev function in the cell, the system evaluates its values before reading the "Override current view" value. For example, say an EvGET function references the Finance application, and Entity on the rows, and Accounts on the columns. When the system determines whose values to return, it looks at the override current view values for all other dimensions except Entity and Account. If no members are specified, the system returns the values from the current view.

Stay tuned for more…

BI Redux

Upto couple of years back larger size of BI system gave a BI team bragging rights. And the threshold to join the club kept getting bigger (you have a size of 6 only? we are approaching 11 (TeraBytes ofcourse)).

Well, it may still be so. But the novelty is giving way to weary realization that there are no easy tools to get the genie back in the bottle.

The DB size may be adding a few TBs every year, multiplying costs through increased hardware and resources to manage it.

Organizations have looked at solutions like BIA (and overlooked the budget increases for hardware and related services) to continue to derive incremental value out of their BI systems. Questions not asked often enough are "is all that disksize containing real meaningful data", "is this data what users are using, or, going to use in future", "are there holes in my dataset - placeholders that are never used but eating up diskspace", "are there orphan fields in my schema" and so on.

It is good to explore and find the best combination of optimizing techniques such as aggregates however it may not be enough anymore. With many organizations reaching a critical DB size after which the incremental cost with increasing size of DB will rise significantly due to the complexity involved in normal maintenance tasks (eg backup, system copy, DB reorg etc) it is about time to look at what more a BI team shall be exploring.

My opinion is, now that the BI teams are confident and comfortable managing their BI system, they need to proactively identify and implement methods that reduce the growth of DB (or in best case, reduce the DB size followed by a minimal growth rate). This would involve analyzing existing data models, remodelling, archiving, discarding (yes, this is what it ultimately will need to come to, archiving offline is what it will mean in real terms) data and so on. Sooner organizations get to this mindset (of managing the DB size) in addition to applying other emerging techniques and solutions in SAP BI, better it will be long term.

It would be a welcome change to see bragging rights (and bonuses) linked to SAP BI systems growth (ie lack thereof)

Global convergence through IFRS - Part II

Continuing on the blog series for IFRS, in this blog we discuss the things in bold below

  1. History of IFRS and why/how it came into existence
  2. Basic principles of IFRS
  3. IFRS roadmap by different countries
  4. List of IFRS
  5. IFRS deep dive on standards
  6. Significant differences between IFRS and country GAAP e.g. US GAAP
  7. How IFRS is enabled in SAP products
  8. How should your company plan for IFRS adoption
  9. What global opportunities exists for IFRS consultants…..and more
3. IFRS roadmap by different countries

Many countries have adapted IFRS across the globe. They have clearly indicated a roadmap whereby companies in their respective countries have to adapt IFRS. The below graphic represents a rough spread in terms of which countries have adopted, are moving towards adoption

image"

Information, updated as on August 2008, for 162 countries (or jurisdictions):

  • IFRSs required for all — 85 countries
  • IFRSs permitted — 24 countries
  • IFRSs required for some — 4 countries
  • IFRSs not permitted — 34 countries
  • No stock exchange — 15 countries

So you can note that about 113 countries have signed up mandatorily or optionally for IFRS.

Let us examine some of the countries perspective.

EU countries were the first to adapt IFRS and there was a mandate to get all of the listed companies to follow IFRS by 2005.

US has taken significant steps to move towards IFRS. In late August 2008, the U.S. Securities and Exchange Commission (SEC) announced its proposal on the roadmap that could lead to requiring public companies to issue their financial statements under IFRS in 2014. However the final determination of the adoption would be made by 2011. For this SEC has identified few milestones that would enable it to take a decision on transition of US GAAP to IFRS

  1. Continued improvement in IFRS — in line with the objective of continuous improvement as part of the convergence between U.S. GAAP and IFRS.
  2. Change in the IASB organization — to reinforce its accountability and to stabilize its funding. Earlier this year, the Trustees of IASB proposed to set up an oversight committee, which would include the SEC, the European Commission and the U.K.’s Financial Services Authority.
  3. Develop an IFRS eXtensible Business Reporting Language (XBRL) taxonomy — mirroring the existing U.S. GAAP XBRL taxonomy. The International Accounting Standards Committee (IASC) Foundation announced the publication of the IFRS taxonomy guide on August 28, 2008.
  4. Training and education — of users, such as preparers, auditors or investors in the U.S.
Early Adoption Option in 2009 for few companies

Large companies that meet the following criteria would qualify for this early adoption:

  • The company must be in the top-20 of its industry;
  • IFRS must be the most used accounting standard in the industry, which does not mean that more than 50 percent of companies are under IFRS; and
  • The company must receive a no objection letter from the SEC.

While the SEC has singled out about 110 companies that would meet this criterion, it expects to exclude large companies with activities in which U.S. groups are dominant from early adoption.

While the SEC has made no set decision, both companies and CPAs should be prepared for what’s to come. The roadmap presents a clear path to the transition to IFRS, but the SEC has decided to stay short on the decision, which has been postponed until 2011, and to leave open the discussion.

However, a company’s strategy to wait until 2011 for the SEC decision on the transition towards IFRS may backfire since once the decision is made, there would be almost no time to move properly to IFRS.

Experience shows that the transition to IFRS will impact not only company’s accounting department, but the entire company because:

  • Business decisions will be impacted by the accounting standards. For example, 1) in general, while research and development costs are expensed under U.S. GAAP, they can be capitalized under IFRS; and 2) hedge accounting is different under both standards and will require a change in processes in order to maintain a similar result;
  • Tax planning has to be revalued; and
  • Internal control and processes have to be reviewed.

· The reporting, including budgeting, will have to be revisited along with systems in order to maintain an effective management of the performance.

China decided that Chinese companies would adopt IFRS in 2006. This move was taken to boost foreign investment into china. However state run enterprises would be execmpted from the ‘related-party’ disclosures.

Japan agreed to move to IFRS by 2011. Japan’s claim was that its existing GAAP was similar to IFRS and saw no issue in Japenese companies moving to IFRS in that timeframe.

South Korea indicated that it will adopt IFRS by 2011. India has also indicated it would adopt IFRS by 2011.

As seen above many of the leading developing countries have either adopted or specified a clear roadmap for adoption of IFRS. This presents a huge opportunity for the world to become unified in terms of financial governance and reporting.

4. List of IFRS

IFRS is a combination of IFRS and IAS standards and is very exhaustive. The below list provides what each one is and what they cover:

IFRS 1 – First time adoption of IFRS

IFRS 2 – Share based payment

IFRS 3 – Business Combination

IFRS 4 – Insurance contracts

IFRS 5 – Non current assets held for sale and discontinued operations

IFRS 6 – Exploration for and evaluation of mineral assets

IFRS 7 – Financial Instruments : disclosures

IFRS 8 – Operating segments

IFRS also includes the International Accounting Standards that specify accounting guidelines to be followed in different areas. There are about 41 International Accounting Standards

IAS 1 – Presentation of Financial Statements

IAS 2 – Inventories

IAS 7 – Cash Flow Statements

IAS 8 – Accounting Policies, Changes in Accounting Estimates and Errors

IAS 10 – Events after balance sheet date

IAS 11 – Construction contracts

IAS 12 – Income Taxes

IAS 14 – Segment Reporting

IAS 16 – Property, Plant and Equipment

IAS 17 – Leases

IAS 18 – Revenue

IAS 19 – Employee Benefits

IAS 20 – Accounting for Government Grants and Disclosure of Government Assistance

IAS 21 – Effects of Changes in Foreign Exchange Rates

IAS 23 – Borrowing Costs

IAS 24 – Related Party Disclosures

IAS 26 – Accounting and Reporting by Retirement Benefit Plans

IAS 27 – Consolidated and Separate Financial Statements

IAS 28 – Investments in Associates

IAS 29 – Financial Reporting in Hyperinflationary Economies

IAS 31 – Interests in Joint Ventures

IAS 33 – Earnings Per Share

IAS 34 – Interim Financial Reporting

IAS 36 – Impairment of Assets

IAS 37 – Provisions, Contingent Liabilities and Contingent Assets

IAS 38 – Intangible Assets

IAS 39 – Financial Instruments: Recognition and Movement

IAS 40 – Investment Property

IAS 41 – Agriculture

You can see that some missing numbers in the above list , these are standards that have been combined into others or removed, and therefore they no longer exist.

Understanding the above standards in detail would be valuable for Business Process Expert consultants who want to implement IFRS.

We would continue further on this blog series as we move into more technical aspects of IFRS and logically move towards how SAP’s solutions are suited to enable IFRS.

Also refer Part I of the blog series.