24
Summary of Myself: To start with, right after I graduated from my Bachelors program in Computer Engineering in India, I started my career as an ABAP consultant where I dealt with various Reports of Classical and Interactive kind, Dialog Programming, BDC’s , SAP scripts as well as the functional modules. Well, After working for two projects I wanted to enhance my knowledge and involved to be more in a techno functional environment rather than just being technical. I saw a very promising future in SAP BW since every company needs to analyze the reports for DECISION MAKING/SUPPORT to have an edge over their products and to also for future enhancements and that’s what drove me to stand as an BW consultant. Since then I was exposed to all the major phases of business Information wearhouse like Reporting (where in I was involved In generating several reports on SD, Purchasing ,Financials modules by designing Queries in a Query Designer with all the components like Calculated Key figures, Restricted Key Figures, variables, free characteristics, conditions, filters, Bex Analyzer, browser and web reports. Efficiently used Exceptions over the Reporting Agent where in you can specify what kind of exception to be used. Coming to the Next phase which is EXTRACTION I was involved in using Lo- Cockpit, Delta and Generic Extractors extensively throughout all the projects including customizing the Extract Structures depending upon the requirements , writing USER EXITS for all the enhancements made with respect to new info objects all the way till replicating data sources in BW from R/3 source System In Modeling Phase I developed InfoObjects, Transfer Rules, Info Sources, InfoPackages, Update Rules, InfoCubes, ODS, developing Aggregates, involved in DataMarts, MultiProviders, loading Master Data and Transaction Data scheduling jobs in background, using Delta loads. Used Open Hub Services for retraction of Data. Configured Standard Business Content cubes in BW with appropriate mapping with R/3 Data Sources Extraction: General Lo-Cockpit Steps are: Maintain Extract Structures Maintain DataSource Replicate DataSource in BW Assign InfoSources Maintain Communication Structure/transfer rules 1

Summary of Myself v1

Embed Size (px)

DESCRIPTION

Summary

Citation preview

Summary of myself:

PAGE 1

Summary of Myself:

To start with, right after I graduated from my Bachelors program in Computer Engineering in India, I started my career as an ABAP consultant where I dealt with various Reports of Classical and Interactive kind, Dialog Programming, BDCs , SAP scripts as well as the functional modules. Well, After working for two projects I wanted to enhance my knowledge and involved to be more in a techno functional environment rather than just being technical.

I saw a very promising future in SAP BW since every company needs to analyze the reports for DECISION MAKING/SUPPORT to have an edge over their products and to also for future enhancements and thats what drove me to stand as an BW consultant. Since then I was exposed to all the major phases of business Information wearhouse like Reporting (where in I was involved In generating several reports on SD, Purchasing ,Financials modules by designing Queries in a Query Designer with all the components like Calculated Key figures, Restricted Key Figures, variables, free characteristics, conditions, filters, Bex Analyzer, browser and web reports. Efficiently used Exceptions over the Reporting Agent where in you can specify what kind of exception to be used. Coming to the Next phase which is EXTRACTION I was involved in using Lo-Cockpit, Delta and Generic Extractors extensively throughout all the projects including customizing the Extract Structures depending upon the requirements , writing USER EXITS for all the enhancements made with respect to new info objects all the way till replicating data sources in BW from R/3 source System

In Modeling Phase I developed InfoObjects, Transfer Rules, Info Sources, InfoPackages, Update Rules, InfoCubes, ODS, developing Aggregates, involved in DataMarts, MultiProviders, loading Master Data and Transaction Data scheduling jobs in background, using Delta loads. Used Open Hub Services for retraction of Data. Configured Standard Business Content cubes in BW with appropriate mapping with R/3 Data Sources

Extraction:

General Lo-Cockpit Steps are:

Maintain Extract Structures

Maintain DataSource

Replicate DataSource in BW

Assign InfoSources

Maintain Communication Structure/transfer rules

Maintain InfoCubes and Update Rules

Activate Extract Structures

Delete setup tables/Setup Extraction

InfoPackage for the delta Initialization

Set up of periodic v3 update

Infopackage for the delta load

Process is as follows((Transactional DataSources)Initial Upload Go to LBWE (Lo Customizing Cockpit)[DataSource 2LIS_11_VAHDR] and click on Maintenance which opens up the communication Structure. The extract Structured are filled by the R/3 Communication Structures of the Individual Logistic Applications

SAP already delivers extract structures that you can enhance by adding appends to the communication Structure

Select the fields that you want in the selection criteria apart from the fields that are already been there, click yes and when you come out and activate it, you should be able to see those new fields in your Datasource. Go inside and uncheck the hide and field only check boxes for the new fields and save it(It activates the DataSource) and also select the update mode (Serialized v3 Update) You go to RSA3 EXTRACTION CHECKER to see if the datasource has any data. If yes then go ahead and replicate the data sources in BW. But If I need to append any fields in the extract structure then inside the datasource by double clicking on the extract structure , it will take me to a screen where I can append the new user defined field. Once added I need to write a user exit to pop up tht field by going to CMOD and choosing ENHANCEMENT ASSIGNMENT and once inside in the screen , scroll down to see the include statement which you double click to open up the code in which you write your code , activate it and then you are all set on the R/3 side. Once replicated the DataSource in bw (source System), assign an Infosource to it, create New Info Objects if needed ,

Maintain Communication Structure/Transfer Rules

Maintaining Info Cube/ Update Rules and then in your Infosource , create an InfoPackage and load data using Initialization of Delta.

When you extract transaction data using the LO method, you need to set up the extraction. The extraction set up reads the dataset that needs to be processed and in turn fills the relevant communication structure with data

The data is stored in the cluster tables from where it is read when the initialization is run.Delta Upload:

Go to LBWE , click on SD Sales BW - to the extract structure and click on Job Control and give the Job Parameters(Start Date, Print Param) and schedule job

Now on BW side, create an InfoPackage and load the data using DELTA UPDATE

Tcodes for LO

RSA7 to check the delta Queue

LBWE - Customizing Lo Cock pit

LBWF BW Log(Only used in QA. Inside it select the User Nishanth and the application No to check the log)

LBWG Delete Setup Tables(If u need to change the extract structure, you need to delete the set up tables , build a New Structure and then reload the data to be sent to BW. Inside LBWG select your Application for ex: 11 for SD Sales, 13 for SD Billing and then execute button to delete the set up tables)

SM13 Shows the transaction for pending processes for update

OLI7BW SETUP OF SD SALES ORDER(LOADING DATA INTO LO)

OLI8BW SETUP OF LE DELIVIRIES

OLI9BW BILLING/INVOICES OLI3BW - PURCHASING

Delta Extraction with V3 Update:

Collective Update Technology called V3 Update is used for updating the extraction of transaction data in logistics applications.

Initially the data is collected in the R/3 update tables where it is picked up by the update process scheduled to start periodically

In the V3 Collection Run, the data is transferred to the BW delta Queue from where it is picked up by the requests from the BW.

Direct Delta Data is transferred to BW delta queue for each document postingLow volume customersQueued Delta Extraction data is collected for the affected application rather than the being collected in extraction queue. Serialization is ensured, large volume of customers

Non serialized just tht data is read without regard to the sequence from update tables

Advantages of using LO

1. Improved Performance and reduced volumes of data

2. Only changes to data that are relevant to BW are uploaded.

3. LIS tables are not updated

4. Update using Batch processes

5. Central , Standardized , maintenance tool for logistics applications

6. No use of LIS Functions

Generic Extraction:

Initial load Execute Transaction RSO2 Select the data source name and description Select a table, view or a sap query/info set for the source of data Flag the fields for selection Flag the fields to be hidden Activate the Data source Replicate, Assign InfoSource, Define Communication Structure and Transfer Rules, Create InfoPackage and Load DataCompany Situation:

I had a situation where in I had to gather information on 0VENDOR Info Object but they also wanted to know the Purchasing Organization(0PURCHASEORG) it belongs to be displayed in a single report. Since 0PURCHASEORG is not a part of the 0VENDOR_ATTR data source we had to create a Custom Data Source to have this field a part of the 0VENDOR_ATTR. So here we take Purchasing Org has the Superior Info Object to be added in COMPOUNDING tab of the Info Object Since LFM1 table has fields pertaining to VENDOR MASTER RECORD PURCHASING ORG DATA and LFA1 which is pertaining to VENDOR MASTER, these tables have been used to create custom data sources to meet the business requirements.

Data source is always created on something like in FI-SL(special purpose ledger its created on Summary Table and in CO-PA its created on Operating Concern), same manner I created it on these two tables. Making use of these tables LFM1 and LFA1 I created View with respect to either 1 table or by using both the tables

Two views are created as View1 which is only from table LFM1 and View2 by using a JOIN operation from both the tables.

Going to se11, give a name and click create. Inside choose DataBase View and created views(used TABLE./JOINS TAB AND VIEW FIELDS TAB where you pull in data by clicking TABLE FIELDS TAB) Two views ZS_VATTR and ZS_VTEXT are created Now creating DataSources for both the views by going into RSO2

Give a name as ZS_VENDOR_ATTR and ZS_VENDOR_TEXT

Inside select the APPLICATION COMPONENT: MM-IO(since vendor)

Fill in the Text field(short, med, long) and save it so that it internally creates an extract structure.

Go to RSA6 to check the datasource , RSA3 to check the data

On bw side, create an Info Object Zvendor , replicate the data sources , do the appropriate mapping and then load data

Delta Load:

In the create DataSource Screen (RSO2), click on generic delta button

Type the delta specific field, select whether this is time stamp, calendar day or numeric pointer

Specify safety interval

Select delta type New Status for changed records this is after image which can be used with ODS ; Additive Delta aggregated records

Generate the data source

Now we can see the delta flag being enabled

Enhancement Example:

Business Requirement is adding an additional infoobject to populate PREVIOUS ACCOUNT NUMBER in the 0vendor InfoObject

Used Stnd Business Content on R/3 and BW for loading Master data

To see the field details of the Info Object that we are adding, use t.code FK01 CREATE VENDOR, FK-02 to change, FK-03 for Display and FK-04 for delete

Use FK-03 and select the VENDOR and COMPANY CODE to pull in general data as well as the company code data

Once selected, in the 4th screen you find the PREVIOUS ACCOUNT NUMBER field , place the cursor on it and click F1 which opens up a window and select the TECHNICAL CONTENT ICON and that shows up the field information like from which table its accessing data and also the tech name of the field. I remember one of the fields was ALTKN and we had to have a ZALTKN appended in the extract structure.

Vendor MM, Company code FI

So you have to go to RSA6 search for the data source 0VENDOR_ATTR and inside it click on ENHANCE EXTRACT STRUCTURE and give the relevant information(short desc, give the field name, component type like ALTKN so that it fills up the properties of ALTKN to ZALTKN Once checked and activated, go to RSA6 again to check this New field

Now need to write a USER EXIT to pop up the field with data

Back to RSA6 , INSIDE 0vendor_ATTR click on FUNCTION ENHANCEMENT which opens up CMOD

In CMOD fill in the project and use radio button enhancement assignment RSAP0001 FOR BW

Get inside, choose your exit which is EXIT_SAPLRSAP_002 for master data, scroll down to check includedouble click to open the code and include your logic in there

Once done and everything is activated go to RSA 3 AND CHECK FOR DATA..replication and same stuff on BW side

1st Project is Pratt and Whitney,400 Main Street, Hartford, CT

www.pratt-whitney.com

Pratt and Whitney is one of the worlds largest aerospace engine manufacturers implemented Business Information Wearhouse for Sales and Purchasing areas. It carries a unique variety of products in commercial, Military, Space, Industrial, Business, Regional and Speciality materials and services. Some of them are AeroSpace engines, Gas turbines, water jet systems, convergent spray technologies and so on. I was Involved in Modeling, Extraction and Reporting of Purchasing Cubes and also Sales Cubes

0PUR_C01 cube. With the data of the InfoCube Purchasing Data, you can carry out analyses of material groups, vendors and materials. We had a requirement to find "Which materials and how much of each have been ordered from a certain vendor?" and also "How many PO items are there for a certain material group?" and this cube t gives all the information required for us The data sources needed for this cube are 2LIS_02_HDR(Header Level Data) 2LIS_02_ITM(Extraction Purchasing Item level) and 2LIS_02_SCL(Extraction Purchasing Schedule Line)

Having these 2 datasources some of the settings in SBIW are 1. Choosing an Industry(Industry specific i.e Standard, Retail, None) and also Maintaining process keys in order for the data to be interpreted correctly in BW(Settings are Application Component: MM, Application: 02(Purchasing), Transaction key: 1 for external vendors

The Master Data Info Objects involved here are 0Country, 0Material, Material Group, 0Vendor, 0Purch-Org, Plant, Version, Vendor, no of Purchasing Info Records, Contracts Key figures are Delivery Date Variance, Delivery Quantity Variance, Effective Purchase Order Value

The Reports that need to be generated are:

1. Purchase Order Values - This query enables you to analyze the purchasing activities relating to a material by trying to figure out: "What level of expenditure has been incurred in the procurement of a certain material in a certain plant or in total?" and "How much has been ordered from a certain vendor?"

Free Characteristics: 0CALMONTH, 0CALWEEK, 0PLANT, 0VENDORRows: 0MATERIAL

Column: 0PO_VAL(PO value in documentary currency)

Variable: (Material)[selection option], Cal/month[selection option] and Plant

2. Price Trends over Last 3 Months - This query allows you to examine the average price (per unit of measure) for materials in a material group. The average price enables you to draw conclusions about the negotiation skills of your purchasing organization, or, if you take the plant into account during valuation, about regional price differences. To enable you to assess short-term price changes, you are always shown the average prices for the last three months (including the current month).If you perform this evaluation in April 2000, the data for February, March, and April 2000 is evaluated

Filters: 0VTYPE = 10(Value Type = Actual)

Free Characteristics : 0PURCH_ORG, OPLANT, OVENDOR

Rows: 0MATERIAL, 0MATERIAL_0MATL_GROUP(Material Group)

Columns: 0CALMONTH, 0PUR_C01_CK0083. Goods Receipt Variances - This query enables you to analyze why the value of goods received varies from the purchase order value.

The following variance types exist:

The Price variance of the invoice receipt value gives you the difference between the value of goods received and the purchase order value. The difference is caused by a price variance, meaning the vendor has billed you for more (positive value) or less (negative value) of the material than was agreed in the purchase order.

The Quantity variance of the value of goods received gives you the difference between the value of goods received and the purchase order value. The difference is caused by a quantity variance, meaning the vendor has delivered more (positive value) or less (negative value) of the material than you ordered. Here the material is valuated at the purchase order price

4. Ordering Activities - This query enables you to display purchasing activities in relation to materials and vendors. In this way, you can figure out "How frequently is a certain material ordered?" "How frequently is a certain material ordered from a certain vendor?" "How is a material usually ordered using discrete POs, or on the basis of longer-term scheduling agreements or contracts?"

5. Average Daily Time Vendor - This query enables you to display a vendor's average delivery time. In the process, you can include the material and the plant in your analysis. In this way, I could figure out "Who can deliver the material the fastest?" "Do a vendor's delivery times vary for different plants?" "How do a certain vendor's delivery times compare with those of others?"

Second Cube is 0PUR_C09 which is called Service Level Purchase OrdersThis Cube enables you to evaluate the service level on the level of purchase orders. You can thus analyze, for example, the total number of purchase orders for the share of the purchase orders delivered late.ODS

0PUR_O02 Purchase Orders is the ODS on which the cube is sitting over. The ODS contains data on purchase orders based on the purchase order header from the InfoSources as well as aggregate information on purchase order items from the ODS 0PUR_O01.

InfoObjects for Master Data are Purchase Order, Vendor, Purchasing Organization, Purchasing Group

Data Sources used: 2LIS_02_HDR Purchasing Data (Document Header Level)

2LIS_02_ITM Purchasing Data (Item Level)

2LIS_02_CGR Delivery of Confirmations

2LIS_02_SGR Delivery of Scheduled Lines

2LIS_02_SCN Confirmation of Scheduled line

Units are: 0SALES_UNIT(Sales Unit) and 0DOC_CURRCY(Sales Document Currency)

These are connected to the transfer rules InfoSources Update Rules ODS Cube - Reports

Reports Generated over this cube are:

1. Purchase Orders Confirmed as Requested - Use the query Purchase Orders Confirmed as Requested to display the purchase order value as well as the number of purchase orders confirmed to the requested delivery date. In this way, you can analyze to what extent the vendor can respond to a requested delivery date.

2. Service Level Purchase Orders Level - Use the query Service Level Purchase Orders Level to analyze the service level with respect to the requested delivery date and confirmed delivery date on the level of the purchase orders. You can thus conduct a vendor analysis in terms of on-time delivery performance

3. Purchase Orders Delivered Complete - The query provides an overview of the purchase orders and purchase order values delivered complete. You can thus conduct a vendor analysis in terms of quantity reliability on the level of purchase orders.

Sales Module: Pretty much used 0SD_C03 which gives the overall Sales Overview. This InfoCube contains all transaction data from sales orders, deliveries and billing documents/invoices.Info Objects for Master Data are Customer, Material, Value type, Version, Sales Area, Document Classification, SD Document Category, Organization Data Sources utilized are

2LIS_11_VAHDR (SD Sales Document Header)

2LIS_11_VAITM (SD Sales Document Item Level)

Units are: Base unit of Measure (0BASE_U0M) and Statistics Currency (0STAT_CURR)

Reports Generated:1. Sales Values - This query displays information about order and sales values

Variables: Division, sold to party, Distribution Channel, Sales Organization

2. Order, Delivery and Sales Quantities It displays the order, division and sales Quantities

3. Incoming Orders per Customer Displays the Incoming Order Data for specific Customers4. Billing Documents It displays the Billing Document Data

5. Report on Fulfillment Rate Values - This query provides you with the percentage value of monthly order fulfillment rates by comparing the value of incoming orders with the value of open orders .

Second one is 0SD_C05Desc: This InfoCube enables you to view general figures for quotations including Number of quotations created, Net and gross values of the quotations and Number of quotations rejected. You can also calculate the success rate of quotations by comparing quotation and order quantities. Information about the quotation success rate includes:

Orders that result from quotations and Comparisons with order probability.

Info Objects for Master Data are Customer, Material, Value type, Version, Sales Area, Status, Validity, Document Classification, Organization, SD Document Category

DataSources Utilized are:

2LIS_11_VAHDR SD Document Header

2LIS_11_VAITM SD Document Item Level Data

Units are Same as above plus SD Document Currency(0DOC_CURRCY)Reports are:

1. Quotation success rate per sales area : These figures enable you to calculate the success rate of quotations, in other words, how many orders result from quotations for a particular sales area.

2. Quotation success rate per customer - These figures enable you to calculate the success rate of quotations, in other words, how many orders result from quotations for a particular customer

3. Quotation Tracking per Customer - This query enables you to see how many quotations have been rejected by particular customers4. Quotation Tracking Per Sales Area - This query enables you to see how many quotations have been rejected by particular sales areas

5. General Quotation Information per Customer - This query provides general information about quotations placed by a particular customerDataSources:

Sales SD

2LIS_11_VAHDR SD Sales Document Header is made up of VBAK(Sales Doc Header), VBUK(Header Status and Administrative Data)2LIS_11_VAITM SD Sales Document Item is made up of VBAK(Sales Doc Header), VBAP(Sales Doc Item), VBKD(Sales Doc: Business Data), VBUK(Header Status and Administrative Data) and VBUP(Item Status)Purchasing2LIS_02_HDR Purchasing Header has EKKO(Purchasing Document Header) and EKPA(Partner Roles In Purchasing)2LIS_02_ITM Purchasing Item has EKKO(Purchasing Document Header), EKPA(Partner Roles in Purchasing) and EKPO(Purchasing Document Item)

Pratt&Whitney in Brief:

The current project I worked was in Pratt and Whitney which is located in Hartford, Connecticut. It is one of the worlds largest aerospace engine manufacturers which implemented Business Information Warehouse for Sales and Purchasing areas. It carries a unique variety of products in commercial, Military, Space, Industrial, Business, Regional and Specialty materials and services. Some of them are Aerospace engines, Gas turbines, water jet systems, convergent spray technologies and so on. I was Involved in Modeling, Extraction and Reporting of Purchasing Cubes and also Sales Cubes

The business Requirement I had in purchasing was to build the structure that holds different types of materials and how much of each have been ordered from a certain vendor firstly, then How many PO items are there for a certain material group and also to evaluate the service level on the level of purchase orders. You can thus analyze, for example, the total number of purchase orders for the share of the purchase orders delivered late. A wide range of reports can be generated with this information and thus worked closely with our Functional Team discussing about the solutions and cubes that need to be incorporated to build the whole system.We then decided to work on 0PUR_C01 which is the Purchasing Data InfoCube, where you can carry out analyses of material groups, vendors and materials) and 0PUR_C09(Service Level Purchase Orders) on the purchasing side and when it comes to Sales we used 0SD_C03( which gives the overall Sales Overview) and pretty much 0SD_C05(orders/Quotations) to handle the part of sales in the company.

On the Extraction part in R/3 Source System I used LO-CockPit Extraction method quite extensively and also Generic extraction method to load data in the cubes all the way from R/3 Source System. At some point of time I was also involved in Enhancements when I figured out that the Fields I am looking for are not available in Standard Extractors provided by SAP. I pretty much used the Business Content in BW and for the additional fields in R/3, created corresponding Info Objects in the Info Objects Catalogue and then mapped data source to the info source , maintained communication structure, created Transfer Rules, Update Rules for the corresponding Data Targets.Once the link was set to active, I created an InfoPackage for loading both Master data first through direct updating and then Transaction data by Initializing the Deltas. Later on loaded data through the Delta Method Update Mode scheduling the loads in the back ground. Monitored the whole process of data from source system to PSA and then through transfer rules to the communication structure and through update rules to the info provider. Once the data is resided in the cube for reporting, used various components like Calculated key figures, Restricted key figures, Variables, Filters, conditions, exceptions being driven in the Reporting Agent and Free Characteristics in the query designer to generate the reports in Bex Analyzer. Worked also on Aggregates and Multi Providers to generate reports depending upon the company requirement.Reporting Components:

Free Characteristics - The Characteristics in this area are not displayed in the initial view of the query but you can drill down and filter once you execute the query

Filter Area - The Characteristics in this area are restricted and cannot be filtered or drilled down further. Filter affects the entire output of the query where as Restrict does not affect the entire output of the query but just the info object it is restricted on.

Restricted Key Figure These are Key Figures restricted by one or more characteristics. For example when you want to compare the sales of product A between different fiscal year periods, you will create a Restricted key figure with sales qnty as key figure and restricted by 0MATERIAL of value A. It can be defined at both query level as well as the Info Provider Level.

Defining at Info Provider Level: In query Designer towards the left side right click on KEY FIGURE and select New R.K.F. Once inside give the description and then select your key figure(revenue) and then the Characteristics(Sales Region) pulling it over the right. Click ok and in the next pop up give the technical name, desc and click ok to create it. Its seen in the query designer in the left and shud be dragged to the right to be used in the query. R.K.F is used in run time which sits in OLAP processor levelCalculated Key Figure It does the complicated calculations on Key Figures such as mathematical functions, percentage functions, total functions and so on. It can also be defined at both the levels.

AT Info provide level: Right click on the key figure(Left side in Query Designer) and click on New Calculated Key figures. Towards the right you have functions and in the left you can see all the key figures + R.KFs. Now In formula box get the difference of two R.K.Fs which is the value of the C.K.F and so when you do reporting it shows up 3 key figures, 2 R.K.F AND 1 C.K.F. Creating a CKF at query level is by right clicking on columns tab and right click to select a formula and use the functions to perform calculation and then run the report which will be specific to that query itself

Hierarchy Variable It will give us the flexibility to choose the hierarchy I want to carry on with the report. Select Revenue as Key figure and Sales Rep Hierarchy in rows and right click on sales rep hierarchy and select properties. Once Inside click on the display icon on the right to see the underlying hierarchies and a new window opens up. Instead of selecting a particular hierarchy(Hard coding), select the variable and click new. And then give the variable name, description, processing by(user entry), next variable entry is mandatory and click finish and when you run the query a window pops up allowing you to choose the hierarchy you want and then will display the result accordingly. Hierarchy Node Variable Options are Fixed hierarchy and variable hierarchy node where only the node is selected at run time. Second is variable hierarchy and variable node where both hierarchy and nodes are selected at run time. In Query designer, revenue as key figure and sales rep hierarchy as rows, right click on sales rep and select RESTRICT. Select the variables tab and right click in that box to create a new variable if you dont have any. In creating it, give the variable name, desc, processing by (user entry) and in next screen select the variable represents(single value NOTE: pre calculated value set is one of the options in here), variable entry(mandatory) and once everything is done a new variable is created. Bring it to the selection screen in the right and click ok. You are all set to run the query and in that process, a window pops up asking you to actually select a particular node like EAST OR WEST region to display the results for that specific region in the report. Characteristic Variable You can select dynamic input for characteristics using this variable. For example if you are developing a sales report for a given product, you will define a variable for 0MATERIAL.Process is if row has customer id, filter as sales org and column as sales revenue, right click revenue(key figure) and select edit. Now restrict with cal year/month and click on variables. Check for standard variables and if not found what you require create a new variable by right clicking and choose New Variable. Variable name, desc, processing by(user entry), next variable represents(single value) and variable entry(mandatory). Once created , select and bring it to the right and click ok. When you query a window pops up which gives you the option to choose a particular date to produce the result.

Text Variables These are used to display dynamic texts for a given characteristics. For example take the sales report for a given product for the last 3 months. The inputs prompted are product and fiscal year period. On the column level you have drilled down by fiscal year period. Since the fiscal year period is dynamic(depending on the user input), you want to display the correct descriptions for the 3 months displayed. So in this case create a text variable for the fiscal year period and use it in the text of properties.Process is the same as right click on revenue and click edit. Once inside click the text variable icon on top right and in next window click new to start. Give variable name, desc processing by (replacement path) and next select characteristics and next replace variable with (name/text) and then finish. Once done run the query in which the window pops up with the text shown clearly

Formula Variable These are variables used in calculations in the query. For example take a customer discount report where each customer is given a different percentage discount. You create a formula variable for the discount percentage and use it in the calculated key figure. The basic steps are first to create characteristic variable to accept values and secondly create a formula to store the values and lastly use this formula in any computation.Process is having customer id in rows and sales revenue in columns and firstly creates a char variable that holds some value like the difference of the dates interval and then right click on key figure and choose New Formula and check for available ones. On F.V create a new formula variable by giving the variable name/desc/processing by(replacement) next by choosing char value(0cal month), next replace variable with (name/text)and interval choose difference and finish it. Now in formula box create revenue * new formula variable created. So in the query we want a calculated field which is the logic of the formula box.

Conditions - If you want to filter on key figures or do ranked analysis then you use condition. For example you can use condition to report top 10 customers or customers with more than a million dollars in sales yearly.Process is in the query designer, rows as custid and columns as revenue and condition is to generate a report for top5 customers based on their revenue. In query designer on the top menu, click on conditions and select new conditions. Give desc as top 5 customers click(.) for all the characteristics in the drilldown independently. Click new and choose the top N from the drop down and give the number and click on transfer button. Once inside the box , check and activate it and execute and run the query. In the report click show display conditions and the desc top 5 customers active is shown. Double click on active to see all the customers. You can even create exceptions in the same report.Exceptions - It is not a filter but is used to highlight reports with different colors for unusual key figures. For example you may want to show red for all accounts receivables that are older than 90 days and yellow for older than 60 days

Process is custid in rows and revenue in columns and click on exceptions icon on top menu and create an exception. Give the description as critical exception , select your key figure and give your exception values by creating new and choosing bad/medium/good option and transfer it. Once saved when you run the report , for the values you provided, it highlights all of them with respect to their values. It can then trigger a REPORTING AGENT. You can use the reporting agent to schedule exception reporting and alert the users for any unusual data.USF CORPORATION, CHICAGO, IL

Its a leader in transport Industry, specializes in Delivering comprehensive supply chain management solutions. The USF team is comprised of a network of professionals that specialize in the following services: High-value, regional and national less-than-truckload (LTL) freight transportation

Premium regional and national truckload (TL) freight transportation

Distribution and logistics

Retail and consumer returns processingI was pretty much involved in controlling part of Financials, Sales and Distribution and partly in Materials Management Module. Since it was a new implementation they provided with the functional and technical specifications in controlling and wanted me to build the structure for it. They wanted the first roll out to be smooth so pretty much used the business content cubes and the enhancements were held till this phase got over in ease.

The cube 0OPA_C11 is the cost and allocations cube of Internal Orders.This InfoCube contains all costs and quantities on internal orders (plan and actual using delta-extraction, budget, commitment), which were transferred from the linked source system(s). The InfoCube also contains the extended partner information (such as, partner cost center with master data) for the allocation relationships. The main key figures in this are amount and quantity. No delta Updates in hereThe info objects used for loading master data are Version, Cost Elements, Partner, Valuation, Currency Type and Order Number

The available data sources are:

0CO_OM_OPA_1: Internal Orders: Costs and Allocations

0CO_OM_OPA_2: Internal Orders: Budget

0CO_OM_OPA_3: Internal Orders: Overall/Annual Plan Values

0CO_OM_OPA_5: Internal Orders: Accruals

0CO_OM_OPA_6: Internal Orders: Actual Costs using Delta ExtractionThe Reports made are:

1. Internal Order Detail (Plan/Actual) - This query is a regular report for the order manager. It provides an overview of the costs incurred on an internal order during the reporting time frame and the current fiscal year. It enables you to make an actual and plan cost analysis, taking cost elements and the corresponding variances into account.

2. Internal Orders Group (Budget/Actual/Commitments) - This query is a regular report for the area manager. It provides an overview of the fixed budget for an internal order group, as well as the incurred commitment values and costs throughout the reporting time frame, and in the current fiscal year. It enables you to analyze these values with regard to fiscal years. The system calculates the remaining available budget using these values.

3. Internal Orders (Group): Overall Plan/Actual/Commitments - This query is a regular report for the area manager. It provides an overview of the total planned costs incurred on an internal order group, as well as the incurred commitment values and actual costs throughout the reporting time frame, and in the current fiscal year. The system uses these values to calculate the allotted and available funds.

The second cube is 0OPA_C02 is the Statistical Key Figures of the Internal Orders

This InfoCube contains all the statistical key figures posted to internal orders, that were transferred from the linked source system(s).The category of the statistical key figure determines whether the system updates quantities or inventory quantities to the statistical key figure. The system fills the 0QUANT_AVRG Info Object for non-cumulative values, such as "Number of Employees". However, for cumulative values such as "Telephone units", the system fills the 0QUANTITY Info Object.

The DataSource used is 0CO_OM_CCA_4

The reports made are:

1. Internal Orders Group: Statistical Key Figures (Plan/Actual) - This query is a regular report for the area manager. It provides an overview of the statistical key figures entered in an internal order group during the reporting timeframe and the current fiscal year.

2. Internal Orders List: Statistical Key Figures (Plan/Actual) - This query is a regular report for the order manager. It provides an overview of the statistical key figures entered on an order during the reporting time frame and in the current fiscal year.

Cost Center Accounting:

Also dealt with 0CCA_C11 cube which is the Cost and allocations cube in Cost Center Accounting and the reports made are:

0CCA_C11_Q0052: Cost Center (Range): Actual Costs - Periods

0CCA_C11_Q0051: Cost Center (Range): Actual Costs - Quarterly

0CCA_C11_Q0050: Cost Center (Range): Actual Costs - Situation

0CCA_C01_Q0005: Cost Center (Range): Plan/Actual - Periods

0CCA_C11_Q0061: Cost Center (Range): Plan/Actual - Quarterly

0CCA_C11_Q0060: Cost Center (Range): Plan/Actual - Situationand Info Sources are:

0CO_OM_CCA_9: Cost Centers: Actual Costs With Delta Extraction

0CO_OM_CCA_1: Cost Centers: Costs and Allocations

Also 0CCA_C03 cube which is the Statistical Key Figure Cube in Cost Center Accounting and the reports made are:

0CCA_C03_Q0001 Cost Center (Range): Key Figures Plan/Actual Situation

0CCA_C03_Q0002: Cost Center: Analysis - Statistical Key Figures - Plan/Actualand the Info Sources are:

0CO_OM_CCA_4: Cost Centers: Statistical Key FiguresBrown Forman Beverages, Louisville, KY

Brown-Forman Corporation, one of the largest American-owned companies in the wine and spirits business, is a diversified producer and marketer of fine quality consumer products.

Through Brown-Forman Beverages Worldwide, Brown-Forman produces and markets many of the most well-known and best-loved wines and spirits in the world. They Implemented Business Information Wear house for the Controlling and Profitability Analysis of their Companys Products.The 0COPA_C03 is the CO-PA quick start cube. This InfoCube allows you to perform general results analysis that is industry-independent. The InfoCube is compatible with the operating concern template Quickstart S_GO, which is delivered in the R/3 System from Release 4.6A and which you can use for testing and demonstration purposes.

You access the environment for operating concern templates in the R/3 System by choosing Structures Define Operating Concern Operating Concern Templates Use SAP Operating Concern Template in Customizing for CO-PA. From this environment, you can access the functions available for the template and detailed information about these functions.

In the template environment, you can generate test data (planning and actual data) for the template and analyze this data using the queries for this InfoCube. If your Profitability Analysis is a new installation, you can copy the operating concern template S_GO for use in the live system and extract your live data into this InfoCube. You can then use the queries delivered for this InfoCube to analyze your live data.

The Reports made are:

Incoming Sales Orders in Current Period / Enterprise

Contribution Margin 1 / Sales / Actual

HYPERLINK "SAPEVENT:page%3dBW_O_D%26SystemID%3dDEVG200%26ClassID%3dELEM%26ID%3d2H1FCEQQUNA9RCQJVBVW9OSJU%26objectVersion%3dD" Sales Revenue / Actual / Plan

HYPERLINK "SAPEVENT:page%3dBW_O_D%26SystemID%3dDEVG200%26ClassID%3dELEM%26ID%3d1NYEQ14FZLE95QHJIV7RJ039M%26objectVersion%3dD" Sales Volume Sales Revenue / Sales / IO / Period Comparison

BW3.XBW7.0BW7.3

Info package loads till target [ex cube]Info package loads only till PSA, we need to load it further via DTPHas a migration tool to aid in the migration process , which was not there in earlier versions and had to be done manually for each object

While creating transfer rules, mentioning target and source was not necessaryWhile creating transformations, mandatory to mention source and target

Authorization: Roles conceptAuthorization: analysis authorization

In Infosets now you can include Infocubes as wellAdministrator Workbench is renamed as Data Warehousing Workbench.

The Remodeling transaction helps you add new key figure and characteristics and handles historical data as well without much hassle. This is only for info cube.There are additional modeling options available at left hand side panel in transaction RSA1 as shown below. The additions are 'DataSources' and 'Open Hub Destination'

The BI accelerator (for now only for infocubes) helps in reducing query run time by almost a factor of 10 - 100. This BI accelerator is a separate box and would cost more. Vendors for these would be HP or IBM.The functions of the InfoPackage tree are fully covered by process chains

The monitoring has been improved with a new portal based cockpit. Which means you would need to have an EP guy in your project for implementing the portal Process Chains are used instead of Event Collector.

Search functionality has improved!! You can search any object. Not like 3.5The 'Remodelling' transaction help you add Key Figures and Characteristics in a Infocube while handling historical data effectively.

Transformations are in and routines are passe! Yes, you can always revert to the old transactions too.Transformation is used instead of update and transfer rules.

The Data Warehousing Workbench replaces the Administrator Workbench.ODS objects have been renamed as DataStore Objects.

Functional enhancements have been made for the DataStore object: New type of DataStore object Enhanced settings for performance optimization of DataStore objects.You can access objects till InfoPackage directly from InfoCube as shown below.

The transformation replaces the transfer and update rules.

The Data Source:

There is a new object concept for the Data Source.

Options for direct access to data have been enhanced.

From BI, remote activation of Data Sources is possible in SAP source systems.

There are functional changes to the Persistent Staging Area (PSA).

BI supports real-time data acquisition.

Introduction of "end routine" and "Expert Routine"

Unification of Transfer and Update rules

New features in BODS4.0:

64-bit job server

Text data processing

Access text data in log files, spreadsheets, surveys, comments fields and in similar places-Sort through the noise of unstructured content by automatically identifying what your text content is -about--Who, what, where, when, how much, etc

Native Text Data Processing on the Data Services platform with the Entity Extraction transform to extract :-Predefined entities (like company, person, firm, city, country, )-Sentiment Analysis (e.g. Strong positive, Weak positive, Neutral, Weak Negative, Strong Negative)-Custom entities (customized via dictionaries)-Languages supported (for version 4.0)-English-German-French-Spanish-Japanese-Simplified Chinese Enhanced data integration and quality management

Data Quality Management on the Data Services platform includes :

Data Cleansing: automatically correct your data based on reference data and data cleansing rules-Address cleansing-Data Cleansing-Data Enhancement: enhance the data with additional attributes.-E.g. geocoding : adding latitude and longitude information-Match and Consolidate: finding duplicates and merging records into one consolidated record

Address Cleansing-240 countries covered of which 37 with country specific directories-New in Data Services 4.0 : street level validation for China-Geocoding-Support for United States, Canada, UK, France and Germany-New in Data Services 4.0 : adding Austria and Switzerland

Empowers data stewards/domain experts to develop custom data cleansing solutions for ANY data domain-Allows users to:-Enhance and change regional cleansing packages for person and firm data-Easily and quickly develop new data cleansing solutions for non-party data-AUTOMATICALLY creates the dictionarylists, rules, and patterns in a singlerepository which is published to theData Services Data Cleanse transform New security model

Central management console layer is added for security

This integration offers many new features to the Data Services customers :-Single location to store all users, passwords and connections to repositories-Advanced user management with password policies and advanced security features-Use external authentication mechanisms like Active Directory, LDAP or SAP NetWeaver-Manage permissions to control access to repositories and administative tasks to more granular level.

Customer already using the BI and EIM solutions from SAP BusinessObjects can now maintain all users and permission in one central CMS (Central Management Server).-Customers who dont have SAP BusinessObjects BI platform 4.0 will be able to use the SAP BusinessObjects information platform servicesfor their Data Services user management. Information platform services is :-A mini-version of SAP BusinessObjects BI Platform with only the CMS and related sercices (no reporting or dashboard engines)-Offers all platform functionality from like user management, authentication, authorization, scheduling, job processing, ...-Available for all Data Services customers from their Data Services download.

New in BOBJ 4.0:

1) Universe changed to "Information Design Tool)2) Xcelcius changed to "Dashboard Design )3) Infoview changed to "BI Launch Pad"4) Webi changed to "Interactive Analysis Tool"5) Voyger changed to "Advanced analysis Tool"(Please note that the product name might change again during launch)

Some noteable changes in the new product:

1) Universe design has changed a lot 2) Xcelcius can report from universe.3) You can directly report from Bex . No need to have a universe.4) Interactive analysis tool looks dazzling (oh i mean webi) . However haven't seen a fundamental change to the product.5) A few new options in CMC (like all auditing stuff grouped together in one service) . Again did not see any fundamental change.6) You can create crystal report on enterprise version of the product through a Universe only. You have another version of the product in 4.0 can connect to different systems.