3
Y:\Testing\Data Warehouse\DW - Cognos\GL Datamart\Test Mgmt\Test Strategy - Cognos.doc Saved Date: 6/8/05 8:28 AM Page 1 of 3 DATA WAREHOUSE TESTING METHODOLOGY I NTRODUCTION Due to the nature of Data Warehouse project, testing the validity of each data mart will require an approach different from that of testing a transactional system (e.g. Oracle Applications). Because the data warehouse and the source system are designed to perform different functions, the table structures between the two systems differ greatly. The main difficulty found when testing is validating query results between the systems. Not all of the data in the source system is loaded into the warehouse, and the data that is loaded may be transformed. Therefore, comparisons between the systems are difficult, and troubleshooting becomes extremely complex when trying to identify points of failure. The testing method introduced in this plan is designed to help streamline the process, making it easier to pinpoint problems and cut down on confusion for the testers while at the same time expedite the testing phases. TESTING PHASES Data mart testing will be divided into three distinct phases, Unit Testing, Conference Room Pilot (CRP), and System Integration Testing. This testing is designed to test the completeness, correctness, and performance of each data mart. Unit Testing o Back -End ? Extract, Transform , Load (ETL): The DW Team developers will create validation scripts for each dimension and fact table within the DW. ? Fact Tables – Validation scripts are written to compare record counts between the source system(s) and the DW for each significant measure. Additionally, for monetary measures, summarizations should be performed to verify that amounts match between the source system(s) and the DW. For example, in the AR fact table: a) Validate that the record count for FY2003-FY2004 matches between Oracle and the DW for all miscellaneous receipts. b) Validate that for APR-FY2004, all miscellaneous receipts in Oracle exist in the DW. c) Validate that the sum of all miscellaneous receipts in the DW matches the sum of all miscellaneous receipts in Oracle. Furthermore, business rules that have been identified during requirements gathering should also be validated. For example, in the AP and PO tables: a) Can a PO agent be inactive in the HR table but still active in the PO agent table?

Data warehousing testing strategies cognos

Embed Size (px)

Citation preview

Page 1: Data warehousing testing strategies cognos

Y:\Testing\Data Warehouse\DW - Cognos\GL Datamart\Test Mgmt \Test Strategy - Cognos.doc

Saved Date: 6/8/05 8:28 AM

Page 1 of 3

DATA WAREHOUSE TESTING METHODOLOGY

INTRODUCTION

Due to the nature of Data Warehouse project, testing the validity of each data mart will require an approach different from that of testing a transactional system (e.g. Oracle Applications). Because the data warehouse and the source system are designed to perform different functions, the table structures between the two systems differ greatly. The main difficulty found when testing is validating query results between the systems. Not all of the data in the source system is loaded into the warehouse, and the data that is loaded may be transformed. Therefore, comparisons between the systems are difficult, and troubleshooting becomes extremely complex when trying to identify points of failure. The testing method introduced in this plan is designed to help streamline the process, making it easier to pinpoint problems and cut down on confusion for the testers while at the same time expedite the testing phases.

TESTING PHASES

Data mart testing will be divided into three distinct phases, Unit Testing, Conference Room Pilot (CRP), and System Integration Testing. This testing is designed to test the completeness, correctness, and performance of each data mart.

Unit Testing

o Back -End

? Extract, Transform , Load (ETL): The DW Team developers will create validation scripts for each dimension and fact table within the DW.

? Fact Tables – Validation scripts are written to compare record counts between the source system(s) and the DW for each significant measure. Additio nally, for monetary measures, summarizations should be performed to verify that amounts match between the source system(s) and the DW.

For example, in the AR fact table:

a) Validate that the record count for FY2003-FY2004 matches between Oracle and the DW for all miscellaneous receipts.

b) Validate that for APR-FY2004, all miscellaneous receipts in Oracle exist in the DW.

c) Validate that the sum of all miscellaneous receipts in the DW matches the sum of all miscellaneous receipts in Oracle.

Furthermore, business rules that have been identified during requirements gathering should also be validated.

For example, in the AP and PO tables:

a) Can a PO agent be inactive in the HR table but still active in the PO agent table?

Page 2: Data warehousing testing strategies cognos

Y:\Testing\Data Warehouse\DW - Cognos\GL Datamart\Test Mgmt \Test Strategy - Cognos.doc

Saved Date: 6/8/05 8:28 AM

Page 2 of 3

b) Are there any payments that are being pushed over to DW with no invoice associated with them?

c) Does the PO distribution amount calculate correctly?

d) If the invoice status is CANCELLED, should the sum of all payments amount equal to 0?

e) Are there any translation or mapping errors? For example, ATTRIBUTE9 becomes TRAVELER_NAME in the DW.

? Dimension Tables – Validation for dimension tables is handled similarly, however due to the nature of the records, there are no sums or measures to be validated. Developers will validate by matching record counts between the source and DW systems. Additionally, there may some business rules that apply to dimensions, filtering out records based on criteria such as status, type, etc. Queries are run by the developers to validate that these business rules have been met.

o Front-End

? Report Net ? Oracle

o Developers will validate the Report Net reports by running similar SQL queries against the source system(s). ATC Analysts will also validate the reports, ensuring that the data is correct by comparing the Report Net reports to similar source system reports or by querying individual transactions and forms. Additionally, developers and analysts should compare the new reports to the original requirements for complete functionality.

? Power Play ? Oracle

o Developers will validate the Power Play cubes by running similar SQL queries against the source system(s). ATC Analysts will also validate the cube data, ensuring that the data is correct by comparing the data results to comparable source system reports or by querying individual transactions and forms. Additionally, developers and analysts should compare the cubes to the original requirements for complete functionality.

? Power Play ? Report Net

o Report Net reports that contain supporting detail for Power Play cubes are also verified by Developers and Analysts. Validation includes checking that the summary records in the cube equal the total of the detail records in the report when queried using the same parameters.

CRP

The CRP phase gives users a chance to demo the system using their own data. This phase is designed to:

o Work out any requirements that may have been misunderstood

o Make suggestions that could improve the usability of the system

o Perform data validation by comparing reports they use in their functional areas to the DW reports

Page 3: Data warehousing testing strategies cognos

Y:\Testing\Data Warehouse\DW - Cognos\GL Datamart\Test Mgmt \Test Strategy - Cognos.doc

Saved Date: 6/8/05 8:28 AM

Page 3 of 3

o Test system performance

o Identify obvious bugs

System Integration Testing

This testing determines the usability of the information in the DW when using the front-end tools to perform multi-level (transactional and analytical) inquiries.

o Power Play– Cubes

This step tests the validity and usability of the pre-built cubes. Cubes should answer complex analytical questions.

For example:

a) What are the top ten vendors in dollar volume over the last year?

For cubes containing drill-through capability, testers should use a valid business scenario that begins with a ‘high-level’ business question, drilling through into further detail as each question is answered.

o Report Net

? Pre -developed Reports (Viewer Reports)

This step tests the validity of canned reports that have been developed to provide easy access for commonly requested information. Testers should use a variety of parameters when testing the reports. Data returned should produce meaningful results. When available, testers should use existing source-system reports to compare data results to the DW.

? Ad-Hoc Queries (Query Studio)

This step tests the relationship between the tables (i.e., the validity of the results returned when data from one table is combined with data from another). It is strictly scenario driven and should answer a valid business question. Original user requirements should be the basis for the scenarios that are tested. This testing should be performed by users who have strong business knowledge about how the data is related.

For example:

b) How many purchase orders did a particular buyer process in the last month?

c) Which invoices were paid against a particular PTA in the last year?