Wednesday, September 21, 2011

Migration from BPT 10 to 11: Changes to Data Tables

In standard QTP, data tables are the work horses for managing test data.  They are used for data driving tests, passing data from one action to another, storing large volumes of test results, and for iterating tests.  In BPT, data tables take a back seat to other means of managing test data, most notably, test parameter data.  But even in BPT, there is a place for data tables.  For example, test data read from associated Excel files can be read in at run time and table data from the application under test can be stored in the component data table and reviewed in the test logs.

In BPT 10, each component has a local datasheet.  Because each component runs independently (remember the slow execution speed of BPT 10 – one component is loaded, it runs, the component is closed, the next component is loaded, it runs, …), there is no global data sheet.

There is a key benefit in BPT 10 to the way tests are constructed.  In a component, you can create data tables named “Actual” and “Expected.”  If this component is used more than once in a test, you don’t have to worry about name contention – the scope of data table names is in a single instance of a component.  In standard QTP, all data tables are exposed to the entire test and you have to make sure that no data tables share the same name. In BPT 11, tests share much more in common with standard QTP tests, and data tables must now be unique.

In my migration from BPT 10 to 11, I had to rethink some of my key assumptions about how I use data tables.

  1. Because of the appearance of the global data sheet in the business component, I changed from using the DataTable.Import method to DataTable.ImportSheet.  The Import method loads the Excel sheet into the first data table. In BPT 10, the first data table is a local sheet; in BPT 11, the first data table is the global sheet. 

A surprise change for me when I ran my components migrated from 10 in 11 is that the imported data was not where I expected it to be.  This caused verifications to fail.  I was further surprised when some of my tests ran extra iterations.  Tests ran extra iterations because the import method added rows to the global datasheet, and the global datasheet determines the number of iterations a test runs.

ImportSheet enables you to specify which sheet to import and where to place it.  When using ImportSheet, you have to know the name of the data sheet you want to overwrite – if your test iterates, this must be a unique name.

  1. Because the scope of data sheet names is not longer local to business components, you must manage the names of the sheets and make sure the names are unique.  As you do this, consider analysis of test results.  If your test has 10 iterations, there may be 10 instances of the “Expected_Result” sheet.  Getting the iteration value of component at run time is difficult so I chose to make the sheets unique by adding part of a time stamp to the name.  Ten instances of a sheet with similar names is difficult to analyze.  To give clarity, I added a line to the log file identifying which data sheet goes with which iteration.

Here is a summary of the changes I made:
BPT 10:
DataTable.Import GetCurrentTestPath() & "\" & strQCExcelSheet

BPT 11:
strUniqueSheetName = “Reference_” & Minute(Now) & Second(Now)
Reporter.ReportEvent micDone, strUniqueSheetName, “Look for reference data at “ & strUniqueSheetName
DataTable.ImportSheet GetCurrentTestPath() & "\" & strQCExcelSheet, 1, strUniqueSheetName
Generally, the migration from BPT 10 to 11 did not require changes to the automation code.  Underlying BPT is QTP, and QTP code did not change.  The change in how BPT handles data tables may be unnoticed in many implementations, depending on which features of BPT are used.  For those who push the limits of how BPT handles data, this undocumented change to data tables will be a stumbling block.  I suppose there is a price to pay to get BPT to perform at an acceptable speed.  I’ll pay this one.