Oracle Database Testing

Automation Testing of Oracle DB Data

Which Oracle database use cases can be covered with ETL Validator?

Oracle DB for
Application Development
Oracle
Datawarehouse
Oracle DB as Source/Target
during Data Migration
Oracle DB integrated to
Cloud Based Applications
Oracle DB integrated to
on-premise DBs


What is Oracle Database Testing?

Oracle Database testing is the process of validating the quality, integrity and completeness of data in the Database as well as ensuring that the data has been consumed appropriately by the application or ETL process.

Challenges in Oracle Database Testing?

   Building robust and comprehensive test cases before signing off on any changes.
   Data transfer is usually not between similarly structured tables/databases.
   Writing custom SQLs for verifying data completeness, integrity and quality.
   Easy to read and use GUI is needed so that no gaps/differences are missed out.
   Recording test results properly so that issues are addressed systematically.
   Every project must undergo thorough testing with consistent set of use cases.
 

What kind of testing does ETL Validator do?

   Oracle Database for Application Development


Data Accuracy Checking

Inorder to ensure that the most accurate data is available for consumption, Data Accuracy checks need to be performed on the existing Database.
   Rows and Columns validation
Validation of columns needs to be done as per the data model. Validation of rows needs to be done as per source records, business rules etc.
Example: Manual data entry at application level sometimes does not adhere to business rules and it might lead to data inconsistencies.
   Check for blank columns, occurence of blank data, frequency of same data
Running frequent sanity checks on the database is important so that the data hygiene is always maintained.
Example: Sometimes with manual data entry or data import from legacy systems, data rules are overridden to facilitate and maximise data push.

Metadata Checking

Metadata testing is needed to verify that the characteristics of the data in the tables is as expected.
   Data Type Check
Verify that the type and format of the data in the database tables matches the specified data type.
Example: Date, timestamp and time data types should have values in a specific format so that they can be parsed by the consuming process.
   Data Length Check
Length of string and number data values in the tables should match the minimum and maximum allowed length for those columns.
Example: If Data for the country column has more than 2 characters in the table's field while the limit is only 2 characters, that is an anomaly.
   Not Null Check
Verify that all required data elements in the tables are populated.
Example: Date of Birth is a required data element but some of the records might be missing this value.

Data Quality Checking

The purpose of Data Quality tests is to verify the accuracy of the data in the Oracle Tables.
   Duplicate Data Checks
Check for duplicate records in the tables with the same unique key column or a unique combination of columns as per business requirement.
Example: Business requirement says that a combination of First Name, Last Name, Middle Name and Data of Birth should be unique for the Customer record. Duplicates can be identified by using this criteria in SQL queries.

   Reference Data Checks
Business rules may dictate that the values in certain columns should adhere to a list of values. Verify that the values in the records conform to reference data standards.
Example: Values in the country_code column should have a valid country code from a Country Code domain.
  Data Validation Rules
Many data fields can contain a range of values that cannot be enumerated. However, using reasonable constraints or rules, these situations can be detected where the data is clearly wrong.
Example: Date of birth (DOB) field is defined as the DATE datatype and can assume any valid date. However, a DOB in the future, or more than 100 years in the past are probably invalid.

Data Integrity Checking

This check addresses "keyed" relationships of entities within a domain. The goal is to idenfity orphan records in the child entity with a foreign key to the parent entity.
   Null foriegn key values
Null value in foreign key column should not be allowed inorder to maintain referential integrity. But, that rule might have been overridden during legacy data imports.
Example: Find Count of records with null foriegn key values in the tables.
   Invalid foriegn key values
All invalid foriegn key values in a table that do not have a corresponding primary key in the parent database table should be addressed
Example: Find the gaps in referential integrity using- 1. Count of null or unspecified dimension keys in a Fact table:
2. Count of invalid foriegn key values in the contact list :

   Oracle Datawarehouse


Datawarehouse Data Completeness Testing

The purpose of Data Completeness tests is to verify that all the expected data is loaded in the tables from the source.
   Record Count Validation
Compare count of records of the source data and datawarehouse tables. Check for any rejected records.
Example: A simple count of records comparison between the source data and target tables can be done using SQL queries.
   Column Data Profile Validation
Column or attribute level data profiling is an effective tool to compare source and target data without actually comparing the entire data. It is similar to comparing the checksum of your source and target data.

Example: Compare unique values in a column between the source and target
Compare max, min, avg, max length, min length values for columns depending of the data type
Compare null values in a column between the source and target
For important columns, compare data distribution (frequency) in a column between the source and target
   Compare entire source and target data
Compare data (values) between the source and target data effectively by validating 100% of the data.
Example: Write a source query on the table that matches the data in the target table after transformation.

Datawarehouse Data Transformation Testing

Data from the source is transformed by the consuming process and loaded into the target. It is important to test the transformed data to ensure data has propagated flawlessly.
   White Box approach
In this testing technique, examine the program structure and derive test cases from the program logic/code.
ETL Validator can easily create those transformation testing cases using data sitting in its repository tables.
Example: In order to compute cumulative sales per customer, ETL transformation logic aggregates individual sales records from source data and loads them in target tables. This can be easily tested with 'Component Test Case' of ETL Validator.
   Black Box approach
In this method, examine the functionality of an application without delving into its internal structures or workings. This involves reviewing the transformation logic from the mapping design document and setting up the test cases appropriately.

ETL Validator can easily create those transformation testing cases using data sitting in its repository tables.

Example: In a financial company, the interest earned on the savings account is dependent the daily balance in the account for the month.
Compare the transformed data in the target table with the expected values.

Datawarehouse Data Accuracy Checking

Inorder to ensure that the most accurate data is available for consumption, Data Accuracy checks need to be performed on the existing Database.
   Rows and Columns validation
Validation of columns needs to be done as per the data model. Validation of rows needs to be done as per source records, business rules etc.
Example: Manual data entry at application level sometimes does not adhere to business rules and it might lead to data inconsistencies.
   Check for blank columns, occurence of blank data, frequency of same data
Running frequent sanity checks on the database is important so that the data hygiene is always maintained.
Example: Sometimes with manual data entry or data import from legacy systems, data rules are overridden to facilitate and maximise data push.

Datawarehouse Metadata Checking

Metadata testing is needed to verify that the characteristics of the data in the tables is as expected.
   Data Type Check
Verify that the type and format of the data in the database tables matches the specified data type.
Example: Date, timestamp and time data types should have values in a specific format so that they can be parsed by the consuming process.
   Data Length Check
Length of string and number data values in the tables should match the minimum and maximum allowed length for those columns.
Example: If Data for the country column has more than 2 characters in the table's field while the limit is only 2 characters, that is an anomaly.
   Not Null Check
Verify that all required data elements in the tables are populated.
Example: Date of Birth is a required data element but some of the records might be missing this value.

Datawarehouse Data Quality Checking

The purpose of Data Quality tests is to verify the accuracy of the data in the Oracle Tables.
   Duplicate Data Checks
Check for duplicate records in the tables with the same unique key column or a unique combination of columns as per business requirement.
Example: Business requirement says that a combination of First Name, Last Name, Middle Name and Data of Birth should be unique for the Customer record. Duplicates can be identified by using this criteria in SQL queries.

   Reference Data Checks
Business rules may dictate that the values in certain columns should adhere to a list of values. Verify that the values in the records conform to reference data standards.
Example: Values in the country_code column should have a valid country code from a Country Code domain.
  Data Validation Rules
Many data fields can contain a range of values that cannot be enumerated. However, using reasonable constraints or rules, these situations can be detected where the data is clearly wrong.
Example: Date of birth (DOB) field is defined as the DATE datatype and can assume any valid date. However, a DOB in the future, or more than 100 years in the past are probably invalid.

Datawarehouse Data Integrity Checking

This check addresses "keyed" relationships of entities within a domain. The goal is to idenfity orphan records in the child entity with a foreign key to the parent entity.
   Null foriegn key values
Null value in foreign key column should not be allowed inorder to maintain referential integrity. But, that rule might have been overridden during legacy data imports.
Example: Find Count of records with null foriegn key values in the tables.
   Invalid foriegn key values
All invalid foriegn key values in a table that do not have a corresponding primary key in the parent database table should be addressed
Example: Find the gaps in referential integrity using- 1. Count of null or unspecified dimension keys in a Fact table:
2. Count of invalid foriegn key values in the contact list :

   Oracle Database as Source/Target during Data Migration


Post ETL Data Completeness Testing

The purpose of Data Completeness tests is to verify that all the expected data is loaded in the tables from the source.
   Record Count Validation
Compare count of records of the source data and datawarehouse tables. Check for any rejected records.
Example: A simple count of records comparison between the source data and target tables can be done using SQL queries.
   Column Data Profile Validation
Column or attribute level data profiling is an effective tool to compare source and target data without actually comparing the entire data. It is similar to comparing the checksum of your source and target data.

Example: Compare unique values in a column between the source and target
Compare max, min, avg, max length, min length values for columns depending of the data type
Compare null values in a column between the source and target
For important columns, compare data distribution (frequency) in a column between the source and target
   Compare entire source and target data
Compare data (values) between the source and target data effectively by validating 100% of the data.
Example: Write a source query on the table that matches the data in the target table after transformation.

Post ETL Data Transformation Testing

Data from the source is transformed by the consuming process and loaded into the target. It is important to test the transformed data to ensure data has propagated flawlessly.
   White Box approach
In this testing technique, examine the program structure and derive test cases from the program logic/code.
ETL Validator can easily create those transformation testing cases using data sitting in its repository tables.
Example: In order to compute cumulative sales per customer, ETL transformation logic aggregates individual sales records from source data and loads them in target tables. This can be easily tested with 'Component Test Case' of ETL Validator.
   Black Box approach
In this method, examine the functionality of an application without delving into its internal structures or workings. This involves reviewing the transformation logic from the mapping design document and setting up the test cases appropriately.

ETL Validator can easily create those transformation testing cases using data sitting in its repository tables.

Example: In a financial company, the interest earned on the savings account is dependent the daily balance in the account for the month.
Compare the transformed data in the target table with the expected values.

Post ETL Data Accuracy Checking

Inorder to ensure that the most accurate data is available for consumption, Data Accuracy checks need to be performed on the existing Database.
   Rows and Columns validation
Validation of columns needs to be done as per the data model. Validation of rows needs to be done as per source records, business rules etc.
Example: Manual data entry at application level sometimes does not adhere to business rules and it might lead to data inconsistencies.
   Check for blank columns, occurence of blank data, frequency of same data
Running frequent sanity checks on the database is important so that the data hygiene is always maintained.
Example: Sometimes with manual data entry or data import from legacy systems, data rules are overridden to facilitate and maximise data push.

Post ETL Metadata Checking

Metadata testing is needed to verify that the characteristics of the data in the tables is as expected.
   Data Type Check
Verify that the type and format of the data in the database tables matches the specified data type.
Example: Date, timestamp and time data types should have values in a specific format so that they can be parsed by the consuming process.
   Data Length Check
Length of string and number data values in the tables should match the minimum and maximum allowed length for those columns.
Example: If Data for the country column has more than 2 characters in the table's field while the limit is only 2 characters, that is an anomaly.
   Not Null Check
Verify that all required data elements in the tables are populated.
Example: Date of Birth is a required data element but some of the records might be missing this value.

Post ETL Data Quality Checking

The purpose of Data Quality tests is to verify the accuracy of the data in the Oracle Tables.
   Duplicate Data Checks
Check for duplicate records in the tables with the same unique key column or a unique combination of columns as per business requirement.
Example: Business requirement says that a combination of First Name, Last Name, Middle Name and Data of Birth should be unique for the Customer record. Duplicates can be identified by using this criteria in SQL queries.

   Reference Data Checks
Business rules may dictate that the values in certain columns should adhere to a list of values. Verify that the values in the records conform to reference data standards.
Example: Values in the country_code column should have a valid country code from a Country Code domain.
  Data Validation Rules
Many data fields can contain a range of values that cannot be enumerated. However, using reasonable constraints or rules, these situations can be detected where the data is clearly wrong.
Example: Date of birth (DOB) field is defined as the DATE datatype and can assume any valid date. However, a DOB in the future, or more than 100 years in the past are probably invalid.

Post ETL Data Integrity Checking

This check addresses "keyed" relationships of entities within a domain. The goal is to idenfity orphan records in the child entity with a foreign key to the parent entity.
   Null foriegn key values
Null value in foreign key column should not be allowed inorder to maintain referential integrity. But, that rule might have been overridden during legacy data imports.
Example: Find Count of records with null foriegn key values in the tables.
   Invalid foriegn key values
All invalid foriegn key values in a table that do not have a corresponding primary key in the parent database table should be addressed
Example: Find the gaps in referential integrity using- 1. Count of null or unspecified dimension keys in a Fact table:
2. Count of invalid foriegn key values in the contact list :

   Oracle Database Integrated to Cloud Applications


Data comparison between Cloud App and Oracle DB

   JDBC or ODBC Driver : ETL Validator already supports JDBC driver from Simba for Salesforce. Any of the test cases (for e.g. Query Compare) can be used to compare data between Oracle and Salesforce. For other JDBC drivers, ETL Validator provides a Generic JDBC driver option

   REST API / JSON Data : Apache Drill can be used to query on JSON data. ETL Validator supports Apache Drill as a data source.

   SOAP Services: ETL Validator supports SOAP Services as a data source. It also has a Web Service test plan for extracting data from the SOAP services for the purpose of testing and data comparison with data in Oracle database.

Data Access from Cloud App for testing

   JDBC or ODBC Driver : For e.g. Salesforce has JDBC/ODBC drivers from Simba, CDATA and Progress

   REST API : Almost all the cloud applications provide REST API for accessing the data. For e.g. Salesforce.

   SOAP Services: Some of the Cloud applications only provide access to data using SOAP Web Services. For e.g. Workday.

Cloud App Data Completeness Testing

The purpose of Data Completeness tests is to verify that all the expected data is loaded in the tables from the source.
   Record Count Validation
Compare count of records of the source data and datawarehouse tables. Check for any rejected records.
Example: A simple count of records comparison between the source data and target tables can be done using SQL queries.
   Column Data Profile Validation
Column or attribute level data profiling is an effective tool to compare source and target data without actually comparing the entire data. It is similar to comparing the checksum of your source and target data.

Example: Compare unique values in a column between the source and target
Compare max, min, avg, max length, min length values for columns depending of the data type
Compare null values in a column between the source and target
For important columns, compare data distribution (frequency) in a column between the source and target
   Compare entire source and target data
Compare data (values) between the source and target data effectively by validating 100% of the data.
Example: Write a source query on the table that matches the data in the target table after transformation.

Cloud App Data Transformation Testing

Data from the source is transformed by the consuming process and loaded into the target. It is important to test the transformed data to ensure data has propagated flawlessly.
   White Box approach
In this testing technique, examine the program structure and derive test cases from the program logic/code.
ETL Validator can easily create those transformation testing cases using data sitting in its repository tables.
Example: In the integration of Oracle ERP and Salesforce.com, lot of transformations take place as the Data Models are very different.This can be easily tested with 'Component Test Case' of ETL Validator.
   Black Box approach
In this method, examine the functionality of an application without delving into its internal structures or workings. This involves reviewing the transformation logic from the mapping design document and setting up the test cases appropriately.

ETL Validator can easily create those transformation testing cases using data sitting in its repository tables.

Example: In a financial company, the interest earned on the savings account is dependent the daily balance in the account for the month. Compare the transformed data in the target table with the expected values.

Cloud DB Metadata Checking

Metadata testing is needed to verify that the characteristics of the data in the tables is as expected.
   Data Type Check
Verify that the type and format of the data in the database tables matches the specified data type.
Example: Date, timestamp and time data types should have values in a specific format so that they can be parsed by the consuming process.
   Data Length Check
Length of string and number data values in the tables should match the minimum and maximum allowed length for those columns.
Example: If Data for the country column has more than 2 characters in the table's field while the limit is only 2 characters, that is an anomaly.
   Not Null Check
Verify that all required data elements in the tables are populated.
Example: Date of Birth is a required data element but some of the records might be missing this value.

Cloud App Data Integrity Checking

This check addresses "keyed" relationships of entities within a domain. The goal is to idenfity orphan records in the child entity with a foreign key to the parent entity.
   Null foriegn key values
Null value in foreign key column should not be allowed inorder to maintain referential integrity. But, that rule might have been overridden during legacy data imports.
Example: Find Count of records with null foriegn key values in the tables.
   Invalid foriegn key values
All invalid foriegn key values in a table that do not have a corresponding primary key in the parent database table should be addressed
Example: Find the gaps in referential integrity using- 1. Count of null or unspecified dimension keys in a Fact table:
2. Count of invalid foriegn key values in the contact list :

   Oracle Database Integrated To On-Premise Databases like Teradata, HANA, DB2 etc


On-Premise DB Data Completeness Testing

The purpose of Data Completeness tests is to verify that all the expected data is loaded in the tables from the source.
   Record Count Validation
Compare count of records of the source data and datawarehouse tables. Check for any rejected records.
Example: A simple count of records comparison between the source data and target tables can be done using SQL queries.
   Column Data Profile Validation
Column or attribute level data profiling is an effective tool to compare source and target data without actually comparing the entire data. It is similar to comparing the checksum of your source and target data.

Example: Compare unique values in a column between the source and target
Compare max, min, avg, max length, min length values for columns depending of the data type
Compare null values in a column between the source and target
For important columns, compare data distribution (frequency) in a column between the source and target
   Compare entire source and target data
Compare data (values) between the source and target data effectively by validating 100% of the data.
Example: Write a source query on the table that matches the data in the target table after transformation.

On-Premise DB Data Transformation Testing

Data from the source is transformed by the consuming process and loaded into the target. It is important to test the transformed data to ensure data has propagated flawlessly.
   White Box approach
In this testing technique, examine the program structure and derive test cases from the program logic/code.
ETL Validator can easily create those transformation testing cases using data sitting in its repository tables.
Example: In order to compute cumulative sales per customer, ETL transformation logic aggregates individual sales records from source data and loads them in target tables. This can be easily tested with 'Component Test Case' of ETL Validator.
   Black Box approach
In this method, examine the functionality of an application without delving into its internal structures or workings. This involves reviewing the transformation logic from the mapping design document and setting up the test cases appropriately.

ETL Validator can easily create those transformation testing cases using data sitting in its repository tables.

Example: In a financial company, the interest earned on the savings account is dependent the daily balance in the account for the month.
Compare the transformed data in the target table with the expected values.

On-Premise DB Data Accuracy Checking

Inorder to ensure that the most accurate data is available for consumption, Data Accuracy checks need to be performed on the existing Database.
   Rows and Columns validation
Validation of columns needs to be done as per the data model. Validation of rows needs to be done as per source records, business rules etc.
Example: Manual data entry at application level sometimes does not adhere to business rules and it might lead to data inconsistencies.
   Check for blank columns, occurence of blank data, frequency of same data
Running frequent sanity checks on the database is important so that the data hygiene is always maintained.
Example: Sometimes with manual data entry or data import from legacy systems, data rules are overridden to facilitate and maximise data push.

On-Premise DB Metadata Checking

Metadata testing is needed to verify that the characteristics of the data in the tables is as expected.
   Data Type Check
Verify that the type and format of the data in the database tables matches the specified data type.
Example: Date, timestamp and time data types should have values in a specific format so that they can be parsed by the consuming process.
   Data Length Check
Length of string and number data values in the tables should match the minimum and maximum allowed length for those columns.
Example: If Data for the country column has more than 2 characters in the table's field while the limit is only 2 characters, that is an anomaly.
   Not Null Check
Verify that all required data elements in the tables are populated.
Example: Date of Birth is a required data element but some of the records might be missing this value.

On-Premise DB Data Quality Checking

The purpose of Data Quality tests is to verify the accuracy of the data in the Oracle Tables.
   Duplicate Data Checks
Check for duplicate records in the tables with the same unique key column or a unique combination of columns as per business requirement.
Example: Business requirement says that a combination of First Name, Last Name, Middle Name and Data of Birth should be unique for the Customer record. Duplicates can be identified by using this criteria in SQL queries.

   Reference Data Checks
Business rules may dictate that the values in certain columns should adhere to a list of values. Verify that the values in the records conform to reference data standards.
Example: Values in the country_code column should have a valid country code from a Country Code domain.
  Data Validation Rules
Many data fields can contain a range of values that cannot be enumerated. However, using reasonable constraints or rules, these situations can be detected where the data is clearly wrong.
Example: Date of birth (DOB) field is defined as the DATE datatype and can assume any valid date. However, a DOB in the future, or more than 100 years in the past are probably invalid.

On-Premise DB Data Integrity Checking

This check addresses "keyed" relationships of entities within a domain. The goal is to idenfity orphan records in the child entity with a foreign key to the parent entity.
   Null foriegn key values
Null value in foreign key column should not be allowed inorder to maintain referential integrity. But, that rule might have been overridden during legacy data imports.
Example: Find Count of records with null foriegn key values in the tables.
   Invalid foriegn key values
All invalid foriegn key values in a table that do not have a corresponding primary key in the parent database table should be addressed
Example: Find the gaps in referential integrity using- 1. Count of null or unspecified dimension keys in a Fact table:
2. Count of invalid foriegn key values in the contact list :

ETL Validator Resources Try ETL Validator free for 30 days or contact us for demo

Interested in more information?