Data is loaded from the staging area to the target system this must be done carefully. Depending on necessity it involves the following operations −
- Record count check from the intermediate table to the target system.
- Ensure the key field data is not missing or Null.
- Check if the aggregate values and calculated measures are loaded in the fact tables.
- Check modeling views based on the target tables.
- Check if CDC has been applied on the incremental load table.
- Data check in dimension table and history table check.
Source-to-target Data Testing
In this type of testing, a tester validates data values from the source to the target system this must be done carefully. Depending on necessity it checks the data values in the source system and the corresponding values in the target system after transformation for your project information. Since for a purpose getting in to details of this this type of testing is time-consuming and is normally performed in financial and banking projects.
Data Integration / Threshold Value Validation Testing In this type of testing, a tester validates the range of data and this is really best of the capabilities. Remember very carefully that all the threshold values in the target system are checked if they are as per the expected result to make the purpose meet from all the ends for this.
Depending on necessity it also involves integration of data in the target system from multiple source systems after transformation and loading.
Metadata Checking the metadata involves validating the source and the target table structure w.r.t to make the purpose meet from all the ends for this.
Data Length Check The length of target column data type should be equal to or greater than the source column data type and likely the act of utmost plausible task.
Data Type Check Data type checking involves verifying the source and the target data type and ensuring that they are same and likely the act of utmost plausible task.
Constraint / Index Check Constraint checking involves verifying the index values and constraints as per the design specification document to make the purpose meet from all the ends for this.
ETL Testing - Data Transformations
ETL Testing Data Transformation are listed below
- The first step is to create a list of scenarios of input data and the expected results and validate these with the business customer to give you the best of the result in assertion of progression. Since for a purpose getting in to details of this this is a good approach for requirements gathering during design and could also be used as a part of testing.
- The next step is to create the test data that contains all the scenarios and I wish you get this explanation. I really find this interesting utilize an ETL developer to automate the entire process of populating the datasets with the scenario spreadsheet to permit versatility and mobility for the reason that the scenarios are likely to change.
- Next, utilize data profiling results to compare the range and submission of values in each field between the target and source data.
- Validate the accurate processing of ETL generated fields, e.g.,, surrogate keys.
- Validating the data types within the warehouse are the same as was specified in the data model or design.
- Create data scenarios between tables that test referential integrity.
- Validate the parent-to-child relationships in the data.
- The final step is to perform lookup transformation for your project information. I think for next understanding your lookup query should be straight without any aggregation and expected to return only one value per the source table and likely the act of utmost plausible task. I think for next understanding you can directly join the lookup table in the source qualifier as in the previous test to make the purpose meet from all the ends for this. Depending on necessity if this is not the case, write a query joining the lookup table with the main table in the source and compare the data in the corresponding columns in the target.
ETL Testing - Data Completeness
Checking Data Completeness is done to verify that the data in the target system is as per expectation after loading. The common tests that can be performed for this are as follows − Checking Aggregate functions (sum, max, min, count), Checking and validating the counts and the actual data between the source and the target for columns without transformations or with simple transformations.