Data Quality Module
COMPONENTS
DQM – There are two DQM areas where it can be deployed. The first consists of elements of Data Profiling and Data Validation and Correction.
Here, the DQM is used to specify Data Profiling requirements as well as Data Validation rules via
user-friendly interface.
The Data Profiler can be used as a starting point for a detailed Data Validation specification.
DQM allows repeated testing of the specified Data Validation rules against real dataset before the integration into the actual Orchestration structure. Both testing and Data Validation processing produce summary and detailed reports on items that failed validation containing enough information to pin-point the problem item. The detailed report can be used in further processing to eliminate bad records
Transformation Module
COMPONENTS
TD - Task Designer module is interactive web based graphical interface used to design various data processing tasks such as transforming inputs and creating outputs, executing SQL statements, executing system commands, etc.
Input and output components in ams are called IO Objects and processing or transformation components are called services.
The types of IO objects and services are easily recognisable by their colours and icons. IO objects and services are connected with arrows that represent data flow. They connect IO objects with services.
IO objects can be dataset of any type and location which includes tables of almost all known DB types or vendors, and files of most formats.
TD Tasks can take environment variables defined on the calling procedure level or defined globaly.
The same TD task can be used in different procedures.
All the details relevant to a TD task are stored in Meta Tables and open for viewing.
The objects in the task pane can be moved around, and they retain their last position automatically.