Master Data Management (MDM-Lite)


This functionality has been introduced to make it easier for users to manage their data. User will be able to create, maintain or integrate their data from a single and simple interface in our application. Also, it will ensure that data quality rules are clean and standardized. To know more about this in detail, let's get started; 

Manage Accounts

Step 1 : First time configuration (Prerequisite)

  1. Open a machine on which SAWS is running 
  2. Create a folder (this folder will be added in DvSum application and it will be created for first time only)
  3.  Copy the path of this folder 
  4.  Now login to an Owner account in DvSum application. (please note that to import data in user's DB, configuration to be done from owner account) 
  5. Go to Manage account >> Folders >> Add folder 
    5.1. Enter the title 
    5.2. Enter the path of the folder copied above 
    5.3. Select SAWS machine on which the folder is created 
    5.4. Click Save
    5.5. Folder name is appearing on the main manage account page



File Upload 

Step 2 : Go to File upload section, select the folder and upload excel file for data importing. Its basically creates template in which data to be imported. 


User can also set property file as desire. Click on Properties file button and set the properties. For more details on cleansing, check the link here. 


Manage Sources 

Step 3 : Now add that file as a source. Go to Manage sources section, click Add source, select 'In memory', add basic level of information. In Host information section, provide File system path (system path + File name) and save it. After saving it, click on Configure button to catalog the file.


Step 3.1  Check the box "My data has headers", select all columns and click on Add button. All the columns will be cataloged and our template as a source is added. 



Import Data 

Step 4 : Now go back to Manage accounts >>Folders. Select the folder and click on Edit.


Step 4.1 :

Select Import Data (Enables data import from file to destination table)
- Load: Only inserts new data
- Delete and Load: Deletes records from table based on the workbench filter and selected user view filter and then inserts new data
- Truncate and Load: Deletes all records from the table and inserts new data
Stored Procedure: Provide a stored procedure name which will be executed after data is loaded successfully. Just the name of the stored procedure is required.
Enable Staging: Data will be imported to the Staging table. Stored Procedure will then implement its own logic to transfer data from the staging table to the destination table.
For Column Mapping, you need to select the Destination and Source and the Tables respectively.

Step 4.2 : Select source and table. In the source column portion, select source (excel file uploaded as a template in manage sources) and table. All the staging columns will be populated under the staging section.



Step 4.3 In source section, all the columns from excel file will be appearing in the dropdown. You can select columns one by one or by clicking 'Auto Map' all the columns will automatically be mapped against destination columns. Once all columns are mapped, just verify if they are all mapped correctly and then Save the changes. 


Step 4.4 To re-upload the file with updated data in the same folder, it can be done directly from Workbench through Import functionality (explained later in step 5.2) .
Note: Please note that the file name can be different but the file template/format/columns/sheet name inside that file should not change. 


Audit Trail Columns Configuration (Last Updated by and Updated on)

Step 5 : Go to profiling >> select source and table >> select the data, click Edit button and select Edit configuration. In write back parameters section, select 'Updated On', 'Updated By' , 'Record status', 'Data quality status' fields (audit trail columns) and Save. The right configuration for record status and data quality status will be as shown by default. 

  • Updated on : This column will be used to store timestamp in utc-0 whenever a change is made in data.
  • Update by: This column will be used to store external user id provided in user configurations. If external user id is not provided then Full name of user provided in user configurations will be stored in this column.
  • Record status: This column will be used to maintain record status of data. Values stored can be new, published, committed etc.
  • Data Quality status : This column will be used to maintain validation status of data based on the rules applied on this table.



New identifier values are added as well 

  • Manual: Data will be inserted in identity column as provided by user.
  • Auto Increment: Increment by 1 of max value of identity column will be used for insertion
  • Sequence Generator: Database sequence generator will be used to obtain next value for insertion, in case of the identity column, select Sequence Generator but keep the input empty.




Data Preparation - Workbench 

Step 5.1 Now go to Data preparation section >> Select Data sets. Those 2 columns will be added there. Please not that those columns cannot be modified.

  • Initially these two fields will have information about date and person who uploads the file in the first place. 
  • If user modify any data/record in the grid, then 'Updated by' and 'Updated on' will provide latest date and person name by whom it is modified.
  • In updated by, if user's external ID is known/given then it will appear here otherwise user's name.
  • Also, these two columns will have information about person who imports the data. 

For more information on Dataset, please check this article



Step 5.2  Import, View, Export 


  • As soon as the user creates a view, the view name will be the default name as the same as the workbench name.
  • views can be created or shared with any other user. 
  • Default views cannot be added or deleted 
  • Views can be cloned (column selection, criteria set up and vise versa can be changed as per required option) 
  • Note:  Default workbench views cannot be shared with any other user 


  •  Users can import data directly from the workbench if the template is configured and visualize it here right away without going back and forth in the file upload section. 

  1. Click on the Import button 
  2. Select the template/folder. Please note that options(folders) linked in the manage account page will automatically populate here in the drop-down. 
    Note: Please note that the file name can be different but the file template/format/columns/sheet name inside that file should not change. 
  3. Choose a file (Excel/CSV)
  4. After choosing file, it will upfront show the status/result of whether the data upload was successful or not. 
  5. Import Failed and Import success scenarios

    a. 100 MB is file limit for excel and CSV files.
    b. For CSV, record count limit is upto 300k
    c. For excel, record count limit is upto 10,000 (10k)



6. In case of CSV, user will receive an email about status of import (Successful or failed import)  

Record Statuses: 

Step 5.3 Workbench changes (New filters)

New filter options are added. Based on these options user can filter and see the data accordingly. 


  • Valid: Every data is valid on which when the rule is applied is not treated as an "exception" which means that data will be populated for it. 
  • Invalid: Where no value/data exists, it will be considered invalid and will appear as an exception. 
  • New: Latest data uploaded 
  • Published: It can never be 'invalid'. Only new and valid data can be published. For publishing a record, click a record that is valid and new then click the 'Publish' button on top. The status will be updated from New to Publish. 

  • Bulk Publish: Capability to select more than one record and publish all at once. 

  • Edited: New data cannot directly go in the Edited state. Only published and valid data can go in the Edited state. Once user makes any new update in the record then the status from 'Published' changes to 'Edited' state. 


  • Re-Published : New, published data can never directly go in this state. Only 'Edited' data can go in re-published state. First of all, select the record in 'edited' state, then click 'Commit' button on top to confirm the changes. After that, re-select the record and click the 'Publish' button then status will change to 'Re-published" 

  • Committed: There's a back job that runs at defined time. When it will run it will change the published/re-published records go in committed status automatically. There's no option to change the record status to 'committed' from UI. 


User can export selected or multiple records at once. Also, there's a capability to apply filters and export the desired filtered data/record as well.


  1. Export file will be downloaded in ZIP file format consisting records limit upto 300k.
  2. If Workbench has more than one template associated with it then multiple options in the template drop down will be shown.
  3. File that to be exported should be according to the template so that while importing that file, chances of errors are less. 


Stored Procedure Execution on 'Commit' and 'Consolidate' 

Step 6 : If user wants to execute stored procedure automatically on committing and on consolidating changes, then go to Data sets >> Edit dataset >> go to 3/3 page. 

  • On Commit (if stored procedure exists then it will invoke after commit, otherwise it will be simple commit)
  • On Consolidate (if user selects one or multiple records on main grid, then 'consolidate' button will automatically appear on top. User clicks it and it will consolidate it on the basis of logic used in stored procedure call) 



Uniqueness Rule Enable for Dataset 

Step 7 : The uniqueness rule is now enabled for data set. User can filter out value of data set through this rule that can help user to identify common records which can be consolidated.



Manage Roles 

Step 8 : To set or view the permission for data set, go to Manage roles >> Data preparation. Here user can set the access for any other role as well. 


Have more questions? Submit a request


Please sign in to leave a comment.
Powered by Zendesk