The growing importance of data has placed an increased emphasis and awareness on data management in financial institutions. With the vast volume of customer and market data collected every day, financial institutions need to ensure a high reusability across the organisation to maximise the value of data, all while minimising the re-collection and re-formatting efforts.
The data content needs to possess characteristics that are aligned with the initial and changing expectations. More than that, all data sets should be governed by a common standard so that the data can effectively support multiple lines of business and different teams across the organisation in making strategic decisions and enable business operations excellence.
Moving beyond the traditional setting, market players are increasingly fascinated by open banking, a platform that enables customers to quickly browse and purchase products from various, distinct financial services providers via digital platforms and APIs. However, to build this new concept into real-life application, financial institutions will have to pay additional attention to data standards with a wider scope of participants – and this will become the key to success, or failure.
In this article, we explore data standards as part of our data management series, where data standards is featured as the second of the seven components within our DATAMANAGEMENTINABOX solution, a framework which supports our clients in setting up and incorporating control and monitoring into the organisation in order for data to be used consistently and effectively.
The same data content is often used and shared across business functions and external entities. As such, having a standardised set of data is critical. Without an agreed upon standards within an organisation, it would be challenging for operational staff to know which format to follow and what data values are acceptable when recording data into the system.
For example, the staff capture the customers’ name parts in an incorrect order or record the business’ growth rate in decimal points instead of the percentage format that was expected by the management team.
The lack of standardised acceptance criteria can lead to data inconsistencies and subsequent difficulties in data integration for downstream reporting or ambiguity in data interpretation for other business processes. As a result, additional time and effort are required to rectify these data contents before they can be further processed for business use.
Having data standards in place enables the sharing and exchange of data between multiple parties in a way that guarantees the interacting parties have the same understanding of what is represented by the data elements. It also reduces the cost and complexity in having to define data standards for every new data or report request, hence enhancing the reusability of data.
Synpulse’s DATAMANAGEMENTINABOX solution offers a Data Standards Management framework that facilitates and accelerates our clients in establishing standards to govern how data is captured and presented. The toolbox is designed to enable our clients to understand their needs in establishing data standards so that they can define customised standards and controls.
The purpose of data standards control is to enable the capturing and exchange of standardised data between systems owned by different business functions. Our framework provides documented methods and templates to guide our clients in defining the rules for capturing, sharing, storing, and exchanging data. This allows business functions to discuss and agree on standards just once, and not have to make the same decision again for each project.
Data standards control can be implemented at the application-level or system-level, depending on the need as well as business and technical feasibility. The aim of the control is to determine if the data is valid and accurate from a data standards perspective. This means if the value in the data elements conforms to the allowed values or if they are in the right format.
It could also prevent human error, especially when the data is captured or recorded manually, and reduce the manual effort to correct erroneous data. Having sufficient data standards control in place can promote consistent results from the processes using them, improve cross-system compatibility, and increase the reusability of data.
Our data standards monitoring framework assists our clients in monitoring, periodically reviewing, and updating their standards. The objective is to ensure that the controls can effectively implement the defined standards as well as minimise redundancy, time, and effort spent on data corrections.
Whenever new data or report is requested, the architects may take the agreed data standards as a reference and instruct the team to build accordingly.
Data stewards will then lead periodic examinations of the data quality based on the standard and identify any gaps. In instances where remediation is required, data owners need to take the lead in identifying the root causes. The leaders should reinforce the standards with teams to rectify any issues and prevent similar instances from happening again.
It is pertinent for all teams across the business to be cognizant of the importance of staying compliant, as the reusability of data does not only benefit one single department but impacts the organisation as a whole. Our guidance will also support our clients in continuously evaluating the standards and controls over time to reduce manual interventions, exception flows, and conversions.
With DATAMANAGEMENTINABOX, we consolidate our deep data expertise and hands-on industry experience to provide a solid reference model for all financial institutions to solve their data management struggles. This article introduces our Data Standards Management framework as the second of the seven components within our DATAMANAGEMENTINABOX solution. We will be introducing each component in the coming months, so do stay tuned.