Table of Contents
Solving Data Quality Pain Points for Chief Risk Officers
Data quality issues severely impact CROs’ ability to build accurate risk models. In this article, we describe the nature of those data quality issues using a few concrete examples and how data governance can fix those issues.
Chief Risk Officers (CROs) often encounter data quality pain points that, if not addressed, not only impact the effectiveness of their role but the overall health of the bank or credit union.
The concept of data quality has been defined very comprehensively by data governance experts. Typically, it comprises multiple dimensions, including:
- Accuracy
- Completeness
- Consistency
- Validity
- Timeliness
- Uniqueness
- Integrity
In this article, we provide a few industry-specific examples to concretize the nature of the data quality problem many Chief Risk Officers typically face. We also discuss how data governance can fix the problem.
Example 1: Risk event tags
Although banks and credit unions do their best to help customers avoid scams, they are unsettlingly common. Let’s say a customer with a credit card from a particular bank falls victim to a scam, and a fraudulent transaction occurs. When the customer calls the bank to report the fraud, the person at the end of the line must tag the event properly. The question is, how do they do it?
Knowing whether to tag the transaction as a fraud, operational, or credit risk is the challenge. The bank often returns fraudulent losses to the card owner in a show of goodwill, and the event is tagged as a credit risk event (because it ultimately led to a credit loss for the bank). However, this tagging is inaccurate. The event actually represents an operational risk. A security deficiency at the bank led the fraudster to achieve their objective.
This error has a downstream impact on the CRO, who cannot develop specific data-driven risk models effectively because the data has been incorrectly tagged. Tagging data accurately is an important part of maintaining data quality.
Furthermore, different departments might be tagging such events differently, which becomes a data consistency problem. Is this only a back-office problem? Different branches of the bank might be using different tagging, or this inconsistency could be prevalent from department to department.
Solution
Data governance helps solve both of the above problems, specifically with the Business Glossary feature and its integration with the Data Catalog.
OvalEdge brings together all the stakeholders of a critical term (e.g., Event Tag) to collaboratively define the exact definition of ‘Event Tag’ and how it should be used. With a well-curated definition in the Business Glossary crafted by top stakeholders, other users can search and view the standardized term to eliminate confusion. The term is assignable to Data Catalog objects, and the term’s definition is available when reviewing reports containing the term via the OvalEdge browser extension. Combined, the chances of manual error become far less.
A business glossary enables users to find common terms and definitions, collaborate more easily on data assets, and move forward fluidly with data-driven growth initiatives.
Related posts: Building a Business Glossary - Why and How
Example 2: Property reappraised values
The properties behind real estate loans must be reappraised periodically, let’s say every 10 to 15 years.
The trouble is, in many cases, while the reappraisal is completed and the updated amount is logged in a file, this file is not updated to centralized IT systems. This means the original appraisal value remains and isn’t refreshed as it should be, leading to an incorrect risk assessment. This is a data validity problem. Ultimately, invalid data leads to flawed decision-making.
Solution
With OvalEdge you identify critical data elements (CDEs) within the organization. The complete IT/data transformation process of those CDEs is well-documented and audited in a central repository. In the case of an error when loading a CDE file, OvalEdge will provide an alert to the appropriate IT team members who can investigate the problem.
By first defining Property Value as a critical data element, then performing Data Anomaly Detection and applying Data Quality Rules, OvalEdge can perform appropriate data validation checks.
Related posts: Data Quality Purpose-Built for Banking
Example 3: Loan repayment reason
The reason for loan repayment is a critical data element (CDE) that must be captured. Without it, CROs cannot accurately calculate capital risk, a data completeness problem.
Solution
Data governance can address this by defining early loan repayment as a CDE to ensure it is captured by default and isn’t missing from the data sets used for calculation.
Related post: Data Quality Challenges for Fair Lending Compliance
Example 4: Depositor profile across the credit lifecycle
Banks must profile depositors. They must identify whether depositors are likely to keep deposits in the bank or withdraw funds when interest rates are less favorable. CROs must capture depositor behavior for the entirety of the credit lifecycle, which could number years. This information enables CROs to predict depositor behavior when interest rates rise or decline. CROs cannot make deposit models if they don’t have data for the entire cycle. This is a data completeness problem.
Solution
Data lineage tools ensure the completeness of data by tracking its lifecycle. This is critical when looking at fluctuating customer behavior over time.
Related post: Top Features of a Data Lineage Tool in 2024
How Can OvalEdge Help?
The data used to calculate risk is usually captured in source applications. This data is then retrieved by risk management teams to develop their models. Technically, risk data quality issues can only occur in the following three areas:
- In source applications, where a business performs various actions.
- In transit, when data is moved from systems to data warehouse facilities and spreadsheets.
- Through integration mistakes, when data from two dispersed systems is merged.
No AI solution can tell your loan mortgage officer to tag data correctly. That's why you need comprehensive data governance to solve risk data quality issues.
Data governance can seem complex, but OvalEdge has simplified and streamlined the process, so you can expect to see results in just a few months. Our data governance tool automatically compiles the complete lineage for all your data, while our crawlers automatically crawl various source systems, risk calculations logic, and reports to map the entirety of your data ecosystem.
We Focus on Critical Data Elements (CDEs)
We've compiled a list of all the CDEs required to calculate various risks. All you need to do is add or remove your desired elements. With OvalEdge, machine learning algorithms effortlessly identify the data associated with CDEs, so you can quickly standardize the CDE definitions and formulas critical for risk management.
With OvalEdge, you can:
- Train all the users associated with specific CDEs so they don’t make operational errors.
- Specify a backlog for controlling all bad data events so that IT systems can be created to control the corrupted data at the source.
Schedule a Demo
Fill the information below to set up a demo.