As leaders in Enterprise Data Management and Business Intelligence, we pride ourselves on being able to deliver successfully on our projects as well as provide high-quality content to our readers.


Follow Us

facebook.png twitter.png linkedin.png youtube.png

The Importance of Data Quality in CCAR Compliance

Posted by Sally McCormack on Apr 25, 2016 1:54:13 PM

Data Quality & CCAR Compliance

When Comprehensive Capital Analysis and Review (CCAR) was mandated by the Federal Reserve in response to the financial meltdown of 2008, it provided a framework for assessing banking organizations with consolidated assets over $10 billion. Designed to help prevent future turmoil in the financial services industry, this stress testing ensures large institutions are able to withstand changing economic conditions, providing uniform and consistent service.

Data quality is a critical component in CCAR compliance. The Federal Reserve Board (FRB) provides detailed rules, called schedule instructions, which define the specific checks that must be performed against a financial institution’s data. Called edit checks, this testing focuses on a wide variety of issues related to overall data quality.


Banking organizations need to leverage the right technology and tools to ensure their data quality supports efficient, accurate CCAR testing and reporting.


Fluctuating Requirements

For any financial institution, there are hundreds of schedule instructions that generate the need for thousands of edit checks. To further complicate matters, requirements change each time the FRB issues new schedule instructions. Banking organizations need to leverage the right technology and tools to ensure their data quality process supports efficient, accurate CCAR testing and reporting.

IT teams are responsible for developing and implementing the optimal data infrastructure and supporting processes, based primarily on the size of the data set. Best practices suggest IT should provide:

  • An architecture that supports the scale of CCAR reporting
  • A development process that responds flexibly to rapidly changing circumstances
  • An evolving set of tools to simplify reporting and make the process as automated as possible

Architecting the Solution

With hundreds of schedule instructions defining thousands of edit checks, the underlying data architecture must be able to handle the load, supporting the scale of each individual organization’s CCAR reporting needs.

An ideal way to architect the environment is to create a dimensional model. Using this approach, the edit checks, the fields being checked, the schedule in which the edit checks belong to, etc. are the dimensions – and the results of the checks are populated in fact tables. To support process efficiency, consider loading only the records that fail the edit check into the fact tables, as these are the only ones requiring attention. Then, a simple star schema supports easy query definition and reporting.

With the edit failures identified and published, the business goes to work on correcting the data within the source systems to address the problems. IT runs the edit checks again, with process iterations continuing between IT and the business, until all edit checks are passed.

Remaining Flexible and Adaptable

Routine CCAR edit checks occur monthly, quarterly, and annually – typically with additional requirements for every iteration. This requires a development process that can remain flexible and adaptable, responding easily to rapidly changing circumstances.

In addition to the routine edit checks, the development process should be able to simultaneously and seamlessly handle exceptions, including the possibility of an MRIA (Matter Requiring Immediate Attention) issued by the Federal Reserve. Corrective action and response must be thorough and accurate, or more MRIAs can be triggered.

Automation is Key

IT should create an evolving set of tools to enable and support automation. With automated CCAR processes, organizations can create and maintain the agility needed to respond to rapid change, and efficiently manage the substantial reporting workload.

The first of the top two recommended automation strategies is leveraging code developed for previous edit checks (as well as within the same edit check). Each time a banking organization receives a new set of instructions, the thousands of related edit checks provide a prime opportunity for code reuse. Identifying these opportunities early supports efficient coding.

The second automation strategy involves batch processing. Once the edit check instructions are fully coded, a production team can run the batch process, allowing business users to review the results in a dashboard. Moving forward, the business can either submit a request when they want to run the batch, or IT can develop an application to allow the business to run the process themselves.

An Intelligent Investment

As banking organizations manage CCAR compliance, the importance of data quality cannot be overstated. Developing and implementing a fully-functioning system to monitor and improve data quality, then automating it, is a wise choice from both a short- and long-term perspective. In addition to improved data accuracy and processing efficiencies for the IT team, business users are able to take a greater ownership in data quality management, and can enjoy better access to improved reporting capabilities. At first glance, leveraging the technology and tools necessary for robust data quality may appear to be a daunting commitment. However, the resulting benefits of efficient, accurate CCAR testing and reporting make it well worth the investment.

Want to have a PDF version of this blog to print or save? Click to download below.

Download Blog


Topics: Data Quality, Blog

Written by Sally McCormack