Skip to main content

Looking for Valuant? You are in the right place!

Valuant is now Abrigo, giving you a single source to Manage Risk and Drive Growth

Make yourself at home – we hope you enjoy your new web experience.

Looking for DiCOM? You are in the right place!

DiCOM Software is now part of Abrigo, giving you a single source to Manage Risk and Drive Growth. Make yourself at home – we hope you enjoy your new web experience.

“Instead of asking how much of your time is left, ask how much of your mind.” Data and CECL

September 15, 2016
Read Time: 0 min

“Instead of asking how much of your time is left, ask how much of your mind.” – Let’s Go Crazy by Prince Rogers Nelson

These slightly paraphrased lyrics from an iconic pop song aptly capture the preference of quality over quantity. Yes, the late Prince was writing and singing about life not banking, but the Current Expected Credit Loss (CECL) standard is “life” for those charged with transitioning their institution’s allowance process from today’s incurred loss to tomorrow’s expected loss model. And as more expected loss details unfurl, there is increasing anxiety among lenders concerning management of the allowance, profitability, capital – subjects critical to the life of the institution.

Some advisors have simplified preparation for expected loss as a data exercise, encouraging banks and credit unions to gather and retain as much loan-level information as possible. Good advice, but not the first order of business. Our experience at MST, supported by Risk Management Association research and an international study by Experian, reveals that loan and customer data collected by financial institutions for decades are in terrible disarray. Even some current data will be difficult to mine and use for estimating life-of-loan losses. Lenders will need more data for expected loss than is required for current ALLL methodologies; if the quality of the data is questionable, the assumptions and conclusions will be as well, creating problems no additional quantity of data can overcome.

Is it really difficult to believe that the condition of your data at the most basic level of a specific customer or loan is a hot mess? Thanks to technological advancements in hardware and software, the process for capturing information has constantly increased in speed and flexibility. How many different staff members in a bank or credit union enter data on a daily basis? How many managers have authority to alter or delete data? How many of these individuals have their own way of doing things, and their entries into core systems reflect such individuality? Does staff ever change, and with it the ways of doing things? Does anyone maintain organization-wide standards for entering data?

Manual entry is the main culprit in terms of responsibility for poor data quality. Technology has been an unwitting accessory to the crime, providing far too many methods of entry deemed acceptable by core and other systems. Then there are the entry errors inherent with a manual process. Over time, through automated processes, system upgrades and data conversions, inconsistencies mount, making it difficult to assemble useable data sets.

For institutions with long-term staff members, experts who understand the “rules” of how information has been gathered and entered, some of these issues can be overcome. The problem is that many of the experts have moved up the ladder, departed for better opportunities or were shed during the recent financial crisis. When it comes to data quality, the loss of local expertise is a close second to manual entry as the primary contributor to unusable data.

There are other factors where red flags should be raised and quality questioned. Few, if any, are intentional. As the 2016 RMA data quality survey states, “benign neglect” by financial institutions over many years is at fault. Preparing to implement the expected credit loss standard is but one situation where the consequences of neglect are revealed.

There are solutions, many of them to be discussed next week at the American Bankers Association CFO Exchange in Charleston, So. Caro. With just a few short years before CECL moves from preparation to reality, acknowledging the data quality problem and addressing that challenge with a thoughtful, orderly process right now could result in meaningful improvement.

Think this does not apply to your institution? Staying with a wait-and-see approach? Expecting less rigorous expected loss requirements because your institution’s asset size will lower your bar for loan-level data quality? Then dust off your “Purple Rain” soundtrack, and drop the needle on the first song. Planning to do nothing is crazy.

About Abrigo

Abrigo enables U.S. financial institutions to support their communities through technology that fights financial crime, grows loans and deposits, and optimizes risk. Abrigo's platform centralizes the institution's data, creates a digital user experience, ensures compliance, and delivers efficiency for scale and profitable growth.

Make Big Things Happen.