Skip to main content

Looking for Valuant? You are in the right place!

Valuant is now Abrigo, giving you a single source to Manage Risk and Drive Growth

Make yourself at home – we hope you enjoy your new web experience.

Looking for DiCOM? You are in the right place!

DiCOM Software is now part of Abrigo, giving you a single source to Manage Risk and Drive Growth. Make yourself at home – we hope you enjoy your new web experience.

The Devil’s in the Data

September 14, 2018
Read Time: 0 min

Even as the new accounting standard for estimating the allowance was being announced in the summer of 2016, we knew the key issue – and the hardest part of transitioning to CECL – was going to be about data. We understood from the beginning that to comply with CECL we were going to need a lot of data, and that the data had to be of good, consistent quality. The concern remains paramount, in fact has grown. We see it everywhere: in webinars, white papers and conference presentations. The devil’s in the data. 

There are many legitimate reasons for not having enough reliable data:

  • We’ve used Excel to manually estimate our allowance and haven’t archived the data.
  • We’ve changed core systems and no longer have access to previous years’ data.
  • We’re a relatively new bank with little loss history.
  • We’ve changed risk rating parameters so don’t have the consistency required for pooling.

There are more. But what are we doing about it? It’s been to a great degree a case of analysis-paralysis. Like a tsunami, it’s coming and there’s nothing we can do about it. 

Still, we know we have to get our house in order. Regulators have repeatedly told institutions they shouldn’t have to incur a lot of additional costs to comply with CECL. But we also know you will be held to the standard, and that will include looking at what you had in terms of data when CECL guidance was issued in 2016 and what you have done since. You will have had three to five years. 

While you might not have great depth of data by your CECL implementation date, you could have an appropriate breadth of data. Considering FASB’s recently proposed extension of effectively 90 days, non-public business entities still have about three years to build their data reservoirs. Start now and you could double or triple the data you have by your implementation date.

Here are a couple of things you can do: 

If you want to cast a broad net: Start capturing as much data as you can and archiving it. Get whatever data you can from your core. Gather loan types and archive them. If you have an automated system, like the MST Loan Loss Analyzer, you can leverage that system as a data warehouse, not only to store your data, but to organize it by loan types, risk ratings and other ways useful for pooling. 

Or better, if you can be more specific: MST Advisory has been working with our clients to help them avoid data gaps by identifying the data points they will need for their chosen CECL methodology, then begin to capture those specific types of data and warehouse them. 

Instead of looking at gathering data for CECL as a futile exercise, do what many banks and credit unions are now doing, consider it an opportunity. Take the next couple of years to capture and archive quality data. You might not have the depth of data you’d like at your implementation date, but you will have the breadth.

More articles about data for CECL:

Data Fields: What Types of Data Should You Collect for CECL?

CECL: Get More Out of Your Investment than Compliance

Confessions of a Data Analyst

About Abrigo

Abrigo enables U.S. financial institutions to support their communities through technology that fights financial crime, grows loans and deposits, and optimizes risk. Abrigo's platform centralizes the institution's data, creates a digital user experience, ensures compliance, and delivers efficiency for scale and profitable growth.

Make Big Things Happen.