Skip to main content

Looking for Valuant? You are in the right place!

Valuant is now Abrigo, giving you a single source to Manage Risk and Drive Growth

Make yourself at home – we hope you enjoy your new web experience.

Looking for DiCOM? You are in the right place!

DiCOM Software is now part of Abrigo, giving you a single source to Manage Risk and Drive Growth. Make yourself at home – we hope you enjoy your new web experience.

How to Find Data Adequacy Ahead of CECL

June 28, 2017
Read Time: 0 min

FASB’s guidance for estimating expected credit losses is not prescriptive, so, examiners are not asking exactly how you plan to calculate your reserve under CECL today. However, due to the shift from an incurred to expected loss model, banks need to be working on loan-level data collection now as a first step toward being compliant under future GAAP.

In this recent webinar, Garver Moore and Tim McPeak – principal consultants with Sageworks advisory services group – cover how community banks can improve data quality, assure data adequacy and take the step towards data validation and modeling. The session takes listeners through data collection methods, suggested fields for community banks to consider by methodology type and the advantages and disadvantages of starting CECL preparations today.

The keys to preparing today, as illustrated and outlined below, are understanding the methods used for data collection methods, determining the adequacy of the data collected and filling gaps when identified.

CECL data aggregation methods

Data Collection Methods

  • Limited method: Not a viable approach for most core systems due to limited data storage.
  • Static method: Preserves optionality later in the project. Consider consistency and coherency.
  • Dynamic method: Significantly reduced risk and offers most optionality for use.

Data Adequacy Checklist

  1. The data is labeled appropriately (headers consistently applied and are understandable)
  2. Data does not contain duplicates (fields, rows or entities)
  3. There are no inconsistencies in values (e.g., truncated by 000’s vs. not truncated)
  4. Data is stored in the right format (e.g., numbers stored as numbers, zip codes stored as text)
  5. The file extracted from the core system is stored as the right file type
  6. File creation is automated; not requiring manual file creation
  7. Data is reliable and standardized throughout the institution, across all departments
  8. Data fields are standardized and governed to ensure consistency going forward
  9. Data storage does not have an archiving time limit (e.g., 13 months)
  10. Data is accessible (usable format like exportable Excel files, integrates with other solutions)
  11. Archiving function captures data points required to perform range of robust methodologies

Filling data gaps
If you would like to learn more, watch the on-demand CECL – Understanding Data Webinar.

About the Author


Raleigh, N.C.-based Sageworks, a leading provider of lending, credit risk, and portfolio risk software that enables banks and credit unions to efficiently grow and improve the borrower experience, was founded in 1998. Using its platform, Sageworks analyzed over 11.5 million loans, aggregated the corresponding loan data, and created the largest

Full Bio

About Abrigo

Abrigo enables U.S. financial institutions to support their communities through technology that fights financial crime, grows loans and deposits, and optimizes risk. Abrigo's platform centralizes the institution's data, creates a digital user experience, ensures compliance, and delivers efficiency for scale and profitable growth.

Make Big Things Happen.