Evolution of Loan Loss Reserves
I started working with the ALLL in 2007, a year after the release of the 2006 Interagency Policy Statement on the Allowance for Loan and Lease Losses. At that time many community banks had a very simple approach to estimating loan loss reserves, something like: Pass-5%, Watch-25%, Special Mention-50%, Substandard-75%, Doubtful 100%. However, with the introduction and eventual adoption of the ‘06 guidance, banks were responsible for more structure, thought and documentation in estimating in order to fulfill both regulatory and audit requirements. The ‘06 guidance calls for an incurred loss methodology whereby the bank’s ALLL calculation includes only losses that have incurred, including losses that may have yet to be revealed. The methodology requires both quantitative analysis (calculated based on actual losses) and qualitative analysis (adjustments to the calculated loss experience based on management input).
With the impending implementation of expected loss methodology (Current Expected Credit Losses or “CECL”), banks will yet again be required to make substantial and fundamental changes to how they calculate their allowances, changes that will include, among other things, determining future losses based on past losses.
In reviewing the December 2012 CECL exposure draft, as well as the many FASB meeting notes and documents, I have noticed in particular multiple references to “Probability of Default” and “Loss Given Default.” In fact, the draft alone mentions Probability of Default some nine times as a method for calculating losses based on past events.
Banks need not be overly concerned about the changes in calculating their allowances that CECL will require, at least not from an operational perspective. Most community banks already have incorporated a risk metric (e.g., loan grade, delinquency) into their reserve process and in adding losses at a transactional level, the balance of the task is merely a calculation exercise.
Johnathon CLoss, MST