This is the third installment in a series of blog posts about the proposed ASU for codification improvements to ASU 2106-13, better known as the CECL standard. Click here to see the initial post in the series and to read some background on the proposed ASU.
One of the new disclosures required by the CECL standard is the disclosure of the carrying value of loans disaggregated on both their credit quality indicator as well as the vintage, or year of origination. The entity would have to produce this disclosure with columns for term loans originated in the last five years, as well as separate columns for term loans originated prior to the last five years and revolving loans. See the below illustrative example directly from the guidance:
An issue that was discussed at the November 1st meeting of the CECL Transition Resource Group was how loans should be treated in the vintage disclosure that had previously fallen into the revolving category but at some point since have converted to term loans. A variety of scenarios were raised, including revolving loans where an eventual conversion to term was written into the original agreement, or loans where the lender performs a new underwriting and converts the existing revolving loan to a term loan. This also might include loans that are restructured as part of a Troubled Debt Restructure (TDR) agreement. Opinions seemed to differ on what the correct treatment of these loans would be for disclosure purposes: whether they should be included in the column based on their origination date, included in the column the based on the date they converted to term loans, left in the revolving column even after the conversion to term, or allow an entity to make a policy election for which of these approaches to take.
Ultimately, the approach FASB landed on was twofold:
An example of the new disclosure can be seen below:
Abrigo’s Take: In this case, it seems like FASB found a pretty elegant solution to this tricky issue. By basing the inclusion in the term loans column on a subsequent credit decision, this should allow institutions to utilize data fields they are already tracking for other purposes (Last Renewal Date, for example) to also track which vintage year column the loan should appear in for this disclosure. However, institutions may have to look closely at what types of products they have that would fall into the new category outlined here, and make sure that those products are able to be identified for proper placement in this disclosure. Once those product types are identified, it should be simply a matter of combining those product types with whatever revolving indicator is already being used to determine what term loans should fall into this category.
This is a second in a series of blog posts about the proposed ASU for codification improvements to ASU 2106-13, better known as the CECL standard. Click here to see the initial post in the series and to read some background on the proposed ASU.
The issue of the treatment of recoveries of previously charged off amounts has been an ongoing topic of discussion. It was discussed at the June 11 TRG meeting, discussed again at a subsequent August FASB board meeting, then discussed again in the November 1 TRG meeting.
Two main issues were brought up and deliberated around this:
Some preparers did not feel that the standard was clear on if recoveries should be included in the estimate of expected credit losses. Many financial institutions use net charge-off rates today, that is, loss rates that include both the charge-offs as well as any subsequent recoveries, which theoretically produces an allowance that is net of those amounts today. There were also some questions about which types of recoveries should be included in the estimate, and if recoveries were required to be considered, or if that was optional. Some of these questions were based on the difficulty in obtaining data related to recoveries.
Related to the question above, if recoveries are included in the estimate of credit losses, then this could result in allowance amounts at either the loan or segment level that could at times be negative. Examples were given of banks who had very high levels of losses during the financial crisis where the subsequent recoveries of these amounts resulted in net recoveries in the years following the crisis. Preparers felt that clarification was necessary considering this definition in the ASU:
3220.127.116.11 The allowance for credit losses is a valuation account that is deducted from the amortized cost basis of the financial asset(s) to present the net amount expected to be collected on the financial asset. At the reporting date, an entity shall record an allowance for credit losses on financial assets within the scope of this Subtopic. An entity shall report in net income (as a credit loss expense) the amount necessary to adjust the allowance for credit losses for management’s current estimate of expected credit losses on financial asset(s).
In the definition, the allowance is defined as a valuation account that is deducted from the amortized cost basis of the asset. It wasn’t clear if that could include negative amounts that would be essentially added to the amortized cost basis rather than deducted.
Much of the clarification in the proposed ASU is around the second issue of Negative Allowances. The FASB appears to have mostly believed that there was not much clarification necessary on the inclusion of recoveries in the estimate, as the existing guidance was clear, particularly this paragraph:
326-20-30-7 When developing an estimate of expected credit losses on financial asset(s), an entity shall consider available information relevant to assessing the collectability of cash flows. This information may include internal information, external information, or a combination of both relating to past events, current conditions, and reasonable and supportable forecasts. An entity shall consider relevant qualitative and quantitative factors that relate to the environment in which the entity operates and are specific to the borrower(s). When financial assets are evaluated on a collective or individual basis, an entity is not required to search all possible information that is not reasonably available without undue cost and effort. Furthermore, an entity is not required to develop a hypothetical pool of financial assets. An entity may find that using its internal information is sufficient in determining collectability.
In this paragraph, institutions are asked to “consider all available information relevant to assessing the collectability of cash flows”, which would include information on expected recoveries. It also addresses that institutions are “not required to search all possible information that is not reasonably available without undue cost and effort”, which should address the questions for recovery data that is difficult to acquire. For negative allowances, the proposed ASU states that they are acceptable in this new addition to paragraph 326-20-30-7:
Recoverable amounts included in the valuation account shall not exceed the aggregate of amounts previously written off and expected to be written off by the entity.
There is also similar language added to the collateral dependent practical expedient, that those individual loans may also carry negative allowances as long as they do not exceed amounts previously written off.
For the most part, the clarifications addressed in this portion of the proposed ASU related to recoveries should require minimal change to ongoing implementation efforts at community financial institutions. Most institutions were already planning to include recoveries in their allowance estimate and likewise include them in their modeling for developing this estimate. As for the possibility of negative allowances, we feel these situations are likely to be extremely rare at the pool level, but may become more common for some individual loans, particularly for institutions who are very conservative with their initial charge-offs. However, institutions who may wish to carry negative allowances on individual collateral dependent loans should make sure that they have all their documentation and justification prepared, as the regulatory scrutiny that will be applied to these loans will likely be quite high.
On January 28, 2019, the FASB hosted a public roundtable on CECL intended to cover one main topic as well as a secondary topic. A large group was in attendance, with representatives from banks of various sizes, regulators, representatives from audit firms, as well as bank stock analysts and other readers of financial statements.
The primary topic was a letter containing a proposal made by the Bank Policy Institute (BPI) to FASB on November 5 identifying their concerns with CECL, as well as some proposed changes to address these concerns. The BPI is a banking trade group whose membership includes over 30 of the largest and most influential banks in the country. The first part of the letter asked for a delay in the implementation timeline as well as a more comprehensive study to be done to assess the “systemic and economic risks posed by CECL.” The BPI’s proposed change to the standard is to essentially split the impact of CECL into three categories:
The entire proposal can be read here. If this splitting of the total expected losses sounds familiar, it may be because this proposal has a lot in common with the “three bucket approach” that was proposed back when FASB and IASB were attempting to converge on expected loss standards, and in fact has some elements in common with IFRS-9, IASB’s expected loss standard that went into effect in 2018. The BPI points to both the negative capital and earnings impacts of the current CECL standard as well as claiming that CECL, as written, would be significantly more procyclical than the current incurred loss standard, and that procyclicality could lead to an unintended tightening in lending during an economic downturn. The letter and proposal were signed by around 20 banks within the BPI’s membership, although notably absent were the signatures of some of the very largest banks in the country: JPMorgan Chase, Bank of America, Wells Fargo, Citigroup, and TD Bank, who are all members of the BPI.
Discussion during the roundtable consisted primarily of the various signatories of the BPI proposal explaining and answering questions from FASB board members and staff, as well as other stakeholders around the table. It was clear that the position of the signatories was that the proposal, while substantial, was simply meant to lay out a framework, and not intended to be a fully mature accounting standard on its own. They also expressed praise for FASB’s deliberate standard-setting process and felt that some of the issues being address regarding the proposal would be able to be more fully addressed in that process. Discussion for this topic extended for nearly three hours, with FASB board members and staff in particular appearing to be eager to get into the details of the proposal and better understand it. Support among banks for the proposal was far from universal, with several of the banks in attendance expressing views in opposition to the proposal. FASB staff is committed to doing more research on the proposal and bring more information to the board later in Q1.
It seems hard to imagine that the FASB would take up the proposal without changing adoption timelines for CECL. The change, including the associated costs, for banks who are already well down the road on their CECL implementation would be significant, to the point of potentially having to go completely back to the drawing board in the most extreme cases. FASB was giving the proposal a fair hearing, but they are also holding their cards pretty close to their chest, as CECL has become something of a political football at this point. Hopefully this becomes clearer when FASB staff presents to the board later in Q1.
The secondary topic of discussion at the roundtable was an interesting disclosure issue related to the new credit quality vintage disclosure that exists in the new CECL standard. The background of this issue is that the illustrative example of this disclosure in the guidance contains lines for “current period gross write-offs and recoveries”, but this requirement was not outlined in the text of the ASU itself, leading to some confusion among banks and analysts as to if it was part of the disclosure of not.
This issue was discussed at the 11/7 board meeting and then again at the 12/18 board meeting. Conclusions reached were essentially that the gross write-offs and recoveries were intended to be part of the disclosure requirement; however, FASB was sympathetic to the operational challenges to adding the requirement so late in the implementation process and, therefore, asked the staff to do more research and outreach on the issue. The inclusion of the topic at this roundtable was part of that outreach.
Early in the discussion, one of the issues brought up by analysts was that they desired this information to be the cumulative gross write-offs and recoveries against a particular vintage of loans, which was even more than the illustrative example in the ASU outlined. Banks expressed that is would be extremely difficult to go back to get the cumulative losses and recoveries against vintages of loans for the last five years, as would be the requirement based on this desire. A few bankers also expressed that they do not use this information in this way today for managing credit risk and questioned the value of disclosing information that is not even used by management.
It seems likely that this will end in a compromise with both analysts and banks having to move from their current positions: banks not wanting to provide the information at all, and analysts wanting to have cumulative write-offs and recoveries against the last five vintage years. The logical compromise, which FASB board members seemed to propose, was requiring only the current period write-offs and recoveries, and then leaving it to the analysts to build the cumulative amounts over the next few years of disclosures. What is less clear is if any update to this disclosure will have the same rapidly approaching adoption date as the rest of the ASU, or if it will be split into a separate update with a different adoption date.
To learn more about Abrigo and our ABA Endorsed CECL Solutions, click here.
The number of jobs created in January as revealed in the February 1 Bureau of Labor Statistics’ Employment Situation report could only be characterized as strong. The national economy added 304,000 jobs, approaching double the expectation of about 170,000. Strength in hiring was widespread. Construction, manufacturing, and leisure and hospitality stood out as job gaining sectors, but most other sectors also saw increases. Only wholesale trade, financial activities, and information technology were essentially flat.
The report comes on the heels of a similarly strong report for December, although it did include a downward revision of December’s numbers of 90,000, leaving that month’s job creation number at 220,000, still well beyond what had been expected.
The headline unemployment rate, U3, ticked up 0.1 percentage point to 4 percent. The broader labor underutilization measure, U6, which counts individuals not formally included in the narrow definition of “unemployed” but available for full-time work, jumped from 7.6 to 8.1 percent.
The two sets of numbers – job creation and unemployment rates – are generated in different surveys; the partial government shutdown, for technical reasons, was expected to have an impact on the “household survey,” which produces the unemployment rates. This seems to be the case. Expectations were that headline unemployment would be unchanged, but might tick up a bit due to the shutdown, which, indeed, is what happened.
January’s job creation report, even given the revision to December’s numbers, indicates, at least as measured by the labor markets, that the U.S. economy remains strong and on firm footing.
Tom Cunningham holds a Ph.D. in economics from Columbia University and was senior economist with the Federal Reserve Bank of Atlanta from 1985 to 2015. Mr. Cunningham serves as a consultant to MST in the creation and ongoing development of the MST Virtual Economist and is the MST Advisory economics specialist.
As employment is a key factor in projecting loan portfolio performance, current employment statistics and longer term trends are likely to be primary considerations for most banks and credit unions as they incorporate forward-looking economic factors in their ALLL estimations under the CECL accounting standard.
Under the new accounting standard, CECL, financial institutions will be required to consider economic factors in estimating their reserves. The MST Virtual Economist is an efficient, automated way to evaluate qualitative economic factors and project their impact on the institution’s loss rate, find new variables that impact the loss rate and determine the relevance of the economic factors you are already using to make qualitative adjustments. Click here for more information or to schedule a demonstration.
One of the challenges bankers often cite about implementing the current expected credit loss (CECL) accounting standard is knowing where to begin the process. Controllers, senior credit analysts and administrators, and others involved in calculating the allowance for loan and lease losses (ALLL) have heard and read about the changes to the standard for accounting for credit losses for more than half a decade, and yet “CECL paralysis” is a primary hurdle for many involved in the transition efforts.
Having a CECL action plan that lays out steps to prepare for and comply with the Financial Accounting Standards Board’s (FASB) methodology for estimating allowances is one way to help overcome that paralysis. The action plan provides a road map that can be adjusted for your institution based on its size, staffing, and deadline for compliance. The updated accounting standard is effective in less than a year at banks with the earliest deadlines. Nevertheless, many other banks know that gathering years of loan data, selecting a loss methodology, and refining their estimates will take long enough that CECL is on the agenda in board rooms across the country.
Another sound option for lenders looking to kick-start CECL implementation is to attend hands-on training that provides the opportunity to dig into the standard and move beyond simply reading about it. Having the chance to evaluate different loan segmentation options and loss rate methodologies and to discuss various ideas about the best way to implement CECL can set off lightbulbs and perhaps avoid missteps.
John Richardson, Credit Administration Analyst at Bank of Washington in the St. Louis area, says his financial institution was already using software that automates the allowance calculation under current U.S. GAAP and is capable of running the estimate under CECL when he attended a CECL Transition Workshop in November. However, he had not really dived head first into implementing CECL beyond spending a significant amount of time attending webinars and seminars and reading up on the various guidance that has come out about CECL.
“A lot of what I’d been reading and hearing was very theoretical, so I was having trouble wrapping my head around the notion of ‘Here’s how it works in your world,’ ” he says. “Before I went to the seminar, we read a lot about CECL and talked a lot about CECL, but we hadn’t really been able to translate that research into practice.”
Abrigo, the technology provider behind Sageworks ALLL and MainStreet Technologies (MST)’s Loan Loss Analyzer, will host CECL Transition Workshops in six locations this year that will provide the same caliber of hands-on case studies and training that Richardson received. The workshops are relevant for institutions regardless of whether the institution subscribes to Sageworks ALLL, MST’s Loan Loss Analyzer or neither.
The workshops combine a morning panel discussion about approaches to CECL with an afternoon of attendees working together (in some cases, working on laptops pre-loaded with CECL solutions and case studies) to discuss the inputs, assumptions, and decisions required for producing an allowance for credit losses. One change this year to the workshop setup is that registrants will receive a short survey upon registration to help determine which agenda track would best fit his or her financial institution. The goal is to make sure participants get the kind of training and advice out of the workshop that is most beneficial to their particular situations.
Regan Camp, Abrigo’s Senior Director of Advisory Services, says CECL Transition Workshops help demystify CECL for those bankers who have been reading and hearing about it for so long that they have gotten a “Chicken Little” mentality: That the sky is falling and they don’t know where to run. The workshops will also, he says, keep bankers focused on the relevant aspects of implementation so they don’t get caught in “analysis paralysis,” where they get caught up in less important details at the expense of making progress on implementation.
“It’s one thing to understand all of the basic conceptual approaches” related to CECL implementation, Camp says. “However, there’s often a disconnect between the theory of it and the practical application of it. There are a number of different inputs, assumptions, and decision points that institutions need to make to actually implement it that are rarely discussed” in CECL educational programs.
Richardson says that the November workshop really helped to make the abstract concepts of CECL much more tangible. “The case study materials provided were helpful, because they walked us through step-by-step and encouraged discussion within the groups. Consultants and other staff were walking around to provide assistance for the times when our group got stuck or had questions,” he says. “I really enjoyed the fact that we were able to work as a team to process through some of the subjective questions, too, because so much of CECL is subjective. In fact, one of the things they said in the panel discussion was that so much of this implementation is going to be how we document the decisions we make as we implement the standard. So we discussed as a group: Which of these outlier readings can we remove, and how do we justify that? What factors do we apply here? What forecasts do you think are reasonable? Just thinking through that process was very beneficial.”
Banking regulators have repeatedly said that financial institutions should document their processes for determining their methodologies and for changing their methodologies, as well as documenting their process for determining the amount of the allowance.
Abrigo’s CECL Transition Workshops begin Feb. 26 in Kansas City, Mo. Additional workshops will be held:
• Feb. 28 in the Los Angeles area.
• March 12 in Nashville
• March 14 in Dallas
• May 14 in Minneapolis
• May 16 in Boston
For information and registration, learn more here.
Many thanks to Nathan Kelly, SVP, Credit Risk & Reporting Officer, United Bank, and David Jaques, FVP, Credit Analytics Manager, Valley National Bank, for discussing their CECL journey and how partnering with a third party, Abrigo (formerly MST), and running parallel methodologies using the MST Loan Loss Analyzer’s “Shadow Loss Analysis” module are helping them determine which CECL models will best suit their institutions.
Nathan Kelly on the impact of automating the allowance process . . .
We moved from Excel to automation in 2015 to improve the efficiency of our existing process. We chose MST at that time, now Abrigo, and our partnership has grown over time as we needed to respond to the increasing complexity of our incurred loss estimations. One of the benefits of converting to an automated solution early has been the warehousing of our data. We also use Abrigo Advisory Services as a trusted resource to address issues that have come up during transition. I think a financial institution needs to focus on a partnership as opposed to just an automated solution. We have found it valuable to bring various bank departments onto our transition team, to better understand the impact CECL is going to have on the bank and capital. We’ve got a lot of bench strength, people who’ve been with the bank a long time. We have used the Advisory team as a sounding board all along the way. They are professionals with real life experience, including at large audit firms.
David Jaques on the impact of automating the allowance process . . .
We went live with the MST Loan Loss Analyzer in 2016. It has allowed us to compile a data warehouse for our 149,000+ loans. CECL implementation is critical and highly visible, so when it came time to consider a CECL solution, Abrigo (formerly MST) was the only choice for us. We formed working groups involving Credit, Risk, Accounting, Capital Planning and Treasury, and in the first quarter of 2017 we engaged the Advisory team. They helped us segment our loan portfolio and evaluate our portfolio data by reviewing how we had been segmenting the portfolio historically. They also provided us with peer data as well as recommendations for CECL methodologies we’ll be testing with Shadow Loss Analysis. Their role has increased as we proceed. They help us get to decisions quicker than we could do on our own.
David Jaques on CECL methods and testing with Shadow Loss Analysis. . .
In an effort to get more granular we started with a cohort method and with 25 pools, later revised to 13 pools. Unexpectedly, a number of pools had zero losses due to the lack of historical loan level data – we only have loan level data back to Q4 2014. After a favorable review of an MST (now Abrigo) presentation on Transition Matrix Methodology, we chose to load that additional model into Shadow Loss Analysis. This allowed us to compare the migrated loss rates from the cohort model with the transition matrix probability of default/loss given default rates. The results so far from cohort have been similar to our incurred loss model, with TMM results generally higher. We’re going to run both models side by side for all 2019. We have $25 billion in loans, including a multi-billion dollar portfolio of PCD loans under CECL. We’ll likely combine the PCD loans with our legacy loans, but have the ability to report a separate reserve for those.
Nathan Kelly on CECL methods and testing with Shadow Loss Analysis. . .
Your data can drive your pooling and methodology selection. We grabbed loan level data back to 2011, not everything you need for a complete economic cycle, but enough key fields to use cohort for commercial loans and vintage for our consumer portfolios. We considered the loss behavior in those portfolios and how we pool to determine which methodologies are most appropriate. We may look to transition to more sophisticated modeling down the road, should the need arise, possibly work toward a transition matrix methodology. We looked at all our existing portfolios and tried to understand the risks in those as driven by loss patterns; our commercial portfolio has staggered losses for certain periods and consumer portfolios lend themselves to a vintage methodology where there is a sort of bell curve in the loan losses.
Nathan Kelly on data gaps . . .
We knew that we’d have to make some internal decisions on how to handle the calculations for those pools with limited internal historical loss history. Data is everything and we performed a lot of reviews of our historical data, determined what we needed to add, consolidate and obtain from a historical standpoint as well as what we need for future pooling or methodologies. Documentation is key here: where you get the data, how you validate it, how you use that data for your pooling. Shadow Loss Analysis gives us a test environment. We can test different models, different pooling to understand the CECL calculation then go back and add the historical data, like risk rating codes and FICO scores. All that can be incorporated to analyze your pooling or methodologies in different ways – and determine your data gaps. Every bank has fields in the core system they will use with CECL that they didn’t use in the past. We all have wish lists and have to go back to clean up data. Those are the basics.
David Jaques on data gaps . . .
We were very concerned about our limited historical data, which only goes back to the end of 2014. We thought the only model that would work for us would be cohort, so we loaded it into Shadow Loss Analysis. But transition matrix doesn’t need as much data as some other models, because it’s based on movements between risk ratings, or other loan states over time. So what historical data we have has driven our CECL methodology selection. Our Phase 3 parallel test will help us determine which model is most appropriate for each loan pool. We’re also working on fields that will support the undisbursed, non-cancellable lines-of-credit, as part of our CECL model.
Nathan Kelly on pooling structures and Q-factors . . .
We completed a risk matrix for all our loans including reviewing differences in rates, terms, geographies and so on, and looked at that matrix to determine how loss was realized differently based on those characteristics.. When you go back as far as you do with CECL, you can pinpoint where an additional loss might have taken place more so than under the incurred loss model. We also consolidated some smaller pools. We realized we could use a lot of the fields that drive our pooling now for our CECL pooling. We’re trying to refine how we use Q-factors, as CECL guidance is more specific about tying historical quantitative data closer to qualitative considerations for current as well as future estimates.
David Jaques on pooling structures and Q-factors . . .
We had eight pools in our incurred loss model, but just within CRE you can have a lot of different mixes, so our commercial pool now is by call code. Our consumer pools pretty much mirror the call codes and haven’t changed much. We have 1.5 billion co-op mortgages that have never had a loss. We isolate those from the rest of the multi-family mortgages because we didn’t want the loss rates to be skewed by co-op. A lot of our pooling resulted from our lack of historical data. It wasn’t what we expected, but our lack of historical and loss data forced our hand in pulling back from being more granular. We will leverage call report data from 2012 forward on a pool basis and make adjustments that make sense. We use a lot of Q-factors. We went back and did correlation analysis on each of them and evaluated factors against each other to make sure were weren’t measuring the same thing with different factors. Then we correlated them to loss rates. If there was a strong correlation we kept them; if not, we generally threw them out. CECL has afforded us the opportunity to overhaul our Q-factor process, to cut back the number of Q factors we employ. Certain factors apply to cohort that won’t apply to transition matrix and currently we’re evaluating which factors to use with each model.
David Jaques on engaging with auditors and regulators through transition . . .
We have been very engaged from the start. From our engagement with MST Advisory (now Abrigo Advisory) to our phase-one report, and Phase 2 (Design) progress, they’ve been regularly informed and we look for their feedback. We recently did a presentation on transition matrix to get their input. It’s critical to make sure your auditors know how your project is unfolding over time and the processes and data utilized to reach key decision points. On all aspects, from Q-factors to loss calculations, it’s important for them to understand what we’re looking at and how. The more comfortable we can get them, the smoother our transition will be.
Nathan Kelly on engaging with auditors and regulators through transition . . .
We have had limited discussion with auditors and regulators, enough to make sure we’re focused and letting them know where we are in the transition process. In the first half of 2019 we hope to sit down with them and go through our implementation process and documentation. Our regulators expect us to be well along the way in our transition in 2019.
Nathan Kelly on whether CECL will be a “set it and forget it” process . . .
There’s no such thing. We’re always looking at our pooling and our look-back periods, for example in our current incurred model, and I expect the same or more under CECL. We constantly look to add historical data, improve our modeling and review our methodology, to make sure we have sufficient data to move to a methodology that drives a better answer.
David Jaques on whether CECL will be a “set it and forget it” process . . .
Implementation is just the start of the CECL process. We expect a continuous evolution, like when auditors come out with best practices. I think we can expect a lot of changes in our methodology in the first few years. We already are discussing ideas for CECL 2.0, once the initial 2020 adoption phase is behind us. We have to validate all the models, and will submit whatever methodology we choose to be validated annually. So set it and forget it will never happen.
Want to learn more how Abrigo can help your institution through the CECL transition with software or advisory servces? Contact us.