Skip to main content

Looking for Valuant? You are in the right place!

Valuant is now Abrigo, giving you a single source to Manage Risk and Drive Growth

Make yourself at home – we hope you enjoy your new web experience.

Looking for DiCOM? You are in the right place!

DiCOM Software is now part of Abrigo, giving you a single source to Manage Risk and Drive Growth. Make yourself at home – we hope you enjoy your new web experience.

Shadow Loss Analysis: Running Parallel Methodologies

February 13, 2019
Read Time: 0 min

Many thanks to Nathan Kelly, SVP, Credit Risk & Reporting Officer, United Bank, and David Jaques, FVP, Credit Analytics Manager, Valley National Bank, for discussing their CECL journey and how partnering with a third party, Abrigo (formerly MST), and running parallel methodologies using the MST Loan Loss Analyzer’s “Shadow Loss Analysis” module are helping them determine which CECL models will best suit their institutions.

Automating the Allowance Process

Nathan Kelly on the impact of automating the allowance process . . .

We moved from Excel to automation in 2015 to improve the efficiency of our existing process. We chose MST at that time, now Abrigo, and our partnership has grown over time as we needed to respond to the increasing complexity of our incurred loss estimations. One of the benefits of converting to an automated solution early has been the warehousing of our data. We also use Abrigo Advisory Services as a trusted resource to address issues that have come up during transition. I think a financial institution needs to focus on a partnership as opposed to just an automated solution. We have found it valuable to bring various bank departments onto our transition team, to better understand the impact CECL is going to have on the bank and capital. We’ve got a lot of bench strength, people who’ve been with the bank a long time.  We have used the Advisory team as a sounding board all along the way. They are professionals with real life experience, including at large audit firms.

David Jaques on the impact of automating the allowance process . . .

We went live with the MST Loan Loss Analyzer in 2016. It has allowed us to compile a data warehouse for our 149,000+ loans. CECL implementation is critical and highly visible, so when it came time to consider a CECL solution, Abrigo (formerly MST) was the only choice for us. We formed working groups involving Credit, Risk, Accounting, Capital Planning and Treasury, and in the first quarter of 2017 we engaged the Advisory team. They helped us segment our loan portfolio and evaluate our portfolio data  by reviewing how we had been segmenting the portfolio historically. They also provided us with peer data as well as recommendations for CECL methodologies we’ll be testing with Shadow Loss Analysis. Their role has increased as we proceed. They help us get to decisions quicker than we could do on our own.

CECL Methodologies and Testing in Parallel

David Jaques on CECL methods and testing with Shadow Loss Analysis. . .

In an effort to get more granular we started with a cohort method and with 25 pools, later revised to 13 pools. Unexpectedly, a number of pools had zero losses due to the lack of historical loan level data – we only have loan level data back to Q4 2014. After a favorable review of an MST (now Abrigo) presentation on Transition Matrix Methodology, we chose to load that additional model into Shadow Loss Analysis. This allowed us to compare the migrated loss rates from the cohort model with the transition matrix probability of default/loss given default rates. The results so far from cohort have been similar to our incurred loss model, with TMM results generally higher. We’re going to run both models side by side for all 2019. We have $25 billion in loans, including a multi-billion dollar portfolio of PCD loans under CECL. We’ll likely combine the PCD loans with our legacy loans, but have the ability to report a separate reserve for those.

Nathan Kelly on CECL methods and testing with Shadow Loss Analysis. . .

Your data can drive your pooling and methodology selection. We grabbed loan level data back to 2011, not everything you need for a complete economic cycle, but enough key fields to use cohort for commercial loans and vintage for our consumer portfolios. We considered the loss behavior in those portfolios and how we pool to determine which methodologies are most appropriate. We may look to transition to more sophisticated modeling down the road, should the need arise, possibly work toward a transition matrix methodology. We looked at all our existing portfolios and tried to understand the risks in those as driven by loss patterns; our commercial portfolio has staggered losses for certain periods and consumer portfolios lend themselves to a vintage methodology where there is a sort of bell curve in the loan losses.

CECL Data Gaps

Nathan Kelly on data gaps . . .

We knew that we’d have to make some internal decisions on how to handle the calculations for those pools with limited internal historical loss history. Data is everything and we performed a lot of reviews of our historical data, determined what we needed to add, consolidate and obtain from a historical standpoint as well as what we need for future pooling or methodologies. Documentation is key here: where you get the data, how you validate it, how you use that data for your pooling. Shadow Loss Analysis gives us a test environment. We can test different models, different pooling to understand the CECL calculation then go back and add the historical data, like risk rating codes and FICO scores. All that can be incorporated to analyze your pooling or methodologies in different ways – and determine your data gaps. Every bank has fields in the core system they will use with CECL that they didn’t use in the past. We all have wish lists and have to go back to clean up data. Those are the basics.

David Jaques on data gaps . . .

We were very concerned about our limited historical data, which only goes back to the end of 2014. We thought the only model that would work for us would be cohort, so we loaded it into Shadow Loss Analysis. But transition matrix doesn’t need as much data as some other models, because it’s based on movements between risk ratings, or other loan states over time. So what historical data we have has driven our CECL methodology selection. Our Phase 3 parallel test will help us determine which model is most appropriate for each loan pool. We’re also working on fields that will support the undisbursed, non-cancellable lines-of-credit, as part of our CECL model.

Expected Credit Loss Pooling Structures and Q-Factors

Nathan Kelly on pooling structures and Q-factors . . .

We completed a risk matrix for all our loans including reviewing differences in rates, terms, geographies and so on, and looked at that matrix to determine how loss was realized differently based on those characteristics.. When you go back as far as you do with CECL, you can pinpoint where an additional loss might have taken place more so than under the incurred loss model. We also consolidated some smaller pools. We realized we could use a lot of the fields that drive our pooling now for our CECL pooling. We’re trying to refine how we use Q-factors, as CECL guidance is more specific about tying historical quantitative data closer to qualitative considerations for current as well as future estimates.

David Jaques on pooling structures and Q-factors . . .

We had eight pools in our incurred loss model, but just within CRE you can have a lot of different mixes, so our commercial pool now is by call code.  Our consumer pools pretty much mirror the call codes and haven’t changed much. We have 1.5 billion co-op mortgages that have never had a loss. We isolate those from the rest of the multi-family mortgages because we didn’t want the loss rates to be skewed by co-op. A lot of our pooling resulted from our lack of historical data. It wasn’t what we expected, but our lack of historical and loss data forced our hand in pulling back from being more granular. We will leverage call report data from 2012 forward on a pool basis and make adjustments that make sense. We use a lot of Q-factors. We went back and did correlation analysis on each of them and evaluated factors against each other to make sure were weren’t measuring the same thing with different factors. Then we correlated them to loss rates. If there was a strong correlation we kept them; if not, we generally threw them out. CECL has afforded us the opportunity to overhaul our Q-factor process, to cut back the number of Q factors we employ. Certain factors apply to cohort that won’t apply to transition matrix and currently we’re evaluating which factors to use with each model.

Engaging with Your Auditors and Regulators

David Jaques on engaging with auditors and regulators through transition . . .

We have been very engaged from the start. From our engagement with MST Advisory (now Abrigo Advisory) to our phase-one report, and Phase 2 (Design) progress, they’ve been regularly informed and we look for their feedback. We recently did a presentation on transition matrix to get their input. It’s critical to make sure your auditors know how your project is unfolding over time and the processes and data utilized to reach key decision points. On all aspects, from Q-factors to loss calculations, it’s important for them to understand what we’re looking at and how. The more comfortable we can get them, the smoother our transition will be.

Nathan Kelly on engaging with auditors and regulators through transition . . .

We have had limited discussion with auditors and regulators, enough to make sure we’re focused and letting them know where we are in the transition process. In the first half of 2019 we hope to sit down with them and go through our implementation process and documentation. Our regulators expect us to be well along the way in our transition in 2019.

Will CECL be a “set it and forget it” process?

Nathan Kelly on whether CECL will be a “set it and forget it” process . . .

There’s no such thing. We’re always looking at our pooling and our look-back periods, for example in our current incurred model, and I expect the same or more under CECL. We constantly look to add historical data, improve our modeling and review our methodology, to make sure we have sufficient data to move to a methodology that drives a better answer.

David Jaques on whether CECL will be a “set it and forget it” process . . .

Implementation is just the start of the CECL process. We expect a continuous evolution, like when auditors come out with best practices. I think we can expect a lot of changes in our methodology in the first few years. We already are discussing ideas for CECL 2.0, once the initial 2020 adoption phase is behind us. We have to validate all the models, and will submit whatever methodology we choose to be validated annually. So set it and forget it will never happen.

Want to learn more how Abrigo can help your institution through the CECL transition with software or advisory servces? Contact us.

About Abrigo

Abrigo enables U.S. financial institutions to support their communities through technology that fights financial crime, grows loans and deposits, and optimizes risk. Abrigo's platform centralizes the institution's data, creates a digital user experience, ensures compliance, and delivers efficiency for scale and profitable growth.

Make Big Things Happen.