AI could one day protect banking from human frailties.
When it collapsed in March 2023 with $209 billion in assets, Silicon Valley Bank (SVB) became the second-largest bank failure in US history. Government officials and private finance professionals are just beginning the postmortem, yet it is possible that SVB’s failure risked triggering a financial crisis of 2008 proportions. Meeting the bank’s financial obligations has already cost the US government billions of dollars. When the news broke, perhaps many readers, like myself, thought: “Could AI have helped to prevent this from happening?”
In the few months since the launch of OpenAI’s ChatGPT in late 2022, we can already identify a wide range of activities where large language models (LLMs) can be applied. As a university professor, I have seen first-hand how the internet and search engines have changed the nature of teaching and learning: the first article I wrote for this journal was dubbed ‘Make Way for Professor Google’, eight years ago (Dialogue, September/October 2015).
When I now read about how students are using ChatGPT to produce AI-generated research papers and essays, I am impressed. Or, perhaps, depressed. If LLMs can help complete school assignments, compose music and write novels, imagine their potential in business, particularly banking and finance.
Yet, as reported in the Financial Times, the problems that led to the fall of SVB were more a result of human frailties than the lack of deep analysis that AI applications can provide. Allegedly, there was evidence of weaknesses long before the run on the bank by its depositors. A conventional assessment of SVB’s balance sheet, including financial ratio analysis, indicated such risky factors as the large proportion of its loans made to startups, and the large amount of its investment assets held in long-term bonds.
Professionals with knowledge and experience should have seen these as part of a recipe for trouble – without the help of an LLM. Reportedly, SVB assumed that borrowers from startups could repay their loans with later rounds of venture-capital funding, rather than using the cashflow from present operations on which established companies rely.
Why should we care about these factors? As SVB matured, venture-capital activity slowed, leaving several of SVB’s fledgling borrowers without the liquidity to uphold their repayment schedule. Moreover, when interest rates rose, bond prices rose accordingly. Thus, to shore up its decreasing liquidity from the flight of customer deposits, SVB had to sell its bonds at a loss.
Lloyd Blankfein, who was chief executive of Goldman Sachs during the 2008 banking crisis, summarized the importance of the human element. “I generally don’t second-guess what someone should or shouldn’t have seen when I have the benefit of hindsight,” he told the Financial Times. “I’ll make an exception in this case, because the [problems] were very apparent.”
Every banking blunder is met with the traditional rounds of finger pointing, and SVB has been no different. One well-publicized argument has focused on the law passed by the US Congress in 2018 that weakened some of the key regulatory powers of the Dodd-Frank Act of 2008, which was passed in the hope of preventing future banking and financial meltdowns.
Like any transformative technology, the growth of AI applications will be fast and furious. AI in finance and banking is no exception. Its advent is already enhancing the customer user experience. It will likely also reduce the sector’s demand for labour: some predict that AI will reduce the jobs in finance by up to a quarter by eliminating the need for routine and repetitive work.
Could AI help finance decision-makers overcome their human frailties in the making of good business decisions? We don’t yet have an app for that. But, someday soon, we might.