Lending and AI

12157626870?profile=RESIZE_400xBuying a house these days is almost insurmountable.  Who can afford to pay cash for a decent house, or even the minimum downpayment?  That’s where lenders come in.  Banks and finance companies have been doing this for years.  But now there is an elephant in the room, called AI.  The top US bank regulator is warning that lenders need to ensure that artificial intelligence tools don't perpetuate biases and discrimination in credit decisions.[1]

Federal Reserve Vice Chair for Supervision Michael Barr said that the central bank is working on its supervisory efforts around AI.  He added that if used safely, the technology could have a positive impact on access to loans.  AI could leverage data "at scale and at low cost to expand credit to people who otherwise can't access it,” Barr said earlier this week in prepared remarks for the National Fair Housing Alliance conference in Washington.  "While these technologies have enormous potential, they also carry risks of violating fair lending laws and perpetuating the very disparities that they have the potential to address.”

Barr said machine learning and other artificial intelligence could amplify bias or errors in data or might make incorrect predictions.  "There are also risks that the data points used could be correlated with a protected class and lack a sufficient nexus to creditworthiness,” he said.

Gary Gensler, head of the Securities and Exchange Commission, also sounded warnings about the use of AI in finance.  He said that companies need to be aware of how their use of AI may not be in line with securities rules.  The proliferation of AI means governments will probably have to overhaul regulations to maintain global financial stability, Gensler said.

Artificial intelligence is already widespread across banking, payments and insurance. Whether we know it or not, algorithms make decisions about our finances every day.  At present, the technology is most commonly used to market products and to enhance customer service, where AI chatbots have become the first port of call for a growing number of customers.[2]

As these chatbots help to answer common queries about payment balances, order statuses and returns, human customer service teams are freed up to address more complex issues. Theoretically, this improves the customer experience and lowers costs.  And, as the new wave of generative AI — based on large language models such as ChatGPT, is applied to more banking and payment services, it will become capable of taking on these more complex queries, too.

 

Applications of AI in banking and payments - Accenture’s generative AI lead for banking in the UK, says: “We can expect even greater and more precise personalization specific to each customer’s unique circumstance.  This will be down to how letters and emails are written to give the customer information only they need, at a time when they need it.”  

Still, new benefits come with new risks. UK consumer group Which? warns that, if automated decisions are based on biased or inaccurate data, it could lead to some consumers being excluded from certain products or suffering financial losses. 

Which? director of policy and advocacy, says: “Ultimately, if consumers are going to benefit from AI, then they need to know that the [regulators] will adopt a robust approach to supervision, with tough enforcement for firms not delivering for their customers.”  No wonder most firms are treading carefully.  The chief data officer at brokerage Hargreaves Lansdown, says: “Like the vast majority of businesses, we are still understanding how we can best use the technology.”

That may be in ways that have not yet be identified. Simon Lyons, lecturer at The London Institute of Banking & Finance, says: “When we think of AI, we assume that its usage is to take over tasks that humans do and do them better.  However, the true value in AI is the identifying of trends and making judgments from them.”

AI is helping with number crunching, processing, and the heavy lifting of data analysis.  Investment fund Augmentum Fintech, says: “The vast majority of AI deployments today involve predictive AI, where machine-learning models are trained on historic data and then used to support rules-based decision making in use cases such as underwriting, fraud detection and trading strategies.”

Checking payments and transactions for evidence of financial crime, by spotting suspicious behavior patterns, is a top use case.  Banks are using AI and the data they collect when processing transactions and authorizations to predict fraud.  In fact, Accenture says: “Many of the frauds and scams discovered in recent years would not have been found without the advanced algorithms which look for signal in the noise.”

Applications of AI in insurance - Similarly, AI’s ability to process data, spot patterns and make decisions is finding practical applications in insurance.  It is already being used to better assess claims liability, to optimize pricing, and to personalize cover.

Debbie Kennedy, chief executive of insurance broker LifeSearch, says insurers are “leveraging the ability to use advanced analytics to consume and learn from vast data sources.” 

Risks from the use of AI - But there are downsides to the pursuit of delivering the perfect price for each risk.  Consumer group Fairer Finance is calling for boundaries around what insurers can price on and transparency around what data is being input to pricing algorithms.  Fairer Finance, warns: “The more we move away from the pooling of risk to individualized pricing, the more we exclude people at the margins. We also end up penalizing people for things they have no control over, or by using statistical correlations to place consumers in the wrong bucket.  You may well be able to show that people from profession A are prone to have more car accidents than people from profession B, but there’s unlikely to be any causality in that link.”

 

29% - Percentage of savers comfortable with an adviser using AI.  However, in future, it is likely that AI could prove beneficial in supporting consumers with financial decisions.  Financial education website Boring Money found 29% savers and investors are comfortable with their financial adviser using AI technology to provide a cheaper and better service.  And 28% are comfortable taking investment recommendations given as a result of using AI technology.  Even so, there will be natural limits on how transformative the technology can be, says Boring Money.  For example, one of the biggest barriers to taking financial advice remains trust and “AI is not going to solve this problem,” it notes.

The director of OneStep Financial Planning at Charles Stanley, agrees.  “Money is emotional and personal,” she says.  “AI can be many things, but it can’t be human, and it can’t understand you as an individual.”

This article is presented at no charge for educational and informational purposes only.

Red Sky Alliance is a Cyber Threat Analysis and Intelligence Service organization.  For questions, comments, or assistance, please get in touch with the office directly at 1-844-492-7225, or feedback@redskyalliance.com

Weekly Cyber Intelligence Briefings:

Reporting:    https://www.redskyalliance.org/
Website:       https://www.redskyalliance.com/
LinkedIn:      https://www.linkedin.com/company/64265941

Weekly Cyber Intelligence Briefings:

REDSHORTS - Weekly Cyber Intelligence Briefings

https://attendee.gotowebinar.com/register/5993554863383553632  

 

[1] https://menafn.com/1106623689/Feds-Barr-Says-Ai-Risks-Amplifying-Bias-And-Errors-In-Lending

[2] https://www.ft.com/content/15ae2b65-7722-4870-8741-b0ddcd54a534

E-mail me when people leave their comments –

You need to be a member of Red Sky Alliance to add comments!