The Value Gap is a MarketWatch Q&A series with business leaders, academics, policymakers and activists on reducing racial and social inequalities.
When a congressional task force on artificial intelligence convened virtually in May for a hearing exploring how “human-centered AI” can increase racial equity in housing and financial services, concerns that machine-learning algorithms may exacerbate systemic racism were closely intertwined.
Dave Girouard, the chief executive of the AI lending platform Upstart Holdings Inc.
in Silicon Valley, understood the worry.
“The concern that the use of AI in credit decisioning could replicate or even amplify human bias is well-founded,” he said in his testimony at the hearing.
But Girouard, who co-founded Upstart in 2012, also said he had created the San Mateo, Calif.-based company to broaden access to affordable credit through “modern technology and data science.” And he took aim at the shortcomings he sees in traditional credit scoring.
The FICO score, introduced in 1989, has become “the default way banks judge a loan applicant,” Girouard said in his testimony. “In reality, FICO is extremely limited in its ability to predict credit performance, because it’s narrow in scope and inherently backward looking.”
Girouard testified that “for the past two decades, study after study has found that African American and Latino communities have lower credit scores as a group than white borrowers.” The CEO said Upstart views potential borrowers as more than their credit scores and has a more inclusive lending model.
“The heart of an AI system is that it keeps learning and is constantly getting better,” Girouard told MarketWatch in an interview. “Someone in my system who might not have been approved a year ago, today, it will approve them, or it might approve them at a lower rate.”
Upstart provides its machine-learning software to community banks and credit unions across the U.S. to help them assess the creditworthiness of customers seeking personal loans, he said.
Related: Upstart prices IPO at $20, for market cap of $1.45 billion
For 2020, Upstart estimates that its AI lending model approved 30% more Black borrowers than a traditional model, and with interest rates that are 11% lower, according to an email from a company spokesman. Similarly for 2020, Upstart approved 27.2% more Hispanic borrowers with 10.5% lower rates compared to a traditional model, based on the company’s estimates using standard industry methodology, according to the email.
‘It’s very reasonable to be concerned about AI because it’s a very sophisticated system and it does have the potential to introduce bias or unfairness. Our answer to that concern is very rigorous testing.’
As promising as AI may be for expanding credit to underserved communities, academic researchers have pointed to potentially discriminatory pricing in both traditional and fintech lending.
In an updated paper titled “Consumer-lending discrimination in the FinTech Era,” published this year in the Journal of Financial Economics, researchers from the University of California, Berkeley, said they found rate disparities in the mortgage market that appeared unrelated to creditworthiness.
“Risk-equivalent Latinx/Black borrowers pay significantly higher interest rates on GSE-securitized and FHA-insured loans, particularly in high-minority-share neighborhoods,” they wrote. “We estimate these rate differences cost minority borrowers over $450 million yearly.”
GSE references the government-sponsored entities Fannie Mae and Freddie Mac, which buy mortgages from lenders and package them into securities; the FHA, or the Federal Housing Administration, insures mortgages. A spokesman for Freddie Mac couldn’t immediately provide comment on the study when reached by MarketWatch. Fannie Mae and the FHA didn’t immediately return requests for comment.
“Fintech lenders’ rate disparities were similar to those of non-fintech lenders for GSE mortgages, but lower for FHA mortgages issued in 2009–2015 and for FHA refi mortgages issued in 2018–2019,” the researchers wrote.
Study co-author Robert Bartlett, a professor of law at UC Berkeley, said in an interview that AI lending requires caution to avoid inadvertently further entrenching inequality. “The risk of bias is very real,” he said, even for “well-intentioned” algorithms.
Longstanding structural disparities underlie them, and data may be embedded with bias, Bartlett said. He added that machine learning should be done in an environment highly attentive to fairness, and with some form of oversight that allows regulators to understand how their models work.
Upstart’s platform doesn’t include home mortgages, though “it’s certainly something we could do in the future,” said Girouard, who was previously president of Google’s
enterprise business and built the tech giant’s cloud-apps business globally. Upstart has mainly focused on personal loans, he said. Last year the company, which also has an office in Columbus, Ohio, added auto refinancing products.
Upstart connects consumers with banks that offer the loans, while also providing banks with its machine-learning software, according to Girouard. One borrower who came to Upstart in search of a loan is Dayana Flores, who was introduced to MarketWatch by a company spokesman.
In a phone interview, Flores, 26, said she received an $8,000 personal loan through Upstart to pay off credit-card debt she had accumulated while juggling a low-paying cashier job with studying at Lone Star College in Texas. Flores got a “cheaper rate” from Upstart, fully repaying the personal loan in January, before it was due, she said. The loan, obtained through Upstart in late 2017, had a 17.54% interest rate, she added in an email.
Flores, who came to the U.S. from Mexico as a child, told MarketWatch that she is now free of debt and has started taking classes at Houston Community College, with the aim of transferring to a university to study psychology. She said she expects to work various jobs while attending school, and hopes to one day open her own therapist practice.
Upstart’s AI model seeks “accuracy in predicting and understanding who has the capacity to repay” their loans, Girouard said during the congressional hearing. The CEO, who said “we believe bias is always wrong,” also defended Upstart in what he described at the hearing as a “disagreement” with the Student Borrower Protection Center, a Washington, D.C., advocacy group that last year raised concerns about its lending model.
See also: How artificial intelligence could replace credit scores and reshape how we get loans
The SBPC in February 2020 called on Congress and regulators to examine the use of education data in consumer lending due to concerns it may discriminate against people of color. In its report, titled “Educational redlining,” the group alleged that Upstart charged borrowers who went to a Historically Black College or University (HBCU) more for their loans.
“Their conclusions, in our view, were inaccurate,” said Girouard at the hearing. “The use of education data without question improves access to credit” for Black and Latino Americans, as well as for “almost any demographic you can speak to,” he said. “Our models aren’t perfect, but they certainly are not discriminatory.”
‘FICO scores are very narrow in scope and backward looking, meaning they’re an accumulation of your historical use of credit. But how do you successfully use credit if you don’t have a FICO score?’
Upstart agreed last year to work with the SBPC and NAACP Legal Defense and Educational Fund Inc. on a review of its fair lending practices for possible improvements. The company also works with the Consumer Financial Protection Bureau (CFPB) in an effort to “build the most inclusive program possible,” according to a statement from Girouard emailed to MarketWatch.
“Upstart runs fairness tests on every applicant and every loan that goes through our platform,” he said in the statement. “Because these models are new, we share the test results with the government and consumer groups on a regular basis.”
As for industry monitoring, Girouard told the congressional task force that he believes a supervisory system is needed to ensure companies aren’t introducing bias into their AI models.
MarketWatch spoke further with the Upstart CEO for a Value Gap interview, which has been edited for length and style:
MarketWatch: Why do FICO scores fall short?
Girouard: A three-digit number can only represent so much. FICO scores are very narrow in scope and backward looking, meaning they’re an accumulation of your historical use of credit. But how do you successfully use credit if you don’t have a FICO score? It’s a bit of a circular argument.
FICO scores tend to serve well people with 20 or 30 years of credit, steady income, and who have paid back all their loans. It just sort of down the middle serves people who are in traditional roles.
That leaves out a lot of people on the margins. It leaves out young people and recent immigrants, while disparately leaving out Black Americans and Americans with low and moderate incomes. So many people aren’t well served by the traditional system through no fault of their own. The heart of what our system tries to do is use a lot more data to identify creditworthy people.
MarketWatch: What are the key data points that you focus on in your system?
Girouard: There’s a whole variety of them. There’s actually like 1,600 different data points in our system. The more things you can know about a person, the more chance you have that you can identify reasons that they’re creditworthy.
To simplify it, one person might be given a good rate because they do have a high FICO score, with the use of credit being a good thing. Another person might get a good loan because he works as a nurse, and nurses tend to be very steadily employed. Another person might have studied economics at a prestigious school, which tends to mean they’re going to have good economic outcomes. Another person might be in the military, and people in the military also tend to have steady employment.
I’m kind of humanizing it. It’s really done by the software. People are far more creditworthy than a three-digit FICO number could ever recognize.
If a nurse’s credit score is 580, which is not a very good credit score, most lenders would say, “No, thank you, I’m sorry I can’t offer you a loan.” But our system might say, I know you have a 580 credit score, but it turns out it’s because you’re really young and haven’t used credit much. Also, you’re a nurse working for a hospital system, and that’s a very reliable source of income.
MarketWatch: So your software considers data points like profession and age?
Girouard: Yes, your credit score, your age, maybe your highest degree of education, your area of study, the industry you work in. It’s this forever list of things. There’s just a lot of subtlety to all of it.
Here’s an important thing to realize: Less than half of Americans have credit scores that qualify them for prime credit, but something like 80% to 85% of Americans have never defaulted on anything.
What our software is trying to do is fill in that gap. Who are these people who are 40% to 50% of Americans who have never defaulted on anything, and yet they don’t have good access to credit? We’re trying to identify them in as many other ways as we can.
MarketWatch: How has Upstart helped marginalized communities?
Girouard: If you just look at a FICO count, which is one way to look at it, our model approves 86% more near-prime borrowers. Near prime means borrowers whose credit score is between 620 and 660. So almost twice as many people in that range are approved by our model compared to a traditional model, and with an interest rate that’s an average 5 percentage points lower.
In another example, banks that use our technology will approve on average 34% more Black applicants than a traditional model would have, and with an average 4 percentage points lower interest rate. [Editor’s note: That’s over the past three years, Girouard explained in his written testimony for the congressional hearing.]
MarketWatch: How do you address concerns that bias may be built into AI lending models?
Girouard: It’s very reasonable to be concerned about AI because it’s a very sophisticated system and it does have the potential to introduce bias or unfairness.
Our answer to that concern is very rigorous testing. You have to test every single application for bias, not just every loan, looking at how rates, or approval rates, compare for different groups of people. We’ve been doing this for many years, and the results of the tests are submitted every quarter to the Consumer Financial Protection Bureau.