Credit scores have long been the key measure of the likelihood that an American consumer will repay a loan, from mortgages to credit cards. But the The factors that FICO and other companies that create credit scores rely on things like credit history and credit card balances often depend on already having credit.
During the last years, a harvest of startups launched on the premise that borrowers without such a history might still be very likely to repay, and that their likelihood of doing so could be determined by analyzing large amounts of data, especially data that traditionally did not. part of the credit assessment. These companies use algorithms and machine learning to find meaningful patterns in the data, alternative signs that a borrower is a good or a bad credit risk.
These companies are still young, but to date there is no clear indication that these approaches have significantly expanded the available credit, and the lenders who use them often charge high interest rates, according to one. report by the National Consumer Law Center, a consumer advocacy group. Consumer advocates fear that some of these new sources of data, such as insights into online consumer behavior or financial data that has traditionally not been included in credit analysis, may unintentionally skew the results, leading to a unfair judgment of some borrowers. In the United States, lenders are prohibited by law from considering race, gender, and religion in a lending decision.
Los Angeles-based ZestFinance, founded by former Google CIO Douglas Merrill, claims to have solved this problem with a new credit scoring platform, called ZAML. The company sells machine learning software to lenders and also offers consulting services. Zest does not lend money itself.
The platform has been refined based on Zest’s experience with search engine Baidu in China, where only 20% of the population has a known credit history. By studying 21 different factors, such as the way people search and the way they browse web pages, Zest discovered patterns in Baidu’s data that could be used in deciding whether to grant small loans to these clients for purchases like clothes. Among the things Zest evaluated was how a person’s self-reported income matched their “modeled income,” which Zest calculates that person actually earned based on some other behavior. Just as important as the gap between reported income and modeled income is when they report inflated income (in other words, more income than the model implies they actually earn) and how much they report. ‘swelled, Merrill said.
Within two months, Baidu, which has a small lending company, approved 150% more borrowers with no increase in losses on their loans, and the company has granted hundreds of thousands of loans since, Merrill says.
Andrew Ng, Baidu Chief Scientist, credits Zest’s technology with helping his company accelerate its entry into consumer financial services by improving the “predictability” of their credit models using behavioral data online search for borrowers, mobile wallets and other sources. With Zest, Baidu found that borrowers who engage in risky behavior online, such as gambling or visiting risky websites such as those who sell illicit goods or market thrill-seekers, have a higher statistical probability. high to default on a loan.
“While perhaps ‘obvious’ in hindsight, clues like these can have a significant effect on underwriting performance,” Ng wrote via email.
Zest has also worked with two credit card issuers and an auto lender. Among credit card holders, an important signal turned out to be help desk calls, something the lender did not connect to creditworthiness before Zest’s job. It turns out that a person who calls to extend a payment deadline for a balance, despite delaying a payment, is probably in fact a reliable customer. “The intuition is sometimes wrong,” says Merrill.
Protection against bias, according to the company, is the fact that for each borrower, the system evaluates 100,000 different data points, and no point plays a determining role. To test for bias, Zest again relies on machine learning, which the system uses to test its own results. It applies an algorithm that the Consumer Financial Protection Bureau uses to check for discrimination, and also performs other tests to find unexpected correlations with factors that lenders are prohibited from considering.
Baidu’s Ng endorsed Zest’s technology for its ability to explain what he called “black box machine learning subscription models” and focus on detecting and correcting both explicit and hidden biases.
Explaining credit decisions to borrowers and regulators will be essential, says Chi Chi Wu, lawyer at the CLB, by explaining in particular whether the data models on which we rely are truly predictive, and not just correlated. “Alternative data is not the ultimate solution,” she says.