> Affirm makes an assessment of creditworthiness based on a person's name, email, mobile number, birthday and the last four digits of his or her social security number, as well as behavioral factors like how long he or she takes to remember all that information. If that combination doesn’t quite add up to a loan, Affirm may also ask borrowers to share information from other online sources, like a GitHub coding profile or a savings account that shows cash flow history.
It's going to be interesting to see how these lenders show compliance with various aspects of the Fair Credit Reporting Act and Equal Credit Opportunity Act.
The latter, for instance, prohibits lenders from denying credit on the basis of race, color, religion, national origin, gender, marital status, etc. But it's possible some of the data points used by these lenders will turn out to be proxies for these attributes. There is already a lot of discussion around how lenders are using "big data" and the associated potential pitfalls[1][2].
There are two scenarios: Someone whose FICO score indicates more risk than Affirm's algorithm, and someone whose FICO score indicates less risk than Affirm's algorithm.
Affirm chooses a rate based on its algorithm's prediction, and I believe worst case just offers standard FICO-based rates... because it has a slightly different business model, it has no incentive to ding customers with a $15 fee if they pay the third payment three days late.
You're oversimplifying the issue here. The point is that Affirm is apparently using an amalgamation of potentially thousands of data points to make decisions about creditworthiness. Some of these data points, including name, email and mobile number, as well as behavioral factors, could very well prove to be proxies for characteristics which the law prohibits lenders from using to make credit decisions.
Interesting. At some level things like income could be considered a proxy as well... arguably the absence some of the more stringent consumer lenders, like Chase, on one's credit report might predict membership in a disadvantaged group.
>denying credit on the basis of race, color, religion, national origin, gender, marital status
Are any of these independent factors for greater (or lower) risk once other factors like income, education and conscientiousness are considered? If they aren't why do any lenders care and if they are then any lender is just going to use cruder proxies for these factors disadvantaging more people.
Yes let's make it easier for youngsters to get themselves in the hock for that hit of instant gratification. Get 'em on the never-never instead of teaching them to save.
What I'd like to see is a disruption to banks to encourage people to save properly, by integrating proper budgeting and accounting into the bank account itself. I.e. walling money that is allocated for future expenses such as insurance, utilities, presents etc. On the other hand with no extortionate fees for going overdrawn. I'd love to see something like that.
Serious question: Why would any financial institution implement something that will reduce their revenues?
Followup question: How would a company be able to come in and offer what you describe and offer users savings, and take a cut for themselves along the way, so that the users and the new company wins, while the incumbent banks lose?
Some banks offer something a little bit in this direction (at least in the UK and Australia). I have heard of banks with 'savings pots' and online saving planners.
For a bank it is probably good for them if you save. Eventually you will spend the money and in the mean time the saved money makes them money because (due to how the banking system works).
The way I can see this playing out is based on consumer demand. Banks have little in the way of genuine USP. I mean they all pretty much offer the same service and must compete on service, interest rates, fees, branding, reputation etc. So offering a way to help people save could be a good USP for the right kind of bank.
I agree. Even if by miraculous technology they are able to be effective at extending loans only to good debtors and be ethical with their bad debtors ... is a conversion optimized "Make me a debtor!" button really the kind of incentive society should have?
shame.
However Max and Affirm justify their business, there isn't an algo smart enough to only offer small credit to people who won't get in to trouble. In fact there is a huge incentive to lure people into using credit when they won't be able to handle it. Which is evidenced by almost any consumer credit business in history. Even if they say they won't charge late fees (for now), to play up their ethics, believe they are still going to be monetizing outstanding loans. Businesses like this (payday loan/ small credit / auto title loan / rent-to-own) typically sell their bad debt to recovery agencies and still turn a profit on the interaction. Not to mention Affirm will get a point or 2 from the merchant in all cases, I assume.
This is driven by profit and no banks are interested in people saving money when they could rather be charging interest and fees through debt.
Also this seems to be more of a failing of the education system. Young people should learn the proper financial skills in school instead of relying on corporations to just help them out. All these loans are voluntary and help plenty of people who need a way to borrow.
> Yes let's make it easier for youngsters to get themselves in the hock for that hit of instant gratification. Get 'em on the never-never instead of teaching them to save.
I think the people that are more likely to save are also the ones less likely to default on loans. Maybe it's just over-simplified thinking, but I think the reason Affirm works wonderfully is because they keep audience in mind. They pull in a ton of data points to assess the risk on investments, and I would love to see the percentage/demographics of people that get loans.
There are a couple of companies trying to do this right now. Digit just raised 11mm to do something similar. Acorns does something similar with investing. It is definitely a hard business to be in though, at the end of the day someone does have to pay for it for these things to work - be it users, merchants, or businesses.
Just had a thought maybe pension/superannuation companies, who have a vested interest in you saving more. Also mortgage companies, who could help you save for a deposit on a house (and therefore take out a mortgage).
Of course mortgage is debt but it is backed by an asset, usually comes with an excellent interest rate that is very close to zero or negative in "real terms".
>>..last four digits of his or her social security number, as well as behavioral factors like how long he or she takes to remember all that information. If that combination doesn’t quite add up to a loan,...
What I find really frustrating with most of these fintech companies is that they claim they use all kinds of data. But they don't really disclose whether this is used for 'underwriting' or 'fraud'. When it comes to underwriting there are specific FCRA guidelines on reason codes. For example, if they decline credit to a consumer because s/he did not type in their address fast enough, they need to tell them that. That would make for a good sound bite for the company.
Also, what if disabled people cannot type fast enough? That seriously touches on discrimination. Too bad CFPB is not going to care about these companies until they get to $10Bn in loan size.
I have a hard time believing that Github commit history actually factors into the loan process. Is the typical customer of a service like this likely to be a programmer, let alone even know what Github is? Also, is there data that shows that someone with a long commit streak is more creditworthy than someone without a Github profile?
Presumably, Affirm is an application of the same "all-encompassing many-sources-as-possible Bayesian-analysis engine" behind both Paypal's anti-fraud and Palantir's anti-terrorism technology. As such:
> Also, is there data that shows that someone with a long commit streak is more creditworthy than someone without a Github profile?
That data doesn't exist; the precise job of this sort of correlation engine is to create it. In credit analysis, you start with a decent credit-worthiness model of all your customers—derived not from predictions, but from how people with a given feature-set did historically. Then you go looking for new feature datasets to incorporate into your model which decrease its RMSE at predicting split-subsets of your outcome data.
Github commit history could easily be one of these. I would guess it would be a good prediction of both average employment tenure, and of the Big 5 Conscientiousness personality trait.
But I don't have to be right—I just throw the feature-set into the engine, and it either finds signal in it and turns up its weighting in the model, or finds that it's noise and turns it down to zero.
I'm kind of curious - is this new to the US market ? There was Paypal BillMeLater, etc. In India - we have something called EMI (Equated Monthly Installments) which a lot of credit card companies do for certain online purchases (like mobile phones, etc.)
The article claims that credit assessment is partially based on "..behavioral factors like how long he or she takes to remember all that information..". Is data entry latency being used as a proxy for human memory latency?
It's just kitchen sink AI applied to as much data as possible, and two plays: 1) a trustworthy lender that doesn't short-term maximize, and 2) in some cases a better rate due to the AI's data-driven prediction.
There is plenty of disruption in the small business loan space, including billion dollar valuations (one went public in December). They're just not from SV.
>Because of incumbent banks’ endless surcharges, late fees and other abuses, “there’s room for a product by young people for young people,” Levchin, 39, said in a phone interview this week. “That’s what Silicon Valley’s good for, and that’s what we’re doing.”
Levchin is not young, so I guess his startup is DOA.
Levchin must have been having the reporter on. This is not serious.
based on a person’s name, email, mobile number, birthday and the last four digits of his or her social security number
Onomancy, hotmailancy, astrology, and numerology, respectively. Ok I made up hotmailancy, but really. He just needed to add tiromancy and they could adopt Wallace and Grommet as their spokespersons.
There is definitely waaaay more data points that go into the loan process. I remember reading that it uses anything from information from social profiles to your GitHub commit streak to see if it worthwhile to provide a loan.
> “There’s an unbelievable amount of information available on American consumers — so it’s not exactly Affirm that you should be freaked out about. Having said that, we’re probably some of the most ethical users of such information.”
Rationally speaking, how is his "trust us" remark any different from the US government NSA spying program's "trust us" remark?
You choose to do business with Affirm in order to get something from them that you deem valuable enough to let them gather information about you, and they tell you that up front.
You don't choose to do business with the NSA, you have no say in whether you consider their "product" worth the cost, and they cover up their own activities.
> You choose to do business with Affirm in order to get something from them that you deem valuable enough to let them gather information about you, and they tell you that up front.
True. But that makes the business model more compelling. It doesn't negate the fact that it takes just one bad actor at Affirm to screw that trust up (just as easily as it takes just one bad actor at the NSA to screw their trust up).
It's going to be interesting to see how these lenders show compliance with various aspects of the Fair Credit Reporting Act and Equal Credit Opportunity Act.
The latter, for instance, prohibits lenders from denying credit on the basis of race, color, religion, national origin, gender, marital status, etc. But it's possible some of the data points used by these lenders will turn out to be proxies for these attributes. There is already a lot of discussion around how lenders are using "big data" and the associated potential pitfalls[1][2].
[1] http://www.microfinancegateway.org/library/big-data-big-disa...
[2] https://www.aclu.org/blog/ftc-needs-make-sure-companies-aren...