You are here

September 17, 2020

How Credit Scores Are Built And Used


Amy Crews Cutts
Amy Crews Cutts | President, AC Cutts and Associates

Amy Crews Cutts details how credit scores are built, what factors and variables may contribute to how they are evaluated, how credit scores from FICO and others have evolved, and the limits to the formulas and data used in building the models that provide credit scoring. Richard Green asks questions about how debts differentiate in the calculations, the issues with using outdated FICO scoring models, and what can be done regarding social disparities created by flaws in credit scoring.

Listen via podcast

View highlights


Please note this automated transcription may contain errors.

00:01:04.590 --> 00:01:22.020
richardgreen: we are talking about how borrowers are underwritten and one of the most important aspects of underwriting is evaluating credit the credit worthiness of a borrower.

00:01:23.880 --> 00:01:32.760
richardgreen: As we've talked about credit scores are one of the leading determinants, whether people can get a loan from a government sponsored enterprise Fannie Mae, Freddie Mac or

00:01:33.630 --> 00:01:46.740
richardgreen: From a bag. And so it's worthwhile understanding credit more deeply and for that I can think nobody better than our guest today. Amy Crews Cutts.

00:01:47.430 --> 00:01:56.460
richardgreen: I've known at me for a longer period of time in either of us would probably lead to admit at first got to know her when she was a professor at Syracuse University.

00:01:57.390 --> 00:02:14.610
richardgreen: Since then, I actually had the pleasure of working with her and Freddie Mac for about 16 months. She's Deputy Chief Economist pretty back chief economist at Equifax and knows all things about credit scores. So Amy, thank you very much for joining us tonight.

00:02:15.210 --> 00:02:16.050
Amy Cutts: Sure. My pleasure.

00:02:17.160 --> 00:02:23.820
richardgreen: So any let's just start with, if you could take a minute to tell us about your background and then we'll get down to business.

00:02:24.420 --> 00:02:35.580
Amy Cutts: Sure. So I started my life as a public finance economist at Syracuse looking at income distribution and public housing and other other subsidy programs on the on the

00:02:36.390 --> 00:02:44.220
Amy Cutts: What we call the expenditure side of public finance, not the tech side and then went to Freddie Mac and had a had a very long great career there it's. It was a

00:02:44.970 --> 00:02:52.080
Amy Cutts: It was probably the best university have ever been to in terms of the staff that were there. Who knew more than anybody else about housing finance.

00:02:53.190 --> 00:03:04.200
Amy Cutts: Eight years at Equifax, which was also a phenomenal experience and I've been on my own for about 18 months now as a freelance economist and that means I can speak my mind, which is great.

00:03:05.130 --> 00:03:20.250
richardgreen: It's one of the reasons I love having you on Amy is, I don't know. I mean, actually, not about me for years is even when she should pull your punches. Sometimes she doesn't, which is I think one of the great things about her. Um, so let's talk a little bit about

00:03:21.270 --> 00:03:37.620
richardgreen: The construction of a credit score. So could you just walk us through. So everybody this audience knows I think what their final score is, but I don't know that they know where it comes from. So if you could talk us through how are these things built

00:03:37.980 --> 00:03:50.460
Amy Cutts: That would be very. Okay, so at its most basic level, they take a consumers credit history actually take millions of consumer credit history so Equifax trans union and Experian

00:03:50.940 --> 00:04:03.120
Amy Cutts: Are the three what we call national credit bureaus and they provide anonymous data to fight go and fight co built models on it. The same thing happens advantage score. They're only major competitor.

00:04:05.190 --> 00:04:11.310
Amy Cutts: They take all of those credit reports and if you haven't ever looked at your credit report now would be a great time to do so

00:04:11.940 --> 00:04:26.550
Amy Cutts: I'm in it is every account. You've opened at a bank and many finance companies, though not all report to credit bureaus. One of the things that is unique about United States with respect is really

00:04:26.580 --> 00:04:31.050
richardgreen: Just a clarification. When you say every account with a bad. You mean accredited

00:04:31.620 --> 00:04:32.700
Amy Cutts: Accredited credit, credit

00:04:32.820 --> 00:04:34.200
Amy Cutts: Credit not not your

00:04:35.280 --> 00:04:37.350
Amy Cutts: Savings also reported but differently.

00:04:39.720 --> 00:04:40.350
Amy Cutts: So,

00:04:41.970 --> 00:04:56.370
Amy Cutts: In the United States credit reporting is voluntary on the part of the lender. It is not required by any any laws and it started a very long time ago in the 1800s.

00:04:57.450 --> 00:05:07.980
Amy Cutts: As a way, it's all about reputation. So what they do is they broke her trust in any event, your credit accounts are reported to the credit bureaus and

00:05:08.910 --> 00:05:18.330
Amy Cutts: In a file. So in it would be the date that it was opened, what the high credit. So if it's a credit card. Think of your credit limit, whatever that is, at the moment, your credit limit.

00:05:18.720 --> 00:05:28.560
Amy Cutts: On a mortgage or another installment loan, it would be the opening balance on that, um, sometimes interest rate is reported. But that's not usually a required field.

00:05:29.730 --> 00:05:49.110
Amy Cutts: And terms term length and so on. They then report every month, whether you have paid as agreed or you current on the account. How many days late you might be 3060 9120 is it in collections. Have you been referred to foreclosure. Have you filed for bankruptcy.

00:05:50.340 --> 00:05:59.700
Amy Cutts: Um, and then there are. I don't know the exact number but hundreds of what they call special narrative codes and the special narrative codes.

00:06:00.840 --> 00:06:09.210
Amy Cutts: Provide additional information. The sad thing about this is as old as credit scoring is it came out of the era when every bite.

00:06:09.750 --> 00:06:19.800
Amy Cutts: Of computer storage was very expensive. So they compress all of this information into just a few bites worth and and that's where the special narrative codes come in.

00:06:21.000 --> 00:06:29.340
Amy Cutts: All of that credit reporting is dictated by the consumer data industry association that which is

00:06:30.720 --> 00:06:43.350
Amy Cutts: The trade association for credit reporting. They set up the codes and how they're reported in. So all of that information that's been reported in on your account how much credit your the balance that you have on it this month.

00:06:44.520 --> 00:06:51.540
Amy Cutts: Whether it's jointly held with someone else or if there's a another user on the account just trying to think of all the various things.

00:06:52.560 --> 00:07:02.790
Amy Cutts: All of that information then goes into the mixing bowl of credit scores. It starts off in its most basic go back to the to the original scores.

00:07:03.120 --> 00:07:15.600
Amy Cutts: They were a simple logistic regression. What is the likelihood that a consumer will go delinquent on an account in the next X number of months, six months, I believe, is the target for for FICO scores.

00:07:16.770 --> 00:07:22.530
Amy Cutts: But credit behaviors like so many things are learned behaviors that what you do over time, sort of,

00:07:23.070 --> 00:07:35.550
Amy Cutts: Has some lingering benefits. So we could look at mortgage loans, five years out, and the credit score would still be predictive, though not as powerful as it was maybe an origination, but it still indicates something about

00:07:36.570 --> 00:07:38.730
Amy Cutts: Your likelihood of default, but

00:07:39.960 --> 00:07:47.490
Amy Cutts: Today, the most advanced credit scores are very, very complicated. They took that simple logistic regression

00:07:48.780 --> 00:08:07.050
Amy Cutts: Trying to estimate that simple probability they changed it into what they call a cascade model. So you kind of get sorted into a bucket before they even evaluate your score as an example. If you don't have very many credit accounts you would be called a thin, thin file borrower.

00:08:08.220 --> 00:08:10.950
Amy Cutts: And you don't want to be in the general population.

00:08:12.810 --> 00:08:25.230
Amy Cutts: Model because you don't have a long history behind you. So they might put you in the thin file part of the cascade to estimate that score. In the end, they've waited all those cascade models together to come up with

00:08:25.920 --> 00:08:31.710
Amy Cutts: This likelihood of default. They then convert that into a scale. The scale runs

00:08:32.490 --> 00:08:45.330
Amy Cutts: 300 to 850 with most borrowers falling between 600 and about 750 that's where the meat of meat of it is, and in fact 700 I think right around 700 is is about the median.

00:08:45.750 --> 00:08:57.510
Amy Cutts: Of those scores. So most people are pretty good at paying their bills they wait, things like on your credit card and how much of the credit card.

00:08:59.160 --> 00:09:05.940
Amy Cutts: credit limit is in use at any time if you had three cards, it would be far better to have a

00:09:06.540 --> 00:09:24.210
Amy Cutts: I'm assuming they're all equal in every way that you have a third of your credit limit used on each card, rather than 00 maxed okay so certain behaviors they don't like to see I'm credit cards maxed out. They certainly don't want to see delinquencies and then. Yep.

00:09:24.990 --> 00:09:34.200
richardgreen: Let me just add that's, I think, a really important point is there empirical evidence that I mean if you sold the credit capacity, right, if you're if you're 100% 00

00:09:34.680 --> 00:09:43.920
richardgreen: You're basically your unused credit is the same as if you were a third, a third, a third right or any empirical evidence that one is better than the other.

00:09:45.540 --> 00:09:52.200
Amy Cutts: Yes, I'm in it. So these are things I have not Franco has never revealed to me the secret sauce.

00:09:52.890 --> 00:10:04.680
Amy Cutts: But I have worked on Freddie Mac's automated underwriting system. So I know that when they generate these scores. They are looking at every combination, they can find to get a little more lift out of the mom.

00:10:05.370 --> 00:10:13.890
Amy Cutts: And the difference between the people who max out their cards versus the person who, a third, a third, a third. As an example, tends to be

00:10:15.330 --> 00:10:34.680
Amy Cutts: less risky. Having said that, the penalty for one card maxed versus 00 we're talking. That's probably the difference between a 771 versus somebody who had everything else equal, and the 30 day delinquency, right. So, so these things matter. And the other thing that sometimes

00:10:36.570 --> 00:10:42.480
Amy Cutts: Well, a lot of other things matter recency of credit. Have you applied for a lot of credit recently.

00:10:43.770 --> 00:10:45.960
Amy Cutts: On have you

00:10:51.000 --> 00:10:59.430
Amy Cutts: The types of credit. So that was a big thing. Maybe. Gosh, when I first came to Freddie Mac. One of the big things that they didn't like about FICO scores.

00:10:59.910 --> 00:11:13.830
Amy Cutts: Was that people who had an account at a finance company versus a bank were had lower scores. All else equal, so you have everything else equal, in terms of numbers of accounts total credit outstanding so on.

00:11:14.670 --> 00:11:25.320
Amy Cutts: The finance account would create a lower score. Well, part of that is if you think about where finance companies are in the in the food chain. There are finance companies like

00:11:26.340 --> 00:11:29.880
Amy Cutts: Credit card issuers who just do credit cards, but they're not a bank.

00:11:31.470 --> 00:11:42.840
Amy Cutts: That's a relatively new development versus household finance, which was again 2030 years ago, a way for people with marginal credit to get access to credit markets.

00:11:43.740 --> 00:12:00.870
Amy Cutts: And therefore were deemed higher risk. All else equal, um, I think if you look at scores today that differences is largely gone away that so many other things have been found to be more powerful, especially as these models are better at segmenting risk I found

00:12:02.850 --> 00:12:05.640
Amy Cutts: In the research that I did it. Freddie Mac as an example.

00:12:06.780 --> 00:12:21.990
Amy Cutts: A 620 borrower, who would be viewed as on the edge of between subprime and not who went to a regular bank to get a mortgage loan versus a 620 borrower who went to a bar of mortgage company that specialized in subprime loans.

00:12:23.520 --> 00:12:33.330
Amy Cutts: behave differently. Those two things were very different kind of which door. Did you go indoor a or door be those kinds of behaviors turned out to be very, very powerful.

00:12:33.900 --> 00:12:44.160
Amy Cutts: And it's one of the things that we find today with finance so called fintech companies, these, these finance companies that have been born of the Internet age that are non banks.

00:12:45.330 --> 00:12:51.600
Amy Cutts: We apply online and all of these kinds of things they've hired a lot of very good engineers.

00:12:52.080 --> 00:12:59.130
Amy Cutts: To come up with models to predict behavior and you've heard this from Zuckerberg, who said I can predict CREDIT BEHAVIOR off of your Facebook account.

00:12:59.790 --> 00:13:11.940
Amy Cutts: And the reason I bring this up is because some of your behaviors, the company you keep as an example. The university that you go to could be very strong predictors of your

00:13:13.440 --> 00:13:23.610
Amy Cutts: Credit performance, but they are not allowed under the Fair Credit Reporting Act. So this is kind of a segue Richard into how credit scores are created. I can give you

00:13:24.150 --> 00:13:36.540
Amy Cutts: Hundreds of variables that would be very strong predictors of your credit performance that are not allowed by federal law because they are in violation of the Fair Credit Reporting Act, the equal

00:13:37.650 --> 00:13:42.600
Amy Cutts: Credit Opportunity Act, the Fair Housing Act in these anti discriminatory.

00:13:44.550 --> 00:14:00.540
Amy Cutts: Laws and then there's also the court of public opinion, even if it was legal, but it was sketchy. I probably wouldn't want to want to put my company's reputation on the line on the basis of this variable because anything out there now is going to be small lift relative to what

00:14:01.800 --> 00:14:09.120
Amy Cutts: If I put this in the legal space would be a very small lift compared to what we have already in in the credit files.

00:14:11.940 --> 00:14:15.720
Amy Cutts: Having said that, the most important things about

00:14:17.880 --> 00:14:26.760
Amy Cutts: Your credit score are going to be things like how long have you had credit. How many credit accounts. Do you have more is better in the sense that

00:14:27.210 --> 00:14:38.310
Amy Cutts: Your ability to manage credit is one of the things that affects both young people and very old people. Is that because you don't have much credit. How do I know that you're a good credit risk.

00:14:38.940 --> 00:14:46.980
Amy Cutts: See, you kind of have to have enough to do that the first credit cards, very important. The second is even better. Once you get, you know, four or five accounts.

00:14:47.580 --> 00:15:02.880
Amy Cutts: We start to trust that we know you pretty well and can verify that. And certainly the capacity matters how much of that credit limit that you've that you've that you've used coming out of the Great Recession, the laws changed rather dramatically.

00:15:04.050 --> 00:15:17.610
Amy Cutts: With respect to granting of credit without verification of ability to repay. So having said that, and I bring this up because, as Richard knows in the great in the financial crisis, what led to that.

00:15:18.720 --> 00:15:28.530
Amy Cutts: I wouldn't even call it underwriting because basically the mortgage market abandoned underwriting and what happened was, house prices were rising so fast.

00:15:29.040 --> 00:15:38.130
Amy Cutts: You didn't need to verify collateral. You didn't need to verify capacity because house prices were rising fast enough. You could refinance out of trouble.

00:15:38.520 --> 00:15:46.710
Amy Cutts: Take out some cash in that home and use it as an ATM to fund this house you really couldn't afford for a while. All underwritten on the basis of a credit score.

00:15:48.030 --> 00:16:07.140
Amy Cutts: But remember that that any model in in econometrics sense is based on all else equal, that the market didn't change fundamentally but when underwriting abandoned those other important pillars of the marketplace credit scores became worth a lot less

00:16:08.250 --> 00:16:20.490
Amy Cutts: Because they were based on that capacity would be underwritten that collateral would be verified. And so then the credit scores became less valuable from that standpoint.

00:16:21.630 --> 00:16:33.630
Amy Cutts: So long winded way of saying credit scores are statistical model that they are weighted on the basis of the information that's in your credit report. It's not as simple to say

00:16:34.170 --> 00:16:49.950
Amy Cutts: A 30 day derogatory on a mortgage is worth two times the derogatory on an auto loan or that it's worth six points or 20 points or 100 points on your score because it really depends on all the other things in your credit report.

00:16:50.970 --> 00:17:08.010
Amy Cutts: They do a look back about two years the data that they're looking back is really covering the last two years of performance. I'm really other critical things to note is that by by federal law. This goes back to the federal Fair Credit Reporting Act.

00:17:09.060 --> 00:17:16.410
Amy Cutts: I'm from the date of the first derogatory on an account that would be the time you first went 30 days delinquent

00:17:18.090 --> 00:17:28.260
Amy Cutts: That derogatory has to be erased at seven years. So you might go delinquent on a mortgage. So I think the great financial crisis, you go delinquent on the mortgage

00:17:28.740 --> 00:17:45.990
Amy Cutts: You go through foreclosure. It's not seven years from foreclosure. It's seven years from the 30 days derogatory of when that negative information has to be deleted from the account the existence of the mortgage will still be there. It might still say closed negatively, but all of that.

00:17:47.100 --> 00:17:51.780
Amy Cutts: Length of history goes away and his in his deleted from from the account.

00:17:53.310 --> 00:18:05.100
richardgreen: So your three questions I want to ask you. So first is sort of a technical detail. So on the mapping of your statistical model to the score.

00:18:06.330 --> 00:18:11.760
richardgreen: I assume what goes on is that you rank everybody from

00:18:12.780 --> 00:18:30.030
richardgreen: Lowest to highest risk and then you somehow assign a score to that it's not just some sort of normal distribution or log normal dessert or how is that to that because, as you said that you are very thin tales and a lot of fatness in the middle so

00:18:30.300 --> 00:18:32.760
richardgreen: Right distribution only. How does that

00:18:32.850 --> 00:18:40.800
Amy Cutts: So they they really meant to look this up before it's it's a mathematical formula if there's no there's no magic in that part. It's just a straight mathematical conversion

00:18:41.730 --> 00:18:49.440
Amy Cutts: Where they take the logistic regression probabilities that come out of that they, I can't remember the map it to an odds ratio. First, but there's a conversion

00:18:50.460 --> 00:18:52.230
Amy Cutts: Where they multiplied into that score.

00:18:53.370 --> 00:18:53.760
richardgreen: So,

00:18:53.790 --> 00:18:56.880
Amy Cutts: It's a straight conversion. There's no magic in it.

00:18:57.450 --> 00:19:09.060
richardgreen: Okay, the second thing on the seven years is so foreclosure disappears from your credit score after seven years. And again, I didn't know this till just now. It's when you're 31st 30 days late

00:19:09.690 --> 00:19:13.800
richardgreen: From what may have may have been only, you know, five years.

00:19:13.800 --> 00:19:14.190
Amy Cutts: Going in

00:19:14.610 --> 00:19:29.940
richardgreen: The since the foreclosure actually happen. Um, but, like, Fannie and Freddie are still didn't look that event and still judge you negatively based on that event, independent of what your final score, is that correct

00:19:30.600 --> 00:19:34.890
Amy Cutts: Anything in your credit report is fair game for them to evaluate separately.

00:19:36.000 --> 00:19:36.720
richardgreen: Okay, so

00:19:36.750 --> 00:19:39.210
Amy Cutts: So, so, but it would be deleted from your

00:19:39.300 --> 00:19:45.720
Amy Cutts: So that foreclosure that happened in the great financial crisis is no longer on your record at all.

00:19:46.440 --> 00:19:50.790
richardgreen: But when I mean Fannie and Freddie console look up right these things.

00:19:51.300 --> 00:19:56.640
Amy Cutts: They could look up in their own accounts. Right, so they could, they could view that. Yeah, absolutely. Um,

00:19:57.270 --> 00:20:07.410
Amy Cutts: They don't share that information. But if you had a fanny loan before Fanny might say, oh, you know, we're not going to give Richard green alone because he went through foreclosure. Yeah. Bernie wants right kind of thing.

00:20:07.770 --> 00:20:10.230
Amy Cutts: So yeah, anything like that that could be could be

00:20:10.800 --> 00:20:12.480
Amy Cutts: Could be used, um,

00:20:14.910 --> 00:20:23.010
Amy Cutts: Yeah, and and that that's a very important thing because Fannie Mae no longer uses credit scores. I can't speak to Freddie because I haven't asked for it.

00:20:23.490 --> 00:20:33.960
Amy Cutts: But Fannie Mae no longer uses credit scores in their automated underwriting system back a long time ago when I was at Freddie Mac. We tried to engineer a automated underwriting

00:20:34.680 --> 00:20:42.690
Amy Cutts: System. So it's like a credit score right you're evaluating all these things, but that would include the loan to value ratio and capacity and other things.

00:20:43.560 --> 00:20:54.480
Amy Cutts: Without using credit scores and we found the credit score still provided meaningful lift on top of the variables in the credit report Fannie Mae has now.

00:20:55.290 --> 00:21:08.280
Amy Cutts: Changed their underwriting completely their automated underwriting system so that it does not use credit scores at all. They are used after the automated underwriting system for pricing the add on fees.

00:21:09.450 --> 00:21:10.980
richardgreen: The low, low price adjustments.

00:21:11.040 --> 00:21:11.670
Amy Cutts: Exactly.

00:21:11.790 --> 00:21:22.170
Amy Cutts: What it's not in not in the whether we would approve this loan or not. So I so I before I get my third question that I just read. So my understanding is they're still using 505

00:21:22.950 --> 00:21:23.580
richardgreen: For that

00:21:23.880 --> 00:21:25.230
Amy Cutts: So called cycle classic. Yep.

00:21:25.470 --> 00:21:43.410
richardgreen: Yeah, and we're now at 509 and when I've talked to her. They tell me they come with a new one every five years, which means that we're using a 20 year out of date scored price loans. Could you comment a little bit on how cycle is evolved and why it might be important.

00:21:44.880 --> 00:21:45.900
Amy Cutts: It's not just psycho.

00:21:46.080 --> 00:21:50.280
Amy Cutts: It's not just psycho importantly Vantage score was created by

00:21:51.480 --> 00:22:04.710
Amy Cutts: A joint effort from the three credit bureaus to compete with Psycho and I'm a free markets person I firmly believe that you need you need competition competition is is healthy for everyone and

00:22:05.730 --> 00:22:16.440
Amy Cutts: The reason why they're not using these more advanced scores is simply because of inertia that this stuff kind of gets baked into all the systems, the

00:22:17.520 --> 00:22:24.990
Amy Cutts: Content FH FHA tasked with some things coming out of the Dodd Frank legislation to look at

00:22:26.550 --> 00:22:36.900
Amy Cutts: Opening up the the underwriting at Freddie and Fannie to use Vantage school or other scores competing scores, not just being two scores, there could be a third, a third company out there that creates a score.

00:22:38.400 --> 00:22:39.060
Amy Cutts: And

00:22:42.270 --> 00:22:47.430
Amy Cutts: I'm firmly of the belief that the best score should win, whatever that is. That's the one that should be used.

00:22:47.790 --> 00:22:57.570
Amy Cutts: And it might not be the best score for everybody, meaning that I might segment one market one way in one market a different way and use a different score. Having said that, JOHN Gibbons, a former

00:22:58.200 --> 00:23:01.740
Amy Cutts: Freddie Mac executive really thought that this would be

00:23:02.250 --> 00:23:18.030
Amy Cutts: A problem in the capital markets that they are so used to looking at these loans and trading these loans on the basis of the FICO score that's that's been in existence that he thought that this would create a disruption. I think that actually it just is in some sense.

00:23:19.530 --> 00:23:22.920
Amy Cutts: an unwillingness to invest in the infrastructure to make the change.

00:23:24.330 --> 00:23:27.420
Amy Cutts: Lots of changes happen but they just don't want to do that.

00:23:27.810 --> 00:23:34.230
richardgreen: Yeah, you know, FHA have a beta argument to so they have already decided to stay with Fido

00:23:34.830 --> 00:23:39.660
richardgreen: Yeah. But the argument they made that up as a silly argument, which is there be a race to the bottom.

00:23:40.710 --> 00:23:50.100
richardgreen: If you had competition in this market. And I'm not sure where the incentive for this race would take place because it ultimately is

00:23:51.030 --> 00:24:00.030
richardgreen: paying for it to determine pricing and that pricing, by the way, you can be approved for for any louder Fanny loan, but you still might not want it because the FHA loan might be better.

00:24:00.300 --> 00:24:03.150
richardgreen: Because when I read on this pricing characteristics.

00:24:05.970 --> 00:24:06.900

00:24:07.110 --> 00:24:15.420
richardgreen: Yeah, you become expensive, so I'm just not entirely clear where that risk. The bottom argument, how that that work.

00:24:15.570 --> 00:24:18.810
Amy Cutts: Well, if you look at the comments that came in. I think that was

00:24:18.870 --> 00:24:23.970
Amy Cutts: It's it's ultimately an excuse, rather than a than a meaningful argument.

00:24:25.410 --> 00:24:30.630
richardgreen: But, but I think the thing that bothers me. And so if you'd comment on this. I'd appreciate it.

00:24:31.770 --> 00:24:32.700
richardgreen: Can tell me I'm wrong.

00:24:33.300 --> 00:24:41.430
richardgreen: Is I'll give an example of a way that 505 really five in class really bothers me is it doesn't differentiate between

00:24:42.930 --> 00:24:59.640
richardgreen: Credit events that happened as a result of X shops exemptions that household and those that were really reflected behavior. And the example I gave is supposed to get your kid gets sick and they have a procedure that's not covered by insurance and you just can't pay

00:25:00.990 --> 00:25:03.300
richardgreen: And so you have this

00:25:04.440 --> 00:25:18.420
richardgreen: Collections on plan on your record. And that really kills here like a compare that to somebody who buys a Camaro on 100% loan and crashes it within

00:25:19.860 --> 00:25:24.630
richardgreen: Two weeks and then default on that auto okay they're both defaults.

00:25:26.820 --> 00:25:40.710
richardgreen: FICA classic doesn't differentiate particularly well between those defaults and yet one thing actually reflects attitude toward credit the other reflects somebody getting a bad hand out to them and trying to make do as best they can.

00:25:41.760 --> 00:25:42.210
richardgreen: And

00:25:43.770 --> 00:25:45.660
richardgreen: My supposition is and I actually

00:25:45.780 --> 00:25:48.540
richardgreen: Do have some evidence, but I can't really get into

00:25:49.200 --> 00:25:50.580
richardgreen: Can share with people.

00:25:50.970 --> 00:26:02.220
richardgreen: Is that that family with a sick kid is a much better credit risk, then the guy who crashes is Camaro. And so that lack of differentiation is a real problem.

00:26:04.140 --> 00:26:12.150
Amy Cutts: Yeah, I'll give you, I'll give you a better example. The reason I say this is that collection medical collections are no longer reported in

00:26:13.710 --> 00:26:24.780
Amy Cutts: By credit bureaus. They have elected not to do that. Um, now if you put it on your credit card and you default on the credit card. It's still the medical so it's still a medical expense.

00:26:25.950 --> 00:26:33.750
Amy Cutts: may impact may be the same that you defaulted on it, but it's not tagged as a medical collections. So things that are tagged as medical collections are not

00:26:35.130 --> 00:26:36.930
Amy Cutts: Used in credit underwriting

00:26:38.580 --> 00:26:53.700
Amy Cutts: Relatively new this. I'm not saying this was always the case. This is more of a relatively new thing. Having said that, um, you know, we experienced Katrina and all these people who stayed in their homes, the homes were flooded. And you know, condemned.

00:26:54.960 --> 00:27:03.870
Amy Cutts: And the insurance wouldn't cover it because of the type of event like wind events are not covered flood events are not covered. But when is and you know depends on which kind of hurricane hits you.

00:27:05.190 --> 00:27:21.540
Amy Cutts: Those kinds of things really do a test people. Right. And some people will fight for that and make the payments and scratch the money together and other people view it as a you know a contract to be broken.

00:27:23.010 --> 00:27:30.270
Amy Cutts: So very different behavioral characteristics of those two types of people, the people who walk away and the ones who who who make payments on that.

00:27:31.860 --> 00:27:40.590
Amy Cutts: And I agree what, you know, independent of the moral aspect of what you brought up, I simply think that anytime we are not using

00:27:41.610 --> 00:27:46.140
Amy Cutts: The very best information available to us and the very best models available to us.

00:27:48.270 --> 00:27:55.380
Amy Cutts: Let's, let's forget for a moment, the consumer side of the pro consumer view. And let's do the pro business view.

00:27:56.490 --> 00:28:08.070
Amy Cutts: What they're doing by not allowing these better scores into their is from a business perspective. I'm saying yes to borrowers, I should say no to which means that I'm not allowing and borrowers that I should say yes.

00:28:09.300 --> 00:28:10.800
Amy Cutts: Right, the swap in of the goods and

00:28:10.800 --> 00:28:11.430
Amy Cutts: The bads.

00:28:11.670 --> 00:28:17.160
richardgreen: And in fact, you may be making your it and it's not just that it's it's that you are

00:28:20.220 --> 00:28:34.650
richardgreen: This is I think one of the elements that least disparate outcomes is you will have the way these things work is non Hispanic whites will score better than people of color, even that if the person of color is a better credit risk.

00:28:35.640 --> 00:28:35.970

00:28:37.560 --> 00:28:43.170
Amy Cutts: Yeah, and that's where these cascade models become when when cycle goes these new models. It's because

00:28:43.620 --> 00:28:51.450
Amy Cutts: The cascades get better at siphoning people into the right model the models themselves get better. The use of information. I'll give you an example. Recent

00:28:52.170 --> 00:28:58.590
Amy Cutts: Innovations in credit scoring and this is one of the things where it's technology, technology, as well as

00:28:59.160 --> 00:29:07.530
Amy Cutts: If evolution of thinking we knew about the value of trended data, a long time ago, Freddie Mac was been using it for 2025 years

00:29:08.250 --> 00:29:18.630
Amy Cutts: That was in their early indicator for for loan default. So loans that had gone late. You want to know which ones you prioritize in trying to get money from people calling them and saying hey what's going on.

00:29:19.470 --> 00:29:25.590
Amy Cutts: You don't want to do all your calls in the first day because some people just were out of town, they're going to mail their payment other people you want to catch them first thing

00:29:26.070 --> 00:29:39.870
Amy Cutts: The trended data is not just to look at, are you 30 days late now. It's what were the balances over the last several months. How are your payments timed, you know, that sort of thing we want to look at the trends in that not just the static in the moment.

00:29:41.220 --> 00:29:48.150
Amy Cutts: Well, it's not that we didn't want to do that. It's that you need massive pipes to bring that information.

00:29:49.350 --> 00:30:02.640
Amy Cutts: Right, so it's not, you're not sending a static credit report, you're sending snapshots multiple it's you know the difference between x ray and an MRI that kind of information intensity is what trying to date is all about the newest credit scores take advantage of trend to data.

00:30:03.660 --> 00:30:13.740
Amy Cutts: Had to have the pipes to do that. So these are these are technological innovations. Not that we didn't know about it. We just didn't have the infrastructure, the technological infrastructure.

00:30:14.100 --> 00:30:20.190
Amy Cutts: To make it happen. So, so I look at these things and say we are short changing consumers, we're short changing lenders.

00:30:21.300 --> 00:30:30.540
Amy Cutts: And investors. So the whole economy loses out when the wrong people get loans and the right people are denied access to credit. So it's a

00:30:31.410 --> 00:30:45.090
richardgreen: Little bit about files. Okay. And, you know, is there. There's a view basically the way again traditional cycle works is if there's limited information we're going to just treat you as a bad credit risk.

00:30:45.750 --> 00:30:49.050
richardgreen: Um, yeah. So you have somebody who lives.

00:30:49.170 --> 00:30:51.780
richardgreen: They pay their rent they pay their utilities.

00:30:53.880 --> 00:30:56.760
richardgreen: And managed to save some money, but they don't have credit cards.

00:30:58.020 --> 00:31:09.840
richardgreen: They're not going to get good, they're not gonna have a good fighter score. So they're not going to have access to the best quality credit. Is there something we can do about that other than telling people that they should go get credit cards.

00:31:10.590 --> 00:31:14.550
Amy Cutts: Well, another example is somebody who just moved from Europe to here.

00:31:16.290 --> 00:31:22.620
Amy Cutts: You know, the head of Oxford economics very prominent organization very great consulting firm came over from the UK.

00:31:23.790 --> 00:31:24.480
Amy Cutts: Couldn't get credit

00:31:26.190 --> 00:31:29.010
Amy Cutts: Guys get, you know, making $500,000 a year. Couldn't get credit

00:31:29.430 --> 00:31:37.650
Amy Cutts: And the reason for that is, is simply. Well, how do we know we can trust you. So it starts off with several things are happening at the same time. One is

00:31:38.640 --> 00:31:46.350
Amy Cutts: Anything you can do to get that first credit card that's that's key getting the first one. After that you work on getting the second one. And so building up that

00:31:47.370 --> 00:32:02.610
Amy Cutts: What we're starting to see more of which is very good. Is there we mentioned earlier about, you said banks only report credit they are now starting to report and use within their own underwriting. So if for example you have your bank account with Chase.

00:32:04.710 --> 00:32:15.000
Amy Cutts: Automated deposit into Chase. Chase will use that income that automated payment in there to underwrite things because they verified your income.

00:32:15.810 --> 00:32:20.100
Amy Cutts: Through this behavior of money into the account. So if you have a job that sort of thing.

00:32:20.610 --> 00:32:35.010
Amy Cutts: They are looking at payments for rent. They'll look at your at your checking account information in the checking account about rent payments and they will use that. Now that's proprietary to your checking account at chase or wells, or wherever that specific lender.

00:32:36.090 --> 00:32:50.100
Amy Cutts: More and more of this information is getting reported to credit bureaus Equifax has a proprietary database that they are the stewards of not the owners, but the stewards of call the national consumer NCT telecom and utilities exchange.

00:32:51.600 --> 00:32:55.830
Amy Cutts: That is wireless company so sprint Verizon those guys.

00:32:57.300 --> 00:33:02.220
Amy Cutts: A lot of but not all of public utilities. So think of your power and your, your

00:33:03.960 --> 00:33:06.210
Amy Cutts: Gas Company and so on. And then

00:33:08.250 --> 00:33:11.820
Amy Cutts: Come utilities cable cable bills reported into there.

00:33:12.900 --> 00:33:22.620
Amy Cutts: And then there are other databases that there are companies and you can find these online now that are go betweens between a landlord and the renter and

00:33:24.870 --> 00:33:33.120
Amy Cutts: The difficulty with collecting rent information is that it's you can go to a corporate office and maybe get 100 or or thousand throttle their properties.

00:33:33.480 --> 00:33:42.120
Amy Cutts: Of the renters and report that to a credit bureau, but much of the rental property is done on mom and pop basis. Right, so you can't can't get it in a single place so

00:33:42.810 --> 00:33:52.890
Amy Cutts: What these companies have done. This is a great innovation in fintech is companies where the renter pays the compass fintech company. The fintech company pays the landlord.

00:33:53.460 --> 00:34:00.960
Amy Cutts: They get a fee for this, they might do some other things for the landlord like manage the taxes or something on the property and then

00:34:01.650 --> 00:34:11.010
Amy Cutts: They send a note to the credit bureau that the renter paid their rent. So there's something in it for the borrower or something in the consumer something in it for the landlord.

00:34:11.580 --> 00:34:24.060
Amy Cutts: And that's a way that more of this information is getting reported to credit bureaus. So it's not that we don't want to use the rent or provide that to them. It's that it's been very hard to collect it.

00:34:25.050 --> 00:34:27.060
Amy Cutts: So as board that gets used it's better

00:34:28.140 --> 00:34:41.820
richardgreen: So I like to finish up with a question on sort of credit scores and social disparity, which isn't something that is become an issue. I'm grateful that everybody is talking about. Yeah.

00:34:43.680 --> 00:34:53.610
richardgreen: So I think that the following and forgive the personalized story, but when one of my kids went to buy a car for the first time she has been come along, so that she wouldn't get

00:34:54.570 --> 00:35:10.230
richardgreen: screwed by the dealer. She didn't actually eat me shoot it just great on our own, but I was in the background. I learned what our credit score was and it is very good. And I'm thinking, yay, you know, good job. You've been responsible and she has been

00:35:11.400 --> 00:35:22.380
richardgreen: But here's the thing. Is she had her college paid for by her parents just as I had my college paid for my parents, which means it's really much easier to

00:35:23.610 --> 00:35:40.500
richardgreen: Manage your credit you make some income you save some income, you get a credit card you pay your credit card every month. And it's, again, I don't want to. I'm very proud of my daughter, she's done great, but she had an enormous badge, just as I had an enormous advantage.

00:35:41.640 --> 00:35:57.660
richardgreen: For people to come out of college with lots of loans and these are disproportionately people of color, or people who don't go to college at all. And so they can't get that really well paying job.

00:35:58.860 --> 00:35:59.730
richardgreen: At a high school

00:36:01.140 --> 00:36:12.300
richardgreen: Building that credit is just much harder to do building that credit score is just much harder to do. It doesn't reflect the difference of character. It reflects the difference in initial vision.

00:36:13.050 --> 00:36:32.400
richardgreen: And so in that way credit scores continued to I think exacerbate these unequal outcomes that are a real problem in our society. Is there anything we can do to address this, while at the same time recognizing the legitimacy of making loans to people who can pay him back.

00:36:33.270 --> 00:36:44.160
Amy Cutts: So I will bet that up a little bit and you know I've looked at stuff I did at Equifax because one of the databases, they have is payroll income. So we could compare income.

00:36:45.330 --> 00:37:05.910
Amy Cutts: Things. So it is uniformly true that people have higher income have average scores that are higher, this, this is just a tautology. The more money you have, the more money you have left over for discretionary fix. So once you paid for the basics. You have you have more money left over.

00:37:07.380 --> 00:37:12.360
Amy Cutts: That said, many people have low incomes lower incomes.

00:37:13.380 --> 00:37:21.930
Amy Cutts: Have great credit scores. So it is not, you know, it's not black. And it's not a simple thing where I can map your income and I get your credit score. That's, that's definitely not the case.

00:37:23.100 --> 00:37:29.010
Amy Cutts: Having said that, um, credit scores are not the problem.

00:37:30.180 --> 00:37:31.530
Amy Cutts: They are the result of the problem.

00:37:33.150 --> 00:37:39.060
Amy Cutts: They then are an intermediate step to exacerbating the problem. So, you know, we've talked it

00:37:40.290 --> 00:37:55.080
Amy Cutts: Through out discussions of racial disparities, the lack of wealth generational wealth, right, the stepping stones that that have been deprived of people of color for 500 years now, right, the lack of property ownership, the lack of of

00:37:56.640 --> 00:38:00.780
Amy Cutts: disproportionately low college rates and so on. So all of these things right incomes low and

00:38:00.780 --> 00:38:04.680
richardgreen: It goes back to like access to FHA in the HTML see

00:38:05.280 --> 00:38:07.500
Amy Cutts: Back. Exactly, exactly.

00:38:07.950 --> 00:38:11.400
richardgreen: You know, if you add alone in 1937

00:38:11.490 --> 00:38:22.950
richardgreen: And that was in technical default, which is what most of the work that's where you are living in a black neighborhood, like the mentor apply, you couldn't get relief from the Homeowners Loan Corporation, where you are in a white neighborhood you could

00:38:23.310 --> 00:38:26.700
richardgreen: So I people got to keep their hasn't the back people lost their house.

00:38:27.390 --> 00:38:40.440
Amy Cutts: Right. And that strips that generation of wealth. Right. And so this just it it has perpetuated itself to today. This is an incredibly important thing and I feel very passionately about it.

00:38:40.920 --> 00:38:57.570
Amy Cutts: Want there's some things that have come out. This is a great natural experiment. If you want to know what the impact is of negative information on people's credit scores and their ability to gain credit the cares act amended the Fair Credit Reporting Act.

00:38:58.980 --> 00:39:05.430
Amy Cutts: All federal student loan borrowers were automatically put into deferment through September 30

00:39:06.930 --> 00:39:16.020
Amy Cutts: And if they were delinquent on their account. There's a little asterisk on that if they were delinquent on their account, it gets marked current for the period of view commendation

00:39:18.210 --> 00:39:24.090
Amy Cutts: Now coming out of that automated thing if you've lost your job, you can go into it to a regular deferral

00:39:25.860 --> 00:39:30.690
Amy Cutts: Or apply for the, you know, and then they have the Income Based Repayment plans and so on for student loans.

00:39:32.190 --> 00:39:43.650
Amy Cutts: New York Federal Reserve Bank looking at credit scores all credit students all student loan borrowers. The average FICO score went up 10 points.

00:39:45.120 --> 00:40:00.480
Amy Cutts: Between March when the cares Act was enacted. And today, so that's not just of the bar was that are affected. I think if we looked at the bars were affected they, their scores went up by a very large amount the asterisk. I put on that is $300 billion worth of

00:40:01.980 --> 00:40:02.970
Amy Cutts: Student loan.

00:40:04.470 --> 00:40:14.430
Amy Cutts: Accounts are not on credit bureau files because they have been they've gone into default and have gone back to the Department of Education, so it sits on their portfolio.

00:40:14.820 --> 00:40:23.670
Amy Cutts: They can garnish wages, they can take tax returns, they can do all kinds of nefarious things to get the money back. That's no longer reported in the credit bureaus.

00:40:25.020 --> 00:40:28.110
Amy Cutts: So that, that part is still still out there. Um,

00:40:29.610 --> 00:40:41.280
Amy Cutts: The Freddie and Fannie fee. So, by the way, the rest of your credit accounts if you asked for a coated related accommodation. So, Freddie and Fannie it's free for the asking FHA free for the asking

00:40:42.750 --> 00:40:54.690
Amy Cutts: If you have a car loan and a credit card and you ask for a coven related deferral. There is no code. I mentioned these special narrative codes. There is no special, you know, cares act coven deferral

00:40:54.930 --> 00:40:57.600
Amy Cutts: So they're using natural disaster. They're using

00:40:58.350 --> 00:41:03.810
Amy Cutts: Different student loan, even though it's a credit card. They're using these other codes to indicate this deferral

00:41:05.040 --> 00:41:05.610
Amy Cutts: The

00:41:06.870 --> 00:41:23.160
Amy Cutts: If the account is current when it enters the accommodation, it is to be counted as current going forward. This is a dramatic change from what happened in the financial crisis where if you were in a forbearance on a mortgage your credit was trashed. That was a negative

00:41:24.180 --> 00:41:41.040
Amy Cutts: The Kazakh expressly says it's not a negative don't treat it that way. The problem is what I know is that you you are in distress. So you have the forbearance. It's not reflected in your credit account in the normal sense of derogatory. So what did Freddie and Fannie do

00:41:42.330 --> 00:41:50.100
Amy Cutts: They slept a 50 basis point origination fee on all refinances because we can't trust the information we're getting

00:41:50.370 --> 00:42:01.980
richardgreen: Which, by the way, is absurd because the refinance, by definition, is going to be a safer alone. Alone, they have on their balance sheet because payments are lower. And when we learned from the financial crash. I'm actually very upset.

00:42:02.970 --> 00:42:03.210

00:42:04.830 --> 00:42:18.900
Amy Cutts: It's the price of that uncertainty there were two bills introduced at the same time as the Cures Act one in the house, one in the Senate that would suspend all negative reporting during the national declared pandemic crisis.

00:42:19.920 --> 00:42:26.940
Amy Cutts: So those have not gone anywhere. Thank goodness, because I think that's a terrible thing. What I would rather see Richard to

00:42:27.960 --> 00:42:40.050
Amy Cutts: Your question was about racial disparity and credit scores. I don't want to change credit scores. I don't want to change credit reporting what I want to do as a men, the Fair Credit Reporting Act that would allow

00:42:40.680 --> 00:42:50.400
Amy Cutts: A small amount of discretion. Suppose I built the very best model out there but I happen to know that you are black.

00:42:51.300 --> 00:43:02.730
Amy Cutts: And that all the things I have in my model says that you are a 645 credit score. But I also know that your 645 credit score and black.

00:43:03.630 --> 00:43:15.000
Amy Cutts: If I gave you five Bonus points for being black that I could get you into the right loan at the right price that you are a great credit risk, but I can't do that, even for your benefit today. Yeah.

00:43:16.020 --> 00:43:26.910
Amy Cutts: And that I think is it's it's like affirmative action right a lot if you turn off affirmative action, then how do I boost people who are who are ready to take the the the

00:43:27.720 --> 00:43:42.030
Amy Cutts: The responsibility of college right. I don't want to admit somebody to university is going to fail. I don't want that. I want somebody who's going to succeed to get in. So in the same way that I look at affirmative action. We can't today legally discriminate for someone

00:43:43.050 --> 00:43:48.480
Amy Cutts: As well. So that's where I would would look to fixing things. I think it will be a better fix

00:43:49.860 --> 00:43:54.060
richardgreen: So Amy, cuts always great to talk to you. That was incredibly informative.

00:43:55.200 --> 00:43:59.370
richardgreen: Presentation. I learned a couple of things I didn't know which always excites me

00:43:59.880 --> 00:44:02.490
richardgreen: Thank you very much for taking some time with us this morning.

00:44:03.210 --> 00:44:04.290
Amy Cutts: Sure. Thank Thank you Richard

00:44:04.620 --> 00:44:05.550
richardgreen: Okay, see you soon.