I’m not super happy with the lack of supporting data, just colorful graphs and a lot of emotional anecdotes but there is a clear disparity that I think warrants greater investigation. I love the credit union movement and was shocked to read this article. I would love to hear your opinions lemmings!
How would a bank or credit union even know what racial background a loan applicant comes from? I have a mortgage, and I’ve had auto loans and personal loans in the past. Not once did I ever see a bank employee face-to-face, even for my mortgage.
A suppose the sound of a person’s voice or their name could give some clues on certain occasions.
In the US lots of forms have race or ethnicity input fields. It always baffles me as a European. Like how is that relevant, except for discrimination.
I think the logic is to track data like this… I’ve never understood it either though…
We’re tracking the data to prove we’re going to do what you said we’re going to do.
Very much for this. In schools in decent states the students self identity. That is then used to look for over representation in suspension or expulsion.
How else can mathematically prove bias or discrimination?
Individually, the ethnicity is known or presumed by staff but without it being known systemically it can’t be addressed systemically.
It’s easier to ask for it upfront and ban companies from using than try and reconstruct the data to analyze after the fact. There scale of discrimination was so severe in housing the government forces this information to be collected and reported for all applications, because it’s easier to detect the discrimination that way. There’s also penalties for not submitting the race or ethnicity on enough applications.
Creditors have access to your entire life in the background. So even if your loan application doesn’t have race/ethnicity on it, your credit file sure as hell does.
Equfax knows a whole lot about you, more than just your race.
They run your information against a number of services when you apply for a loan. That data is 100% available to them. Source: I work with this data daily.
Like the author says this is probably due to how automated systems were trained. They weren’t made to be racist, but based on certain traits are more likely to reject, and that ends up making them discriminate against black applicants. I’m thinking: neighborhood, street, schools attended, stuff like that.