Julie Cassidy: New HUD rule would make it easier to discriminate in housing
Witherell Street in Detroit | Ken Coleman
At a recent staff training on racial equity here at the Michigan League for Public Policy, we considered a question: If you threw a Frisbee and accidentally hit someone in the head, would your lack of ill intent absolve you of responsibility? Or would you recognize the impact of your action, though inadvertent, and apologize?
This question was posed to us in the context of microaggressions — slights against a member of a marginalized group that are usually unintentional. The lack of intent behind a given action doesn’t make it any less harmful to the person on the receiving end — something the U.S. Department of Housing and Urban Development (HUD) would do well to bear in mind as it considers a proposed rule change that would make it easier for those engaged in housing discrimination to escape accountability. We’re calling on our fellow Michiganders to join us in opposition.
The Fair Housing Act prohibits discrimination in the sale, rental or financing of housing and other housing-related activities on the basis of race, color, religion, sex, disability, familial status or national origin. In 2013, HUD adopted a rule specifying that the prohibition applies to practices that have an unjustifiable disparate impact on a protected class, even in the absence of discriminatory intent.
The proposed rule would substantially raise the burden of proof on the person making the complaint, tilting the playing field in favor of the defendant. Additionally, it would expand the ability of landlords, lenders and others to assert plausible deniability for discrimination through reliance on algorithms.
That’s concerning as these housing gatekeepers increasingly use computer programs to screen prospective tenants and applicants for mortgage loans. Under the proposed rule, a defendant could evade liability for a disparate impact simply by showing that a particular model’s use is standard in the industry.
That should alarm us. There’s a common perception that increased reliance on technology decreases the influence of human bias in decision-making, that computers and mathematical processes aren’t sentient beings so their outputs can’t be tainted by bigotry. But algorithms are created by humans and systems learn based on past events. Rather than advancing objectivity, our technological advances often result in the mass reproduction of our biases.
In a racist, sexist and ableist society, racism, sexism and ableism are the industry standard. It’s no coincidence that the people who are harmed by supposedly unbiased systems are disproportionately economically disadvantaged or members of the very groups our civil rights laws aim to protect: people with disabilities, people of color, LGBTQIA+ people and women.
The proposed rule reflects the incorrect belief that the absence of explicitly racist intent on the part of a particular individual or institution equals the absence of racism, that if disparities exist between racial groups, it’s a matter of happenstance and not the cumulative result of centuries of deliberate decisions. As writer and Holocaust survivor Stanislaw Lec said, “No snowflake in an avalanche ever feels responsible.”
If we can’t identify anyone to blame for disparate impacts, then we can’t task anyone with responsibility for fixing them. It’s a sham to justify a status quo that actively harms people in protected classes.
Please join us in opposition to the proposed rule. Visit Defend Civil Rights to get all of the information you need to submit a comment and tell HUD how this proposal would perpetuate oppression in your community. We have until October 18 to tell this administration that impact matters more than intent and there’s no excuse for discrimination.
Our stories may be republished online or in print under Creative Commons license CC BY-NC-ND 4.0. We ask that you edit only for style or to shorten, provide proper attribution and link to our web site. Please see our republishing guidelines for use of photos and graphics.