Banks and landlords want to overturn federal rules on housing algorithms

0
6
Banks and landlords want to overturn federal rules on housing algorithms

Landlords and lenders are pushing the Department of Housing and Urban Development to make it easier for businesses to discriminate against possible tenants using automated tools. Under a new proposal that just finished its public comment period, HUD suggested raising the bar for some legal challenges, making discrimination cases less likely to succeed. Fair housing advocates have cried foul, arguing that the change will open the door for companies to discriminate with algorithms and get away with it.

Like most modern industries, the housing market relies on automation. In deciding whether to rent or sell someone a home, businesses run background checks, calculate insurance costs, examine credit, and generally take account of an applicant’s history. The tools that are used are largely hidden from public view, but they can have a devastating cost: a faulty or biased algorithm won’t just harm a single person, but can shut people out of housing in entire neighborhoods.

To help ensure communities are all treated equally by those tools, the Department of Housing and Urban Development finalized a rule in 2013 known as the disparate impact standard. Under the rule, if a protected group of people is harmed by a policy — even if that policy isn’t directly targeted at that group — then the company or government agency that implemented the policy can be held liable. If a zoning algorithm disproportionately harms people of color, for example, the city might face a lawsuit under the rule.

The standard has proven to be a crucial aid for advocates dealing with algorithmic discrimination. In one recent case out of Connecticut, a fair housing group has used the policy to sue over an automated background check system. Under the new rule, attorneys would have to go jump through new legal hoops to make a disparate impact case. The proposed change

Read More

Leave a reply