AI Landlord Screening Tool to Cease Scoring Low-Income Tenants After Discrimination Settlement

An AI landlord screening tool will discontinue scoring low-income tenants following a discrimination settlement that addressed algorithmic discrimination and violations of the Fair Housing Act. Learn about the implications of this settlement and its impact on tenant screening practices.

AI Landlord Screening Tool to Cease Scoring Low-Income Tenants After Discrimination Settlement

In a significant development, an AI landlord screening tool, SafeRent, is set to halt the practice of scoring low-income tenants as part of a class action settlement aimed at preventing discrimination based on income and race.

The Discrimination Lawsuit

The settlement, stemming from a lawsuit filed in Massachusetts, highlighted how SafeRent's scoring system negatively impacted individuals using housing vouchers, particularly Black and Hispanic applicants. The lawsuit alleged violations of both Massachusetts law and the Fair Housing Act, which prohibits housing discrimination.

Algorithmic Tenant Screening

SafeRent's scoring algorithm, which factored in credit history and non-rental-related debts, assigned SafeRent Scores to potential tenants. Landlords relied on these scores to make decisions on rental applications. However, the lack of transparency in the scoring process raised concerns, especially regarding its impact on marginalized groups.

Settlement Terms

Under the terms of the settlement, SafeRent will no longer display scores for applicants using housing vouchers nationwide. Additionally, landlords utilizing SafeRent's "affordable" score model cannot include scores in their evaluations. This shift emphasizes a more holistic review of applicants based on their overall record.

Implications and Reactions

Experts have questioned the validity of using credit scores and similar models in predicting tenants' ability to pay rent. The settlement's focus on eliminating these scores for certain groups reflects a growing awareness of algorithmic biases in the housing sector.

Conclusion

The resolution of this case signifies a step towards fairer tenant screening practices, emphasizing the need to address algorithmic discrimination in the housing market. By moving away from opaque scoring systems, the industry aims to promote greater inclusivity and equality in housing access.

What's Your Reaction?

like
0
dislike
0
love
0
funny
0
angry
0
sad
0
wow
0