Submission + - AI scoring of tenant's characteristics leads to rejection - and payout (theguardian.com)
Bruce66423 writes: 'Despite a stellar reference from a landlord of 17 years, Mary Louis was rejected after being screened by firm SafeRent'. The software, SafeRent, didn’t explain in its 11-page report how the score was calculated or how it weighed various factors. It didn’t say what the score actually signified. It just displayed Louis’s number and determined it was too low. In a box next to the result, the report read: “Score recommendation: DECLINE”.'
However there is a question as to whether this is AI or merely automated scoring of standard factors being toted up to produce a result. The prospective tenant's problem was her low credit rating — that she had 'a low credit score and some credit card debt, she had a stellar reference from her landlord of 17 years, who said she consistently paid her rent on time. She would also be using a voucher for low-income renters, guaranteeing the management company would receive at least some portion of the monthly rent in government payments. Her son, also named on the voucher, had a high credit score, indicating he could serve as a backstop against missed payments.'
Surely what's needed is a clear algorithm. To suggest that the algorithm is wrong is fair, and for it to be hidden inside a black box called 'AI' is especially problematic.
However there is a question as to whether this is AI or merely automated scoring of standard factors being toted up to produce a result. The prospective tenant's problem was her low credit rating — that she had 'a low credit score and some credit card debt, she had a stellar reference from her landlord of 17 years, who said she consistently paid her rent on time. She would also be using a voucher for low-income renters, guaranteeing the management company would receive at least some portion of the monthly rent in government payments. Her son, also named on the voucher, had a high credit score, indicating he could serve as a backstop against missed payments.'
Surely what's needed is a clear algorithm. To suggest that the algorithm is wrong is fair, and for it to be hidden inside a black box called 'AI' is especially problematic.