Forgot your password?
typodupeerror

Submission + - AI causing problems on campuses. (theguardian.com) 1

Bruce66423 writes: 'The email arrived out of the blue: it was the university code of conduct team. Albert, a 19-year-old undergraduate English student, scanned the content, stunned. He had been accused of using artificial intelligence to complete a piece of assessed work. If he did not attend a hearing to address the claims made by his professor, or respond to the email, he would receive an automatic fail on the module. The problem was, he hadn’t cheated...'

'There is also evidence that suggests AI detection tools disadvantage certain demographics. One study at Stanford found that a number of AI detectors have a bias towards non-English speakers, flagging their work 61% of the time, as opposed to 5% of native English speakers (Turnitin was not part of this particular study). Last month, Bloomberg Businessweek reported the case of a student with autism spectrum disorder whose work had been falsely flagged by a detection tool as being written by AI.'

'Many academics seem to believe that “you can always tell” if an assignment was written by an AI, that they can pick up on the stylistic traits associated with these tools. Evidence is mounting to suggest they may be overestimating their ability. Researchers at the University of Reading recently conducted a blind test in which ChatGPT-written answers were submitted through the university’s own examination system: 94% of the AI submissions went undetected and received higher scores than those submitted by the humans.'

'The AI cheating crisis has exposed how transactional the process of gaining a degree has become. Higher education is increasingly marketised; universities are cash-strapped, chasing customers at the expense of quality learning.'

Submission + - AI scoring of tenant's characteristics leads to rejection - and payout (theguardian.com)

Bruce66423 writes: 'Despite a stellar reference from a landlord of 17 years, Mary Louis was rejected after being screened by firm SafeRent'. The software, SafeRent, didn’t explain in its 11-page report how the score was calculated or how it weighed various factors. It didn’t say what the score actually signified. It just displayed Louis’s number and determined it was too low. In a box next to the result, the report read: “Score recommendation: DECLINE”.'

However there is a question as to whether this is AI or merely automated scoring of standard factors being toted up to produce a result. The prospective tenant's problem was her low credit rating — that she had 'a low credit score and some credit card debt, she had a stellar reference from her landlord of 17 years, who said she consistently paid her rent on time. She would also be using a voucher for low-income renters, guaranteeing the management company would receive at least some portion of the monthly rent in government payments. Her son, also named on the voucher, had a high credit score, indicating he could serve as a backstop against missed payments.'

Surely what's needed is a clear algorithm. To suggest that the algorithm is wrong is fair, and for it to be hidden inside a black box called 'AI' is especially problematic.

Slashdot Top Deals

It is better to travel hopefully than to fly Continental.

Working...