Employers Are Offering a New Worker Benefit: Wellness Chatbots (wsj.com) 61
More workers feeling anxious, stressed or blue have a new place to go for mental-health help: a digital app. Chatbots that hold therapist-like conversations and wellness apps that deliver depression and other diagnoses or identify people at risk of self-harm are snowballing across employers' healthcare benefits. From a report: "The demand for counselors is huge, but the supply of mental-health providers is shrinking," said J. Marshall Dye, chief executive officer of PayrollPlans, a Dallas-based provider of benefits software used by small and medium-size businesses, which began providing access to a chatbot called Woebot in November. PayrollPlans expects about 9,400 employers will use Woebot in 2024. Amazon about a year ago gave employees free access to Twill, an app that uses artificial intelligence to track the moods of users and create a personalized mental-health plan. The app offers games and other activities that the workers can play, as well as live chats with a human "coach."
The app "allows you to address mental health concerns the moment they arise and can be used as a supplement to your daily well-being routine," the company said in a blog post. Amazon declined to comment. About a third of U.S. employers offer a "digital therapeutic" for mental-health support, according to a survey of 457 companies this past summer by professional services company WTW. An additional 15% of the companies were considering adding such an offering in 2024 or 2025. Supporters say the mental-health apps alleviate symptoms such as anxiety, loneliness and depression. Because they are available at any time, the apps can also reach people who might not be able to fit traditional therapy into their schedules or can't find a therapist who has an opening. Yet some researchers say there isn't sufficient evidence the programs work, and the varied security and safety practices create a risk that private information could be leaked or sold.
The app "allows you to address mental health concerns the moment they arise and can be used as a supplement to your daily well-being routine," the company said in a blog post. Amazon declined to comment. About a third of U.S. employers offer a "digital therapeutic" for mental-health support, according to a survey of 457 companies this past summer by professional services company WTW. An additional 15% of the companies were considering adding such an offering in 2024 or 2025. Supporters say the mental-health apps alleviate symptoms such as anxiety, loneliness and depression. Because they are available at any time, the apps can also reach people who might not be able to fit traditional therapy into their schedules or can't find a therapist who has an opening. Yet some researchers say there isn't sufficient evidence the programs work, and the varied security and safety practices create a risk that private information could be leaked or sold.
Cheapest Benefit ever- Paging Dr Eliza (Score:2)
Dr. Sbaitso, please call the office.
Come on, really? People are charging companies money for a chatbot?
Re: (Score:2)
And there's no doubt in my mind that it will end up - quickly - like the chatbot for the help line for people with eating disorders (of the anorexia variety) who were told to "lose weight."
Re: Cheapest Benefit ever- Paging Dr Eliza (Score:3)
Re: (Score:2)
> "For some therapeutic release, can we roleplay me negotiating a promotion with my grandma?"
PleasureBot: "That's more wholesome than the roleplaying I'm currently doing with your grandma in another session." Beep, bop boop.
Re: (Score:2)
Re: (Score:2)
Sounds like someone paid WSJ for an article (Score:5, Insightful)
Do you know any companies using this? Didn't think so. This is playing out just like the NFT scam.
Re: (Score:2)
One thing that will influence negative outcomes is that these non-AIs can't show real compassion, unlike a human therapist (some of who can).
Re: (Score:1)
Re: (Score:2)
Do you know any companies using this? Didn't think so. This is playing out just like the NFT scam.
Sadly I do know a few companies on both sides of the pond who would.
First off, it's a good thing that companies care... or at least pretend to care about wellness. Western societies, especially western men tend to internalise far too much until it blows up (hopefully more figuratively than literally, but that has happened too)... However I don't think a chatbot is going to be useful, first of all it will be incapable of true empathy, offering at best canned answers from a psychology textbook and will be
Re: (Score:2)
A company I escaped from not too long ago was moving HR into a (ahem) Teams chatbot, so it's plausible.
Opportunity (Score:2)
Re: (Score:3)
It's only a matter of time before that chatbot data gets a feedback loop to your employer, who discovers your frailties and acts on them. This is a total mine (mining?) field for misuse and abuse. WOW.
Re: (Score:3)
It's only a matter of time before that chatbot data gets a feedback loop to your employer, who discovers your frailties and acts on them. This is a total mine (mining?) field for misuse and abuse. WOW.
Matter of time? It's most likely in the "private sales tactics that shouldn't ever be written down to avoid liability." You have to know everything you put into these chatbots is being filtered by HR for possible high-ticket costs down the line, and anybody looking overly ill or "not within normal parameters" whatever those may be, will likely have their file flagged as, "look for reasons to expunge." Tracking every minute detail of every little thing and then comparing against nominal norms will be the way
Re: (Score:2)
LOL-- especially if you're doing it on company time from WFH.
Re: (Score:3)
You could use it to your advantage. Feed it some BS that would get you fired, and if they do read it and act on it, you can sue for a nice big payday.
Re: (Score:2)
You could use it to your advantage. Feed it some BS that would get you fired, and if they do read it and act on it, you can sue for a nice big payday.
They're far too smart to let that happen. Your superiors would be instructed to gradually turn up the pressure and to start adding spurious bullshit 'poor performance' and 'bad attitude' notes to your file. HR would already be greasing the skids for your exit behind the scenes. Within a few months you'd be gone, and the records would be unimpeachable.
Of course, when one of the HR wonks falls victim to this privacy-invading 'mental health support' scam, there could be a whistle-blower scandal. But these are
Re: (Score:2)
Pretty sure LLM is already a fork off of Eliza, which has been around longer than I've been alive.
Re: (Score:2)
Re: (Score:2)
Yes, because real people are Freudian.
Clippy (Score:3)
So Clippy's had a gender change, now calls herself Eliza and would like to help you write a letter to yourself about how you feel. Great.
Re: (Score:3)
Dr. Eliza predated Clippy by about 20 years.
Re: (Score:3)
i gotta be honest, no matter how bad my mood might be, interacting with a chatbot (or the customer service hotline equivalent) always, always makes it worse.
Though putting one of these behind a Canadian suicide hotline would save their government tons of money.
Re: (Score:1)
> So Clippy's had a gender change
That's how it got the name "Clippy"
Re: (Score:2)
> So Clippy's had a gender change
That's how it got the name "Clippy"
Clippy had a gender? Surely I can't be the only one to refer to it as "it".
Seems like a bad idea (Score:2)
Chatbot or live human, there is no way I'd participate in such a process if my employer paid for it. That concern aside, as they touched on, they have no idea if this even works, yet they're already pushing out to folks; who's to say your goals are the same as those that are writing the chatbot?
Finally; is the chatbot subject to the same regulations that real therapists are? The details on the website are a bit thin in this regard, which immediately makes me highly suspicious.
Re: (Score:2)
That concern aside, as they touched on, they have no idea if this even works, yet they're already pushing out to folks;
It is AI! It must be great! Right?
In other news, many people confuse tech with religion and think that just believing hard enough some tech is good makes it so.
"Would you like to talk to a human" ? (Score:2)
Re: (Score:3)
Obligatory Red Dwarf:
Rimmer: I used to be in the Samaritans.
Lister: I know. For one morning.
Rimmer: I couldn't take any more.
Lister: I don't blame you. You spoke to five people and they all committed suicide. I wouldn't mind, but one was a wrong number! He only phoned up for the cricket scores!
Rimmer: Well, it's not my fault everyone chose that day to jump out of buildings! It made the papers, you know. "Lemming Sunday," they called it.
Re: (Score:2)
Re: (Score:1)
Demolition Man had one of these.
Troubled Guy : I don't know... lately I just don't feel like there's anything special about me.
Booth : You are an incredibly sensitive man, who inspires joy-joy feelings in all those around you.
Nothing says I care more than... (Score:2)
On a bright side, at least it is one less useless HR position.
Reminds me of the scene in demolition man (Score:1)
It was a chat bot or ... (Score:2)
not randomly firing people.
And it was clear that a chatbot was the better move.
What I like about chatbots (Score:2)
is their empathy.
Return to Office (Score:2)
I have a suggestion where they can put their ChatGPT wrapper chatbots and unqualified 'human coaches'.
However, here is one thing that improves mental health and not only it does not cost money it actually saves the company money: Letting people work from where they want to.
https://www.forbes.com/sites/b... [forbes.com]
https://www.forbes.com/sites/b... [forbes.com]
https://www.forbes.com/sites/g... [forbes.com]
Alternatively they can try to make great offices for their workers (some ideas: private offices that people can make as they like with the
I can see the consequences now (Score:2)
This will be used to flag and fire employees who tell the chatbot undesirable things. Of course, the employees will never be told that is why but I don't think these chatbots have Doctor patient confidentiality...
The marketing division of Sirius Cybernetics Corp. (Score:2)
defines a robot as "Your Plastic Pal Who's Fun to Be With."
Re: The marketing division of Sirius Cybernetics C (Score:2)
The Hitchhiker's Guide to the Galaxy describes the Marketing Department of the Sirius Cybernetics Corporation as: "A bunch of mindless jerks who'll be the first against the wall when the revolution comes."
Never discuss your health with your employer (Score:2)
Your health is between you & your doctor, nobody else. Sure, if you choose to inform family or friends, that's your personal choice. There's no way an employer should be
Re: (Score:2)
Re: (Score:3)
Naaa. That would cost money! We cannot have that. Exploitation of workers is good capitalist practice and must continue at all cost. Far cheaper to just detect potentially costly depressed or suicidal people early and fire them.
Re: (Score:2)
>Why would anyone discuss their health with an unqualified anyone, let alone one provided (or required?) by their employer?
The first chat bots, which were obviously barely smart enough to be called a 'script', ended up with people pouring their hearts out to them. Sad commentary on modern community or issues with our evolved social primate brains... either way it just happens.
For employers? It's an inexpensive 'benefit' that keeps a few people at their desks with less trouble.
Re: (Score:2)
Nothing says "we care" like a free bot! (Score:2)
We care about all our "teammates!" Please enjoy having unlimited time with your new virtual friend gossiping, complaining and sharing your feelings with who can satisfy all your non-work needs without ever getting reported to HR*...
*user profiles are the properly of 3rd party corp who also does HR contracting for many corporations.
Re: (Score:2)
Hmm. Come to think of it, this may be exactly the reason why they are doing it. As people are generally pretty stupid and AI is the big hype, I see tons of people falling for this.
What we really need (Score:2)
People should be talking to friends and family, not chatbots.
This is like Employee Training (Score:2)
This is like employee training became just a knowledgebase and finding spare time to "teach yourself."
I can't wait until they offer health insurance benefits of "doctor" bots so when that phone-like bot fails you pay for a real doctor out of pocket. Americans will put up with anything if you distract them with racism etc. (proof? Trump.)
Something for the company and HR to use to (Score:2)
Let me give you reasons to fire me (Score:2)
Really, the absolute last thing I'm ever going to do is talk to an application that my company knows about whatever stress, anxiety, or mental health issues I may or may not have. How could that possibly go wrong in a world where every bit of data you ever put in an application is saved forever?
Re: (Score:2)
Really, the absolute last thing I'm ever going to do is talk to an application that my company knows about whatever stress, anxiety, or mental health issues I may or may not have. How could that possibly go wrong in a world where every bit of data you ever put in an application is saved forever?
Unfortunately, the oblivious and tech-ignorant form a very large part of the working population. Even people who should and could know better fall for this kind of crap. If they didn't, we wouldn't have social media, Alexa, Ring, smart appliances, and the like.
Many people have a serious deficiency in their abstract thinking processes. They simply can't see the relationship between this 'helpful service', and having their privacy raped to their own detriment and to their employers' benefit. At least, they ca
What could possibly go wrong? (Score:2)
So an employee discusses their mental and emotional problems with an imitation human wholly owned by the boss, an imitation human that can accurately remember every word exchanged and with ethical standards entirely determined by the employer.
This should be interesting.
Re: (Score:2)
This should be interesting.
Not really. I would say what happens is entirely predictable: The usual 10-15% independent thinkers will see the trap, the rest will fall for it.
Old style Dilbert (Score:2)
Catbert and the Soul Sucker 3000.
Wellness Chatbot (Score:2)
..made me cough in my coffee.
The idea, I mean. So no, regardless who pays that bill, not going for clown-based LLM therapy, TYVM.
Brave New World (Score:2)
Roboqwack (Score:2)
Patent doctor privilege? (Score:2)
As useful as people have been, I bet. (Score:2)
Blame the victim, and give yourself time to fire people who are having trouble. That's the goal.
Could you be more... specific? (Score:2)
You are a true believer. Blessings of the state. Blessings of the masses. Thou art a subject of the divine, created in the image of man, by the masses, for the masses.
Let us be thankful we have an occupation to fill. Work hard, increase production, prevent accidents, and be happy.
Failure to do so may result in prosecution for criminal drug evasion.