Salesforce CEO Benioff Says Microsoft's Copilot Doesn't Work, Doesn't Offer 'Any Level of Accuracy' And Customers Are 'Left Cleaning Up the Mess' (x.com) 37
Salesforce founder and chief executive Marc Benioff has doubled down on his criticism of Microsoft's Copilot, the AI-powered tool that can write Word documents, create PowerPoint presentations, analyze Excel spreadsheets and even reply to emails through Outlook. In a post on X, he writes: When you look at how Copilot has been delivered to customers, it's disappointing. It just doesn't work, and it doesn't deliver any level of accuracy. Gartner says it's spilling data everywhere, and customers are left cleaning up the mess.
To add insult to injury, customers are then told to build their own custom LLMs. I have yet to find anyone who's had a transformational experience with Microsoft Copilot or the pursuit of training and retraining custom LLMs. Copilot is more like Clippy 2.0.
To add insult to injury, customers are then told to build their own custom LLMs. I have yet to find anyone who's had a transformational experience with Microsoft Copilot or the pursuit of training and retraining custom LLMs. Copilot is more like Clippy 2.0.
Hi there! It looks like you’re frustrated wi (Score:2, Funny)
Throwing your hands up in despair?
Googling alternatives?
Creating a meme comparing Copilot to Clippy? (Spoiler: I was way cuter, and I didn’t spill data everywhere!)
And hey, at least with me, you knew what you were getting—a friendly paperclip, not a half-baked AI with a cleanup crew!
Re: (Score:2)
Binging* alternatives.
Does sales force have its own LLM (Score:2)
Re: (Score:2)
They either have one or have announced intent to introduce one, I forget which.
If it's as good as Salesforce, it will be a million times shittier than Copilot.
Re:Does sales force have its own LLM (Score:4, Interesting)
If it's as good as Salesforce, it will be a million times shittier than Copilot.
As someone who was forced to clean up one of our git repos at work in which a junior programmer committed dozens of Copilot-generated changes, that's a scary thought.
(And yes, the dude was told he'd be summarily fired if he used that shit ever again without asking permission)
Re: Does sales force have its own LLM (Score:2)
Nobody was ever fired for choosing Microsoft (Copilot).
Re: (Score:2)
Well that's about to change with AI, because you should have seen the mess. We were supposed to ship our new firmware update yesterday and the release date has been pushed back one week just to make sure the cleaned-up code is sane and passes production tests again.
Copilot is not welcome at our company, and we're a Microsoft cloud customer through and through sadly - mostly because our IT guy is utterly incompetent. That should tell you how bad Copilot is.
Re: (Score:2)
...we're a Microsoft cloud customer through and through sadly - mostly because our IT guy is utterly incompetent.
I have found those two things to go together almost exclusively. The deeper a shop is into Microsoft, the more incompetent its leaders tend to be. I witness it personally time after time.
Re: (Score:2)
In one sense I can understand why he's rubbishing Microsoft so vigorously: not only does 'copilot' deserve it; MS has a fairly obvious interest and fairly obvious ability(at l
Re: (Score:2)
Having said that I will worry about AI when in congeals into something proven.
It works relatively well for doing web searches on Bing. Mostly gets past all the SEO.
Re: Does sales force have its own LLM (Score:2)
This means that Google actually has a way to defeat the SEO crapwave... But it just doesn't want to. Fvck them and their greed.
Re: (Score:2)
Yeah CoPilot is such bullshit (Score:2)
You should be using "Einstein AI" [salesforce.com], it's so much better, it says right there this chatbot is so much better than my old one!
What? No, I don't stand to make billions of dollars from this. How preposterous.
First thing to ask copilot (Score:4, Informative)
How do I disable copilot? And it gives a pretty accurate answer to that one. It's literally the only thing I've ever used copilot for.
Not wrong (Score:2)
He's not wrong - but there's apparently still an upside to copilot. I've personally not really seen it, but LLMs in general can be helpful for filling out "puff" in documents. Take this comment for example, I could write all of it out, or I could use an LLM:
You know, folks, Copilot is a tool that can be really, really helpful—tremendous, actually. But let me tell you, it doesn’t always get it right. Sometimes it misunderstands the context and gives suggestions that are just plain wrong. You
Re: (Score:1)
Indeed. Some of us have access to Copilot licenses in Teams. I asked it to summarize yesterday's standup meeting (which, coincidentally, nobody remembered to record) and it hallucinated a whole legitimate-sounding summary of key points, actions and blockers... most of which had nothing whatsoever to do with the actual meeting.
Re: (Score:3)
Take this comment for example, I could write all of it out, or I could use an LLM:
Just write the prompt into your comment, and save us the time from reading the filler nonsense.
Re: (Score:1)
Did you really have to use the Trump LLM for this?
Ugh.. Make AI Great Again..
orly? (Score:2)
CEO = Clueless Executive Officer (Score:3)
He's a clueless dweeb, who listened to sales pitches from clueless dweebs at Microsoft. He probably hoped he could fire half of his developers. That would really boost his bonus! Turns out that's not the case, so he's disappointed.
Re: (Score:2)
Re: (Score:2)
True conservatism required (Score:3)
Not the political kind... Just "hey, maybe let us not jump blindly into this trend without careful consideration and some reasonable testing".
"AI" is supposed to be doing all sorts of things that it clearly cannot, and people are losing their jobs to it while we're being told it's magically creating new ones.
This economic disruption is bad for the average person in the short term, and in the long run it's not great for companies. Of course, it's a lot easier for the companies to change course after a few years, and executives' bonuses won't be affected at all...
HIlarious, because he's right (Score:3)
I have used it to generate scaffolding / boilerplate policy wording, which I then fill in / tailor to my needs, and it does that fairly decently. I've also used it in my IDEs to help with basic boilerplate code generation, and it's maybe 40% accurate at the simple stuff, enough that it saves me time. Would I ever use it in a professional, unguided, unwatched, and unverified capacity? Absolutely not, even accidentally, gen AI is not ready for professional uses cases that a human can't do better or more accurately.
"Clippy 2.0" (Score:2)
The wrong way to use LLMs. (Score:2)
We're talking to computers here.
I'll ask you all, 1. How accurate are humans? 2. Do humans accidentally leak data or cause security problems? 3. Do humans understand how to interact effectively with LLMs? 4. Let's check driving safety comparing humans and auto driving AIs. I think you already know the answer.
1. Humans are not accurate in general and praise their own ac
Re: (Score:2)
3. I took a course through a university regarding Prompt Engineering.
It's kind of hard to take you seriously after that.
Re: (Score:2)
Really, I find it very interesting that a University would offer this. Presumably to the wider public and not students working toward a degree. It's a difficult concept to wrap your head around without taking the time to learn exactly what it does. And also, emergent behavior from what it does have is already counterintuitive without a lot of mental gymnastics.
I see very smart people write very bad Google search queries. And some of them even know how modern search engines work. This is at least a few
Re: (Score:2)
And some of them even know how modern search engines work.
I don't know how modern search engines work.
Re: (Score:2)
The simple answer is that this is what the marketing says it can do.
These LLMs do exactly what marketers should be saying they can do. But the moon was promised.
A prediction from 1987 (Score:1)
https://youtu.be/VsE0BwQ3l8U?t... [youtu.be]
I caught it from a Win 10 update (Score:2)
OK, I'm a dummy. I never got around to disabling automatic updates for the one Win 10 Pro computer I have for work purposes. I recently had the unalloyed pleasure of finding out I had an icon for Co-pilot squatting on my task bar like a turd on the carpet. Fortunately, it seemed easy enough to get rid of. I have a fresh disk image I will revert to if I find out "Uninstall" actually means "Hide and Continue 'Recall' Functionality".
It's hard to describe the sinking feeling in my stomach when I found that
Your first mistake... (Score:2)
If you are depending on AI blindly then that is your mistake. These companies shoving AI in your face are as annoying as sites such as FB constantly shoving autocompletion links at you.
why does it need to be transformational to add.. (Score:2)
why does it need to be transformational to add value?
Nothing of what salesforce has brought to market has been transformational, but some it suits a purpose and thus is adopted.
For someone to say it's crap because it's not "transformational" is in need of a mirror.
It's a tool... it's not a replacement for an expert in every field... it's there to assist... if it can make your team slightly more productive, it's a success. And it looks to be a hell of a lot more effective than anything the leadership team a
Checking Output (Score:2)
The trouble with AI output is that it needs to be checked by a competent person, and often that checking, if done to a suitable standard, will take longer than doing things the old fashioned way. For coding, AI can be useful in suggesting things to try, but the programmer must understand the output, and to correct any problems. Now if, for example, I'm writing something in Python or Rust, then AI can be a great source of suggestions as to what packages I should look for, and also possible search terms to go
As expected (Score:2)
While it's likely that future AI will be a very useful tool, and early versions like AlphaFold are already producing results, today's consumer focused AI offerings are just crap generators that produce stuff that appears to be well written, but is in fact, crap. It's kinda like a BS artist, who confidently claims expertise while spewing nonsense