If you want change, maybe educate yourself on IP laws, then work to change these laws. Venting on slashdot won't do any good.
There is no world where making shit up about other people via AI is a long-term acceptable strategy.
I believe that the Mac Studio fills up the role of the Pro, via the M5 Max and M4 Ultra. In most head to head performance tests, they've been trouncing Windows, be it on Ryzens, Core Ultras or Snapdragons
CPU performance. Now compare GPU performance against a PC built out with eight GPUs to do parallel 3D rendering.
I think the Mac Pro - particularly the trashcan - was excellent
The trash can was thermally limited by its design, and could never be upgraded to hold newer CPUs or GPUs. Anyone for whom the trash can Mac Pro would work could just as easily use an Apple Studio, give or take, ignoring the lack of ECC (which the Apple Silicon Mac Pro also lacked).
I disagree with Apple really should have just been honest with its pro users and said "We no longer care about you,"'.They've abandoned a very specific and shrinking segment of pro users, but the vast majority of pro users are covered by today's lineup with Mac Studio at the top.
Depends on what you mean by covered. Can they do their work? Yes. Are they negatively impacted by hardware limitations? Also yes. A lot of professionals would be willing to pay extra for ECC. The fact that Apple doesn't offer ECC makes their machines less than ideal for use cases where a crash would be expensive. The fact that a lot of pros put up with crashes doesn't mean they like the situation. It just means that they dislike it less than switching platforms and tools.
But the pro users I was specifically talking about here are the ones doing high-performance computing tasks involving GPUs. Their only real option is to change platforms, because even though the new Apple Silicon CPUs are great in terms of performance per watt, the wattage is really low, so if you genuinely need boatloads of GPU, to the extent that Apple was still in the game without NVIDIA support, they completely dropped out of the game when Apple Silicon dropped support for AMD. At that point, Apple computers became nearly useless for most modern high-performance computing/AI workloads, people doing large-scale 3D video rendering, etc., because they're underpowered as shiped and can't be expanded with more GPUs, and parallelizing work across multiple machines is way more expensive and not always practical.
One minor peeve - what is "pro" today? Most office workers can do their work just fine with the some of the cheapest equipment you get - isn't that "professional" enough?
The historical definition of "pro" is people who are running software beyond what a typical user would run. Web browsers and productivity software (word processors, spreadsheets) are not pro apps; they are business apps. Pro apps are mostly things like high-end photo editing (think Photoshop/Pixelmator, not iPhoto/Capture One), 3D modeling, audio/video production, etc.
Even most developers can do most of their work on laptops these days - and if they need more horsepower, that's likely to be on the server side anyway. Don't they count?
Developers at least arguably fall into that category, though they are borderline, because they don't have huge storage requirements or huge compute requirements. Developers can do most of their work on laptops, though Apple's non-Pro laptops are pretty thermally throttled, so they will be miserable. And developers are probably the group who care most about ECC RAM, because they understand enough to know why it matters, but they still often use laptops because they don't want to be tethered to a desk. It's a tradeoff.
And what about project managers, lawyers, and CEOs - aren't they "pro" either?
No, and they never were. While the users might have professional occupations, their computing requirements are indistinguishable from a high school student. "Pro" in this context doesn't mean "users with money". It means "users with needs that exceed typical requirements".
Eh traditionall the big "need many hdmi" tasks was multicamera editing.
I know that at some point Multi Camera support got broken or removed but apparently it came back? Honestly its been a long long time since i've been anything close to knowledable about FCP
Oh. Today I learned that FCP regained live multicamera switching support in 2024... four years after everybody stopped caring and started using OBS, vMix, or Tricaster.
I don't understand why these two fell for it.
They spent too much time believing the Orange Drivel machine about how America rules the World.
To be fair, outside of GPUs there really isn't much need for third party cards, and arguably even GPUs aren't a show stopper with third party GPU cages.
And Apple's total lack of third-party GPU support on Apple Silicon (beyond a few kludges that use them for AI workloads, but no display), losing the current Mac Pro is no great loss.
And yeah, the BlackMagic stuff I've bought lately is Thunderbolt. Also, most of the software for things like real-time switching runs on Windows anyway, so I'd imagine the market for that on Mac is not huge. And so much stuff gets brought in over networks these days (NDI, SRT, etc.) that HDMI ingest probably isn't that interesting anyway. You're more likely to use a dedicated encoder box that provides streams over Ethernet. My production work has been doing it that way since the pandemic.
Even more than the PCI-lanes, there wasn't hardware to justify it. With Apple Silicon, the GPU is built in and you can't fill the case with cards from NVidia to make it a CUDA-monster or handle graphics beyond the (impressive) abilities of the combined CPU/GPU.
Exactly this. Apple neutered the Mac Pro by making all of its additional functionality useless.
Years ago, they announced that they were killing support for kernel-space drivers. Then they announced a user-space replacement, DriverKit, that is basically half-assed when it comes to PCI, providing no support for any of the sorts of PCIe drivers that anyone would actually want to write. The operating system already comes with built-in support for USB xHCI silicon and most major networking chipsets, nobody builds PCIe audio anymore, FireWire support is dropped in the current OS release, and video drivers can't be written because Apple didn't bother writing the hooks.
That last one is the showstopper for PCI slots on a Mac. The main reason people bought Mac Pro or bought Thunderbolt enclosures was to support high-end video cards. With Apple not supporting any non-Apple GPUs on Apple Silicon, the slots are basically useless. I'm not saying that PCIe is useless by any means, just that the neutered, broken, driverless PCIe-lite hack that Apple actually makes available on macOS is basically useless.
I suppose you could theoretically provide DriverKit support for RAID cards, but really at this point everybody just uses external RAID hardware attached over a network anyway, so the number of people who would buy a Mac Pro for something like that is negligible.
And I guess in theory, you could port Linux video card drivers over if the only thing you're doing is using GPUs for non-video purposes (e.g. for AI model training or offline 3D rendering), but tying it into the operating system as a video output device is likely impossible without additional support from Apple, and nobody is going to bother to do that for the tiny number of people who would want that when you can just run Linux on x86 and not have to do all that porting work. After all, for those sorts of tasks, you probably aren't benefitting much from the OS or the CPU or memory performance anyway.
So basically the Mac Pro was dead on arrival because of Apple dropping support for very nearly every single thing that the Mac Pro could do that couldn't be done just as easily with a Studio (without even attaching a Thunderbolt PCIe enclosure). And once the Studio came out and had a comparable CPU in a much smaller form factor, the writing was on the wall.
More than that, the Apple Silicon Mac Pro is a sad toy that was never truly worthy of the Mac Pro name by any stretch of the imagination. It doesn't even have ECC memory or upgradable RAM. IMO, Apple really should have just been honest with its pro users and said "We no longer care about you," and then they should have dropped the Mac Pro as part of the Apple Silicon transition, rather than shipping something so massively downgraded that is so many miles from being a true pro desktop machine.
Anyone who is even slightly surprised by it being discontinued was obviously not paying attention.
I am not even going to assess Biden's compos mentis. Maybe it was some medication or some benign reason, it doesn't matter. But what I can say is that his performance during the debate caused many of the die hard democrats to declare him incompetent and made it acceptable for media and pundits to turn on him (which they never did before).
What phrase would you use to describe that, other than "spoke incoherently"? I mean, I can think of some medical terms that might apply, but that's how I would describe his debate performance. He wandered off the subject, had trouble forming a complete thought... basically like a Trump speech, only he paused a lot when he lost his train of thought instead of rambling about illegal aliens eating pets or whatever.
Even on for authors, of encyclopedia articles, and this notihing wrong with telling ChatGTP to, "take this list of bullets and write it up as a paragraph."
Until it hallucinates and adds something that wasn't there or changes the meaning significantly. In my experience, AI is really good at screwing things up in ways that nobody expects. And if the people making the changes aren't subject-matter experts, but are just doing drive-by edits to try to make things more digestible, they might not notice the errors if they are subtle enough. Allowing any random person to do stuff like that could potentially cause a lot of damage really quickly.
Nor is there anything wrong with asking it to make a diagram of some process etc.
Until it steals the chart blatantly from somebody's published book, and Wikipedia gets sued for copyright infringement. Wikipedia isn't just trying to protect itself from erroneous data. It's trying to protect itself from liability. With user-uploaded content, the user can self-certify that they have the right to upload it, and apart from user incompetence, that's usually going to be good enough. With AI-generated images, it is impossible for a user to know for certain whether what they are uploading is infringing, and would be hard to later prove which AI generated the diagram to transfer the liability to the AI company.
But the biggest risk, IMO, would be asking it to make a chart with numbers from some table. It could manipulate the numbers, and if someone isn't checking closely, they might not see the error, but the incorrect chart could easily mislead people. AI-based chart generation seems way more likely to introduce errors than a human copying and pasting the table into a spreadsheet and generating the chart with traditional non-AI-based tools.
Someone else is going to clone wikipedia and the authorship will no doubt migrate to where they are allowed to use contemporary tooling.
And after a few months, people will complain that the content is constantly wrong, the editors over there will give up trying to keep the error rate under control, and anyone with a clue will come running back to Wikipedia.
The moving cursor writes, and having written, blinks on.