Forgot your password?
typodupeerror

Comment Re:the last mac pro had an big upchange for very l (Score 1) 70

I believe that the Mac Studio fills up the role of the Pro, via the M5 Max and M4 Ultra. In most head to head performance tests, they've been trouncing Windows, be it on Ryzens, Core Ultras or Snapdragons

CPU performance. Now compare GPU performance against a PC built out with eight GPUs to do parallel 3D rendering.

I think the Mac Pro - particularly the trashcan - was excellent

The trash can was thermally limited by its design, and could never be upgraded to hold newer CPUs or GPUs. Anyone for whom the trash can Mac Pro would work could just as easily use an Apple Studio, give or take, ignoring the lack of ECC (which the Apple Silicon Mac Pro also lacked).

Comment Re:the last mac pro had an big upchange for very l (Score 1) 70

I disagree with Apple really should have just been honest with its pro users and said "We no longer care about you,"'.They've abandoned a very specific and shrinking segment of pro users, but the vast majority of pro users are covered by today's lineup with Mac Studio at the top.

Depends on what you mean by covered. Can they do their work? Yes. Are they negatively impacted by hardware limitations? Also yes. A lot of professionals would be willing to pay extra for ECC. The fact that Apple doesn't offer ECC makes their machines less than ideal for use cases where a crash would be expensive. The fact that a lot of pros put up with crashes doesn't mean they like the situation. It just means that they dislike it less than switching platforms and tools.

But the pro users I was specifically talking about here are the ones doing high-performance computing tasks involving GPUs. Their only real option is to change platforms, because even though the new Apple Silicon CPUs are great in terms of performance per watt, the wattage is really low, so if you genuinely need boatloads of GPU, to the extent that Apple was still in the game without NVIDIA support, they completely dropped out of the game when Apple Silicon dropped support for AMD. At that point, Apple computers became nearly useless for most modern high-performance computing/AI workloads, people doing large-scale 3D video rendering, etc., because they're underpowered as shiped and can't be expanded with more GPUs, and parallelizing work across multiple machines is way more expensive and not always practical.

One minor peeve - what is "pro" today? Most office workers can do their work just fine with the some of the cheapest equipment you get - isn't that "professional" enough?

The historical definition of "pro" is people who are running software beyond what a typical user would run. Web browsers and productivity software (word processors, spreadsheets) are not pro apps; they are business apps. Pro apps are mostly things like high-end photo editing (think Photoshop/Pixelmator, not iPhoto/Capture One), 3D modeling, audio/video production, etc.

Even most developers can do most of their work on laptops these days - and if they need more horsepower, that's likely to be on the server side anyway. Don't they count?

Developers at least arguably fall into that category, though they are borderline, because they don't have huge storage requirements or huge compute requirements. Developers can do most of their work on laptops, though Apple's non-Pro laptops are pretty thermally throttled, so they will be miserable. And developers are probably the group who care most about ECC RAM, because they understand enough to know why it matters, but they still often use laptops because they don't want to be tethered to a desk. It's a tradeoff.

And what about project managers, lawyers, and CEOs - aren't they "pro" either?

No, and they never were. While the users might have professional occupations, their computing requirements are indistinguishable from a high school student. "Pro" in this context doesn't mean "users with money". It means "users with needs that exceed typical requirements". :-)

Comment Re:All copper is "oxygen-free" (Score 1) 68

The only thing stopping you from calling the water pipes in your house "copper-phosphorus pipes" is laziness and poor attention to detail.

A truly non-lazy person, then, would have to conduct a detailed spectrographic assay of all of the pipes (or at least sufficient samples from each lot) to accurately determine the precise composition of each, because all of them contain impurities and aren't merely copper and phosphorous.

In general, getting a truly pure sample of almost any element is incredibly-hard, and outside of laboratories (and even in laboratories, most of the time) it just doesn't matter. In the case of transporting anti-protons, standard "pure" copper is apparently inadequate, because it's not pure enough.

Comment Re:Nobody (Score 1) 70

Eh traditionall the big "need many hdmi" tasks was multicamera editing.

I know that at some point Multi Camera support got broken or removed but apparently it came back? Honestly its been a long long time since i've been anything close to knowledable about FCP

Oh. Today I learned that FCP regained live multicamera switching support in 2024... four years after everybody stopped caring and started using OBS, vMix, or Tricaster.

Comment Read the ruling on CourtListener (Score 1) 35

https://storage.courtlistener....

IANAL, but it sure seems to me the administration lost on all of the claims (except for one or two where the judge said, "I don't need to go here, because I've already made it moot.")

Now there's still the other case on this, which is in the DC Court of Appeals, addressing one specific law where the recourse is that venue. Briefs are due in April, so the first hearing will probably be in May. Track that case here: https://www.courtlistener.com/...

Comment Re:Water is what scares me (Score 1) 47

After decades of decreasing water supplies coupled with irresponsible explosive growth in the Great Basin, Front Range, and SW in particular.its just asking for trouble.

Even with the reduced precipitation there's still plenty of water for residential and commercial use. The problem, at least where I live (Utah), is agriculture. 80% of our water goes to agriculture. It would be one thing if we were growing regionally-appropriate crops for local consumption, but nearly all of that agriculture is to grow alfalfa (a water-hungry crop that isn't appropriate for the high desert climate), and nearly all of that alfalfa is shipped out of state, much of it out of the country, to feed cattle elsewhere. China is one of the biggest buyers. Essentially, our farmers are selling the contents of our aquifers to the world.

If we had plenty of water, letting our farmers buy it at a deep discount and sell it to willing buyers elsewhere would be fine, just another commercial use of a local resource, which is what trade is all about. But we definitely don't have plenty of water.

The solution is simple and straightforward (though legally complicated): Don't discounts. Set the same price for water across the board, residential, commercial and agricultural. There can and should be minor differences in delivery cost, and surcharges for purification, but the base cost of the water should be set through a single government-managed market, probably at the state level, probably divided up by drainages (drainages with more abundant water will have cheaper water; if this creates an arbitrage opportunity for someone to pipe water between drainages, great!).

Yes, this would probably put the alfalfa farmers out of business, but that's good because growing alfalfa in the desert is a bad idea. It might also raise the price of local produce, but that's as it should be, putting agricultural water use directly in competition with other water use. If prices go up, people will find ways to be more efficient. Farmers may switch to drip irrigation. If you build too many houses for the available water supply, well, those houses are going to have very expensive water and residents are going to want to find ways to conserve -- and maybe the high cost of water will disincentivize new move-ins.

The bottom line is that efficiently allocating scarce resources is what markets are good at. The problem with water isn't that there are too many people or not enough water, the problem is that we don't properly allocate the water or encourage conservation in the right places. Trying to fix this through regulation rather than market pricing will always be subject to regulatory capture and will never be as efficient or as effective as just enabling a competitive market and letting it work.

Comment Re: Well cult followers (Score 1) 326

Well, we're talking about around 50 MWh here. MCS systems go up to 3.75 MW for charging. So that would be about thirteen and a half hours. Except that there's no reason the battery system can't have parallel chargers. Use four of them and you can fill up in under three and a half hours. If the port can't handle 15 MegaWatts, you can just have containerized batteries onshore that charge the batteries in the ship. Of course, battery swapping is even faster, but you could charge by cable instead.

Comment Re: Well cult followers (Score 1) 326

But running things with batteries is not new technology. People tried it before gas, it didn't work. All of this is technology that has been around for 100 years.

I'm getting this weird feeling like maybe your posts on this are parody. The rechargeable batteries available back in 1926 were unsealed lead acid batteries. The kind you had to fiddle around with a hygrometer and top up with water to maintain. They had a specific energy of maybe 30 Watt hours per kg. Basically around a tenth of modern EV batteries. Of course things have changed since then.

Comment Re:Nobody (Score 1) 70

To be fair, outside of GPUs there really isn't much need for third party cards, and arguably even GPUs aren't a show stopper with third party GPU cages.

And Apple's total lack of third-party GPU support on Apple Silicon (beyond a few kludges that use them for AI workloads, but no display), losing the current Mac Pro is no great loss. :-)

And yeah, the BlackMagic stuff I've bought lately is Thunderbolt. Also, most of the software for things like real-time switching runs on Windows anyway, so I'd imagine the market for that on Mac is not huge. And so much stuff gets brought in over networks these days (NDI, SRT, etc.) that HDMI ingest probably isn't that interesting anyway. You're more likely to use a dedicated encoder box that provides streams over Ethernet. My production work has been doing it that way since the pandemic.

Comment Re:the last mac pro had an big upchange for very l (Score 1) 70

Even more than the PCI-lanes, there wasn't hardware to justify it. With Apple Silicon, the GPU is built in and you can't fill the case with cards from NVidia to make it a CUDA-monster or handle graphics beyond the (impressive) abilities of the combined CPU/GPU.

Exactly this. Apple neutered the Mac Pro by making all of its additional functionality useless.

Years ago, they announced that they were killing support for kernel-space drivers. Then they announced a user-space replacement, DriverKit, that is basically half-assed when it comes to PCI, providing no support for any of the sorts of PCIe drivers that anyone would actually want to write. The operating system already comes with built-in support for USB xHCI silicon and most major networking chipsets, nobody builds PCIe audio anymore, FireWire support is dropped in the current OS release, and video drivers can't be written because Apple didn't bother writing the hooks.

That last one is the showstopper for PCI slots on a Mac. The main reason people bought Mac Pro or bought Thunderbolt enclosures was to support high-end video cards. With Apple not supporting any non-Apple GPUs on Apple Silicon, the slots are basically useless. I'm not saying that PCIe is useless by any means, just that the neutered, broken, driverless PCIe-lite hack that Apple actually makes available on macOS is basically useless.

I suppose you could theoretically provide DriverKit support for RAID cards, but really at this point everybody just uses external RAID hardware attached over a network anyway, so the number of people who would buy a Mac Pro for something like that is negligible.

And I guess in theory, you could port Linux video card drivers over if the only thing you're doing is using GPUs for non-video purposes (e.g. for AI model training or offline 3D rendering), but tying it into the operating system as a video output device is likely impossible without additional support from Apple, and nobody is going to bother to do that for the tiny number of people who would want that when you can just run Linux on x86 and not have to do all that porting work. After all, for those sorts of tasks, you probably aren't benefitting much from the OS or the CPU or memory performance anyway.

So basically the Mac Pro was dead on arrival because of Apple dropping support for very nearly every single thing that the Mac Pro could do that couldn't be done just as easily with a Studio (without even attaching a Thunderbolt PCIe enclosure). And once the Studio came out and had a comparable CPU in a much smaller form factor, the writing was on the wall.

More than that, the Apple Silicon Mac Pro is a sad toy that was never truly worthy of the Mac Pro name by any stretch of the imagination. It doesn't even have ECC memory or upgradable RAM. IMO, Apple really should have just been honest with its pro users and said "We no longer care about you," and then they should have dropped the Mac Pro as part of the Apple Silicon transition, rather than shipping something so massively downgraded that is so many miles from being a true pro desktop machine.

Anyone who is even slightly surprised by it being discontinued was obviously not paying attention.

Comment Re:Good. Now copyright terms (Score 1) 90

There is more than one study and more than one way to look at it. Especially for streaming, having a catalog matters, especially for the smaller artists who will never have a charts-level hit:

"In 2024, nearly 1,500 artists generated over $1 million in royalties from Spotify aloneâ"likely translating to over $4 million across all recorded revenue sources. What's remarkable is that 80% of these million-dollar earners didn't have a single song reach the Spotify Global Daily Top 50 chart. This reveals a fundamental shift from hit-driven success to sustainable catalog-based income, where consistent engagement from devoted audiences matters more than viral moments or radio dominance."

https://cord-cutters.gadgethac...

Also don't forget that many studies such as DiCola's "Money from Music" focus on the superstars and the big hits. That is true, the charts pop music generates 80% or so of its income within the few weeks it stays in the charts and then drops of sharply.

Honestly, I don't care about the charts and superstars. They wouldn't starve if we cut copyright terms to six weeks. I do care about the indie artists that I enjoy. Who after ten years get the band back together for another tour through clubs with 200 or 500 people capacity. I'm fairly sure they would suffer if the revenue from those albums disappeared. And disappear it would. Maybe fans would still buy the CDs from the merch booth, but Spotify would certainly not pay them if it didn't have to.

Comment Re:Who gave Paul modpoints? (Score 1) 88

I am not even going to assess Biden's compos mentis. Maybe it was some medication or some benign reason, it doesn't matter. But what I can say is that his performance during the debate caused many of the die hard democrats to declare him incompetent and made it acceptable for media and pundits to turn on him (which they never did before).

What phrase would you use to describe that, other than "spoke incoherently"? I mean, I can think of some medical terms that might apply, but that's how I would describe his debate performance. He wandered off the subject, had trouble forming a complete thought... basically like a Trump speech, only he paused a lot when he lost his train of thought instead of rambling about illegal aliens eating pets or whatever.

Comment Re:AI data centers guzzle water (Score 1) 47

A quick search shows 5 million gallons daily. The Southwest states are currently fighting over the Colorado River or what's left of it and everyone wants to build data centers there because they get very few natural disasters

In order to get numbers like 5 million gallons one has to be looking at the very largest data centers, counting all water use as single use, even though water used for cooling is often reusable, and counting all the water used not by the center directly but used for power plants also as discussed earlier. Typical data center consumption is much lower. For example, see https://www.brookings.edu/articles/ai-data-centers-and-water/ which has one of the high-end estimates for what a typical data center consumes. As for the idea that there's a lot of data centers being built in the Southwest, more are being built or planned to be built in California or on the East Coast. Northern Virgina is the fastest growing region for data centers. See map here https://usdatamap.com/ (This isn't a perfect map. The situation is in flux. And admittedly, this map doesn't show size of them. My impression is that at least some of the ones being built in Arizona are very large so the map here isn't showing everything.)

Never mind the fact that we are seeing dozens of these data centers built. A large city might use 100 million gallons a day so the 10 data centers you might easily see near a large city could guzzle 50% of the water.

Yes, building some of the largest data centers, making them all near one city, would take up a lot of water. However, that would be silly; the people building these are not idiots and aren't going to go shove all their centers in a region they know they then won't have enough water for all of them. Moreover, in many jurisdictions, one has historical water rights to contend with. In many jurisdictions for major water resources, historical users get priority over new users, so farmers and others would get priority before data centers if it came down to that. (Yes, this does mean that in parts of California, golf courses get priority over some other uses.)

All of this because the rich don't want to have to pay people and they don't like to have to pretend to be civil to consumers or employees

This is not remotely why AI systems are being used. ChatGPT is being used daily by hundreds of millions of people https://explodingtopics.com/blog/chatgpt-users. Right now, ChatGPT is the 5th most visited website in the world by some independent metrics. https://en.wikipedia.org/wiki/List_of_most-visited_websites. These systems are not being used just because some rich people want to not have to pay people or bother with civility. The regular, common people are using them. Understanding where this is going, the impacts it will have, both positive and negative, requires understanding the actual usage, not what one imagines it to be.

Comment Re:Water is what scares me (Score 3, Interesting) 47

The water use for AI seems to be greatly exaggerated. Estimating water use complicated. Different data centers use different amounts of water. Also, systems need more water use for cooling when the weather is hot, so centers may use more water in summer. A data center will use more water when the center is at close to maximum usage, so data centers will use less water if they are handling queries when few users are using the system. Complicating things even further, some people are counting not just data cooling water but also counting the indirect water use from the needed electricity production (fossil fuel and nuclear plants use a fair bit of water for their steam turbines). There's a good article here discussing the difficulties in making water estimates https://theconversation.com/ai-has-a-hidden-water-cost-heres-how-to-calculate-yours-263252 However, all things considered. they estimate that all things considered it takes about 39 milliliters of water per a typical query. Now, for comparison, a high efficiency shower uses about 1.5 gallons of water a minute, which is about 95 ml of water a second. So making a query to an LLM AI system costs less than a second of water. If this estimate is off even by a factor of 3, this is equivalent to taking 1 second longer on a shower. The water use is just not hat high. The total water use is also not very high. If for example you use estimates for how much water is used by golf courses in the US https://www.usga.org/content/dam/usga/pdf/Water%20Resource%20Center/how-much-water-does-golf-use.pdf, the largest estimates of AI use put the water use as about a tenth of the water use by golf courses, and golf course water estimates put it at most about 1% of total US water use. So even if one is concerned, just getting rid of some of the gigantic water hungry golf courses in California and Arizona (seriously who the heck puts a golf course in Arizona) would largely offset this. Now, it is true that as data centers grow, more water will likely get used. But as we switch to more wind and solar power, the indirect water use will go down, and data center builders are working hard on reducing water use since it is such a hotbutton issue.

There are a lot of legitimate concerns about AI. Water use should not be high on the list.

Comment Re:My TV is a monitor (Score 3, Informative) 73

A little computer with Mint on it does a great job accessing streaming as well as my NAS. And it doesn't report my activities to anyone.

What are you using for the streaming services? Netflix etc? A web browser?

If so, that's a complete non-starter; it fails the ease of use expectations of watching TV of the wife using a remote control to turn it on and make it go. (and honestly it fails my own expectations for that matter too; having to reach for a keyboard or mouse to watch a movie or stream a show is just clunky). It also limits you from watching content in 4k.

At the moment, I've got a RokuHD of some sort on one TV, and an nvidia shield on another one. Plex, netflix, f1tv, and a couple other things on both of them. The TV remote can fairly seamlessly control the TV/soundbar and the attached box and it works well, and passes the usability test, but both devices are still more ad-laden than I want.

I've also got computers and consoles hooked up to TVs for gaming and what not, but i find them utterly miserable to use for streaming. Their is no app for linux that I'm aware of. And even the app for Windows is regularly just complete ass to use, and its a PITA to switch from plex to netflix and back etc, and using them with a remote control is pretty trashy. So I've been using the aforementioned boxes for streaming as the least awful way to run things for some years now.

But if there's a better way now, I'm listening.

Slashdot Top Deals

"We want to create puppets that pull their own strings." -- Ann Marion "Would this make them Marionettes?" -- Jeff Daiell

Working...