Forgot your password?
typodupeerror

Comment Re:This reminds me of something (Score 2) 39

Reply "yes", then close and reopen this message to activate the link.

No matter how idiot-proof you make technology, God will always create a better idiot. That's why the right way to solve this problem is:

  • Make it as hard as possible for users to accidentally do something that is irreversible, and as easy as possible to roll back even serious mistakes. This means, among other things, keeping more than just a single backup. (Apple, I'm talking about your borderline useless iCloud backups here when I say that.)
  • Make SSNs easily changeable and less easily guessable.
  • Make it technologically as hard as possible to send out messages in a way where the sender's identity can be forged to look like it comes from someone else.
  • Aggressively prosecute phone companies who allow calls and text messages onto their network from fake phone numbers.
  • Aggressively track down, prosecute, and very publicly make an example of every person who tries to pull one of these scams, along with the people who employ them, so that anybody considering pulling such a scam is aware of previous scammers who have ended up behind bars for thirty to life within six months of starting their scam.

But IMO, the most important one is that last one. We would be a lot better off if the right to a speedy trial were taken seriously. If a year or more passes between committing a crime and being prosecuted, the threat of prosecution ceases to be a meaningful deterrent to crime.

If I were in charge, there would be two nationwide statutes of limitations added that apply to all crimes:

  • Charges must be filed within six months* of law enforcement having solid evidence showing who committed a crime. Just cause must be shown for any exceptions to this. If the law enforcement fails to show that they received significant supporting evidence that made it possible to bring their case during the six month period prior to filing charges, the charges are automatically dropped.
  • Cases must begin within thirty days* of bringing charges. If the case cannot begin within 30 days, the charges are dropped.

* I'm willing to consider arguments that these numbers should be slightly higher, but not dramatically so.

If legitimate extenuating circumstances outside the control of prosecution warrant a delay (e.g. the defendant being impossible to locate or in another country), a judge could order the statute of limitations tolled. But otherwise, the only exceptions should be in situations where a mistrial or similar forces a new trial (which obviously starts more than 30 days after the initial charges are filed). And even for a retrial, there should be a hard limit of maybe 90 days from the end of the previous trial or thereabouts.

This would result in a very large number of cases not getting prosecuted, but by forcing the prosecution to triage cases and bring important cases quickly, it would ensure that fear of being brought to justice would be a real deterrent to committing crimes. Right now, it is not. Good people don't (intentionally) commit crimes, because they have morality and ethics. Bad people do, because they have neither. Almost nobody avoids doing crime merely out of fear of punishment, and that's a bad thing.

Comment Re:Dolby is run by fuckwads (Score 1) 41

Errr no, they very much do make technology. Quite a bit of it actually. Lots of what is marketed under Dolby Vision and Dolby Audio was developed by themselves and they spend a quarter of a billion dollar every year on R&D. Heck even the noise cancelling ability in video conferencing software along with music detection was largely developed by Dolby.

I would still consider them patent trolls at this point. Legitimate patent holders use patents immediately or hold them to use defensively. They do not sit on patents for an entire decade, waiting for the patented technology to be ingrained in the industry, and then use them to earn income. The patent having been created in-house rather than acquired doesn't change the fact that the behavior is fundamentally similar.

Just because you don't see their products on the shelves at Best Buy doesn't mean they don't make those either. They produce reference monitors for colour grading Dolby Vision content, they have an entire line of cinema audio speakers, and they make the rest of the cinema audio stack as well as a first party product, including multichannel amplifiers and audio pre-processors for Atmos content - a codec they also developed from the ground up.

Dolby Atmos was 2012. Dolby Vision was 2014. How are they not basically a non-practicing entity at this point?

The fact they sit on a bunch of related patents is just the nature of any R&D development.

Yes, but using them offensively after sitting on them violates the doctrine of Laches. In effect, they sat on the patents so that people would end up depending on AV1, because if they sued too early, AOMedia would have designed around the patent, and they would get nothing. So they deliberately delayed action to cause prejudice to the defendant.

At this point, it would be entirely reasonable for a judge to declare that because they failed to act against AOMedia within the 6-year window prescribed by patent law, they lost their right to sue AOMedia for damages in creating the patented technology, and that patent exhaustion applies to all downstream users. And if that happens, I will laugh so hard.

Comment Re: Why are lawsuits allowed against end users? (Score 1) 41

Imagine your little startup patents something and is egregious copied by a large, rich company. If the startup doesn't immediately have the funds to sue, the other company just gets to use the tech without the patent with no consequences. Seems unfair.

Dolby is not a startup. It was founded in 1965.

Also, the doctrine of Laches says you cannot unreasonably delay filing a lawsuit. Waiting ten years from the first release of the specification is clearly unreasonable. Waiting eight years from the first finished implementation is clearly unreasonable.

The bigger problem for Dolby is that patent law won't let you recover damages at all for damages more than six years ago, and the standard has been available for eight. So unless somehow this is some wacky patent where Dolby claims that some use of an otherwise non-patent-protected codec is patented (which should almost certainly result in that patent getting overturned for obviousness), Dolby should be laughed out of court.

But I'm sure they're hoping that Snapchat caves and agrees to go back to a Dolby codec or pay them royalties rather than fight them in court. This is patent troll behavior. Dolby has effectively become a patent troll, IMO.

Comment Re:Why are lawsuits allowed against end users? (Score 2) 41

Unfortunately, from a legal point of view, AOMedia hasn't done anything against Dolby. It's simply created a video compression codec. It doesn't use the codec, it just publishes documentation on how to use it.

From a patent law point of view, it is illegal to create something that violates a patent, not just to use it. Patent law kicks in when you create, offer for sale, sell, import, or otherwise distribute a patented invention.

IMO, one of the biggest flaws in patent law is that it covers the use of inventions in all cases except for patent exhaustion (sale of an already-licensed product). With the exception of pure process patents, IMO, that should not be a violation, as a user has no realistic way of knowing that something they bought violates someone else's patent, and should not even need to worry about such nonsense.

This "feature" of patent law exists solely to give the patent holder more leverage to screw the company accused of violating the patent by holding their innocently infringing customers liable, causing irreparable reputational damage to both companies, irreparable harm to countless others, etc., and it should have been eliminated decades ago.

That said, having seen this behavior by Dolby, I hereby vow to never knowingly buy any product that they manufacture, nor support their products or technology, nor use it except in situations where the content creator or distributor leaves me no alternative. They've gone from being a legitimate technology company to a glorified patent troll. Instead of innovating and making the world better to enrich themselves, they are suing anybody and everybody and making the world worse to enrich themselves.

Moreover, absent gross incompetence by Dolby's legal counsel, it seems clear that Dolby flagrantly and willfully violated the doctrine of Laches to allow damages to accumulate for eight full years from the final release (and ten years from the first specification release), thus allowing AV1 to become the dominant codec so that they could then predatorily use their patents to squeeze money out of the industry. Their behavior is nothing short of unconscionable, and whether due to incompetence or malice, their legal counsel should be formally sanctioned for it.

Finally, if Dolby wins, it is paramount that the entire technology industry agree to never license *any* future Dolby technologies going forwards, because doing so will only encourage them to use the patent system to prevent free and open standards. The only way to prevent patent abuse is to stop feeding the companies that abuse patents.

It is my fundamental believe that data formats should not be allowed to be protected by copyright or patents under any circumstances, because doing so fundamentally violates the rights of the owners and creators of that content. It makes it so that users can potentially lose access to data that they created. And this is wholly unacceptable for the same reason that renting software is unacceptable.

In short, Dolby and its lawyers can go f**k themselves with a shovel.

Comment Re:the last mac pro had an big upchange for very l (Score 1) 90

I believe that the Mac Studio fills up the role of the Pro, via the M5 Max and M4 Ultra. In most head to head performance tests, they've been trouncing Windows, be it on Ryzens, Core Ultras or Snapdragons

CPU performance. Now compare GPU performance against a PC built out with eight GPUs to do parallel 3D rendering.

I think the Mac Pro - particularly the trashcan - was excellent

The trash can was thermally limited by its design, and could never be upgraded to hold newer CPUs or GPUs. Anyone for whom the trash can Mac Pro would work could just as easily use an Apple Studio, give or take, ignoring the lack of ECC (which the Apple Silicon Mac Pro also lacked).

Comment Re:the last mac pro had an big upchange for very l (Score 1) 90

I disagree with Apple really should have just been honest with its pro users and said "We no longer care about you,"'.They've abandoned a very specific and shrinking segment of pro users, but the vast majority of pro users are covered by today's lineup with Mac Studio at the top.

Depends on what you mean by covered. Can they do their work? Yes. Are they negatively impacted by hardware limitations? Also yes. A lot of professionals would be willing to pay extra for ECC. The fact that Apple doesn't offer ECC makes their machines less than ideal for use cases where a crash would be expensive. The fact that a lot of pros put up with crashes doesn't mean they like the situation. It just means that they dislike it less than switching platforms and tools.

But the pro users I was specifically talking about here are the ones doing high-performance computing tasks involving GPUs. Their only real option is to change platforms, because even though the new Apple Silicon CPUs are great in terms of performance per watt, the wattage is really low, so if you genuinely need boatloads of GPU, to the extent that Apple was still in the game without NVIDIA support, they completely dropped out of the game when Apple Silicon dropped support for AMD. At that point, Apple computers became nearly useless for most modern high-performance computing/AI workloads, people doing large-scale 3D video rendering, etc., because they're underpowered as shiped and can't be expanded with more GPUs, and parallelizing work across multiple machines is way more expensive and not always practical.

One minor peeve - what is "pro" today? Most office workers can do their work just fine with the some of the cheapest equipment you get - isn't that "professional" enough?

The historical definition of "pro" is people who are running software beyond what a typical user would run. Web browsers and productivity software (word processors, spreadsheets) are not pro apps; they are business apps. Pro apps are mostly things like high-end photo editing (think Photoshop/Pixelmator, not iPhoto/Capture One), 3D modeling, audio/video production, etc.

Even most developers can do most of their work on laptops these days - and if they need more horsepower, that's likely to be on the server side anyway. Don't they count?

Developers at least arguably fall into that category, though they are borderline, because they don't have huge storage requirements or huge compute requirements. Developers can do most of their work on laptops, though Apple's non-Pro laptops are pretty thermally throttled, so they will be miserable. And developers are probably the group who care most about ECC RAM, because they understand enough to know why it matters, but they still often use laptops because they don't want to be tethered to a desk. It's a tradeoff.

And what about project managers, lawyers, and CEOs - aren't they "pro" either?

No, and they never were. While the users might have professional occupations, their computing requirements are indistinguishable from a high school student. "Pro" in this context doesn't mean "users with money". It means "users with needs that exceed typical requirements". :-)

Comment Re:This is the right decision (Score 1) 91

You don't get to pick and choose what people post (with some obvious exceptions like fraud or csam), while also claiming immunity for the stuff you couldn't or wouldn't.

Exactly, thanks for the excellent example. That's the kind of statement that nobody ever explains, but always presents as pure axiomatic dogma.

I do think that you might have revealed a clue in your unusual phrasing, though. You said "claiming immunity for the stuff you couldn't or wouldn't" but how can there ever be any possibility of liability there? If your computer denies someone else's request to publish something, what liability is there to be immune from?

Comment Re:Nobody (Score 1) 90

Eh traditionall the big "need many hdmi" tasks was multicamera editing.

I know that at some point Multi Camera support got broken or removed but apparently it came back? Honestly its been a long long time since i've been anything close to knowledable about FCP

Oh. Today I learned that FCP regained live multicamera switching support in 2024... four years after everybody stopped caring and started using OBS, vMix, or Tricaster.

Comment Re:Nobody (Score 1) 90

To be fair, outside of GPUs there really isn't much need for third party cards, and arguably even GPUs aren't a show stopper with third party GPU cages.

And Apple's total lack of third-party GPU support on Apple Silicon (beyond a few kludges that use them for AI workloads, but no display), losing the current Mac Pro is no great loss. :-)

And yeah, the BlackMagic stuff I've bought lately is Thunderbolt. Also, most of the software for things like real-time switching runs on Windows anyway, so I'd imagine the market for that on Mac is not huge. And so much stuff gets brought in over networks these days (NDI, SRT, etc.) that HDMI ingest probably isn't that interesting anyway. You're more likely to use a dedicated encoder box that provides streams over Ethernet. My production work has been doing it that way since the pandemic.

Comment Re:the last mac pro had an big upchange for very l (Score 1) 90

Even more than the PCI-lanes, there wasn't hardware to justify it. With Apple Silicon, the GPU is built in and you can't fill the case with cards from NVidia to make it a CUDA-monster or handle graphics beyond the (impressive) abilities of the combined CPU/GPU.

Exactly this. Apple neutered the Mac Pro by making all of its additional functionality useless.

Years ago, they announced that they were killing support for kernel-space drivers. Then they announced a user-space replacement, DriverKit, that is basically half-assed when it comes to PCI, providing no support for any of the sorts of PCIe drivers that anyone would actually want to write. The operating system already comes with built-in support for USB xHCI silicon and most major networking chipsets, nobody builds PCIe audio anymore, FireWire support is dropped in the current OS release, and video drivers can't be written because Apple didn't bother writing the hooks.

That last one is the showstopper for PCI slots on a Mac. The main reason people bought Mac Pro or bought Thunderbolt enclosures was to support high-end video cards. With Apple not supporting any non-Apple GPUs on Apple Silicon, the slots are basically useless. I'm not saying that PCIe is useless by any means, just that the neutered, broken, driverless PCIe-lite hack that Apple actually makes available on macOS is basically useless.

I suppose you could theoretically provide DriverKit support for RAID cards, but really at this point everybody just uses external RAID hardware attached over a network anyway, so the number of people who would buy a Mac Pro for something like that is negligible.

And I guess in theory, you could port Linux video card drivers over if the only thing you're doing is using GPUs for non-video purposes (e.g. for AI model training or offline 3D rendering), but tying it into the operating system as a video output device is likely impossible without additional support from Apple, and nobody is going to bother to do that for the tiny number of people who would want that when you can just run Linux on x86 and not have to do all that porting work. After all, for those sorts of tasks, you probably aren't benefitting much from the OS or the CPU or memory performance anyway.

So basically the Mac Pro was dead on arrival because of Apple dropping support for very nearly every single thing that the Mac Pro could do that couldn't be done just as easily with a Studio (without even attaching a Thunderbolt PCIe enclosure). And once the Studio came out and had a comparable CPU in a much smaller form factor, the writing was on the wall.

More than that, the Apple Silicon Mac Pro is a sad toy that was never truly worthy of the Mac Pro name by any stretch of the imagination. It doesn't even have ECC memory or upgradable RAM. IMO, Apple really should have just been honest with its pro users and said "We no longer care about you," and then they should have dropped the Mac Pro as part of the Apple Silicon transition, rather than shipping something so massively downgraded that is so many miles from being a true pro desktop machine.

Anyone who is even slightly surprised by it being discontinued was obviously not paying attention.

Comment Re:Who gave Paul modpoints? (Score 1) 88

I am not even going to assess Biden's compos mentis. Maybe it was some medication or some benign reason, it doesn't matter. But what I can say is that his performance during the debate caused many of the die hard democrats to declare him incompetent and made it acceptable for media and pundits to turn on him (which they never did before).

What phrase would you use to describe that, other than "spoke incoherently"? I mean, I can think of some medical terms that might apply, but that's how I would describe his debate performance. He wandered off the subject, had trouble forming a complete thought... basically like a Trump speech, only he paused a lot when he lost his train of thought instead of rambling about illegal aliens eating pets or whatever.

Comment Re:Bye bye Wikipedia (Score 5, Insightful) 31

Even on for authors, of encyclopedia articles, and this notihing wrong with telling ChatGTP to, "take this list of bullets and write it up as a paragraph."

Until it hallucinates and adds something that wasn't there or changes the meaning significantly. In my experience, AI is really good at screwing things up in ways that nobody expects. And if the people making the changes aren't subject-matter experts, but are just doing drive-by edits to try to make things more digestible, they might not notice the errors if they are subtle enough. Allowing any random person to do stuff like that could potentially cause a lot of damage really quickly.

Nor is there anything wrong with asking it to make a diagram of some process etc.

Until it steals the chart blatantly from somebody's published book, and Wikipedia gets sued for copyright infringement. Wikipedia isn't just trying to protect itself from erroneous data. It's trying to protect itself from liability. With user-uploaded content, the user can self-certify that they have the right to upload it, and apart from user incompetence, that's usually going to be good enough. With AI-generated images, it is impossible for a user to know for certain whether what they are uploading is infringing, and would be hard to later prove which AI generated the diagram to transfer the liability to the AI company.

But the biggest risk, IMO, would be asking it to make a chart with numbers from some table. It could manipulate the numbers, and if someone isn't checking closely, they might not see the error, but the incorrect chart could easily mislead people. AI-based chart generation seems way more likely to introduce errors than a human copying and pasting the table into a spreadsheet and generating the chart with traditional non-AI-based tools.

Someone else is going to clone wikipedia and the authorship will no doubt migrate to where they are allowed to use contemporary tooling.

And after a few months, people will complain that the content is constantly wrong, the editors over there will give up trying to keep the error rate under control, and anyone with a clue will come running back to Wikipedia.

Slashdot Top Deals

"Don't tell me I'm burning the candle at both ends -- tell me where to get more wax!!"

Working...