Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Isn't this the idea? (Score 1) 113

Google, Microsoft, Apple, Facebook, Amazon, or another one of the big software development companies could easily fork ffmpeg itself, fix the open CVEs, provide their own (likely incompatible) features, and become the new standard - leaving the original developers out in the cold. Google did this with Blink (forked from WebKit, which itself was forked from KHTML). They took a fork of a KDE backed project, put it into what is now the #1 browser in the world, allowed Microsoft, Opera, and others to then use it in their own browsers — and now Google owns the entire narrative and development direction for the engine (in parallel to, and controlled to a lesser extent by Apple which maintains WebKit). The original KHTML developers really couldn’t keep up, and stopped maintaining KHTML back in 2016 (with full deprecation in 2023).

That is the risk for the original developers here. You’re right in that there isn’t really anything out there that can do what ffmpeg does — but if the developers don’t keep up on CVEs then organizations are going to look for new maintainers — and a year or two from now everyone will be using the Google/Microsoft/Apple/Facebook renamed version of ffmpeg instead.

That’s the shitty truth of how these things work. We’ve seen these same actors do it before.

Yaz

Comment Re:Isn't this the idea? (Score 1) 113

Look — I’m a developer. I get it. I’m personally all for having organizations do more to support the OSS they rely on. But the people in the C-suite are more worried about organizational reputation and losing money to lawsuits. If a piece of software they rely on has a known critical CVE that allows for remote code execution and someone breaks in and steals customer data — that software either needs to be fixed, or it needs to be scrapped. Those are the choices. Our customers in the EU are allowed to request SBOMs of everything we use and pass it through their own security validation software — and if they find sev critical CVEs in software we’re using there is going to be hell to pay. And the people in the C-suite can’t abide that level of risk.

Most software development companies (outside some of the biggest ones) don’t really have the kind of expertise in house to supply patches to something as complex as ffmpeg. But a company like Google has the staff with sufficient experience in this area that they could fork the project, fix the issues, and redistribute it as their own solution to the problem — and now Google is driving ffmpeg development. Organizations that need a security-guaranteed version will simply switch to Google’s version, which will likely slowly become incompatible with the original. They’ve done it before — Chrome was Google’s fork of WebKit, huge swaths of users flocked to Chrome, and now Google has over the years made enough changes that their patches often aren’t compatible with WebKit (and, of course, WebKit itself did similar when they forked KHTML).

Now forking like this is great for the community, but it can be tough on individual developers who see their work co-opted and then sidelined by massive corporations. And that’s really why the ffmpeg developers need to be very careful about ignoring CVEs like this. They do so at their own peril, as anyone can fork their code, fix the issues, and slowly make it incompatible with the original. And a big enough organization can ensure they’re fork becomes the new standard, leaving the original developers out in the cold.

Yaz

Comment Re:Isn't this the idea? (Score 2) 113

Eventually whoever has most to lose is bound to step up and help.

That, or your project gets sidelined. Which is where the danger lies.

I work for a big multinational software company that uses a lot of Open Source Software. We have a security office that audits all of our products several times a year. If any piece of our stack shows any open CVEs we have a fixed amount of time to fix the issue, with the amount of time varying from a few days (for CRITICAL severity issues) to roughly half a year for the lowest severity issues. A lack of a fix for a published CVE isn’t an excuse for not fixing the issue on our end — the software still has a security flaw in it, and the organization is so incredible security averse (thanks in part to having contacts in the defence industry) that they don’t want to risk expensive lawsuits and the loss of reputation if a vulnerability is exploited.

A lot of bigger organizations now work this way. We’ve all seen what has happened to organizations that have had significantly security breaches, and it’s not pretty. Our customers are big corporations and government entities — and if they even sniff a risk there are going to be problems. So if there is an unpatched exploit, we’re expected to either switch to something comparable, or DIY a solution (either replacing the library in question, or potentially patching it ourselves).

If ffmpeg allows known and published vulnerabilities to languish, the risk here is that organizations that use their code will simply stop using it and will look for other solutions. That’s a tough pill for an Open Source Software developer to swallow, especially when they make it as big and important as ffmpeg. You might wind up in a situation where an entity like Google forks your code and takes ownership, and eventually gets everyone to migrate to using their version instead (like what they did with WebKit to Chrome), leaving you sidelines. Or maybe someone else jumps in with a compatible solution that works well enough for enough users that they switch to that instead.

Now in an ideal world, the Google’s of this world would not only submit a CVE but would also submit a patch. Having been an OSS developer myself I’ve always encouraged my staff if they find a bug in a piece of software we use to file a bug report and ideally a patch if they know how to patch the issue correctly — but I know that is hardly universal within our organization, and probably even less so elsewhere.

TL;DR: a lot of OSS success relies on having lots of users, or at least some big and important users. But you risk losing those if you leave CVE’s open for too long, as company policies may require scrapping software with unfixed CVEs. That loss of users and reputation is dangerous for an OSS project — it’s how projects get supplanted, either by a fork or by a new (and similar) project.

Yaz

Comment History repeats itself (Score 3, Insightful) 63

At one point in history, people believed that all you needed to run a business was an MBA. Actual knowledge of the product, processes, people, etc. was irrelevant.

A narrow focus on "managing" a team of AI chat bots suffers from a similar narrow-mindedness. Without actual knowledge of the areas that you're relying on the chat bots to manage, you have no way of determining if the work product you are getting is of any use.

Using a team of bots to produce code without domain knowledge or even general purpose computer science knowledge can only eventually end in tears. But I guess if you can generate the kind of mess a bureaucratic consultancy staffed with MBAs, at a fraction of the price... that's progress of sorts?

Comment Re: Make it stop quickly (Score 1) 134

Actually, it doesn't even take 5 minute of manual work on Google.

Just checking to see if a given citation exists, nevermind the actual content, is a simple matching query. If I were a lawyer, the very first thing I would do would be to dump every public listing of caselaw I could get my hands on as a local searchable case index, in parallel with having access to a tool like lexis-nexis or PACER (with recap) for more in-depth research. It honestly should be one of the intermediate steps in an AI bot workflow - validate that the citations exist... and then verify that the citation actually bolsters the argument.

Suggested tools:

https://free.law/recap
https://case.law/
https://www.courtlistener.com/...

If you're going to turn an AI bot loose to generate arguments for you, the very least you can do is check its homework, same as you would do for any supervised clerk, paralegal, or lawyer in training working for you. That it can generate bullshit faster than you can is no excuse for shutting off your brain and signing off on it without even doing the bare minimum of due diligence.

The next level of work would be having an AI bot analyze each case and the citations used for the arguments in each, to generate a tree of citations that can be used to argue in one direction or another. Then depending on what arguments you want to bolster, you can selectively cite the cases that give more weight to your case, and prepare counterarguments in case the opposition has prepared an equivalent set of trees that will cherry pick case citations against your argument.

But forget robot lawyers generating bullshit cases. What I want to see is the robot trial judge (and the robot red team lawyer playing the part of the opposition) that can audit your case beforehand and pick it apart so you can be better prepared before you go to trial.

Comment Re:Todo: (Score 1) 55

I wonder about that. But there's definitely a seeming drive for puffery on individual resumes, and a collective drive for puffery on the entire platform in order to drive mindshare. The same kind of short-term thinking that has people ripping out existing features to "improve" a product, so they can claim that they actually did something in their tenure there.

Instead of doing something to fix a hard problem, say the obscene memory consumption of tabs as part of the base browser, they do things to make Firefox more attractive to say... advertisers who want placement on Firefox's default home page.

This is my impression as a user - I have no window into the workings of the Mozilla team aside from depressing news bits like this one featured on Slashdot...

Submission + - First-ever Star Trek Lego U.S.S. Enterprise (the-independent.com)

semper_statisticum writes: Made from 3,600 pieces, the [first-ever] Star Trek inspired Lego set is of the U.S.S. Enterprise NCC-1701-D, the spaceship that serves as the main setting of Star Trek: The Next Generation series, which ran for seven seasons, as well as the 1994 film, Star Trek Generations.

“[It] allows builders to craft a detailed replica of the iconic starship, complete with a detachable command saucer, secondary hull, and warp nacelles with distinctive red and blue detailing,” according to a press release from Lego. “The model also features an opening shuttlebay and two mini shuttlepods, perfect for recreating classic scenes.”

The set comes with nine mini-figures of Star Trek: The Next Generation characters, including Captain Jean-Luc Picard, Commander William Riker, Lieutenant Worf, Lieutenant Commander Data, Dr. Beverly Crusher, Lieutenant Commander Geordi La Forge, Counsellor Deanna Troi, bartender Guinan, and Wesley Crusher.

Comment Cognitive dissonance (Score 3, Interesting) 41

One of my state's Republican senators is all-in on chemtrails and "Solar Radiation Modification" lunacy. It's curious how these are the same people who think humanity isn't capable of affecting the climate by burning fossil fuels and pumping tons of CO2 into the atmosphere.

Slashdot Top Deals

Computers can figure out all kinds of problems, except the things in the world that just don't add up.

Working...