Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Objecting to military use is just selfish (Score 1) 299

I respect the objections that a committed pacifist (or opponent of standing armies) might have to their company taking on military contracts -- even if I disagree. But, anyone else is just being a selfish fucker. They are saying: yes, I agree that we need to maintain a military so someone needs to sell them goods and services but I want it to be someone else so I don't have to feel guilty.

Doing the right thing is often hard. Sometimes it means doing things that make you feel uncomfortable or icky because you think through the issue and realize it's the right thing to do. Avoiding doing anything that makes you feel icky or complicit doesn't make you the hero -- it makes you the person who refused to approve of interracial or gay marriages back in the day because you went with what made you feel uncomfortable rather than what made sense.

Comment Nowhere Near Kolmogorov Complexity (Score 1) 22

We are likely nowhere near the true Kolmogorov complexity. Note the restriction on running time/space. The Kolmogorov complexity is defined without regard for running time and, in all likelihood, that's going to use some algorithm that's hugely super exponential (with large constant) in time and space.

Comment Fundamentally Useless Without Threat Model (Score 1) 23

Also, data regulations are fundamentally useless as long as we can't agree (or even really try) on a threat model. Mostly, what the regulations do now is just limit what kind of creepy ads you might get which just gives people a false sense of security.

If you're concern is people breaking the law to use data to engage in blackmail and other bad acts then the regulations which limit what data is actively placed into corporate databases is useless when it can still be harvested and reconstructed from raw weblogs. If you're concerned about the ability of governments to use data against citizens then you don't want to be passing laws about data collection (the government can collect data in a dark program) but about encryption and guarantees of anonymity. If you're concerned about the the loss of 'privacy' (really pseudo-anonymity in public) then the regulations need to be more focused on what *other* people can post to the web (e.g. people streaming cameras covering public spaces). If you're concerned about the ability of hackers to gain illicit access to data then the focus needs to be on meaningful security (require regular red team attacks not just meaningless standards) for organizations that hold sensitive information not on it's collection.

However, no one seems really interested in taking these questions seriously so we just get more useless laws that impose barriers to entry for small companies and help concentrate more power into the hands of a few tech giants -- which tends to make the serious concerns even more troubling.

Comment Count on the EU for bad regs (Score 1) 23

I suspect we can yet again count on the EU for yet more stupid regulations here. Yet more in the line of data privacy protections that are deeply concerned about the US not including certain formal legal protections but don't have an issue with Chinese firms that practically don't protect data at all. Not to mention inconsistent silly regulations about cookies that don't do anything to protect real privacy but make us all click through dumb consent screens.

Comment Misses the point (Score 1) 45

I don't find arguments for AI alignment x-risk very compelling, however, the whole point of those arguments is to suggest that AI won't just sorta get things wrong in the ways that governments of corporations might but that they will be highly systematic and unstoppably effective in pursuing goals that take no note of human concerns. The whole argument is that they won't just be like a government or corporation that gets too wrapped up in profit but that they will turn the world to ash to build more paperclips.

I think the concern about kludilly misaligned AI far more compelling and probable but it's misleading to not phrase this as an argument against AI risk and make that argument explicitly.

After all, this suggests we should be pretty hopeful. Sure, no one would claim there are no issues with corporations or states but it's hard to deny that humans are better off now than they have been at any point in the past. Billions have been lifted out of poverty, we have antibiotics, air-conditioning, labor saving devices etc.. etc.. A far smaller percentage of our populations die in violence now than they do in hunter-gatherer populations of even in the ancient world. And if we don't find the x-risk stuff compelling we should be pretty optimistic we'll be able to do the same with AI.

Comment Just Bureaucratic Stupidity (Score 1) 78

Bernstein seems to be correct that NIST did something dumb in calculating the time needed to break this algorithm. Basically, they said each iteration requires this expensive giant array access which takes about the time needed for 2^35 bit operations and that each iteration requires 2^25 bit operations. However, rather than adding the cost of the memory access to the cost of the bit operations in each iteration they multiplied them. That's bad [1].

But then Bernstein has to imply that this isn't just your usual bureaucratic stupidity but that it's somehow an attempt by the NSA to weaken our encryption standards. Yes, of course they consulted with the NSA because that's what they should do and it's part of the NSA's job to protect the security of US information. Any algorithm that NIST approves is going to be used by all sorts of government agencies and government contractors and if we'd be horribly unsafe if that algorithm could be cracked. That's presumably why the NSA helped with the design of the S-boxes in DES to make them more resistant to differential cryptoanalysis before the rest of the world knew about that.

This isn't even the *kind* of error that would be beneficial to the NSA. We aren't going to trick the Chinese or Russians into using an algorithm by multiplying numbers that should have been added. Even if the NSA is trying to make the new algorithm breakable they'd want to inject a flaw that only someone with some secret knowledge they have could break not just encourage NIST to adopt an algorithm where the best *public* attack takes less operations than they say it does.

--

[1]: Though it's still possible that the issue is more subtle than this and the claim is that each operation requires one of these expensive array accesses but I can't find the source for these so can't check. That doesn't seem quite right but it might be how this thing got confused.

Comment Better Corporate governance (Score 1) 43

This is why we need to make it easier for shareholders to control executives or easier to execute a hostile takeover. Otherwise, the incentive for executives is just to use all that cash they control to expand their business in dumb ways that let the execs feel like they have more influence or control more things.

Sure, netflix merchandise associated with their shows isn't a terrible idea but why start your own stores. Just sell it online or via existing outlets.

Comment Sue Kodak (Score 1) 89

You could say the same thing about a bunch of other technologies. Photography and cameras made it much easier to stalk people as well. Messaging apps made it easier to send people unwanted messages etc.. etc...

All new technologies come with upsides and downsides and I think it's an awful precedent to set to suggest that anyone who introduces a new technology is responsible for mitigating any downsides that technology might have. It's particularly ridiculous in a country that accepts the idea that gun makers aren't liable for the foreseeable fact that people might use guns in an illegal fashion.

Realistically, I think apple has done more than what can be expected from someone introducing a new product. It shouldn't be their legal responsibility to head off all the bad ways that someone could use a new technology yet they included warnings about airtags being nearby and other mitigating measures. Morally, I'm glad that they did so but I don't see what more could reasonably be demanded of them without adopting a rule that you can't make life better for most people in any way that might also harm some others.

Comment This is a skill children need to learn (Score 1) 23

Those children are going to grow into adults who live in a world full of things that desperately cry for their attention. The best time for them to learn how to deal with this is when they are children and have the most safety nets (and the most time they can afford to lose). This feels too much like the controversies about comic books, roleplaying, satanic music or rap music or, go back gar enough, even reading itself for me to take it too seriously.

Can social media have a negative effect on children's mental health? Of course it can. But guess what else can have a hugely negative effect: real life interactions with other children. Based on my experiences and those of people I know, children today are actually far safer and have to deal with far less pervasive and awful bullying than they did back in the 80s and 90s when everything all happened offline.

Ultimately, if these kids have electronic devices they *will* socialize online in some fashion. The question is just where and how. If you push them off TikTok and other big social media sites the danger is they move to just using WhatsApp and random sites hosted in other countries beyond the reach of US law. It's far from clear that this won't make things much worse. And I particularly question the idea that using algorithms to predict what they might want to see is somehow particularly invasive of their privacy. The same information is stored even if you don't ever aggregate it or use it to target videos.

Comment Buy Backups (Score 1) 44

For the electric grid, wouldn't the wise thing to do be to have a national emergency stockpile of transformers and other equipment that could be used to bring the grid back up in the case of a devastating event?

Events in Ukraine suggest that there are important national security reasons that we should have an emergency stockpile of transformers and other electrical grid equipment anyway. Even if we aren't expecting our grid to be hit having that stockpile would let us help other countries if their grids are attacked as well.

Comment Re:There's a much simpler solution (Score 1) 114

And how does keeping encrypted messaging apps out of the hands of children help protect them? Do you expect parents to submit images of their children's phones to the UK police who in turn will the decrypt those messages for them to check if there is anything dangerous?

This smells because doing that wouldn't help keep children safe.

Parents who want to guard their children from online threats can just demand to look at their children's phone. If they can defeat those parents with encryption they can defeat them by just deleting their history or hiding it. The parents can't subpoena the messaging app and demand decryption anytime their kid starts acting squirrelly.

Either, the real goal is to use this to *universally* scan for CSAM (child porn) in encrypted texts (which is the exact opposite of what the minister claimed) or the real fear is that criminals will use this to avoid prosecution for their normal non-child involved crimes.

Comment Is it a lie or just stupid? (Score 1) 114

This only makes sense if the plan is, in direct conflict to what was just specified, to do something like scan all messages for content like CSAM (aka child porn). Then the need for the decryption keys does make sense.

But, if you accept their claim that this power will only be used rarely when they have particularized evidence that there is some kind of criminal conduct or threat it sounds like it will backfire. For this to happen, someone has to report their kid is getting disturbing/suspicious messages and whoever does that reporting can just hand over the received message to the government. At that point, mainstream encrypted messaging apps still allow the government to get the sending phone number and they can go serve a warrant on whoever has that phone (even if it's a burner, they can just get a warrant for it's position). Even if you argue that it's necessary to see what that person has sent to other individuals you have their phone so you get all the messages still kept in the history (and even if you have a decryption key that doesn't let you rewind time to undelete deleted messages after the retention period has past).

OTOH, if they pass this law people are likely to turn to other options such as sideloaded apps on android or computer based apps that might use a phone number to prove identity. Unlike meta, apple etc it's quite plausible these apps either won't collect at all or won't expose the true phone number to the person getting the messages (and overseas servers can make it legally difficult to recover that information even if present).

So is it really about protecting children or are they lieing out of fear that adults will get away with crimes and they won't be able to gather the evidence? For that application, it makes sense to worry about how hard it is to access messages that people send on convenient messengers which neither party is willing to share. After all, the police might discover a bank robbery because the bank gets robbed.

Comment Re:Just another shit-show (Score 4, Interesting) 144

It matters because how we resolve issues like this affects how much economic benefit consumers receive.

Yes, economic inequality matters but so too does the size of the pie. It's a fuckton better to be relatively low on the economic hierarchy in 2023 than it was in 1923 or 1823. And, as we saw with the soviet union, not all economic rules produce the same degree of economic growth.

I want a world where I get to benefit from neat new websites and services which make my life better. If you burden these new technologies with all sorts of extra fees that the ISPs will use to pad their own pockets then we're less likely to get these improvements in our lives.

Comment Public Benefit Public Ownership (Score 1) 144

The ISPs need to decide if they are in a for profit business or are offering some kind of public good.

If you are offering a for profit product you don't get to argue that other people should pay for it because they benefit. OTOH, if you want to argue that internet access is some kind of public good that warrants asking other people for funding then you shouldn't get to claim ownership of the result. It should be owned by the public as a public utility.

ISPs want to have it both ways.

Slashdot Top Deals

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...