Please create an account to participate in the Slashdot moderation system


Forgot your password?
Compare cell phone plans using Wirefly's innovative plan comparison tool ×

Comment Re:Intern (Score 2) 105

Well if that's true it's one of the most spectacularly boneheaded decisions I've ever heard of. People rely on mapping programs for important stuff.

People who rely on mapping programs for "important stuff" might want to use a service they pay for with money, instead of one that they pay for with their personal data and the availability of their eyeballs. The former tends to come with either explicit or legally assumed obligations on the part of the provider, while the latter has terms and conditions that hold the provider harmless and put the user at a legal disadvantage. Besides, if it's really important, then it's incumbent upon the user to check more than one source.

Comment Re:Logic Says It Should Be Legal (Score 5, Insightful) 365

Many on the left love protectionism...except when they don't.

Many on the right hate protectionism...except when they don't.

Corporations just love having unfettered access to other markets for their products. They also love unrestricted access to supplies of (cheaper) materials and labour in other countries; but let their customers demand the same, and all of a sudden the hypocritical bastards lobby for protectionism, and start spreading FUD about the supposed dangers of products from other countries. Their idea of a 'free market' is really a 'captive market' - one that is kept captive by the legislation they buy, the lies they spread, and the dirty deals they strike with their counterparts in other countries.

Comment Social engineering - the next new tech frontier? (Score 1) 24

Yes, I know that what we now call 'social engineering' has been around for as long as humans existed, and probably longer. But when I say "new tech frontier", I mean the marriage of the scientific method, technological processes, and technologically-gathered data, with more scientifically-rigorous studies and experiments in sociology, psychology, neurology, and biology.

Criminals are now systematically, and probably even experimentally, exploiting employees' psychological and social traits in combination with various technical vulnerabilities. The companies being attacked will feel they have no choice but to respond with their own research and experiments in the area of vetting, monitoring, influencing, and outright brainwashing their employees, (not to mention both prescribing and proscribing certain actions and behaviours), on a 24/7 basis. There will be a lot of science applied to this kind of problem; we're seeing some of it already with things like the Predictive Policing program in Chicago.

George Orwell's work has often been mentioned here on Slashdot, and 1984 was in many ways an eerily prescient work. But if current trends such as those I've outlined above play out as I imagine, we may end up with a less metaphorical, more literal version of Orwell's dystopia.

Comment Re:Be careful what you wish for (Score 3, Interesting) 110

If anyone at Google gets arrested for linking to infringing content, it will be the end of the Internet as we know it.

If anyone at Google gets arrested for linking to infringing content, it will be the beginning of civilization as we used to know it. You know, that short, golden age when corporate overlords were at least occasionally arrested and jailed for breaking the law, when the average citizen had at least a small say in the policies enacted by their elected representatives, and when corporations actually cared about what their customers thought because those customers were still capable of hurting them financially.

Arrests at Google would be a possible sign of the turning of the tide; hope that 'government by the people, for the people' would stop being an empty, embarrassing slogan; perhaps a harbinger of the playing field at last being levelled, where there would be no distinction before the law between Artem Vaulin and Sergey Brin. Yeah, I know it will never happen, but it's nice to dream sometimes.

BTW, it seems pretty likely that if Vaulin and company had managed to become sufficiently rich and well-connected before the heat was turned up on KAT, they'd be enjoying the kind of immunity and spurious respect that Page and Brin now take for granted. And PS, the Internet might be a more interesting, more vibrant, fairer place if the Googletards and slagvertisers and marketing wankers had left it the fuck alone.

Submission + - FCC proposes 5G cybersecurity requirements, asks for industry advice

Presto Vivace writes: “Cybersecurity issues must be addressed during the design phase for the entire 5G ecosystem," FCC chairman Thomas Wheeler previously said.

The FCC published a request Wednesday for comment on a new set of proposed 5G rules to the Federal Register focused on adding specific “performance requirements” for developers of example internet-connected devices. ... If a company hopes to secure a license to access higher-frequency 5G spectrum in the future then they will need to adhere to these specific requirements — in other words, compliance is non-negotiable. Notably, these FCC “performance requirements” now include the submission of a network security plan.

Submission + - Princeton Researchers Announce Open Source 25-core Chip ( 1

An anonymous reader writes: Researchers at Princeton announced at Hot Chips this week their 25-core Piton Processor ( The processor was designed specifically to increase data center efficiency with novel architecture features enabling over 8,000 of these processors to be connected together to build a system with over 200,000 cores. Fabricated on IBM’s 32nm process and with over 460 Million transistors, Piton is one of the largest and most complex academic processors every built. The Princeton team has opened their design up and released all of the chip source code, tests, and infrastructure as open source in the OpenPiton ( project enabling others to build scalable manycore processors with potentially thousands of cores.

Submission + - SPAM: NASA aircraft probe Namibian clouds to solve global warming puzzle

sciencehabit writes: Off the coast of Namibia, for several months a year, a layer of smoke from African savanna fires drifts over a persistent deck of low clouds. It’s the perfect place to investigate the thorniest problem in all of climate science: how haze and clouds interact to boost or moderate global warming. Now, after weeks of delay and uncertainty, an airborne research campaign is about to begin. On 29 August, NASA will fly aircraft into the heart of this natural laboratory for about a month, with plans to return in 2017 and 2018. Complementary efforts from France and the United Kingdom would have expanded the sampling area but were postponed when the teams couldn’t get diplomatic clearances from Namibia.
Link to Original Source

Submission + - MedSec Disclosure Ethics (

An anonymous reader writes: Ok, so apparently a security research company called MedSec has discovered vulnerabilities in a slew of medical devices produced by St Jude Medical. t's alleged that St Jude's devices and ecosystem are demonstrably less secure than competitors.

Rather than disclose the vulnerabilities to the manufacturer, they approached Muddy Waters — an investor that's been known for shorting companies, and MedSec stand to benefit from the trade.

I can't recall this having ever occurred before, where does this fall in the spectrum of research and disclosure ethics?

Comment Re:"Some" data? (Score 4, Interesting) 102

The majority of my friends aren't geeks. What really weirds me out is that they say they wouldn't tell their friends everything about their private lives, but if I tell them that IT admins with access to their entire online life are just people like me, their eyes glaze over.

I try to explain it in simple terms: You don't want me to know this private stuff about you - but in my professional capacity I have access to all this information about you. There are numerous examples of governments with political agendas or individuals with personal agendas abusing access to private information. You are relying on the fact that you will never knowingly or unknowingly get on the wrong side of anybody in that position.

But still, blank.

I have the same problem. I think it has something to do with 'out of sight, out of mind'. If our friends don't know, will likely never meet, and don't know about the people who have access to their private data, then it's easy for them to keep their heads in the sand. It's comfortable, it requires no additional effort, and the threat of having to change their daily routines and upset their social structures feels more imminent and more dangerous than the (in their minds still abstract) threat of having their private info revealed to the world. I think this is partly just a human trait, and partly the result of indoctrination in public schools in an industrial society.

I don't know how to explain it to people. I mean when I was a kid life was simpler, as actions were less likely to have consequences: I'd just go into l33t hax0r mode and obtain files from their machine / school computer account and then show them what I can do...

I just don't know what to do.

I've never been remotely close to being a hacker, never mind 'l33t'. But I also don't know what to do. I offer my friends help with making their online activities safer and more private, and all I hear are crickets. And I'm not talking about ditching Facebook, Twitter, and the like - I'm just talking about ad blockers, NoScript, and a basic education about the types of places and behaviours to avoid. If they won't even do the Internet equivalent of asking a partner about STD's before having sex, how the hell would they ever come to terms with the fact that companies like Facebook are just using them and plundering their very lives for profit? Sometimes I feel like Neo in The Matrix.

Comment Re:"Some" data? (Score 5, Interesting) 102

...People need to start boycotting companies that do this kind of thing.

The vast majority of people don't care and don't want to know. They've been trained from birth to not be analytical and to follow the herd. For those in power, making "the people" feel powerless is good; making them feel that everything is OK and that they have neither need nor desire for power, is even better.

Also time to bring back anti-trust laws and break up any companies that are "too big to fail".

To a large extent, laws are effectively written and enforced by the companies that are "too big to fail" and their friends. Unless and until corporate hegemony is upended or destroyed this kind of abuse will continue to grow.

Comment Re:Just what America needs (Score 2) 242

Let Asians build the world's fastest trains and the continent-wide energy systems we can only dream about. We have lawsuit AI technology we can use to rob each other blind as we cash those unemployment checks.

Some of those Asians will undoubtedly invest in American lawsuits in order to help fund those trains and energy systems, with the added bonus that they're also helping the competition bankrupt itself.

Comment Re:Definition of a broken system. (Score 4, Insightful) 242

When your legal system becomes the realm of financial investment trading you KNOW your system is broken.

When your political system becomes the realm of financial investment trading you KNOW the legal system isn't far behind and your society is well and truly fucked.

Comment Re:Credibility of the system (Score 4, Insightful) 242

It's a lot harder to defend the integrity of the system when supposedly impartial actors have quantifiable effects.

With that in mind, and given that algorithms, (and soon big data as well?), are now significant factors in the justice system, can 'algorithmic judges' be far behind? The court system will push back; but inevitably, the job of judging will have to at least be informed by computerized analyses of pertinent data. And eventually, the position of judge might simply be taken over by AI. Yes, that's a long way off, if it ever happens at all - but developments such as 'lawsuit as investment' are among many factors that will further drive the development of artificial intelligence.

On a side note, I seem to recall something recently about automation being poised to take over something like 40% of law practice jobs in the next couple of decades. It seems that even the law biz isn't immune from digital disruption.

Slashdot Top Deals

Happiness is a hard disk.