Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Submission + - Uber Gets Sued Over Alleged 'Hell' Program To Track Lyft Drivers (techcrunch.com)

An anonymous reader writes: Uber has another lawsuit on its hands. This time, it’s about Uber’s alleged use of a program called “Hell.” The plaintiff, Michael Gonzales, drove for Lyft during the time Uber allegedly used the software. He’s seeking $5 million in a class action lawsuit. As the story goes, Uber allegedly tracked Lyft drivers using a secret software program internally referred to as “Hell.” It allegedly let Uber see how many Lyft drivers were available to give rides, and what their prices were. Hell could allegedly also determine if people were driving for both Uber and Lyft. The lawsuit, filed in the U.S. District Court for the Northern District of California, alleges Uber broadly invaded the privacy of the Lyft drivers, specifically violated the California Invasion of Privacy Act and Federal Wiretap Act and engaged in unfair competition. Uber has not confirmed nor outright denied the claims.

Submission + - A caterpillar may lead to a "plastic pollution" solution. (bbc.com)

FatdogHaiku writes: Researchers at Cambridge University have discovered that the larvae of the moth, which eats wax in bee hives, can also degrade plastic.

They think microbes in the caterpillar — as well as the insect itself — might play a role in breaking down plastic. If the chemical process can be identified, it could lead to a solution to managing plastic waste in the environment.

Submission + - How Online Shopping Makes Suckers of Us All (theatlantic.com)

Thelasko writes: Will you pay more for those shoes before 7 p.m.? Would the price tag be different if you lived in the suburbs? Standard prices and simple discounts are giving way to far more exotic strategies, designed to extract every last dollar from the consumer.

Submission + - NSA's DoublePulsar Kernel Exploit A 'Bloodbath' (threatpost.com)

msm1267 writes: A little more than two weeks after the latest ShadowBrokers leak of NSA hacking tools, experts are certain that the DoublePulsar post-exploitation Windows kernel attack will have similar staying power to the Conficker bug, and that pen-testers will be finding servers exposed to the flaws patched in MS17-010 for years to come.

MS17-010 was released in March and it closes a number of holes in Windows SMB Server exploited by the NSA. Exploits such as EternalBlue, EternalChampion, EternalSynergy and EternalRomance that are part of the Fuzzbunch exploit platform all drop DoublePulsar onto compromised hosts. DoublePulsar is a sophisticated memory-based kernel payload that hooks onto x86 and 64-bit systems and allows an attacker to execute any raw shellcode payload they wish.

“This is a full ring0 payload that gives you full control over the system and you can do what you want to it,” said Sean Dillon, senior security analyst at RiskSense. Dillon was the first to reverse-engineer a DoublePulsar payload, and published his analysis last Friday.

“This is going to be on networks for years to come. The last major vulnerability of this class was MS08-067, and it’s still found in a lot of places,” Dillon said. “I find it everywhere. This is the most critical Windows patch since that vulnerability.”

Dan Tentler, founder and CEO of Phobos Group, said internet-net wide scans he’s running have found about 3.1 percent of vulnerable machines are already infected (between 62,000 and 65,000 so far), and that percentage is likely to go up as scans continue.

“This is easily describable as a bloodbath,” Tentler said.

Submission + - Lyrebird claims it can recreate anyone's voice based on just a 1 minute sample

Artem Tashkinov writes: Today a Canadian artificial intelligence startup named Lyrebird unveiled its voice imitation deep learning algorithm that can mimic a person's voice and have it read any text with a given emotion, based on the analysis of just a few dozen seconds of audio recording. The website features samples using the recreated voices of Donald Trump, Barack Obama and Hillary Clinton. A similar technology was created by Adobe around a year ago but it requires over 20 minutes of recorded speech. The company sets to open its APIs to public, while the computing for the task will be performed in the cloud.

Submission + - Ontario launches Universal Basic Income Pilot (www.cbc.ca)

epiphani writes: The Ontario Government will pilot universal basic income in a $50M program supporting 4,000 households over a 3 year period. While Slashdot has vigorously debated universal basic income in the past, and even Elon Musk has predicted it's necessity, experts continue to debate and gather data on the approach in the face of increasing automation. Ontario's plan will study three communities over three years, with participants receiving up to $17,000 annually if single, and $24,000 for families.

Submission + - SPAM: Will the high-tech cities of the future be utterly lonely?

randomErr writes: Humans are inherently social animals, and our health suffers if we're cut off from social ties. Loneliness can happen to anyone. In Britain, more than one in eight people say they don't consider anyone a close friend, and the number of Americans who say they have no close friends has roughly tripled in recent decades. One pervasive source of our loneliness is technology. While it offers an easy way to keep in contact with friends — and meet new people through dating and friendship apps — technology's omnipresence encourages shallow conversations that can distract us from meaningful, real-life, interactions. Researchers at the University of Essex found that having a phone nearby, even if we don't check it, can be detrimental to our attempts at connecting with others. Smartphones have transformed post office lines from a chance for some small-talk with the neighbors to an exercise in email-checking, and sealed the fate of coffee shops as nothing more than places of mutual isolation.

Comment Re:Common? (Score 4, Insightful) 55

Aurora are highly variable objects; they come in many shapes, shades, intensities, speeds they move back and forth across the sky, the speed at which they can appear/disappear, and so on. It's not that they haven't been noticed before, in fact that's how they were identified as being so common - by finding examples captured in previous images of the night sky taken by aurora watchers and similar - it's just that no one has realised they were a distinct form of interaction between particles in the upper atmosphere until now. You've also got to keep in mind that for many people that live in latitudes where aurora are common they're just a fact of life and not all that much more notable than the moon in the night sky, so the chances are pretty high that these jets have been seen on countless occasions, maybe even photographed as well, dismissed as a band/ribbon aurora (not the most photogenic type, and of little interest unless you're new to aurora watching), and that was that.

Submission + - SPAM: Why I Published in a Predatory Journal

randomErr writes: Last month I was invited to submit a paper to a dubious urology journal. I'm an editor of scientific writing who has a strong antipathy for predatory journals. So I decided to troll this publication, the MedCrave Group’s Urology & Nephrology Open Access Journal, to see whether they would agree to publish a totally made-up, Seinfeld-themed “case report” about a man who develops “uromycitisis poisoning.” Seinfeld argued that, due to his illness, he could die if he doesn’t relieve himself whenever he needs to. To my surprise, a representative at Urology & Nephrology Open Access Journal wrote to say that my manuscript was sent out for peer review. Three days later, it was conditionally accepted.

Submission + - A battery made of molten metals (mit.edu) 1

Z00L00K writes: This story came out a while ago, but didn't seem to surface:

A novel rechargeable battery developed at MIT could one day play a critical role in the massive expansion of solar generation needed to mitigate climate change by midcentury. Designed to store energy on the electric grid, the high-capacity battery consists of molten metals that naturally separate to form two electrodes in layers on either side of the molten salt electrolyte between them. Tests with cells made of low-cost, Earth-abundant materials confirm that the liquid battery operates efficiently without losing significant capacity or mechanically degrading — common problems in today’s batteries with solid electrodes. The MIT researchers have already demonstrated a simple, low-cost process for manufacturing prototypes of their battery, and future plans call for field tests on small-scale power grids that include intermittent generating sources such as solar and wind.


Comment Re:Here is my clever idea... (Score 1) 171

Try explaining that to the legacy mainstream media dinosaurs that are still busy taking Google to court for spidering, indexing, and linking to their content, despite the debacle of Spain a few years back, and see how far it gets you. Common sense is in short supply in some corners of the Internet, and fairly large corners at that.

Comment Re:Cautiously saying yes to this (Score 1) 171

I think the law of averages would take care of that. Bandwidth is pretty cheap and the chances are that even if you are constrained by bandwidth, as might be the case with a smaller site on an "xGB/day" hosting plan, then it's more likely to be the case there won't be too many GB of content to spider in the first place. There are always exceptions though, and where there is a real problem there are still going to be workarounds, e.g. explicit opt out clauses for spiders like IA's or, if all else fails, denying access based on User-Agent strings.

It does clearly depend on what effect this might have on the value of "everyone" though. Spidering (for legit purposes and otherwise) is mostly just background noise at present; the real bad actors - cyber criminals - already ignore robots.txt, and not every good actor would significantly benefit from ignoring robots.txt. The only real reasons a good actor might have for ignoring it are for better archiving (as with IA's proposals) or more complete search engine indicies, but if the reason for the content being excluded via robots.txt is that it is highly dynamic, transient, or just fodder for bad robots, then it's of minimal value to search engines anyway. Even if some (or all) of the search engines were to follow IA's lead on this, I think they'd still be looking at balancing that with more intelligence in their spidering just to avoid the risk of cluttering up their databases with broken links and expired data, and that's likely to limit the bandwidth requirements considerably.

Comment Re:yeah (Score 5, Informative) 171

IA does still spider, but they seem to use a more nuanced system than the rudimentary "start at /, then recursively follow every link" approach used by more trivial site spider algorithms. Firstly, they don't download an entire site in one go - they spread things out over time to avoid putting large spikes into the traffic pattern which is more friendly for sites that are bandwidth limited and on things like "xGB/month" plans. Secondly, they have a "popularity weighting" system that governs the order they spider and refresh sections of a given site, which is the main reason for the difference between the level of content for popular and less popular sites - although I have no idea whether that's based entirely off something like the site's Alexa ranking or is also weighted against how dynamic the content is (e.g a highly dynamic site like Slashdot would get a bump up the priority, whereas a mostly static reference site might get downgraded). Combine the two approaches and you get the results you are seeing: major web homepages get spidered more or less every day with several levels of links retrieved, while some random personal blog only get spidered every few weeks or more, and only with the homepage and first level or two of links ever getting looked at.

Comment Re:yeah (Score 5, Informative) 171

Even more specific robots.txt directive for this instance:

User Agent: ia_archiver
Disallow: /


As is often the case, Lauren is going off half-cocked with only part of the story. The IA already has a policy for removal requests (email info@) and is only considering expanding their current position of ignoring robots.txt on sites outside their current "test zone" of the .gov and .mil gTLD domains and have not had any problems. They probably will do that (and for their archival purposes it's a good idea in principle), but I think it's only fair to see whether or not they listen to the feedback and provide some specific opt-out policy and technical mechanisms like at least honoring either of the above prior to going live on the rest of the Internet before starting to scream and shout. It's going to be a two-way street anyway because they're going to find a lot more sites that feed multiple-MB of pseudo-random crap to spiders that ignore robots.txt to try and do things like poison spammer's address lists, so it's actually in their best interests to provide an opt-out they honor.

Besides, it's going to be interesting to see what kind of idiotic crap web admins who should know better think is safely hidden and/or secured because of robots.txt - it's useful to know who is particularly clueless so you can avoid them at all costs. :)

Submission + - Devuan Jessie 1.0.0 stable release candidate announced (devuan.org)

jaromil writes: Devuan 1.0.0-RC is announced, following its beta 2 release last year. The Debian fork that spawned over systemd controversy is reaching stability and plans long term support. Devuan deploys an innovative continuous integration setup: with fallback on Debian packages, it overlays its own modifications and then uses the merged source repository to ship images for 11 ARM targets, a desktop and a minimal live, vagrant and qemu virtual machines and the classic installer isos. The release announcements contains several links to project that have already adopted this distribution as a base OS.

Slashdot Top Deals

If you think the system is working, ask someone who's waiting for a prompt.

Working...