Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Submission + - Tar Pitch Drop Captured on Camera (nature.com)

Ron024 writes: How long would you be willing to wait for a drop of the black stuff in Dublin? After 69 years, one of the longest-running laboratory investigations in the world has finally captured the fall of a drop of tar pitch on camera for the first time. A similar, better-known and older experiment in Australia missed filming its latest drop in 2000 because the camera was offline at the time.

Submission + - Google starts sending adverts as emails to Gmail users (geek.com) 1

An anonymous reader writes: Back in May, Google rolled out an update to Gmail that it marketed as “a new inbox.” What it did was to split the email you receive into categories and then display them in different tabs. The Gmail redesign wasn’t just to help users, though. It turns out Google has decided to introduce a new form of advertising because of it, one that you could view as being much more intrusive than before.

Some users have started noticing that in the Promotions tab new emails are appearing that they haven’t singed up to receive. These emails as marked as “Ad” under the sender name. A little further investigation reveals they are actually Google adverts packaged as emails.

Submission + - Ask Slashdot: What Is The Most Painless Intro To GPU Programming? 3

dryriver writes: Dear Slashdotters. I am an intermediate level programmer who works mostly in C# NET. I have a couple of image/video processing algorithms that are highly parallelizable — running them on a GPU instead of a CPU should result in a considerable speedup (anywhere from 10x times to perhaps 30x or 40x times speedup, depending on the quality of the implementation). Now here is my question: What, currently, is the most painless way to start playing with GPU programming? Do I have to learn CUDA/OpenCL — which seems a daunting task to me — or is there a simpler way? Perhaps a Visual Programming Language or "VPL" that lets you connect boxes/nodes and access the GPU very simply? I should mention that I am on Windows, and that the GPU computing prototypes I want to build should be able to run on Windows. Surely there must a be a "relatively painless" way out there, with which one can begin to learn how to harness the GPU?

Submission + - MIT computer program makes TCP twice as fast (mit.edu)

An anonymous reader writes: MIT is claiming they can make the Internet faster if we let computers redesign TCP/IP instead of coding it by hand. They used machine learning to design a version of TCP that's twice the speed and causes half the delay, even with modern bufferbloated networks. They also claim it's more "fair." The researchers have put up a lengthy FAQ and source code where they admit they don't know why the system works, only that it goes faster than normal TCP. On the same day that MIT went to court to stop Aaron Swartz's documents from being published, the school is devoting its main website to an animated GIF about faster TCP.

Submission + - Google is going Puritan on us (zdnet.com)

DougDot writes: In three days, Google's Blogger will begin to delete scores of blogs that have existed since 1999 on Monday under its vague new anti-sex-ad policy purge.

On Wednesday night at around 7pm PST, all Blogger blogs marked as "adult" were sent an email from Google's Blogger team.

blogger sex purge
The email told users with "adult" blogs that after Sunday, June 30, 2013, all adult blogs will be deleted if they are found to be "displaying advertisements to adult websites" — while the current Content Policy does not define what constitutes "adult" content.

To say that Twitter ignited with outrage would be an understatement. Blogger users are panicked and mad as hell at Google.

Submission + - IT Spending In Engineering

An anonymous reader writes: I work in the engineering division at a large organization, about 2000 people total and about 900 in the engineering division. As I'm sure many institutions have been faced with recently, we are dealing with reduced budgets. We have a new director who has determined that the engineering division spends too much on "IT" and has given us a goal of reducing IT spending by 50%. We currently spend about 8% of the total engineering budget on IT related purchases. About 10% of that (i.e. 0.8% of the total budget) is spent on what I consider traditional IT such as email, office automation software, etc.. The rest goes towards engineering related IT such as clusters for large computations, workstations for processing, better networks to handle the large data sets generated, data collection systems for testing facilities, etc.. My gut says that 8% is low compared to other engineering institutions. What do other engineering organizations spend on IT (traditional and engineering)? What strategy would you use to convince your management that 8% spending on IT is already very efficient?

Submission + - Japan Spends Millions of 2011 Tsunami Relief Funds On Sea Turtle Research (ibtimes.com)

Rebecka writes: “Priorities” is the word residents of Japan may be uttering today after reports that officials reportedly spent a majority of the 200 billion set aside for 2011 tsunami relief efforts not on rebuilding their cities, but on sea turtle research.

The prefecture Kagoshima, located roughly 800 miles from the affected zone of Ishinomaki, was awarded three million yen after the devastating 2011 storm, a new report claiming the donated funds were reportedly spent in an effort for researchers to observe and protect sea turtles according to The Telegraph via the Asahi Shimbun newspaper.

Submission + - How Unity3D Became a Game-Development Beast (slashdot.org)

Nerval's Lobster writes: In the early 2000s, three young programmers without much money gathered in a basement and started coding what would become one of the most widely used pieces of software in the video game industry. “Nobody really remembers how we survived in that period except we probably didn’t eat much,” said David Helgason, the CEO and co-founder of Unity Technologies, maker of the Unity3D game engine. A decade later, untold numbers of developers have used Unity3D to make thousands of video games for mobile devices, consoles, browsers, PCs, Macs, and even Linux. The existence of Unity3D and similar products (such as the Unreal Engine and CryEngine) helped democratize game development, making the kinds of tools used by the world’s largest game companies available to developers at little or no cost. This has helped developers focus less on creating a video game’s underlying technology and more on the artistic and creative processes that actually make games fun to play. In this article, Helgason talks about how Final Cut Pro helped inspire his team during the initial building stages, how it's possible to create a game in Unity without actually writing code, and how he hopes to make the software more of a presence on traditional consoles despite Unity3D being several years late to supporting the PS3 and Xbox 360.

Submission + - Google Glass Developers Angered at Facial Recognition Ban (ibtimes.co.uk)

DavidGilbert99 writes: Google Glass has caused its fair share of headlines already and it's not even on sale yet, but developers creating apps for the device are the latest to throw their toys out of the pram, angered by Google's move to ban apps which use facial recognition in response to growing concerns from privacy groups. One commenter on the Google Glass page, said: "The sky isn't falling and Glass will not steal your identity because it can see you. Go back to your caves and let the rest of us live in the future."

Submission + - Congressman's Chief of Staff Implicated in Internet Voter Fraud (miamiherald.com) 3

An anonymous reader writes: Voting technology and fair elections have been topics of many discussion on Slashdot. Now the repercussion of voter fraud, often asserted to be mythical, despite previous sightings and more than one recent conviction, have cost a congressman's chief of staff his job: "Congressman Joe Garcia’s chief of staff abruptly resigned Friday after being implicated in a sophisticated scheme to manipulate last year’s primary elections by submitting hundreds of fraudulent absentee-ballot requests. . . Hours earlier, law enforcement investigators raided the homes of another of Joe Garcia’s employees . . .The raids marked a sign of significant progress in the probe that prosecutors reopened in February, after a Herald investigation found that hundreds of 2,552 fraudulent requests for the Aug. 14 primaries originated from Internet Protocol addresses in Miami. The bulk of the requests were masked by foreign IP addresses."

Submission + - Atomic clock built with 10^-18 instability. (arxiv.org)

c0lo writes: Apropos accuracy polls and missing options:

A collective of NIST, University of Colorado, Instituto Nazionale di Ricerca Metrologica and Politecnico di Torino announces (warning: PDF linked) the creation of an atomic clock with an instability of 10^-18. Such a level of instability is equivalent to specifying the Earth’s diameter with a precision to less than the width of an atom.
Better still, consider the gravitational redshift, a consequence of general relativity dictating that clocks ‘tick’ slower in stronger gravitational elds. With a maximum instability of 10^-18, one can discern a difference shown by two such clocks separated by only 1 cm in elevation above the Earths surface.

Now, it is likely that an operator of half-a-world-away-remote-controlled-drones won't actually need such a precision, but it becomes important in designing experiments to test unification theories employing non-metric couplings without asking for a cosmological setup (may one dream of gravitational waves detection without causing — or expecting — supernova explosions or birth of blackholes type of events?)

Slashdot Top Deals

"Here's something to think about: How come you never see a headline like `Psychic Wins Lottery.'" -- Comedian Jay Leno

Working...