Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Finally, the invention I've been looking for. (Score 1) 286 286

Interesting. It looks like the US military has not only invented, but also improved on the technology needed to stab someone in the face over the internet. This will be a boon for reasonable internet discussions everywhere!
http://bash.org/?4281

Comment Re:Nice article (Score 1) 127 127

It is a mixture of geographical separation and pork barrel politics. There are actually two LIGO interferometers, and the other one is "out west somewhere" right next to the Hanford Nuclear reservation in Washington state. The Hanford location has the advantage of being as close to the middle of nowhere as we could find during WWII, while still being close to a city that now has a lot of technical people and resources. The choice for the eastern location was less clear, and Louisiana was chosen as being "good enough" while also appeasing the desires of members of congress for some good old fashioned pork.

However.... there were problems due to the Louisiana location. The facility passed all the seismic survey tests when they were run before the site was chosen and building was started. And it sure looks nice and pretty with all that woodland situated around it. Soon after coming online, however, that same forest become mature and a logging company started harvesting it for wood. As it turns out, dropping a tree to the ground (even miles away) causes enough seismic noise that it made the interferometer useless for much of the day when logging operations were going on. This forced them to install the active seismic damping system that was planned for Advanced LIGO well ahead of schedule, in order to get functionality back. In the end it worked out ok and gave the active seismic isolation some early testing data, but it definitely caused a lot of extra work for a while.

Submission + - ATM Bombs Coming Soon to United States

HughPickens.com writes: Nick Summers has an interesting article at Bloomberg about the epidemic of 90 ATM bombings that has hit Britain since 2013. ATM machines are vulnerable because the strongbox inside an ATM has two essential holes: a small slot in front that spits out bills to customers and a big door in back through which employees load reams of cash in large cassettes. "Criminals have learned to see this simple enclosure as a physics problem," writes Summers. "Gas is pumped in, and when it’s detonated, the weakest part—the large hinged door—is forced open. After an ATM blast, thieves force their way into the bank itself, where the now gaping rear of the cash machine is either exposed in the lobby or inside a trivially secured room. Set off with skill, the shock wave leaves the money neatly stacked, sometimes with a whiff of the distinctive acetylene odor of garlic." The rise in gas attacks has created a market opportunity for the companies that construct ATM components. Several manufacturers now make various anti-gas-attack modules: Some absorb shock waves, some detect gas and render it harmless, and some emit sound, fog, or dye to discourage thieves in the act.

As far as anyone knows, there has never been a gas attack on an American ATM. The leading theory points to the country’s primitive ATM cards. Along with Mongolia, Papua New Guinea, and not many other countries, the U.S. doesn’t require its plastic to contain an encryption chip, so stealing cards remains an effective, nonviolent way to get at the cash in an ATM. Encryption chip requirements are coming to the U.S. later this year, though. And given the gas raid’s many advantages, it may be only a matter of time until the back of an American ATM comes rocketing off.

Comment Re: Yawn (Score 1) 556 556

You are correct about things like pork, and there are several interesting religious conventions that are based on things that science tends to agree with them on. (Many of these seem to be dietary in nature.) However, I find it a useful contrast to look at how religion and sicence change as time passes. For instance, pork then and now...

Then:
Jews: Don't eat pork.
Scientific hindsight: Don't eat pork, because it is infested with parasites and likely undercooked.

Now:
(Orthodox) Jews: Don't eat pork.
Science: Cook your pork properly.

The difference seems to me to be that religion has only maintained the action that should be taken, while science looks at the reasons behind it. When cooking methods and parasite populations change, science tells you to take another look at whether pork is safe to eat. Since most religious teachings seem to have lost the rationale behind them (if there ever was one), it is much harder to figure out when or if things should be reconsidered. Science has been much better about keeping the reasoning attached to the actions, to let future generations guide their own actions knowingly instead of blindly.

That isn't to say that this holds true for everything when comparing religion vs science, but I find it an interesting comparison.

Comment Re:headline fail (Score 1) 276 276

As someone who is currently writing up a dissertation dealing with this topic, I can assure you that mil spec is not sufficient. Hardening chips for radiation is completely different from hardening them for other hostile environments, especially when you look at the heavy ion strikes you can get in space.

Radiation effects are generally split into two basic categories, Total Ionizing Dose (TID) effects and Single Event Effects (SEEs). TID results from lots of little ion strikes, which gradually build up charge and/or defects and screws with transistor characteristics. Often the result is that transistors leak a lot more current when off, reducing your margins. Since this takes time to build up, it is highly unlikely that this caused the issues with the probe. Since mil spec chips often have a bit more tolerance for this, mil spec does help, but it does not help enough for long exposures.

SEEs are the result of a single, high energy particle hitting the chip. The area of effect varies greatly depending on the energy of the particle, but the typical results of a strike are than a logic gate or cluster of nearby logic gates end up forced to output the wrong value. Essentially, one or more of your "0"s just became "1"s, and vice versa. If these values happened to be important to the current state of the machine or OS running on it, then congratulations, you just got screwed. The two most common ways to harden a chip against this are temporal redundancy and logic redundancy. Temporally redundant circuits assume that any ion will only upset the logic for a short period of time, and wait for the signal to become stable before storing values. This has been the staple of custom hardened chips for a while now, because it is relatively easy to convert all your flip flops into hardened flip flops, and thus harden the entire circuit.

Logically redundant circuits essentially have 3 copies of the logic that vote to determine the correct value. This was often used in the early days of hardening, since you could just stick 3 chips in there and add some basic voting circuits outside the chips to correct the values. However, as processors got more complex, it became harder and harder to restore their state properly in a reasonable amount of time, so people tended to move to temporal hardening for custom chips, and only used logic hardening for things like FPGAs.

Currently, however, temporal hardening is breaking down, since it doesn't scale well with smaller processes. A heavy ion deposits a fixed amount of charge, but smaller processes have less current flow per transistor, so it takes longer to remove that charge and restore proper operation. Thus, the length of time temporal designs have to wait for the signal to stabilize keeps increasing. This is one of the main reasons why hardened chips lag behind in terms of transistor size and the processes they can use. My graduate research has created a method to do high speed, logically redundant circuits that are highly scalable, meaning that you can automatically create three circuits that vote on the same chip, using commercial synthesis and APR tools to automate the process. I firmly believe that this is going to be the standard once people realize how much faster they can make chips run on new processes.

Comment Re:Local Neighborhood (Score 1) 125 125

That isn't the point. This isn't a local vs. big store issue, and it isn't really about comic books. It is about publishers and bookstores, and the balance of power between them. Exclusive deals between publishers and a specific store with a specific DRM tied to a specific brand of ebook reader are bad. Even if it happens to be the biggest store that services the most people, it just helps to support a monopoly. And even if it were a smaller store that just happened to use your personal brand, so you personally were not inconvenienced, it still artificially segments the market and tries to lock people in to different personal playgrounds so that they can be milked by one company. The market needs to be open and unrestricted to promote quality services, not locked in to whichever bookstore happened to score a deal on what you most want to read, even if that bookstore is worse than the one you really want to use.

As far as they're standing up for a less segmented market, I applaud B&N for taking the long view on this issue, despite any short term loss in profits. Although they may be in it for selfish reasons (would they have protested if they were the ones with the exclusive deal, and amazon was shut out?), they are making at least some sort of stand. Now we just need someone big enough to take a stand against the entire idea of DRM on ebooks....

Comment We don't need another friend list. (Score 1) 88 88

Even if the store isn't broken and works flawlessly (and from what I've heard, that is a big if), I'm still opposed to exclusive Origin titles just on general principles. Withdrawing titles from other services is a huge pain in the ass for the consumer, and does a huge disservice to PC gaming as a whole. It fragments the PC community even more, creating yet another friends list you need to keep track of, and yet another program that needs to be running. Ultimately, PC games can only benefit from a single unified service that tracks friends, achievements and such, just like there is only one Xbox system, or one PS3 system, but there is no point to trying to force people to adopt your system as the primary one. While EA may think it sucks that Valve got there first, the solution is not to keep fracturing the marketplace and forcing people to use a special system just for your games. The only responsible policy is to provide your games to any existing systems, so that people have a choice of which one they want to use. Then you make yours good enough that people want to use it.

From a business standpoint, I can see why they pulled their titles from Steam, because if they didn't, no one would ever use Origin. But if that is the case, you have to ask yourself why you're forcing people to use a service that they would never normally use. You want people to use your service because they like its features, not resent your service because its their only choice. It seems like they only reason they want to push Origin is that they want some of the money Steam is making, not because they actually feel like they can offer a new and/or superior product.

Like a lot of EA's decisions, this seems to be very short term and focused tightly on pure monetary numbers. Cash grabs can work, but you build up enough ill will among your customers and eventually they'll stop buying your stuff. It takes a long time to get to that point, but EA has been working at it for years. More and more of my friends seem to be aware of the crap they're pulling. That is not a good sign for them. I'm already fairly careful on which games I buy, and tend to skip ones that aren't available on Steam. Refusing to put your games where I do most of my shopping can only hurt you more.

Comment Re:Mine it. (Score 3, Informative) 500 500

http://mitnse.com/2011/03/16/what-is-decay-heat/

There is no more uranium fission, that was stopped within seconds of the earthquake hitting. The problem is the decay products of the reaction, which are unstable and thus radioactive. The power given off by the reactor at this point is just a percent or so of its original power, and all of that is coming from unstable isotopes splitting on their own. There is no real point to separating the fuel, the byproducts will continue to fission without any neutrons hitting them. Removing them to make them easier to cool is pointless, since by the time they could set something up, they could've set up a real cooling system and solved the problem on site.

Comment Re:Starting from full stop ..... (Score 1) 776 776

My current car has a cruise control with resume functionality, but it clears the resume whenever it drops below a certain speed (somewhere around 10-30 mph, I really only notice it when stopping). So you can tap the brake pedal to take it off cruise, coast for a bit, then hit resume to get back to the previous speed, but if you go down to a stop or close to it it'll clear it and you have to get back to the desired speed manually.

Comment Re:Industry slow to respond to challenges (Score 1) 305 305

This, also, shouldn't be news.

Niche applications have a much lower install base, and must make more money on each sale in order to pay for the same amount of development. Since niche markets often have orders of magnitude less users, you have to both jack up the cost of the item and cut back on development.

Its the difference between having 50,000 users and 100 developers, and 500 users and 10 developers. Assuming the project is of comparable complexity, you're going to pay 10x as much and get something 10x less polished.

Comment They just don't get it. (Score 5, Insightful) 123 123

Games? Social Networking? The fact that Murdoch is a part of this venture does not surprise me, because it shows an astounding lack of understanding for why people are buying ebook readers and what the market actually wants in a book reader appliance. Namely, they failed to do prior art to find the millions of PDAs people were using to do exactly what this new format is proposing. Or rather... not doing exactly what this format is proposing, because no one really needs it and it is an energy hog.

The Kindle and other ebook readers (i.e. the Sony one I've owned for the past 3 years) did not become popular because they were a new idea and a new device, they became popular because of a new technology: e-ink. There were book readers before the e-ink displays came around, but very few people used them because they suffered from 2 major drawbacks. The first was the power consumption of their displays meant that you had to plug them in and let them charge on a daily or twice daily basis. People already have to charge their cell phones on a daily basis, but charging one twice a day when you use it a lot is pretty annoying, and a huge amount of power is spent on the display when a cell phone is being used. The second drawback is simply screen real estate and the interface to get to it. PDAs could do exactly what is being proposed, but they didn't because it was hard to use a handheld device in that manner. Sure handheld gaming devices exist and are used... but they have buttons and layouts specifically tailored to using the device as a game. The same goes for cell phones, PDAs, and ebook readers. You can play games on cell phones, but not easily and the power usage sucks up the battery. The new format proposal looks to do exactly the same thing to ebook readers. Congratulations, you just re-invented the N-Gage.

The major "killer app" in the ebook market that no one is mentioning is really quite simple. It isn't a killer display (black and white is fine for books), it isn't a fancy new display (though color would be nice, it would also be mostly useless and a major expense), and it isn't a whiz-bang new DRMed file format. What is missing from the ebook marketplace is simply a universal storefront. Amazon books only work with the kindle. Sony's store only works with their ebook readers. The same for most other ebook stores (with a wider list of readers that can use their store... but a lower percentage of people who actually have those readers). DRM has fractured the marketplace, but selling to the entire install base of ebook readers is really quite simple because all ebook readers out there can read non-DRMed files. It is only the stores that are enforcing DRM. The first store to offer a wide selection of books in non-DRMed format at reasonable prices will suddenly be able to sell to 100% of people interested in ebooks and steal market share from everyone else out there.

I could rant on this subject for days, but the bottom line is: I can get almost any book out there for free from pirates, and I don't have to worry about losing those books when I migrate from my Sony Reader to whatever device I might end up using next (the battery is finally dying). However, I've bought most of my books from the Baen store, because I can get them fast, easily, and with good proofreading. It is easier to read them and find them, and they aren't some OCRed crap with forced line breaks and errors. Publishers have to understand that on the web, they're not competing against the price and convenience other publishers, they're competing against some random pirate scanning in a copy of their book and giving it away for free. If it isn't easy to find a copy of their book that will work on my system for a reasonable price there ($15 for a paperback selling for $8 at the local bookstore?) there is no reason to give them money.

That said, there is one thing I can see some value in for the proposed format: daily deliverables. This is something that isn't done all that well in current generation ebook readers, but it isn't exactly a new idea. There has been some freeware software for the Sony Reader that was able to download and sync online newspapers for you for quite some time now. I first ran into it a couple years back, but didn't actually use the functionality. The only real drawback to it was having to connect it to your computer in order to update, so wireless updating in a smooth manner would be worth some money. So it is valuable, but not nearly as new and unique as they seem to think. For that matter, I saw info on the new "Sony Daily" that is supposed to come out soon, and its entire premise is that it can download content wirelessly. If they can actually deliver content easily and smoothly over a wireless link, I see no real reason to move to a special format for it and the inevitable device specific DRM that tries to lock you in.

Comment Re:Not the engineers fault (Score 1) 383 383

The Therac-25 incidents happened partly because there were hardware interlocks on previous versions, but not on the updated version. However, a simple "don't kill the patient" interlock would not have worked. The basic problem is that it handled both e-beam and X-ray dosage on the same machine. And you get X-rays by hitting a target with an e-beam of much, much greater power. This absorbs the e-beam and emits a much weaker X-ray beam. If I remember what I read about this incident correctly, all of the incidents were some form of "we wanted X-rays, but the target was rotated out as if we wanted an E-beam, so the entire E-beam was applied to the person instead of the X-ray target". In standard X-ray operation (which was by far the majority of the doses that were requested), the beam had to be active at a high level in the majority of cases. Since this beam was more than strong enough to kill anyone if the target was improperly placed, almost every single treatment would involve someone bypassing a "don't kill the patient" safeguard. That is just begging to be bypassed each and every time without thinking about it.

The fact that there were several bugs that led to similar results with no backup is the major issue. There are various ways to fix this issue, including hardware interlocks, actual software review, and exhaustive test methodology (including designing the software so that it can be tested exhaustively). In the end, they cut corners and this killed patients. They reduced the cost by removing "extraneous" hardware interlocks found on the Therac-20 model, because they didn't realize that they were activating and saving lives. They reduced the cost by hiring programmers who clearly did not understand proper code design and by reusing old code that depended on the interlocks. They reduced the cost by not requiring exhaustive testing, and code that supported exhaustive testing. In particular, the hardware interlocks were not the simple "low power or else" checks, but more complicated checks on what valid powers vs. other settings were appropriate. More expensive than a simple "don't go to high power without authorization" check, and thus more expensive.

I can remember two examples of errors that caused problems. One of the incidents involved an 8-bit integer that was incremented when it was checked and found not ready in a continuous loop. This integer was part of what checked to see if the target was in place. So using a testing procedure where you make a slight mistake, fix that mistake but then forget to rotate the target back in would be stopped by this check.... 255 out of 256 times. The other 1 out of 256 times it had just rolled over and gave an incorrect output. Someone lost that game of Russian roulette.

Another of the incidents involved fast data entry. You enter the dosage as if you were going to give the patient an X-ray beam (which was much more common than E-beam treatments and became a habit to some operators), and hit enter at the bottom of the setup form. This starts the beam strength calibration. If you then realize you really wanted an E-beam of the same strength for this patient, go back to the top, change one entry from X-ray to E-beam and fly through hitting enter on the rest of the form in 8 seconds to get to the bottom. The beam strength calibration finishes 8 seconds after you hit enter the first time, exits its loop and checks to make sure the form is still properly filled out (which by now it is). Then it removes the target because you asked for E-beam and it doesn't double-check the power setting which was originally set for X-rays. Since it doesn't go back to double check the power setting vs. E-beam/X-ray and just checks the single "form properly filled out" variable, it is inherently dangerous. This was fixed by the infamous "remove the up key on the keyboard" hack by the company, forcing people to take more than 8 seconds to fill out the form again.

While I'm more of a hardware engineer than a software one, even I can see where both of these errors should not have been made by anyone who know what the heck they were doing. The fact that they were not reviewed exhaustively before going into a product as potentially dangerous as a radiation treatment machine is... well, a case study in how to do things wrong.

Comment Re:Sharing books? (Score 1) 503 503

Yes, Amazon's store is not a good advertisement for the value of electronic publishing vs. paper. It is better than the Sony store, but that is not a ringing endorsement. The price differences are pretty random and nonsensical, but at least they're often in the same range as a paperback book instead of selling it as though it is a hardback even though it has been out in paperback for months. Compare that to Baen books, however. The last book I was just reading is $8 on amazon for the dead tree edition. At the Baen online store it is $6 with no DRM and in a variety of formats. And they've been selling all their books for 1-2 bucks cheaper than the paperback edition for the past several years, even when the book was still in hardback. This is even cheaper when you consider the "webscription" bundle of all the books they're publishing or republishing in a month, typically 3-4 new hardback books and 3-4 older paperback books for $15. If you'll read half those books, that is $5 or less per book.

I've spent around $1000 at Baen's store in the last 3 years to put books on my Sony Reader. The only problem is that they can only publish new and interesting books so fast. I've bought pretty much nothing from any other online bookseller (unless you count the free $50 coupon that came with my Sony ebook reader, where I discovered exactly how bad the Sony store sucked). It's not that I don't want to buy other books, it's that all the other bookstores I've found either had restrictive DRM that didn't work with the Sony reader or had pretty horrible selection. Or both. If I could have found a store like Baen with a wider selection of books, I could have easily spent another $2000 on my book habit. Instead, I reread old books, or look for them from pirated sources in badly OCRed and formatted versions.

Comment Re:Non-Toxic inert? (Score 1) 237 237

This distinction is part of what makes the Hanford area in Washington such a difficult cleanup effort. The separation of plutonium for WW2 isn't really the problem, its all the poorly documented experimental methods they used in the cold war. You end up with radioactive metals dissolved in all sorts of chemicals, and then you don't bother to document which chemicals. I wouldn't even consider it radioactive waste exactly, it's some nasty chemical waste that just happens to be radioactive from dissolved metals. Separating the radioactive metals from the rest of the chemical soup would be a significant first step, because then you can treat each part differently. i.e. the radioactive stuff won't try to eat its way through the container, and the chemical stuff won't try to kill you just for standing next to it.

Comment Re:They exist. (Score 1) 553 553

Exactly. The initial design for LIGO was only expected to be sensitive to the largest possible sources of gravity ways. The theories on most of these weren't proven in any real sense, so it was decided to go look for them. Many theories predict a much lower level of background noise, which LIGO cannot yet detect without a _very_ long run time. It is faster to upgrade the device and continue looking with more sensitivity than to integrate more to look for coherency in a noisy signal. The evidence we've seen so far of gravity waves comes from sources that would be much weaker than the current LIGO sensitivity, or else from sources that are expected to be very rare (2 neutron stars colliding should give off a ton of gravitational energy, but how often does this happen within X light years?).

The original LIGO sensitivity also matched or slightly exceeded other gravitational wave observatories that were being designed around the globe, so it was decided it was a good place to start. However, while other observatories in more populated areas get a lot of their sensitivity by having very complicated suspension systems for the mirrors and active isolation systems to reduce outside noise, LIGO gets its sensitivity by being in the middle of nowhere with plenty of space to build massively long beam tubes, which are a direct multiple of the sensitivity. Thus, it was easier to get LIGO running with simpler suspension systems, but upgrading the sensitivity does not require replacing the entire device. Simply replacing the mirror suspension systems with the more advanced ones that others have been working the bugs out on should give a large boost to the sensitivity. And since others have been working with these new systems, they'll be better understood and hopefully take less time and fiddling after they're installed in the LIGO facilities.

1000 pains = 1 Megahertz

Working...