Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Comment They just don't get it. (Score 5, Insightful) 123

Games? Social Networking? The fact that Murdoch is a part of this venture does not surprise me, because it shows an astounding lack of understanding for why people are buying ebook readers and what the market actually wants in a book reader appliance. Namely, they failed to do prior art to find the millions of PDAs people were using to do exactly what this new format is proposing. Or rather... not doing exactly what this format is proposing, because no one really needs it and it is an energy hog.

The Kindle and other ebook readers (i.e. the Sony one I've owned for the past 3 years) did not become popular because they were a new idea and a new device, they became popular because of a new technology: e-ink. There were book readers before the e-ink displays came around, but very few people used them because they suffered from 2 major drawbacks. The first was the power consumption of their displays meant that you had to plug them in and let them charge on a daily or twice daily basis. People already have to charge their cell phones on a daily basis, but charging one twice a day when you use it a lot is pretty annoying, and a huge amount of power is spent on the display when a cell phone is being used. The second drawback is simply screen real estate and the interface to get to it. PDAs could do exactly what is being proposed, but they didn't because it was hard to use a handheld device in that manner. Sure handheld gaming devices exist and are used... but they have buttons and layouts specifically tailored to using the device as a game. The same goes for cell phones, PDAs, and ebook readers. You can play games on cell phones, but not easily and the power usage sucks up the battery. The new format proposal looks to do exactly the same thing to ebook readers. Congratulations, you just re-invented the N-Gage.

The major "killer app" in the ebook market that no one is mentioning is really quite simple. It isn't a killer display (black and white is fine for books), it isn't a fancy new display (though color would be nice, it would also be mostly useless and a major expense), and it isn't a whiz-bang new DRMed file format. What is missing from the ebook marketplace is simply a universal storefront. Amazon books only work with the kindle. Sony's store only works with their ebook readers. The same for most other ebook stores (with a wider list of readers that can use their store... but a lower percentage of people who actually have those readers). DRM has fractured the marketplace, but selling to the entire install base of ebook readers is really quite simple because all ebook readers out there can read non-DRMed files. It is only the stores that are enforcing DRM. The first store to offer a wide selection of books in non-DRMed format at reasonable prices will suddenly be able to sell to 100% of people interested in ebooks and steal market share from everyone else out there.

I could rant on this subject for days, but the bottom line is: I can get almost any book out there for free from pirates, and I don't have to worry about losing those books when I migrate from my Sony Reader to whatever device I might end up using next (the battery is finally dying). However, I've bought most of my books from the Baen store, because I can get them fast, easily, and with good proofreading. It is easier to read them and find them, and they aren't some OCRed crap with forced line breaks and errors. Publishers have to understand that on the web, they're not competing against the price and convenience other publishers, they're competing against some random pirate scanning in a copy of their book and giving it away for free. If it isn't easy to find a copy of their book that will work on my system for a reasonable price there ($15 for a paperback selling for $8 at the local bookstore?) there is no reason to give them money.

That said, there is one thing I can see some value in for the proposed format: daily deliverables. This is something that isn't done all that well in current generation ebook readers, but it isn't exactly a new idea. There has been some freeware software for the Sony Reader that was able to download and sync online newspapers for you for quite some time now. I first ran into it a couple years back, but didn't actually use the functionality. The only real drawback to it was having to connect it to your computer in order to update, so wireless updating in a smooth manner would be worth some money. So it is valuable, but not nearly as new and unique as they seem to think. For that matter, I saw info on the new "Sony Daily" that is supposed to come out soon, and its entire premise is that it can download content wirelessly. If they can actually deliver content easily and smoothly over a wireless link, I see no real reason to move to a special format for it and the inevitable device specific DRM that tries to lock you in.

Comment Re:Not the engineers fault (Score 1) 383

The Therac-25 incidents happened partly because there were hardware interlocks on previous versions, but not on the updated version. However, a simple "don't kill the patient" interlock would not have worked. The basic problem is that it handled both e-beam and X-ray dosage on the same machine. And you get X-rays by hitting a target with an e-beam of much, much greater power. This absorbs the e-beam and emits a much weaker X-ray beam. If I remember what I read about this incident correctly, all of the incidents were some form of "we wanted X-rays, but the target was rotated out as if we wanted an E-beam, so the entire E-beam was applied to the person instead of the X-ray target". In standard X-ray operation (which was by far the majority of the doses that were requested), the beam had to be active at a high level in the majority of cases. Since this beam was more than strong enough to kill anyone if the target was improperly placed, almost every single treatment would involve someone bypassing a "don't kill the patient" safeguard. That is just begging to be bypassed each and every time without thinking about it.

The fact that there were several bugs that led to similar results with no backup is the major issue. There are various ways to fix this issue, including hardware interlocks, actual software review, and exhaustive test methodology (including designing the software so that it can be tested exhaustively). In the end, they cut corners and this killed patients. They reduced the cost by removing "extraneous" hardware interlocks found on the Therac-20 model, because they didn't realize that they were activating and saving lives. They reduced the cost by hiring programmers who clearly did not understand proper code design and by reusing old code that depended on the interlocks. They reduced the cost by not requiring exhaustive testing, and code that supported exhaustive testing. In particular, the hardware interlocks were not the simple "low power or else" checks, but more complicated checks on what valid powers vs. other settings were appropriate. More expensive than a simple "don't go to high power without authorization" check, and thus more expensive.

I can remember two examples of errors that caused problems. One of the incidents involved an 8-bit integer that was incremented when it was checked and found not ready in a continuous loop. This integer was part of what checked to see if the target was in place. So using a testing procedure where you make a slight mistake, fix that mistake but then forget to rotate the target back in would be stopped by this check.... 255 out of 256 times. The other 1 out of 256 times it had just rolled over and gave an incorrect output. Someone lost that game of Russian roulette.

Another of the incidents involved fast data entry. You enter the dosage as if you were going to give the patient an X-ray beam (which was much more common than E-beam treatments and became a habit to some operators), and hit enter at the bottom of the setup form. This starts the beam strength calibration. If you then realize you really wanted an E-beam of the same strength for this patient, go back to the top, change one entry from X-ray to E-beam and fly through hitting enter on the rest of the form in 8 seconds to get to the bottom. The beam strength calibration finishes 8 seconds after you hit enter the first time, exits its loop and checks to make sure the form is still properly filled out (which by now it is). Then it removes the target because you asked for E-beam and it doesn't double-check the power setting which was originally set for X-rays. Since it doesn't go back to double check the power setting vs. E-beam/X-ray and just checks the single "form properly filled out" variable, it is inherently dangerous. This was fixed by the infamous "remove the up key on the keyboard" hack by the company, forcing people to take more than 8 seconds to fill out the form again.

While I'm more of a hardware engineer than a software one, even I can see where both of these errors should not have been made by anyone who know what the heck they were doing. The fact that they were not reviewed exhaustively before going into a product as potentially dangerous as a radiation treatment machine is... well, a case study in how to do things wrong.

Comment Re:Sharing books? (Score 1) 503

Yes, Amazon's store is not a good advertisement for the value of electronic publishing vs. paper. It is better than the Sony store, but that is not a ringing endorsement. The price differences are pretty random and nonsensical, but at least they're often in the same range as a paperback book instead of selling it as though it is a hardback even though it has been out in paperback for months. Compare that to Baen books, however. The last book I was just reading is $8 on amazon for the dead tree edition. At the Baen online store it is $6 with no DRM and in a variety of formats. And they've been selling all their books for 1-2 bucks cheaper than the paperback edition for the past several years, even when the book was still in hardback. This is even cheaper when you consider the "webscription" bundle of all the books they're publishing or republishing in a month, typically 3-4 new hardback books and 3-4 older paperback books for $15. If you'll read half those books, that is $5 or less per book.

I've spent around $1000 at Baen's store in the last 3 years to put books on my Sony Reader. The only problem is that they can only publish new and interesting books so fast. I've bought pretty much nothing from any other online bookseller (unless you count the free $50 coupon that came with my Sony ebook reader, where I discovered exactly how bad the Sony store sucked). It's not that I don't want to buy other books, it's that all the other bookstores I've found either had restrictive DRM that didn't work with the Sony reader or had pretty horrible selection. Or both. If I could have found a store like Baen with a wider selection of books, I could have easily spent another $2000 on my book habit. Instead, I reread old books, or look for them from pirated sources in badly OCRed and formatted versions.

Comment Re:Non-Toxic inert? (Score 1) 237

This distinction is part of what makes the Hanford area in Washington such a difficult cleanup effort. The separation of plutonium for WW2 isn't really the problem, its all the poorly documented experimental methods they used in the cold war. You end up with radioactive metals dissolved in all sorts of chemicals, and then you don't bother to document which chemicals. I wouldn't even consider it radioactive waste exactly, it's some nasty chemical waste that just happens to be radioactive from dissolved metals. Separating the radioactive metals from the rest of the chemical soup would be a significant first step, because then you can treat each part differently. i.e. the radioactive stuff won't try to eat its way through the container, and the chemical stuff won't try to kill you just for standing next to it.

Comment Re:They exist. (Score 1) 553

Exactly. The initial design for LIGO was only expected to be sensitive to the largest possible sources of gravity ways. The theories on most of these weren't proven in any real sense, so it was decided to go look for them. Many theories predict a much lower level of background noise, which LIGO cannot yet detect without a _very_ long run time. It is faster to upgrade the device and continue looking with more sensitivity than to integrate more to look for coherency in a noisy signal. The evidence we've seen so far of gravity waves comes from sources that would be much weaker than the current LIGO sensitivity, or else from sources that are expected to be very rare (2 neutron stars colliding should give off a ton of gravitational energy, but how often does this happen within X light years?).

The original LIGO sensitivity also matched or slightly exceeded other gravitational wave observatories that were being designed around the globe, so it was decided it was a good place to start. However, while other observatories in more populated areas get a lot of their sensitivity by having very complicated suspension systems for the mirrors and active isolation systems to reduce outside noise, LIGO gets its sensitivity by being in the middle of nowhere with plenty of space to build massively long beam tubes, which are a direct multiple of the sensitivity. Thus, it was easier to get LIGO running with simpler suspension systems, but upgrading the sensitivity does not require replacing the entire device. Simply replacing the mirror suspension systems with the more advanced ones that others have been working the bugs out on should give a large boost to the sensitivity. And since others have been working with these new systems, they'll be better understood and hopefully take less time and fiddling after they're installed in the LIGO facilities.

Comment Re:I think I see the problem. (Score 2, Funny) 553

I worked at LIGO Hanford a few years back before going back to grad school. Since it is essentially a scaled up prototype, new things were always being fiddled with and the device was very temperamental. If we could have blown the dust out of the cartage, we would have. How easy/hard it was to gain and hold lock (when the laser is resonating properly) varied on a daily or sometimes hourly basis with no obvious way to tell what was wrong this time.

As a joke, I put together an emergency kit for the control room. It consisted of:
1) one(1) cardboard box with "emergency locking kit" written on it. Also suitable for use as an altar.
2) one(1) rubber chicken for use as a sacrifice for any suitable god.
3) one(1) butter knife stolen from the lunch room.

To my knowledge, it was never officially used. But the rubber chicken did end up with some suspicious marks on its neck and the butter knife did end up with red marks along the edge. It was claimed to be accidental damage and a slip with one of the whiteboard markers, but I suspect something else was at play.

Comment Re:It is not the volts (Score 1) 336

I'll expand on this since people keep claiming I'm wrong. This all depends on where you measure the voltage. If its on the device itself, then technically I'm wrong. However, if you look at the body itself and the important parts of it, I'm correct. Everything has some sort of capacitance and inductance associated with it, even the human body. It isn't a great capacitor or a great inductor but it does act somewhat like one. This doesn't matter at DC or at low frequencies, but when you look at AC or high frequency transients (shocks from rubbing your feet, the initial hit of a spark plug/taser, etc.) these values start to have an effect.

Without going into gory details, the main effect these values have is that they smooth out the voltage that is actually applied. The capacitance of your body means that it resists the instantaneous change in voltage, so the "12,000V" discharge is not applied to your body the exact picosecond that you touch it. Instead, it starts charging your body's capacitance, and your body's voltage starts to rise. If the voltage source can't actually sustain 12,000V across your body its output voltage drops very, very quickly. Eventually it reaches equilibrium at a lower voltage across the body, hopefully one that is not fatal.

So, to correct my statement: Anything that can sustain 12,000 can not only kill you, it can jump air gaps to do so. Anything that can't sustain that voltage is likely just painful.

Of course, if it can't sustain the voltage, usually the number is just given by the marketing department to sound large. Or for very specific purposes (ESD testing, spark generation, etc.).

Comment Re:It is not the volts (Score 5, Informative) 336

Somewhat off topic, but...

While true and oft-repeated, the volt/amp comment ignores the fact that there is a definite relation between the two. It is easier to determine the exact effect on the body if you know how many amps went through the person's heart and/or other muscles, but ballpark figures with volts can give some idea of the danger. The body is essentially just a resistor, so there is a linear relation between volts and amps if you know where that voltage is applied and thus what the resistance of the body between those 2 points is. You know that with 12 volts it takes some ingenuity to kill someone, but 120 volts from a wall socket is dangerous if mishandled. 1200 volts will be fatal when applied directly to the skin almost anywhere. 12,000 volts will not only kill you, it will arc through small air gaps to do so (i.e. tasers, you don't get all of the claimed thousands of volts over the body, most is dissipated across the air gap or is regulated by the circuitry to keep the current low).

The way I look at it, amps give you a good idea of how dead you are. Volts gives you a measure of how bad something is trying to kill you.

Comment Re:That could be pretty cool (Score 1) 80

My current annoyance with Sony is product support. I bought the Sony reader 3 years ago. I just upgraded to Vista 64 after my computer had a minor melt down. Their software does not support 64 bit OSes with the very first version of the device. Apparently the PRS-500 had its own specialized USB driver, while the current models act more like a thumb drive. So they never ported the specialized drivers over to 64 bit vista, just let the normal USB drivers work with the 505 and later models. 64 bit Vista has been out for how long now? At this point I just assume it will never happen, and they've dropped support for the device only a couple years after putting it on the market. Inexcusable.

Comment Re:When Will the Average Consumer Learn? (Score 5, Insightful) 311

I completely disagree. Buying an item you don't intend to actually use is sending the wrong message. You're rewarding the book publishers for their insane DRM when you should be discouraging them.

Finding pirated books can be a pain in the ass. If they're going to force me to spend time looking for a copy with bad proofreading and odd line-breaks, I'm going to ask for a refund on the money I spent on the book. Or better yet, just not spend it in the first place. Its not that I'm unwilling to buy ebooks, its that I value my time and spending 10-60 minutes looking through various websites/peer to peer applications is more valuable to me than the cost of the book in the first place.

And for the record, I've spent just under $1000 at Baen's online store over the last 3 years, because the books there are unencumbered by DRM and are easy to find and buy. I'm more than willing to buy books if I'm given a fair deal. It just seems that a lot of book publishers are so scared by the piracy boogieman that they piss off their real customers.

Comment Re:The real questions is: (Score 1) 170

This is the major question, and I can't seem to find any info about it. If the books are sold without DRM Google is in a position to force other online publishers to follow suit, but at the same time fewer publishers will want to list books with Google (due to percieved losses from piracy). It seems like more publishers are wising up to the fact that DRM is only hurting them, but there is still a long way to go before all books are available in non-DRMed formats. I suspect that Google will end up using the middle ground again and allow publishers to choose whether the books are DRMed or not, which means that all the major publishers will continue to try to make DRM work.

The basic issue is that all major eBook readers can handle a large number of non-DRMed files, but only 1 DRM format. If you can't find the book in that specific DRM format, you're out of luck. Typically, these are specific to the company that puts out the reader (i.e. Amazon's kindle format, Sony's reader format, etc.). The Sony store is expensive and has a limited selection. Amazon has a much better selection, but not perfect and is often expensive as well. Fictionwise has a mediocre selection (seems to be better than Sony in my area of interest), but their DRM doesn't work with the two most common kinds of ebook readers (the Sony and Amazon ones).

Since they're not going to put out their own version of an ebook reader, I'm hoping that Google will go without DRM so that I can use their store with my Sony Reader. If not, I'll end up pirating books again and end up with free but often badly formatted books after spending 4-5x as much time looking for the book as I would with a proper store. I've already tapped out Baen's back catalog of interesting books (spent close to $1k getting the ones that looked interesting) and I read books faster than they can publish them. I'm willing to buy books online, I just can't find someone to take my money and give me something that works. Yes, this frustrates the hell out of me. I refuse to buy books then pirate them because it sends the signal that DRM is acceptable. If you're going to make me spend time looking for crappily formatted books due to fear I'm going to steal something, I'm not going to pay you for it. I don't like the fact that this means authors don't get paid, but I'm more than happy with the fact that publishers don't get paid because of this.

Comment Re:Written to be released on DVD (Score 1) 834

Exactly. I like Heroes, but I refuse to watch it as it comes out. When every single episode has a "to be continued" cliffhanger and you have half a dozen or more stories in the air at once, it is just too dang annoying to wait a week between episodes. Seasons 1 and 2 I watched on the NBC website. This last season I just finished watching on Hulu (much better interface, no surprise there). At 3ish episodes a night it took about a week, and was much more enjoyable and understandable than stretching it out over half a year.

Slashdot Top Deals

egrep patterns are full regular expressions; it uses a fast deterministic algorithm that sometimes needs exponential space. -- unix manuals