Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:Teribad summary (Score 3, Insightful) 221

All current input hardware uses fairly rectangular input grids. However, this is often far from the truth. A digital camera contains pixels, but each pixel is covered by a color filter. In a JPEG from the camera, all pixels are represented (and compressed), but some information is already only a result of interpolation. This is one reason for why RAW is preferable, no lying about the information around. One could make hexagonal sensors or sensors with varying pixel density for a greater and more affordable field of view, or weird lens designs where the projection is not rectangular. If you have vector or mesh-free image data you have much greater freedom in designing both input and output methodology.

Comment Re:First confirmation? Really? (Score 3, Interesting) 44

The main problem of the experiments of the Watson-Crick era was that the diffraction pattern was created by the average along the helix, so you could not really discern individual nucleotides. Considering diffraction by itself to somehow be inferior to transmission techniques is not very convincing, in my opinion. It's not even like the scientists can visually see this with their own eyes - and even if they could, the interpretation would be totally dependent on the design of the equipment, just like reconstruction of diffraction data is dependent on a number of assumptions.

What's relevant and interesting is the fact that we get close to observing individual molecules of DNA in detail, but that could be done with techniques for single-molecule diffraction as well.

Comment Re:Programming is applied Math (Score 4, Insightful) 233

Programming, in this sense, is applied method calling into your supporting libraries and framework. It has more similarities to designing a nice-looking Word template or using Excel in not overly creative ways. If a programmer of this kind ends up designing her own algorithms or even worse a full class hierarchy, it will surely end up on thedayilywtf. The thing is, they should not need to. You don't expect a household electrician to rewire stuff with a new transformer design just because it seemed fun to do for one specific customer and maybe 10% more efficient. You do standard stuff in standard ways. It's not trivial, but it's all done within well-defined bounds.

I would never want this kind of a job, but if you consider how many things that are still done manually in one way or another by people having GFLOPS on their desktops, it's also obvious that cheaper and more plentiful access to people able to just crank out code to do stuff has a tremendous value.

Comment Re:Counterpoint (Score 1) 409

600 000 000 meters making 50 000 000 meters peanuts. Sure, they are different, but not that different. For an effective energy weapon, you are probably not fine with say 1 % efficiency, so beam collimation becomes crucial. If the energy you are hitting the enemy with is a mere fraction of the energy you need to dump off safely from your own ship, you are in trouble. (Incidentally, that makes big defensive weapons based on atmosphere-devoid planetary bodies all the more realistic.)

Comment Re:128gb??? (Score 1) 278

No 2.5" 120 GB existed in 2001 either. 60 GB was a high-end laptop option in 2004. The first 1 TB desktop drive was released in 2007. And according to an old Engadget posting, the 120 GB Momentus was a big deal in 2005. Now, 7 years is of course a long time, but it's almost half of what you claimed. The minimum storage need for a modern OS has barely increased in that time, with the exception that you might to some extent have dual userspaces if you used to do only 32-bit and now support 64 as well as 32.

Comment Re:And NASA has made mistakes with this before... (Score 5, Insightful) 228

why the NASA engineers want to take such a risk

Similar to some devices here on Earth, the rover should have an automatic revert solution. For instance, a non-updatable software running on a separate processor detects specific conditions (like no signal from Earth for a while) and flashes back the updatable software to its original version when that condition occurs.

Such things tend to be present, but how many times have they tested the automatic revert in actual conditions? An alternative codepath is always a risk.

Updating the software can have great advantages. Only a slightly more reliable connection would allow vast amounts of more science to be done. Adapting the algorithms for autonomous functions such as simple navigation or sample processing also makes a great difference when your lag time for a single command is measured in terms of minutes and you don't even have that level of "real-time" access most of the time.

Comment Re:"The smearing of a computer legend" (Score 5, Insightful) 286

From your link: "What is the evidence, then, that QDOS was a derivative work – a rip-off? The answer lies in the API, which describes how software can call up the underlying operating system and make it work for the user. The first 26 system calls of MS-DOS 1.0 are identical to the first 26 system calls of CP/M."

Yeah, just like Linux and WINE are rip-offs. The need to map system calls by number and not only name was of course due to the fact that the actual calling mechanism worked by number. However, the IEEE article is still strange, since the matters described are already settled. On the other hand, the legend of DOS being stolen and not only a clone lives on, in some places.

Comment Re:After Rage (Score 1) 635

A game is far easier to move into WinRT compared to a desktop app using a window-based GUI with GDI for drawing. You get access to a fullscreen surface and you can manipulate that surface using DirectX. I wouldn't find a Windows 8-style requirement as much of a problem for a lot of titles. It will probably start off with touch-based titles similar to what you find for iOS and Android, but if that succeeds, I think you will see a lot of more "serious" desktop games ported over.

Comment Re:Reason? GNOME3 (Score 2) 535

However, you are not only polluting RAM with duplicate versions of code. You are also polluting L2 (and instruction L1, but that will probably be flushed anyway). No reason to make those context switches more expensive than what is really needed. And a large statically linked executable is in no way trivial to ignore, you can easily reach 10s of MBs.

Slashdot Top Deals

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...