Slashdot videos: Now with more Slashdot!
US != the rest of the world and vice versa
Since the article is about US models of cars, the US numbers are appropriate. Since the article claims that this feature is already common in European diesels, I would guess that it is still a win in those cars as well.
These conditions apparently don't show up enough to justify the cost of determining safe operating parameters. Therefore, no flying. It isn't really complicated -- if a bunch of airlines want to get together and pay for the testing, they can fly. Otherwise, they stay on the ground.
I think that if the average person walking around had a 10% chance to be carrying a camera that could do thermal imaging, it would be hard to argue that you had a reasonable expectation of privacy and the police would probably be allowed to use it without a warrant.
If LIDAR cost 100k, I think the law enforcement would still be entitled to use it, but if FLIR cost $100 that would make a difference.
Or, learn what you are talking about.
IR cameras and film detect NIR in the 800nm - 1.3 micron range. Your stove heating element that is glowing a dim red will light up brightly in such a device, but it is completely useless for this type of application. IR thermometers and thermal imaging systems for the 0-100F range use much longer wavelengths, around 10 microns.
Note that you can't even make IR film that is any good at thermal wavelengths because it would get exposed sitting in a box. The film would have to be prepared, stored, used, and developed in a cryogenic environment. This may have been done (perhaps for IR astronomy), but you obviously can't just buy a roll of 35mm "thermal" film and pop it in a nikon.
Openfiler's web gui is buggy as hell, its local LDAP server option is poorly documented and provides terrible diagnostic messages when improperly configured, and it has no official support for installing/booting from flash. Never trust a product that wants to charge money for the admin guide.
I only tried FreeNAS briefly, and did end up using openfiler, but I would love to see anything beat openfiler.
He is represented by a public defender, which means he can't afford a new lawyer, and his current lawyer can't afford to put together a respectable case.
GCD is a mechanism to let one central authority dispatch threads across multiple cores, for all running applications (including the OS).
This is what most people talk about, and what is most obvious from the name, but it is not the interesting part of GCD.
The interesting part of GCD is blocks and tasks, and it is useful to the extent which it makes expressing parallelism more convenient to the programmer.
The "central management of OS threads" is marketing speak for a N-M scheduler with an OS wide limit on the number of heavyweight threads. This is only useful because OS X has horrendous per-thread overhead. On Linux, for instance, the correct answer is usually to create as many threads as you have parallel tasks and let the OS scheduler sort it out. Other operating systems (Solaris, Windows) have caught up to Linux on this front, but apparently not OS X. If you can get the overhead of OS threads down to an acceptable level, it is always better to avoid multiple layers of scheduling.
So we've had a defined standard that was, arguably, not the easiest to understand. THEN harddrive manufacturers started their fraud. And THEN people started complaining. So what, and please think about this, would be the right decision here?
This is revisionist at best and really just wrong. Despite all "wisdom" to the contrary, there has never been a universal acceptance of 1 MB = 2^20 bytes on computers. For instance, all of IBMs mainframe hard drives from the 60s and 70s were sold using base-10 prefixes. Early desktop hard drives from the 80s used both. I think the ST506 used base-2, but some other models used base-10. All networking and communications standards (ethernet, modems, PCI, SATA...) use base 10 prefixes for MB/s and Mbit/s. 3.5" floppy disks used NASA-style units where 1 MB = 10^3*2^10. Even while RAM is still almost always measured in base-2 units (due to manufacturing issues making it much easier to produce in power-of-2 sizes -- something which is not true for hard drives) the speed of the memory bus on your CPU is still measured in base-10 units.
It is a *good* idea to have K and M mean the same thing everywhere. A system where a 1 GB/s link transfers 0.93 GB every second is stupid. This is especially important as computers are being used in more and more environments. Should a 1 megapixel camera mean 2^20 pixels? What about CDs with a 44.1 KHz sampling rate?
Why would you say something as silly as that spice netlist format is not a programming language?