Comment Re:Maybe (Score 2, Informative) 128
A stream of alpha particles would have a well-defined current, despite the lack of electrons.
A stream of alpha particles would have a well-defined current, despite the lack of electrons.
Yes. The antenna on my GPS receiver wouldn't fit into any smartphone, even if you took out the phone bits. I suspect if you took a smartphone into a deep valley with thick tree cover, you would find that it couldn't track satellites as well as even the cheapest dedicated receivers, let alone as well as a high-end unit. Also, no smartphone that I've yet heard of has a barometric altimeter, which means the altitude readings off a smartphone are pretty much a novelty. (The last time I went sea kayaking with a GPS, the altitude reported was -6ft the whole time, despite the receiver clearly being 8inches above the water.) An accurate altimeter can be very useful when navigating and geocaching in the mountains.
However, all the above reasons pale in comparison the the fact that my heart doesn't skip a few beats when I drop my GPS onto a rock in a river. I know it will be fine as long as I fetch it out before it gets washed downstream. If the same happened with an iPhone (which is far more likely to be dropped, due to the smooth, slick surfaces), I'd have good reason to panic.
And what is the actual thing ISPs provide access to? The Internet. But what is "the internet"? It certainly isn't a physical object that resides in a specific place. It's a communication system. Packet based, even. Which makes it very much an electronic embodiment of the idea of a system of post offices and post roads. All that internet service providers really do is deliver packets of information for their customers. At a high level, the service that ISPs provide is fundamentally the same as kind of service as what the USPS provides.
I think it's more that nobody is taking seriously the fundamental differences between hard drives and flash. Nobody has really stopped to do a comprehensive assessment of what existing assumptions embodied in our software and users will be broken by flash memory that is asymmetric in both access speed and access granularity. As a result, the pre-Intel flash SSD controllers made really stupid trade-offs, and they ended up with drives that were less suitable for the consumer market than ordinary hard drives. Once Intel made everybody realize that latency and IOPS mattered a lot more to consumers than throughput, people moved on to the next difference, and started complaining about the lower write performance of a nearly full SSD. Even today, I still see people referring to it as a "bug", when it is nothing more than an inherent difference from the spinning platters of hard drives. Smart garbage collection (which requires smart OS support) is a way of hiding the limitation, but the lack of it isn't a bug any more than a hard drive with a small cache is faulty. It just has obvious room for improvement.
OS X has the only OpenCL implementation that allows you to use CPUs and GPUs to run compute kernels from the same context. NVidia's implementation is GPU-only, and ATI's seems to still be CPU-only, and you can't use them simultaneously.
The difference between one ISO terabyte and 1 TiB is relatively smaller than the variance among normal fingernails.
To be fair, it doesn't really matter what version of Flash you're running. It still sucks, and is very insecure. The embarrassing part is more that it downgrades the version than that it exposes users to an extra security risk.
Apple's been using the UNIX trademark in relation to Snow Leopard for quite a while. Either 10.6 is certified (as Apple's website seems to imply), or The Open Group is in danger of losing their trademark.
What do you do when a security vulnerability or other serious bug is found in the version of the DLL? Do you trust the app vendor to be around to release an update in a timely fashion? Or would you prefer Microsoft releases an update that looks for any outdated DLLs anywhere on your hard drive and overwrites them? What if there are games that, as an anti-cheating measure, check the hash of their version of the DLL to detect tampering?
I think the point of the article is that new computers must be 64-bit capable in order to be advertised as Win7-ready. This is quit different from saying that computers being upgraded need 64-bit capabilities. In fact, Microsoft would be in huge trouble if they made Win7 refuse to install on non-64-bit capable machines, because the "release candidate" runs on machines as old as my 1.5Ghz Athlon XP, and such a drastic change in specs from something called a release candidate might not go over well with the FTC or the EU.
At worst, a phone in repeater mode would last as long as the normal talk time. However, if it's acting as a repeater in a dense mesh, it probably wouldn't need to (and shouldn't) transmit at as high a power as it would to reach a tower a mile away.
You haven't watched the hour+ long tech demo, have you? You seem to be completely unaware of it's capabilities for collaboratively building a document, or it's extension systems that mean people will be adding new capabilities all the time. It's a lot more than just an integration of email and IM.
They've had Super User for a while now. It's not linux specific, but linux questions are very much welcome.
At my university, we have a the VCL, a pool of blade servers accessible by RDP or SSH that get imaged on the fly when a user requests a machine with certain apps. These blades get wiped on log-out. (Home directories are of course stored elsewhere, and accessed over AFS.) This is very secure, but it lets students get admin access to their machine, and it also helps keep software licensing costs down, because it is trivial to limit the number of concurrent users of a package that isn't volume licensed. Performance when accessing the VCL on-campus is great, and in a corporate environment it could work great with thin clients.
Where are the calculations that go with a calculated risk?