The efficiency of electric motors is around 90%, so I'm assuming the fuel-powered pumps have such a low efficiency it's worth using batteries instead of fuel to save weight. These are also unlikely to have rechargeable batteries, so the energy density may be an order of magnitude higher than let's say rechargeable LiPO-batteries.
About 17Hz or a bit more with most single DVI outputs, although 14Hz is the minimum required for DVI by actual spec. Twice that with two DVI signals. The display itself does the thing by partitioning the display; either 3840x2400@14+Hz or 2x1920x2400@28+Hz side-by-side or 4x1920x1200@60Hz in a 2x2 grid, capped by the display at 41Hz or 48Hz depending on the model.
I have one of these (rebadged T221; ViewSonic VP2290b), got it second-hand in 2008 or 2009. It's not just the display connection bandwidth, the 41Hz and later-model 48Hz limit is from the display internals. They use huge custom FPGA logic chips to drive the signals, which are apparently not fast enough for more than that, although some of them can be overclocked to drive almost 60Hz. Without these internal display limitations, four DVI cables have enough bandwidth to run one at 60Hz (4x1920x1200@60Hz).
I haven't bothered to drive mine with four cables, because with just two (1920x2400) DVI signals I get it up to 34Hz, but I've scaled it down to 30Hz, because it's evenly divisible by 60Hz. In normal desktop use, it's fast enough. For gaming and movies, there are other displays.
Eventually there'll be a point in resolutions when it's bandwidth-wise better to have the GPU on the display side and just run some future thunderbolt-esque long cable than running even higher bandwidths to the display. An 8k display with a resolution of 7680x4320 would require 50Gbps of bandwidth to be driven at 60Hz at 8bpp or 60GHz at 10bpp. The actual required data rate between the CPU, RAM and GPU is much lower nowadays, especially because most of the heavy lifting like rendering and video decoding is done by the GPU.
Well, someone who wants a modern car isn't buying an american V8 anyway.
Shouldn't maintainers of compromised systems be held liable for skimping on security?
iPhone 4 and newer iPhone battery replacement is fairly trivial:
1: Buy a battery and a pentalobe driver or bit from dealextreme or ebay for about $10
2: Uscrew the two case screws
3: Slide the back cover off
4: Unscrew the battery connector screw
5: Replace the battery and reassemble the back cover
I've done it about once a year on my iPhone 4, once the average recharge interval goes from about five days to about three days.
No, you must be thinking of 1970's stuff. Integrated circuits and surface-mounted components were mainstream by late 1980's.
We've sent more spacecraft to Mars than any other planet. We've had space stations with sustained life-support environments for quite a while. The Apollo stuff on the other hand started pretty much from scratch as far as space-faring goes.
Yes, but it was all-new tech back then. It's not so much about science and research anymore, just about finance and engineering to pull this off.
Because there's already an awesome open-source Mach kernel out there: XNU, and it ships with most Apple devices.
If anything humans are polygamous. A third cheat and the reason the other two thirds don't is because of social, financial and other consequences or just aren't attractive enough to get someone to cheat with.
I wouldn't count out the possibility of Samsung's Android diverging from the other Android. That'd leave the rest with whatever Google releases and Samsung providing their own, separate stuff and exclusive third-party apps.
Probably 43.5 GWh over its lifetime. The article is badly written, and so is the summary: classic Slashdot style.
And that's a 2007 device, isn't it?