I tried it for a few minutes, and the Prius never suddenly accelerated. Clearly the simulation is flawed.
There have also been some concerns over possible problems related to hydrogen gas leakage.[50] Molecular hydrogen leaks slowly from most containment vessels. It has been hypothesized that if significant amounts of hydrogen gas (H2) escape, hydrogen gas may, because of ultraviolet radiation, form free radicals (H) in the stratosphere. These free radicals would then be able to act as catalysts for ozone depletion. A large enough increase in stratospheric hydrogen from leaked H2 could exacerbate the depletion process. However, the effect of these leakage problems may not be significant. The amount of hydrogen that leaks today is much lower (by a factor of 10–100) than the estimated 10–20% figure conjectured by some researchers; for example, in Germany, the leakage rate is only 0.1% (less than the natural gas leak rate of 0.7%). At most, such leakage would likely be no more than 1–2% even with widespread hydrogen use, using present technology.[50]
...
[50] ^ a b "Assessing the Future Hydrogen Economy (letters)" (PDF). Science. 10 October 2003. Retrieved 2008-05-09.
The implication there is that even if leakage were a major problem, the gas doesn't escape the planet. Even if it did, and we switched entirely to hydrogen, and consumed 100 times the current rate of energy, I have a hard time believing we'd actually make a dent in the oceans. I'm going to guess that, by volume, the amount of oil that was ever on the planet is pretty trivial compared to the size of the oceans. Unlike what happens to oil when we burn it, most/all of the hydrogen would eventually be converted back into water.
If Mr. Sullivan needs [the fact that Jobs doesn't talk about the general problem with proprietary technology] explained to him then maybe he should hold his comments until he understands it. Does he actually expect *every* article, blog post or story to rehash this basic concept?
I think it's reasonable to expect an editorial that complains that Flash is "not open" as its first big bold bullet point would somehow address the reason why Jobs thinks we should care. I know why I care, but it's not at all clear why Jobs thinks I should care.
My take as a "young developer" isn't that the kernel is too complex or that I don't like developing for the kernel (although it certainly does have its issues) as that the drawbacks outweigh the benefits. I've written device drivers for some stuff I've got laying around as well as done some board support and bringup and the experience isn't any worse than one could expect from such a task. However, working on the *mainline* Linux kernel:
a) Doesn't get me paid.
b) Isn't "hip" - you don't see kernel developers speaking at media conferences or hanging out with celebrities, like you do "web 2.0 kids." The no-e-fame aspect is actually appealing for me but not for many of the people I've met.
c) Involves dealing with a lot of douchebags.
d) Involves wasting my time convincing an old-hands crowd self-assured of their relative place in the development world that my ideas have merit (also see c).
I "develop for the kernel" just fine, but I have no interest or desire in getting my patches to mainline - they benefit few people, Git makes it easy to track trunk while keeping my own code around, and I don't want to waste my time dealing with the douchebaggery and politics involved in reaching the mainline kernel.
This just doesn't make any sense. People who are using a social network are using a social network because they want to be found - because they want an easy way to keep in touch with a lot of people. Changing to a darknet model completely eliminates all these benefits. The only people who would buy such a device are people who shouldn't using online social networks anyway (making the import aspect odd).
Not only that, but there are actually two entirely different TI-Nspire models (Nspire and Nspire-CAS) that differ only in software (and cost).
So if it were to become possible to flash one firmware to the other, TI would both lose money and anger standardized testing organizations (most allow only the NSpire and not the CAS, and rely on the different labelling on the hardware to ensure students are using the approved unit).
DirectX is a forward-looking standard - Microsoft sits down (or stands up and yells) with developers and graphics manufacturers, and hammers out a spec which a "DirectX X.XX" card must support. Then vendors go and make a card and drivers that support those features. In this way every DirectX 10 or DirectX 11 card can be assumed to support the same things using the same APIs, and if they don't, it's the vendor's fault and they have recourse.
OpenGL, at this point, looks back - Graphics card manufacturers make a graphics card and then shoehorn its features into OpenGL. This way every single card has different supported OpenGL features implemented in different ways.
So sure, "OpenGL" gets some features first via extensions (it's debatable whether or not it's even OpenGL at that point, since the OpenGL standard doesn't even really play into it) - good luck using them, though.
The choice for game developers is pretty easy: support a lot of people (Windows and Xbox 360) using one consistent API, or support a few more people (Linux mostly, with some additional work required for PS3 or Wii) at a huge cost (debugging across vendors, platforms, and consoles).
If I want a landline, I can go buy any old phone I want, and as long as it speaks the right protocols (which are pretty simple for analog landlines) I can plug it into my wall, and it works.
It took the US government to end enforced landline phone rentals and open up the analog telephone network in 13 F.C.C.2d 420.
With today's moves towards "deregulation" I don't think we'll see the cell industry being forced to do anything similar in the near future.
The only possible interpretation of any research whatever in the `social sciences' is: some do, some don't. -- Ernest Rutherford