It is a local news so the use of Fahrenheit scale is appropriate. It's not like people living outside of the US have to dress accordingly.
- higher temperature -> more energy in the system (higher pressure gradients, strong winds etc).
- changing temperature -> upsetting the steady state of the system (ditto).
The biggest problem at the moment is the price - ultimately it will have to go down because there is no reason why a mass produced EV has to be more expensive than an ICE. But for now, Tesla is doing it right, by targeting the premium market and selling a car that appeals to early adopters.
Does supply of precious metals vary? Sure. Can it be ramped up 1000x, 1000000x or 1000000000x as easily as typing this line? Hell no!
Guys, you are of course right that there is no black and white choice here. Gold is only an approximation of a commodity with a stable price and therefore it can be used as money. Binary formats can be specified as clearly as text ones. But then, the *scale* is what sometimes makes all the difference.
The first has to remain compatible with millions of LOCs like:
s.write('GET / HTTP/1.0\n\n')
which in practice prevents anyone (good willed or not) from extending the format in an incompatible way.
The other one can be arbitrarily extended by upgrading libhttp.2.so and a handful of other most commonly used implementations.
The second option is tempting if you want more control over the specification. Which raises the question: why would anyone need or want to change HTTP2 once it is deployed? Or, will IETF turn evil once they get power in their hands?
I agree with you that it is technically possible to define a text format that is more obfuscated than the binary one. Yet, it almost never happens.
Text formats are designed to be open and humanly readable (that's why they are *text* formats). This encourages writing multiple implementations of parsers and formatters, often partial or ad-hoc ones (perl one-liners etc). At some point the critical mass is reached and no one, not even the original author, can fiddle with the format, effectively preventing any embrace, extend, extinguish efforts.
Binary generally do not generally reach this level of standardization. On purpose - people want to control the format and the handful of its implementations. There are some exceptions, like TCP or IPv4, (which BTW proves your point that a binary format *can* be open) but they are considered a failure in terms of extensibility specifically because they escaped the control.
Finally, in case of HTTP there is exactly *zero* benefit from switching to binary representation. Bandwidth utilization is negligible and has never been smaller, parsing has to be robust to errors for security reasons so you cannot save much processing power either. These things were important in early 1990s (yet we chose to communicate in plain text anyway) but not now. The only place where HTTP could be improved is latency as it scales slower than network bandwidth but that has nothing to do with the format.
"Peg rates", "certificates"? Pegging fiat currencies to precious metals is just like assuring that binary protocols will always be 1:1 convertible into equivalent text representations. Over the time, the ratio becomes 1:2, 1:5,..., 1:1000. It will, because there are no technical obstacles preventing that, and organizational ones are never effective (drugs, child porn and terrorism will justify about anything).
And no, 1g of gold will always be worth 1g of gold. It my become diluted in new coins, or replaced with paper certificates. But your existing savings are as safe as you are.
I could play with Wayland API and help it to take off but not if I have to wait 5+ years for Wayland to get X11 features and drivers.
Are you seriously asking "why would you want the coolest and most technologically advanced car in the world"? On
When have you last seen a development like this in conventional cars? In '30s?
Moore's law is all about positive financial feedback:
- better products (capacity, performance, usability) ->
- more money ->
- process development (scaling feature size) ->
- better products.
It worked so well because there was a single variable - process feature size - that translated investment money into more attractive products. That produced 30 years of exponential growth and increased transistor density ~1e+4 times.
Tackling multiple problems (design, IO, packaging) doesn't work nearly as well - you need more money for the same result and most techniques deliver only a temporary boost. Hitting fundamental issues like sub-threshold slope limiting the supply voltage (~90nm) or quantum tunneling leakage (DRAMs at 20nm and Flash at 15nm) doesn't help either.
The development continues, only more slowly. This is visible in performance-sensitive applications (PCs), which improve incrementally for 10 years now - they are getting better but people are no longer tempted to change their devices every 2~3 years.
It is depressing to see how many Americans are falling into this trap. If there is anyone who should know better, it is you.
A healthy economy should encourage:
- working - this is what brings value to the economy - the economy is only worth as much as the goods and services it produces,
- saving - this is the only thing that brings money for investment and social security,
By extension, it should discourage:
- consumption - it is depleting the pool of goods and services, and savings,
- debt - it is depleting savings and encourages excessive consumption.
When you organize your country around these rules you get period of constant growth of purchasing power and a deflation. If you flip the rules upside-down, you get constant drop of purchasing power and an inflation (bubbles are inflation concentrated on a small fragment of the market). It is really that simple. The difficult part is moving from a heavily indebted economy to one based on healthy principles. So far no-one has managed to do it without going bust or, worse, killing milions of people.
Why would researchers publish their code? They have only one target - to get their *papers* published in reputable venues. More often than not, such venues are closed and paywalled, so it is not surprising that they do not enforce (in fact they discourage it, to say the least) opening up bits of research.
Some researchers would be happy to publish their code anyway (as a matter of principles, or to promote themselves through non-academia channels) but at best they would be frowned upon by their superiors for mis-allocating their resources. At worst, they would be accused of undermining team efforts (by disclosing too much information or exposing inconvenient assumptions to competing researchers) or risking legal conflicts with publishers (copyright).
As earlier mentioned, the code written as a part of research is often poor. This is caused by the same underlying mechanism - getting as many papers published with as little work as possible. That is not (only) about procrastination. The effort put into making the code better is better spent on work on another project.
As usual, "you get what you test for". In case of publicly funded academic projects this means "plenty of good enough papers and nothing more".