Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:2 Words (Score 1) 810

Good point, pure EVs are currently unsuitable for apartment dwellers, or people commuting 100km to work. But, no one is talking about a market dominance. If only 5% of people decided to buy an EV that would already be a major success. Infrastructure and technology would follow and several years down the line we would find there are more of us falling into the "can buy" category.

The biggest problem at the moment is the price - ultimately it will have to go down because there is no reason why a mass produced EV has to be more expensive than an ICE. But for now, Tesla is doing it right, by targeting the premium market and selling a car that appeals to early adopters.

Comment Re:For non-Americans (Score 1) 174

The sources say it is 500km/h. That's the problem with unit conversion - on one hand there is always accuracy loss (500km/h310mph), on another it paints a false picture. "310" (two significant digits) sounds like a maximum speed they have managed to achieve, "500" tells you they have reached a milestone or a design target. Bloomberg did it right - they provided original numbers with translations in parentheses for casual readers. Slashdot, which supposedly serves geeks and technical audience, hasn't been any better than a popular magazine.

Comment Re:Hyper TEXT (Score 1) 566

Does supply of precious metals vary? Sure. Can it be ramped up 1000x, 1000000x or 1000000000x as easily as typing this line? Hell no!

Guys, you are of course right that there is no black and white choice here. Gold is only an approximation of a commodity with a stable price and therefore it can be used as money. Binary formats can be specified as clearly as text ones. But then, the *scale* is what sometimes makes all the difference.

Comment Re:Hyper TEXT (Score 1) 566

The first has to remain compatible with millions of LOCs like:
s.write('GET / HTTP/1.0\n\n')
which in practice prevents anyone (good willed or not) from extending the format in an incompatible way.

The other one can be arbitrarily extended by upgrading libhttp.2.so and a handful of other most commonly used implementations.

The second option is tempting if you want more control over the specification. Which raises the question: why would anyone need or want to change HTTP2 once it is deployed? Or, will IETF turn evil once they get power in their hands?

Comment Re:Hyper TEXT (Score 2) 566

I agree with you that it is technically possible to define a text format that is more obfuscated than the binary one. Yet, it almost never happens.

Text formats are designed to be open and humanly readable (that's why they are *text* formats). This encourages writing multiple implementations of parsers and formatters, often partial or ad-hoc ones (perl one-liners etc). At some point the critical mass is reached and no one, not even the original author, can fiddle with the format, effectively preventing any embrace, extend, extinguish efforts.

Binary generally do not generally reach this level of standardization. On purpose - people want to control the format and the handful of its implementations. There are some exceptions, like TCP or IPv4, (which BTW proves your point that a binary format *can* be open) but they are considered a failure in terms of extensibility specifically because they escaped the control.

Finally, in case of HTTP there is exactly *zero* benefit from switching to binary representation. Bandwidth utilization is negligible and has never been smaller, parsing has to be robust to errors for security reasons so you cannot save much processing power either. These things were important in early 1990s (yet we chose to communicate in plain text anyway) but not now. The only place where HTTP could be improved is latency as it scales slower than network bandwidth but that has nothing to do with the format.

Comment Re:Hyper TEXT (Score 1) 566

"Peg rates", "certificates"? Pegging fiat currencies to precious metals is just like assuring that binary protocols will always be 1:1 convertible into equivalent text representations. Over the time, the ratio becomes 1:2, 1:5,..., 1:1000. It will, because there are no technical obstacles preventing that, and organizational ones are never effective (drugs, child porn and terrorism will justify about anything).

And no, 1g of gold will always be worth 1g of gold. It my become diluted in new coins, or replaced with paper certificates. But your existing savings are as safe as you are.

Comment Re:Hyper TEXT (Score 0) 566

Text protocols are a guarantee of openness, just like precious metals are a guarantee of sound money. In theory, there is nothing wrong with binary protocols or fiat currencies, they may even be better performing or more convenient. But as we all know by now, promises "we will keep this protocol open" or "we will never print money out of thin air" are *always* broken.

Comment Re:Journalist Wanted Moore Hits (Score 1) 147

Moore's law is all about positive financial feedback:
- better products (capacity, performance, usability) ->
- more money ->
- process development (scaling feature size) ->
- better products.

It worked so well because there was a single variable - process feature size - that translated investment money into more attractive products. That produced 30 years of exponential growth and increased transistor density ~1e+4 times.

Tackling multiple problems (design, IO, packaging) doesn't work nearly as well - you need more money for the same result and most techniques deliver only a temporary boost. Hitting fundamental issues like sub-threshold slope limiting the supply voltage (~90nm) or quantum tunneling leakage (DRAMs at 20nm and Flash at 15nm) doesn't help either.

The development continues, only more slowly. This is visible in performance-sensitive applications (PCs), which improve incrementally for 10 years now - they are getting better but people are no longer tempted to change their devices every 2~3 years.

Comment Re:Duh (Score 1) 339

Are you in the US? Then perhaps you realize that the greatest period of growth in the US coincided with constant deflation (second half of XIX century)? That was when the nation was running consistent surplusses (both at individual and state level), there was plenty of money for investment (real money, coming from savings, not from debt), many social issues were non-existent (drug user? - die in hell. Immigrant? - earn you wages), and standard of life has greatly improved. All this without a single dollar collected through the income tax.
It is depressing to see how many Americans are falling into this trap. If there is anyone who should know better, it is you.

----------

A healthy economy should encourage:
- working - this is what brings value to the economy - the economy is only worth as much as the goods and services it produces,
- saving - this is the only thing that brings money for investment and social security,

By extension, it should discourage:
- consumption - it is depleting the pool of goods and services, and savings,
- debt - it is depleting savings and encourages excessive consumption.

When you organize your country around these rules you get period of constant growth of purchasing power and a deflation. If you flip the rules upside-down, you get constant drop of purchasing power and an inflation (bubbles are inflation concentrated on a small fragment of the market). It is really that simple. The difficult part is moving from a heavily indebted economy to one based on healthy principles. So far no-one has managed to do it without going bust or, worse, killing milions of people.

Comment Why? (Score 1) 84

Why would researchers publish their code? They have only one target - to get their *papers* published in reputable venues. More often than not, such venues are closed and paywalled, so it is not surprising that they do not enforce (in fact they discourage it, to say the least) opening up bits of research.

Some researchers would be happy to publish their code anyway (as a matter of principles, or to promote themselves through non-academia channels) but at best they would be frowned upon by their superiors for mis-allocating their resources. At worst, they would be accused of undermining team efforts (by disclosing too much information or exposing inconvenient assumptions to competing researchers) or risking legal conflicts with publishers (copyright).

As earlier mentioned, the code written as a part of research is often poor. This is caused by the same underlying mechanism - getting as many papers published with as little work as possible. That is not (only) about procrastination. The effort put into making the code better is better spent on work on another project.

As usual, "you get what you test for". In case of publicly funded academic projects this means "plenty of good enough papers and nothing more".

Comment Consonance (Score 3, Informative) 183

It's perhaps not obvious but there is no such thing as perfect consonance in music:

- Tone C3 is an exact second harmonic of C2 and a fourth harmonic of C1. That's why the sound so nice together.

- Tone G2 is a third harmonic of C1, but (surprise) not an exact one. That's because if you take 13 third harmonics (C G D A E B F# C# G# D# A# F C') you are supposed to arrive at the same tone. But you don't, there is a slight frequency offset. In practice, this offset is distributed among all 13 intervals so we are generally unable to notice it.

- The fifth harmonic tone (C1 -> E3') is also inexact. It is fairly close to the sound (here E) obtained from the scale above but again there is a slight frequency offset.

- The sixth harmonic (C1 -> G3) is 2*3 times the fundamental frequency, so is as (in)exact as the third harmonic.

- The seventh harmonic (C1 -> ~A#3, noticeably lower) is not on the (twelve tone) scale but it still sounds nice.

- The eight harmonic is exact (2*2*2, C1 -> C4). And so on...

The twelve tone scale is a rather clever invention, it manages to approximate a rather large number of harmonics with a small number of tones. But it is still only an approximation - a perfect consonance can only be obtained for octaves.

Comment Re:Why? (Score 2) 118

Nope, strong inductive coupling is efficient and reliable. But it only works at a short distance and requires coils to be aligned. That's perfect for recharging buses at bus stops, but too cumbersome for other uses.

Resonant coupling can still be efficient at larger distances but: it stores a huge amount of energy in LC tanks (currents and voltages roughly 100 times larger than in non-resonant coupling), produces proportionally stronger magnetic field, is very sensitive to losses in the environment (paddles, metal objects), and is not suitable for high power recharging (think more of 100W-order trickle charging).

So, we are talking about an expensive, difficult to install, weak, inefficient, unreliable, interfering and dangerous solution to a very pressing problem, which is sticking a plug into a socket. I can see (inductive) wireless charging being deployed in fixed-route buses but other than that this technology is only a distraction from solving truly important problems (batteries, specialized range extending ICEs etc.).

Slashdot Top Deals

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...