Having the correct tools does not magically make broken things unbroken, but it helps you to repair them.
Contrary to whatever silly fantasy world you live in, 99.9999% of the population DOES NOT GIVE A SHIT ABOUT DISASSEMBLING THEIR PHONE. They just use the damn thing.
They use it until they crack the screen; then they want a non-broken phone.
But these serve different purposes. The kind of app you'd use Sinatra for is the kind of app Rails would be worse at, and vice versa. Sinatra is more in the same space as Camping, and I don't know if anyone still uses Camping.
Sure, but hasn't the Python community kinda imbraced Django as the One True fullstack framework?
Excuse me... Free has little to do with price. And in the case of GPL and LGPL, the price is as follows: "If you use the software you have to provide the source and any possible modifications you might have made to the people you sold/gave the software to." That's the price. Whether you view it as worth nothing or priceless all is in whether you're just a user or a developer.
Well, to be pedantic, that is the *cost* of the (L)GPL. The price is 0.
I am not an economist.
How exactly do you put something into public domain legally, such that you can legally protect them to be in public domain?
That's OK. Maybe some day Slashcode will actually render <comic book guy> and </comic book guy> tags. About the time they decide to implement more than 2% of the HTML entity set.
Of course, by that time, everyone else will have been using Markdown (or similar) for 10 years.
My statement is still valid. you hand someone your card to pay for gas, they can go in and duplicate it very easily with a magnetic stripe just by swiping it through a reader.
You go inside to pay for gas? I just use the cardswipe/pinpad on the gas pump, which I thought was pretty standard practice these days.
So in summary they haven't been getting similar improvements in speed.
Perhaps that's because they didn't suck so much in the first place (with the possible exception of MRI)?
Because everyone chooses a language based on raw execution speed, amirite?
You were comparing relative increases between different implementations of one language before; now you're comparing the fastest implementation of one language to the not-fastest implementations of others. =_=
Even if they look vaguely similar on the surface, all of the languages you've mentioned are quite different internally. Some performance changes that would take too much effort to change, some won't happen for philosophical reasons, and a great deal more are not applicable to anything other than the language they were written for.
> There's "fat-val", "tracer JIT" and "method JIT". Just curious, given all these advances in JS speed, are there technical reasons why stuff like Python, Ruby and Perl aren't getting similar improvements in speed?
Perl... well, it's either dead or incredibly alive, depending on who you ask, but all development seems to be focused on Perl6.
Ruby doesn't have an "official" interpreter. The standard C implementation uses YARV for 1.9, which is considerably better than MRI (which was in 1.8). Rubinius is supposed to be faster, but isn't quite ready yet. Ruby Enterprise Edition seems to be fairly popular, but doesn't have 1.9 compatability yet. I believe JRuby is fairly widely-used, and it seems that it can perform better than YARV sometimes (and consistently better than MRI).
Nowhere. But right now it's the most widely adopted and implemented (pretty much everyone but Firefox either does or is planning to support it).
And pretty much everyone except Apple is planning on supporting WebM (or is, currently). Sure, IE will only support it if a vp8 codec is installed on the machine, but that counts enough for me.
"Our vision is to speed up time, eventually eliminating it." -- Alex Schure