The original pngquant was quite bad indeed (it did support alpha, but poorly). It had lots quality trade-offs for MS-DOS era machines.
The modern rewrite is much better in terms of quality and it's especially tuned for good alpha channel support.
I'm working on it -- or rather squeezing every last drop of the existing format.
With good PNG8+alpha quantization you can get compression in the same league as WebP (although WebP is still better) and basically 100% browser support (it degrades well in IE6).
Cut out the middlemen and give it directly to recording industry lobbyists.
Of course replacing Adobe lock-in with Apple lock-in would be dumb. HTML has 100% market share and CSS+JS are still ahead of Flash.
Jobs didn't block Flash on desktop, where Flash has high market share, so why quote that statistic? He refused to support it on mobile, where currently iOS has much higher market share than Flash.
> Because the bank is (presumably) chartered in the country you live in and heavily regulated, and you have recourse if they screw something up.
Indeed, Opera won't get trillion dollar bonus if they screw something up
It could be accepted.
Apple has already accepted number of WebKit-based browsers, so browsers in general aren't forbidden.
And for iPhone users, especially on EDGE, there is very good reason to use Opera Mini: it's going to be faster. iPhones before 3GS are also very low on RAM, and Safari only uses RAM for caching. Presumably Opera Mini would be able to keep many more tabs open and fully cached.
It is patented, and in exactly the same way as h264 will form a toll both on the internet
All known Theora patents have royalty-free license. Only thing that is "exactly same way" here is risk of submarine patents.
Remember that Opera proposed video element in the first place and they've chosen Theora from the start. They're not fond of patents, and may not want to choose H.264, especially if Mozilla doesn't.
> But how exactly do you go about doing unit tests of the front end of a web application?
Unit testing in PHP isn't different than in other languages. You split code into testable chunks and hammer them with PHPUnit.
How is that related to C++/PHP? And would you just run Facebook without unit tests? (good luck!)
Anyway, for JS there's Selenium and it can integrate with PHPUnit. UI testing is difficult, and browser-hosted tests are especially fragile and finicky, but that's not PHP's fault and C++ won't fix it.
> If there is so much PHP out there, why wouldn't/couldn't there be an efficient compiler
There is PHC and Roadsend.
However there are PHP-specific problems that make it harder than it should be.
PHP's "standard library" is heavily dependent on the interpreter, so you either lug it around and maintain its state, or rewrite 5000 methods.
And there's of course eval(), extract(), dynamic include/autoload and other magic that makes static analysis pretty hard or impossible.
Developers that are diligent enough to make only 1 memory-related bug/year can certainly spell variable names correctly.
If you have statically typed language, you rely on types. If you have dynamic, you rely on unit tests. Both are probably equally slow
This is not a problem in PHP.
If you have editor with auto-completion, misspellings are uncommon. Other kinds of errors are caught (and logged) as soon as faulty code is executed. With short edit-run cycle it's pretty quick.
It's not perfect, but OTOH C++ compiler won't tell you about all memory leaks, dangling pointers and buffer overflows at compile time.
C++ has runtime errors too, and you won't easily get logs with line numbers where your memory corruption happened.
Technological progress has merely provided us with more efficient means for going backwards. -- Aldous Huxley