No, it's not bullshit. Even if Webkit were actively attempting to defeat compatibility, they can't exactly just change how -webkit-border-radius or other such extensions work without breaking existing markup, which would hurt themselves more than it would hurt their competitors (rendering would break in their engine while it remained compatible in competing products).
Sure, maybe a toolkit-specific extension would have to be reverse engineered so you can provide a proper compatibility layer, but that's not the case with Webkit. Not only does Webkit submit official recommendations which are extremely detailed descriptions of their extensions, but even if they didn't, it's open source, worst case scenario you could read it for yourself.
Which is exactly Firefox and other browsers did. The problem was that IE's broken rendering was failure to adhere to standards (rendering standards-based markup in a manner different from the standard), while Microsoft's current complaint is that other browsers implemented standards deviations in the manner that standards provided for, but which apparently MS can't be arsed to provided a compatibility layer for.
And nothing says that Microsoft can't translate -webkit- specific prefixes in a compatible manner. Just because it's -webkit-something doesn't mean only Webkit is allowed to use it, but rather that it should be compatible with the Webkit implementation.
A lot of these -webkit- prefixes exist because these are CSS 2 or CSS 3 properties that predated finalization of these standards, and most of them are largely compatible with the final standard if the prefix is removed. Webkit was complying with standards by adding features not yet finalized and prefixing them so there would be no conflict with the final standard. MS is essentially upset that Webkit's presence is sufficiently strong that developers for the first time in many years, don't feel the need to test against Microsoft's platform.
Depending on what the exercises are, they might not leave much room for slight differences. "Construct a rounded rect button with 12 px radius corners with a vertical gradient from #RRGGBB to #RRGGBB and a 15% drop-shadow with radius of 7px offset by 14px at 120."
A better solution is to digitally watermark the solution files, or if the students have access to the solution files independently (i.e. they came with the book on a CD), watermark the source files and require students to start with those rather than the ones from the CD. Honestly the book ought to have watermarked the solution files already.
Alternatively alter the exercise enough that the students can't depend on the provided solution. On the whole, this is not different from any other course where you're given problems to solve, and also given the solutions. You're going to by-and-large be depending on the students for honesty, if you don't feel you can do so (and that's probably safest), alter the parameters so the provided solution and the correct solution do not agree.
Do you mean the TARP program? The same one that earned the government $19.6 billion more than they put into it? Seems like a good investment to me, it both rescued the economy and turned a profit.
They are a useful tool, but not a replacement for human Q/A since tests inhrently depend on a programmer's guess on what might go wrong.
All testing depends on what a human guesses might go wrong, automated or otherwise. The advantage of automated testing is that you can perform those tests thousands of times faster than a human can.
It depends on the nature of your application, and how strong your test suites are, and how damaging it is to your customer or company if your product fails.
If your application has very strong and clear determinism along with few or no cross-interactions, then test cases can be fairly easy to write and can also (importantly) provide a comprehensive quality analysis. With good test cases, you can provide a full regression test multiple times per day. With sufficient confidence in your automated tests, some companies even deploy to production automatically upon successful test execution, this is called Continuous Deployment.
Not too many big shops can realistically provide sufficient test coverage for continuous deployment to be reliable, but small shops can get away with it much more easily. In part this is because their product is probably simpler and easier to test in an automated manner, plus the engineering staff probably understands every aspect of it extremely well (also a side effect of the simpler product).
Finally you can do continuous deployment if your product is a version controlled API. Your customers bind and test against a specific version, and the features of that version are locked, only bug fixes are deployed to that version, while new versions can safely be deployed because customers only use the newer version if they go out of their way to do so, and they presumably will provide test coverage of their own when they increment api versions.
if we pushed Adobe to open the format
This is a fairly common criticism of Flash, and it's also an invalid one. Flash is an open format, you can download the specification here on Adobe's website. There are even open source players available, see Gnash, Swfdec, and Lightspark. Unfortunately none of them are feature complete, and most are lacking some major features.
What Flash is not is an open standard. Meaning only Adobe gets to advance the standard, and I don't believe the licensing allows for there to be a fork of their standard. They'll tell you how to interoperate, but only they get to guide the technology and decide what to include.
But then, no sane coder uses a light background for coding, right?
Actually a light background is (somewhat counter-intuitively) easier on the eyes, especially in dim lighting scenarios. The reason comes down to the optical properties of your eyes, which we can talk about in camera terms. A narrow aperture creates a broad depth of field, while a narrow aperture creates a very shallow depth of field. Bright scenery requires a narrow aperture and a broad depth of field, while dark scenery (is certainly moodier) requires a wide open aperture and a very shallow depth of field.
That means that the brighter things are, the smaller your iris is, the more movement you can have in your head without your screen really going out of focus. Very dark setups with dark code (the stereotypical coding setup, it certainly looks cool) actually lead to more eye strain than a bright working environment and white background on your code. Eye strain is caused by constantly shifting focus, and that is alleviated by bright environs and bright code. Dark setups can require only a few millimeters of movement before your eyes are having to refocus. Bright setups can give you several centimeters of movement.
[Neon] is commercially extracted by the fractional distillation of liquid air. It is considerably more expensive than helium, since air is its only source.
And that's at more than 3.5x the concentration too.
Nevermind that the boiling point of Ne is 27K, while the boiling point of He is 4.2K (roughly 1/7 the absolute temperature). That makes fractional distillation dramatically more efficient for Neon than Helium even if they were at similar concentrations. Fractional distillation is also the process used to separate radiogenic He from natural gas reserves, but with much higher concentrations, yield per energy investment is much higher. Fractional distillation of He from atmosphere is expensive.
It's the patent office that has absolutely fucked up for awarding lame patents such as that rectangle with rounded corners patent to Apple.
There's a difference between being granted a bad patent, and litigating that bad patent. Apple doesn't get a free pass to generate lawsuits just because they were granted them.
They should be made to cough up at least 3 times the total revenue they've received that is linked to the clock design they have so blatantly stolen.
So it's ok to steal as long as you don't directly make any money off the theft?
What's the per-device profit margin, and how many apps are shipped on the default firmware? Your recommended damages are probably a lot higher than the Swiss Railway would seek in terms of licensing costs.
Your article states that twice an Exxon Valdez seeps into the gulf naturally each year. Their methodology is pretty suspect - measuring the thickness of naturally occurring oil on the surface, extrapolating the expected bacterial consumption rate and natural churn rate, and multiplying this by the surface area of the gulf. But I'll accept their figures for the sake of argument. So that's 84,000 m^3. Deepwater Horizon was 780,000 m^3, 18.6 times larger.
You're saying that releasing 18 times that volume over the course of only a few months in a single location about 40 miles from a coast probably doesn't have much if any measurable ecological impact? Maybe Exxon Valdez was no big deal either, I mean that's the Pacific Ocean, I'm sure there are hundreds of times that much oil seeping naturally into the ocean, right?
The money quote from that article regarding whether there is a corresponding explosion of population of life that feeds on this bacteria:
In late 2012 local fishermen report that crab, shrimp, and oyster fishing operations have not yet recovered from the oil spill and many fear that the Gulf seafood industry will never recover. One Mississippi shrimper who was interviewed said he used to get 8,000 pounds of shrimp in four days, but this year he got only 800 pounds a week. Mississippi's oyster reefs have been closed since the spill started. A Louisanna fisherman said the local oyster industry might do 35 per cent this year, "If we're very lucky." Dr Ed Cake, a biological oceanographer and a marine and oyster biologist, said that many of the Gulf fisheries have collapsed and "If it takes too long for them to come back, the fishing industry won't survive".
So... no. If I had to speculate, the bacteria is most effective in high concentrations of dispersant. That dispersant is likely detrimental to higher lifeforms, so it's probably a smorgasbord of poisoned food. A shrimper who pulls in around 6% of his pre-disaster haul, that sounds like a completely devastated ecology. Also from the above article, they used dispersants right as tuna were spawning, and it takes a tuna fish 5-15 years to mature, so the effects of that might not hit the tuna fishing industry for 3 more years.
Disclaimer: "These opinions are my own, though for a small fee they be yours too." -- Dave Haynie