You're still alive, old man?!
Well if it means we're going from small devices with small apps and small amounts of resources to suddenly making them full on desktop machines, I just don't see the point.
And that's totally fine. The point isn't what YOU want, it's what some private company wants to do and these actions will in no way, shape, or form negatively impact your life and thus getting all up in a huff about it is a little over the top.
What percentage of Android owners even remotely want any of this?
Users don't know what they want until it is provided to them and, honestly, if you don't want any part of it, that's cool but perhaps it will really help developers port their work cross-platform and bring us to a completely different level.
I would love to see Android or iOS apps come back across the divide in some cases, so there's likely a market in reverse.
No sense in getting all fired up about CodeWeavers doing this.
I agree with you, just not your example. Pharmacy Techs are on-the-job trained in a few days and get paid just north of minimum wage. The technical skills required to do that job aren't complex and those leading the area should have to do the same on-the-job training as the staff. Comparing that world to most IT specializations is a HUGE leap.
If I had to guess, I'd say heritable immunity.
const int one = 65536;
As an aside (that means off-topic, guys) this looks like part of a fixed-point arithmetic implementation. It may not be as silly as you think.
Those genes are not expressed, and we don't have copies of those viruses floating around our bloodstream.
Probably, and for the most part. But we used to think the genome was mostly "junk DNA" before we understood that much of it was homeotic in function. It seems to me that virus copies would not be conserved over time unless they were serving some function.
With a hammer.
Every time I see an article mentioning "software programs" I cringe. I guess they're different from "hardware programs" or "exercise programs" or you can just call it "software" like everyone else. It reads like someone still uses a typewriter and not one of these fancy new computers using "word processing software programs". *sigh*
There used to be a web page called "Your Eyes Suck at Blue". You might find it on the Wayback machine.
You can tell the luminance of each individual channel more precisely than you can perceive differences in mixed color. This is due to the difference between rod and cone cells. Your perception of the color gamut is, sorry, imprecise. I'm sure that you really can't discriminate 256 bits of blue in the presence of other, varying, colors.
Rather than abuse every commenter who has not joined your specialty on Slashdot, please take the source and write about what you find.
Given that CPU and memory get less expensive over time, it is no surprise that algorithms work practically today that would not have when various standards groups started meeting. Ultimately, someone like you can state what the trade-offs are in clear English, and indeed whether they work at all, which is more productive than trading naah-naahs.
I am NOMAD!