Nah, you just missed the part where when the talk about running Windows Apps, they mean Microsoft Windows Apps--that they come preinstalled or you'll buy later on the Microsoft online store. Because (a) no one but Microsoft would be crazy enough to develop for the Surface RT and/or (b) at least the people involved with the Surface RT are so tunnel-visioned, they really only see Microsoft Windows Apps as they apps you'd actually want to run. I mean, chalk this up to the same sort of people who were so wowed by Gadgets in Windows Vista--which were about as much a clusterfuck as every other gadget setup I've seen on every other platform (slow/choppy behavior, huge memory leaks, huge security holes, unstable)--just to see them disappear just as fast. *shrug* I guess you shouldn't clone hollywood movie UIs.
No, it's a bit more insidious than that. School administrators, even in rural midwestern towns, are mandating that students rent tablets (like iPads*) for school. But, as outlandishly expensive as iPads can be (compared to Android tablets), obviously Surface RT prices are even more galling. The obvious answer then is to undercut iPads and at least be price competitive with Android tablets. Add to that the promise of "it's like a laptop" and "comes with MS Office", and you'd be hard pressed to not see plenty of schools pushing for such devices.
*Yea, as ridiculous as that. You'd think they'd go for Android tablets, if anything, to be cheaper for everyone concerned. I guess schools may be delusional and think those iPads are actually secure wall gardens, perfect for controlling their rented-out property. Personally, I think tablets for every student is outright an absurd idea even if they were free. They're simply too much of a distraction for adults, let alone for children. Beyond that, I'm generally against the idea of just about any company dumping (and possibly at all selling) their branded products at schools precisely because it's designed to create life-long lock-in. I mean, why else would MS or Apple or whatever push for schools to use their products, possibly even selling at a loss? Now, if the Gates Foundation were doing such a thing...I'd still be suspicious because I don't see the Surface RT as the right tool for the job, anyways.
HFT is a symptom of a deeply broken system.
Nah, I'd argue HFT is a symptom of a moderately broken system. The whole point of exchanges is to be market makers--to allow the means for transactions to occur. HFT are supposedly part of the solution to issues of liquidity--that two parties who are technically willing to trade but on their face claim otherwise can be manipulated to get out that information and allow the transaction to occur. It seems pretty clear that if exchanges are failing in their part and a 3rd party can succeed, then it's exchanges which need to change.
To that end, I propose a simple solution. Exchanges will run their own equivalent of HFT, without any unnecessary price spread. And since HFTers, no matter how close they are, are outside the exchange, they'll never be as fast as the exchange in carrying out those all-important liquidity transactions. Of course, it'll also just help if exchanges used ranges for prices for buy/sell, and then the overlap would be very apparent and HFTers would have little ability to step in and earn much of anything.
But more importantly, don't you see the irony that his "poor education" allowed him to know the difference between right and wrong where apparently you don't?
That's not irony. That's a non sequitur. Or are you unaware of the concept of a "mad scientist"? No, the only irony I see is you placing some sort of faith in "poor [educated]" people. Being "smarter" may place you to be more aware of how you can extend your lust for power over your duty to morality, but poorly educated people are quite able and do lust for power. The real difference, of course, is who is more likely to be caught and punished and for what crime. To that end, I'd suggest that you're a lot more likely to be caught and punished if what you do is unusual. Hence, in a higher murder area, murderers go free. Now, consider the probability of this clear case of abuse and corruption in the US government leading to an arrest and punishment and of whom it would apply.
And that's the reason, of course, Snowden is a hero.
If I may impose, could you explain how it is known that CO2 drives warming, and not vice versa?
It's both. More CO2 drives warming. More warming causes CO2 to bubble out of the ocean, permafrost to thaw and organic matter to rot and release CO2, etc. And thinks to the practice of carbon dating, we can say reasonably well that a large part of the current CO2 increase is from long-buried carbon sources--aka fossil fuels.
Regarding the assertion that the temperature rise in the last century has been exceptional: should I presume that it is rate of rise that is being discussed, not the level? Because there were far warmer periods in the past; for example the late Jurassic, when Dinosaurs roamed Canada in tropical conditions. Do we have any reliable basis for CO2 measurements during this period?
Yes, it's the rate that's troubling. Because in the past it took thousands of years to see the sort of warming the gradually resulted in tropical conditions in Canada's latitude. But with the rapid rate we're seeing now, the honest fear is that even if we were to simply stop fossil fuel CO2 emissions completely, we'd still continue to see the unprecedented rapid temperature rise because of the previously mentioned warming->CO2 release.
It is also interesting to me that there have been warmer periods in the past during which, at least to my understanding, CO2 was lower than it is presently. Presumably there is a lot more involved than CO2 level. That suggests to me that at this time the situation needs further study more than it needs extreme and precipitous action. I would be receptive to having any faults in this reasoning pointed out.
Except that your point is sort of superfluous. Even if what you state is true--which I'm not certain of--, the fact is that we know pretty confidently that CO2 is a greenhouse gas and higher CO2 concentrations means a greater temperature. We also know, pretty confidently, that greater temperatures have the above mentioned forcing cycle. That there have been possible exceptions to this cycle isn't comforting unless we have a good reason to believe the mentioned cycle won't repeat itself. That is, even if someone could come up with a good explanation for past higher temp/lower CO2 periods, it doesn't resolve the current higher temp/higher CO2 period. A better place to look would be lower or flat temper/high CO2 periods and consider why or how we could take that track. To that end, I haven't remotely heard anything to suggest we could be or are on that track.
The closest I've heard about anything along those lines is considerations on combating global warming with things like mitigating global warming with dust clouds (either in the atmosphere or in space). The general problem with that is a matter of scale--that human CO2 emissions are so great, countering them with dust would be of similar scale great, and that introduces a lot of unknowns like (a) how much dust to use, (b) how to remove dust if we go too far, (c) all the atmospheric (if done in the atmosphere) risks of increased dust, (d) the cost/risks of doing the same in space (a dust cloud could slow asteroids and increase the risk of them hitting Earth), etc. In essence, anything of the scale that could fix the problem are probably also of the scale of the problem itself. So, we have the real risk of solving one problem just to produce another one. Hence, it'd seem a lot wiser to head off CO2 release as much as we can and only really consider alternatives as a last resort.
But, seriously, we're so far from even seriously trying to deal with CO2 release.
Everything about Big Data relies on the assumption that having more complete information allows a particular business to improve efficiency.
That word. I don't think it means what you think it means. Now, if you mean "improve sales"...
For advertising and medicine, this is pretty obvious. Just saying a brand name to the right person at the right time makes a sale.
And taking that same person at the right time and showing them that generic X is equivalent to brand name except for the label and the much cheaper prices increases efficiency. Funny that. Not very "Big Data" efficient, though.
A doctor who can see the symptoms and outcomes of tens of millions of patients can better match a particular patient's case with an earlier example. If that assumption holds true, Big Data is useful.
Congratulations. You've proven correlation == causation. Oh, wait, no you didn't. Ten million people might have a cough because it's flu season. But, Joe may cough because he has lung cancer. Big Data *is* more efficient, though, if it means Joe is given some zinc pills and dies. And by efficient, I mean cheaper for the hospital when it can push people through on a diagnostic assembly line and cheaper for insurance companies that have to pay for a cheap prescription instead of having to fund expensive anti-cancer drugs.
This ultimately boils down to the issue of anecdotal vs. statistical evidence. Each individual's information is an anecdote, and holds value to the erson (or people) it relates to, but the anecdote doesn't really provide insight for the future. On the other hand, statistical information is only useful on a large scale with a large sample, collected from people who know little enough about the project to alter its outcome. As you said, the statistical information is worth buying, but anecdotes aren't.
So true. Too bad the plural of anecdote isn't data. So Big "Data" isn't inherently good for statistical analysis. Oh, and did I mention that real statistical analysis is really, really hard? I mean it-requires-a-human hard. But, since Big "Data" is all about trying to push through (aka "analyze" with an algorithm) more anecdotes than individuals can reasonably process... That's not to say it's impossible, sometimes, to pull out little threads of insight. I know throughout medical history there's been plenty of examples of that happening; I mean, sure, it might take thousands of false positives and a dedicated researcher to manually go over them... But the hype and hope of Big "Data" is in the same scope of nativity (or con, if you're a cynic) as trying to make a programming language/ui/whatever that anyone can use. A few isolated examples or scopes of problems? Sure. But remotely in general? Completely, insanely impossible.
PS - I really hope your work is an exception to the rule.
I don't really think so myself. Efficiency is the key thing to making "stuff" more affordable and therefore more ubiquitous.
Relatively cheap is the key to making stuff more ubiquitous. Having a good deal of *demand* is probably even a lot more important. Efficiency may or may not enter into it based heavily on just how much a prototype costs relatively to what the market can bare. Which leads too...
For example, efficiencies in semiconductor fabrication enabled personal computers to be affordable by the average joe, even really poor people, whereas it used to be only the very rich owned them.
No, the integrated circuit (aka semiconductor fabrication itself) is the primary based upon which more than the very rich owned a personal computer. Before that point, all the transistors and wires were a bulky mess which didn't entirely forego the non-very rich from owning a personal computer, but it certainly set the stage for allowing the progressive miniaturization in lithography that were those "efficiencies" you speak of. Yet, well before that point, of course, plenty of people had their own PC. Of course, the IC itself could be said to be an example of the "efficiencies" you speak of, but that's not right really. What made ICs so important is that it allowed the bulk package of everything together which in itself made the production a unified thing. I guess that again could be chalked up to a point of efficiency, like the assembly line, but it was the fact that computers can do so much that spurred so much adoption. Yes, without a lower floor on the price of a PC, computers would certainly be a lot less ubiquitous. But they'd be well outside of the scope of for just "the very rich".
The same thing can be said for cars and Ford's original Model T.
Again, no. Yes, at the start the Model T was more of a novelty to the very rich who could potentially waste a great deal of money on something that may not pan out . But, the very nature of the current shipping system we have today I think shows that there's a strong commercial interest in having a horseless carriage. So, the ubiquitous nature of cars would seem to be based heavily on demand there as well. Of course, being cheaper makes them *more* ubiquitous, but without the demand in the first place...
One key part of this is economies of scale, which means you need to sell large quantities of something in order for it to be affordable by the masses. And subsequently, a key part of that is marketing. Marketing is expensive as hell, and goes into the cost of those goods. If big data makes marketing cheaper, then that savings will eventually (though not immediately) make its way to joe sixpack.
Except the rule has consistently been to spend yet even more money on marketing to spur even further sales to produce further revenue to pay for said marketing. The truth is, big data as a concept is very, very old. It's based on the idea that if you can collect enough information about potential customers, you can make them actual customers. Yet, just like directed advertising, it forgets the key point of marketing is not to sell products but to inform a consumer who has a demand, even if he hasn't fully realized it yet. The thing is, we're already well to the point of having a system where ads can read potential customers. And ads continue to stray further and further into trying to conjure up a demand out of nothing, rather than informing a consumer into a real demand they have.
So, the real specter of big data is to yet further manipulate consumers even more than directed ads do--and if you don't see how trying to create a per-user view of available products doesn't manipulate the consumer, you'll already well lost--by further trying to take what consumers already buy or know and try to match them up with...stuff the select few companies who pay the most want to sell. Meanwhile, all sorts of products the consumer actually knows they want aren't shown and things they don't even realize they want aren't mentioned because marketers and algorithms (and the humans in general) are pretty horrible at predicting the future. Instead, it'll all be about "trends" and "keeping up with the Jones", another manifestation of the disgusting aspect of consumerism.
But, yea, keep telling yourself that efficiencies are the key. No, marketing is like DRM. And consumerism keeps manipulating people into buying crap they don't really want or need. Yet, just like DRM, people keep coming up with interesting waste to circumvent the trends and forge new territory. Instead of focusing on trying to take old data and churn out new details, why not spend a bit more in listening to the real-time demands of people? And I don't mean in being a me-too company that clones the latest game.
Methinks the start screen is just a highly visible rallying point for people to whine about Windows.
Like a nuclear bomb is just a critical mass detonation of tremendous energy in the form of significant heat, force, and radiation. Throwing in that "just" is rather belittling to the point. The major advantage Windows 95, as a UI, had over its predecessors (and other competing UIs of the era) was the always-on-top taskbar + start menu. And MS decided to throw those away, effectively, in Metro mode. And best of all, they're pushing people into using Metro mode to do things--the fact that you can circumvent MS's efforts doesn't change that fact.
But, yea, it's "just" a bit of whining from people being manipulated into using an inferior UI. As others have pointed out, the only reason such an inferior UI is at all accepted on tablets and smartphones is their inferior resolution. But, I'd venture a guess that in short order people with tablets will "whine" even more in time for a start menu + taskbar too as the resolution issue is a much mooter point there. This is, btw, a major reason why Windows CE sucked so much--there, it tried to shove a start menu + taskbar on a too small of a resolution screen.
I mean, let it not be said "use the best tool for the job" or "there's no such thing as a universal UI". Oh, wait... And here we see why Unity is a failure as well, with the same group of whiners.
Yet one person gets murdered here and everyone seems to be yelling "terrorist" and going weak at the knees in fear and stupidity.
In part. A tiny minority group has an agenda and uses anything and everything to pursue it. The vast majority of people are too cowardly or too stupid to confront that minority with any sort of logic or reason. Instead, the mere fact that they may be painted as pro-terrorism, pro-murder, anti-nationalistic, or simply non-compassionate leaves them to be walked all over, often go as far as parroting the party line instead of making a stand on principle or character or integrity. So, congratulations Britain; you're just like America.
Well, I tend to think of quantum mechanics as proving the universe functions on call-by-need, with faster than light being the lack of support for mutation. Entangling then is really just call-by-need evaluating out a circumstance backwards far enough to note that when two waves/particles/whatever were at the same place, they had to have certain exclusionary properties (for the article, one photon was polarized vertically and the other horizontally) which cause the interpretation of entanglement.
Of course, all of the above says nothing of the how or why of it. And no doubt, I'm likely far off in really understanding on quantum mechanics. But, it at least helps me better understand it.
They could probably make a very secure system, but people would complain too much because all their applications would have to come vetted from MS and it would be like running IOS on your desktop.
Um...yeah...perhaps you've never heard of 'iOS jailbreaking"? Seriously, even with MS vetted drivers (a mandatory part of 64-bit Windows), almost entirely non-Admin user programs (because of how Windows is designed, there are a handful of MS programs that run at higher privilege to provide the Win32/64 environemnt), things like stack smashing protection, data execution prevention, randomized application offsets, and even sandboxing (admittedly, only rarely done), Pown2Own still clearly shows that IE and Windows 8 are vulnerable. Slapping on Secure Boot wouldn't do a damn thing. And the idea that MS can successfully vet software just falls flat on its face when it fails to adequate protects is own software--unless you think that's some sort of conspiracy.
No, in all seriousness, writing secure software is incredibly hard. My personal problem with MS has more to do with their marketing of Windows as "secure" and "robust" for ages--it was a big selling point as far back as NT 4.0 (probably sooner). And in retrospect, we can see that that was either ignorant/arrogant optimism or just marketer bullshit. I'd contribute it to both, and I don't see the situation changing.with MS or any other non-conservative organization. Really, OpenBSD is about the closest you'll ever get to a secure/robust system, but even there that really translates into a box with very limited software options--anything more and you've stepped out of vetted secure/robust.
PS - And yea, I'd say just about every *nix is guilty of overclaiming robustness and security. The big differences is the degree and just how much it's the organization itself and its many members/fans. I really don't see the same sort of out-of-the-horses-mouth BS that you see in MS PR (or just about any company PR, really). But, to know that's the nature of the beast sort of proves the point on why your claims seem absurd.
A much better analogy would be sheet music and performances. It makes little sense to pretend that sheet music and performances are identical and simply handing performances to the sheet music author seems pretty preposterous.
The fact that there's a mechanical licensing system for music has a lot to do with preventing the sort of absurdity that Nintendo seems to be pulling, the idea that they can monopolize all photos, videos, whatever of any game they ever made--outside a very limited scope. Of course with music, the absurdity was realized very early on. It took something like Youtube and micro-based payments before the situation came down to the level of game performances; before then, there wasn't enough hard currency for Nintendo, or anyone really, to flex their muscle on the subject.
PS - Ask "Which path do you think the Tolkein Estate would take?" is sort of like asking "What would a pyromaniac do with kindling?". It's the great shortsightedness of many authors, like Twain, to not understand that in trying to further copyright past their own life, all they have done is given "Estates" an excuse to be nothing more than useless, money-grubbing parasites. It entirely misses the point of copyright, to reward authors--not their children, grandchildren, et al. That a *live* author should shower his children, grandchildren, et al with that reward is his/her own prerogative. To codify it into law is absurd. And yea, I can see the sticky situation of an author dying young or a very old person seeing very little financial worth to writing. But, that's more a core truth of the inequities of life. Granting copyright past death doesn't really fix anything.
Only 68 papers [rankexploits.com] out of 12,000 asserted greater than 50% of the cause to humans, while 78 explicitly rejected it.
That reminds me of the joke about the man who finds a genie. The genie says he can grant three wishes, but for each one granted the man's worst enemy will receive twice as much of it. The man makes two wishes of great fortune. For his final wish, he wishes to be cast half-dead.
The point, then, is that if nature alone in greenhouse gas emissions is slowly (read, on the scale of thousands of years) shifting us into another ice age, man suddenly pumping anywhere close to the same amount (making humans on the order of 50% responsible for current greenhouse gas emissions) may radically (read, on the scale of hundreds of years or perhaps even decades) shift us towards global warming. So, the question isn't the strawman of 50% man-made greenhouse gases but (a) if there are significant man-made greenhouse gases and (b) how much, if any, shift those gases will have on the climate. Well, the studies show on the order of what nature puts out and our knowledge of nature's own greenhouse gas emissions and their forcing effects (read, they're what keep us from being in a perpetual ice age) tells us that there's something to worry about with the only real debate as of now the exact scale and speed of warming.
But, yea, keep tilting at the irrelevant points that you make up.
Or perhaps evidence about the stupidity of man? Who, after all, spent all the time and money developing war games computers just to either (a) "win" at Global Thermonuclear War or (b) never actually fight (yet still spend billions of dollars to achieve nothing)?
PS - Yea, I know you're just joking. But, to me the point is more sad than funny.
Nearly 4% in 6 months isn't bad especially when you consider the lower demand for PCs in general.
Last year's PC sales were on the order of 348 million, down about 4.5 million from the year before. That means, that if all PCs in the last 6 months were sold with Windows 8, you'd expect about 174 million in sales--obviously that figure is off because there's likely a burst in sales at different times in the year, not every manufacturer instantly switches to selling Windows 8, those that do switch often offer multiple model lines of which some won't include Windows 8 for corporate/other purposes, and then there's all the Mac OS X/Linux machines... In any case, hand waving "lower demand for PCs in general" is a good deal of heavy BS.
Also, some portion of the XP and 7 users will never upgrade so the potential growth for a new OS is even lower.
That's a point that hits it on the head. Windows 8 isn't compelling enough to buy a new machine--like say, a tablet--and directly retail upgrades have never been the main way in which one version of Windows (since at least Windows 95) have supplanted the previous version. Instead, it has been pretty consistently the ever greater PC sales coupled with a lower install base to begin with. Ie, people with Windows 95 bought new machines with Windows 98, but more importantly a lot of people without computers bought machines with Windows 98 and perhaps doubled the number of computer owners.
So, yea, ever since Windows XP or so, the market has become rather saturated and the only way one tends to see a shift in numbers is either (a) a whole new "type" of computer (smart phones, for example) which increase the pool and can shift percentages massively or (b) the slow, yet methodical, death of machines and their new replacements having the latest version of Windows (usually). Given how Metro was trying to push for (a) (ignoring how it'd likely be an incarnation of (b)), I'd say the figures are pretty dismal.
It's already the fourth most popular OS.
Which isn't saying much. Windows has heavy vendor lock-in on programs people need/want/expect to use. Couple that with a lot of OEMs preloading Windows 8 on millions of machines, and you'd expect it to become a "popular" OS if nothing else because people can still run said programs, no matter how much they otherwise hate the OS. Hell, Windows 98 might score higher than Windows 8 if there was the option of it instead of Windows 8 over the last 10 years--especially because of the Vista days