The problem is not that MS launched a new OS that underwhelmed. The problem is that we have a machine with a ridiculous amount of CPU and GPU power compared with the portable shit (tablets and phones), yet we can't seem to put this power to meaningful use. I mean, if you don't to scientific computing or video/photo editing or gaming, what's the point of a PC over an underpowered piece of junk or a console? Software developers should really start thinking hard (yes, MS too). But I guess it's far easier developing 2D games for a shiny new platform than doing real innovation.
Obligatory Douglas Adams quote:
Orbiting this at a distance of roughly ninety-eight million miles is an utterly insignificant little blue-green planet whose ape-descended life forms are so amazingly primitive that they still think digital watches are a pretty neat idea...
Although I wear a relatively smart digital watch, I often wonder whether an automatic piece of jewelry (say, a Breguet) would be nicer on my wrist. Can't afford, so the question is purely philosophical.
There are many negative results in clinical medicine. For example, all drugs that don't work in a phase III trial deserve their own publication. This is a costly failure for pharma, but less costly than failing post-marketing and being sued by everyone.
Anyway, the term negative results is rather vague. A negative result coming from a well-designed and powered experiment can be very exciting (say, not finding the Higgs boson despite adequate design) because it makes us reconsider current theories. In my domain, for example, showing beyond reasonable doubt that smoking does NOT cause cancer would be a result of profound significance in preventive medicine. This kind of negative result is interesting, but rare. On the other hand, most of the time when a result goes against a very well established theory, the method is probably flawed, or underpowered or the interpretation is incomplete. This is the frequent kind of negative result, the one that most PhD's fear. There is yet another kind of negative result, also frustrating, when your new code/algorithm proves to be inferior to the competition. At least in this case you do contribute something new that might be of use in specific circumstances or in designing a better version in the future.
So, what I'm saying is that most of the time unexpected negative results come from bad methodology, which is why everyone hates them. True negative results are great but require extreme rigor and luck.
Actually, that's why clinical trials are now supposed to be registered (clinicaltrials.gov), so that when they end, we get to know what they found (or not). This way, pharma cannot avoid bad publicity, for example. It doesn't work perfectly because I'm not aware of someone actually verifying that studies did get published, but the mechanism is there and if agency "X" decides to have a look it should have a quick idea of who studied what. The situation is of course much less well-documented when it doesn't concern real patients, but most funding agencies do want to know what you did with their money, including not finding stuff.
Higher quality PSUs will provide stable voltage and current with much less ripple than low-end PSUs. Furthermore, you get goods like overcurrent protection, modular cabling and, if you choose wisely, low noise. In my opinion, a high quality PSU is a critical component and helps you get a longer life from your components. For example, a Seasonic G-550 80+ Gold can be found for $90 and it should keep almost any user happy. I'm not saying you should get it for the Gold rating, but for the overall quality...
I treat people with brain cancer for a living in a university hospital. As someone once said, 10 barbers won't make your haircut 10x faster. In the end, if his disease is bad, there is simply not much to be done today in order to obtain a cure. When I say "bad disease", I don't mean stage or grade or histology. I mean the specific population of cells with the specific DNA alterations that he has in his head. His best chance is probably in a clinical trial. Believing that we somehow, somewhere, have a cure for his disease and getting access to it is just an act of publicity is unfortunately naive.
a) Games have variable frame rate, so people want 60 fps average in order to never get more than say 25 fps min.
b) Games are competitive. If I play at 20fps, I get information (for example enemy position) every 50ms. If I play at 60 fps, I get information every 16.6ms. Even if I don't use 100% of the information, this is likely to be an advantage. At a pro level, these things count.
How about a decent interface for a desktop Linux-based OS instead of a horrible interface for netbooks, laptops, 24", tablets and TVs? How about they get that right for a start...
Omg, I wanted to rate you +32 Mega-insightful, but this is not technically possible because you are already at +5. Ubuntu gets worse with every iteration...
You silly newb. HDMI uses 24fps for compatibility reasons and the initial decision was probably based on an quality-cost tradeoff back in the days when actual film was used and the NTSC/PAL specifications were defined. Using 60fps would mean that the tape would last half the time, for example. There is the famous "notion" that eyes cannot see over 24fps, but in fact eyes are very sensitive to some kinds of motion, colors and contrast and less sensitive to others, so you cannot generalise that 24fps is "enough" for all kinds of motion, image and people (ye, people are different too). Furthermore, even if the above were not true, in fact you need an average of at least 50-60 fps in most games to ensure that the MINIMUM will not go below 30fps, which is not only visible but also implies a between-frame reaction time of 30ms (plus ping, plus input lag, plus keyoard lag etc). In hardcore-land this mean PWNAGE for you and your silly rig.
Facebook will soon be irrelevant. People will migrate to other services, probably Google+. We've seen it happen before: myspace, hi5 etc. I give it 2 more years before it fades away.
I would just like to add the fact that doctors don't just think about the diagnosis, they also obtain data from patients. Not only by clinical
examination (try pushing Watson around when a patient is dying from e.g. pneumothorax in the street) but most importantly from the interview,
which I assume is not Watson's or any machine's strong point. And before you say anything about Elisa or whatever, do not underestimate
the subtlety and difficulty of verbal and non-verbal communication with patients in distress, with the demented, intoxicated, delirious, neurotic, comatose
In real life, Watson will be just a step higher than huge databases like UpToDate or AccessMedicine providing more intelligent feedback to
actual doctors. I don't think Watson will be able to feed himself clinical (as opposed to laboratory) data for the moment.
Also, don't forget that under current law you cannot sue an AI. A doctor has to sign somewhere at some point...
I was a Slackware user from the very low single digit versions until I decided I really wanted 64bit then never got back.
The 1337 version number is a clear sign. I am tempted to give it a go.
Well, seeing a Mandelbrot algorithm running on an interpreted language on top of an interpreted language and
struggling on my super powerful quad core makes me suffer. I had coded the Mandelbrot fractal in assembly
and it ran faster on a 80386...
Now get out of my lawn...