Well sure, latency is a bitch but imagine the throughput once it got moving!
Genetically engineer atmospheric terrestrial microbes
Just curious, what would the flagella producing engineered terrestrial microbes do for water once they were in the Venusian atmosphere?
Looks like the
The DOS vulnerability seems odd...so if I opened a file in Word Perfect (for DOS) you're saying the kernel would try to execute it before passing the contents on to Word Perfect? Somehow that doesn't seem likely.
Ok, as you say, its a bug. Not the same thing as balancing security vs. convenience.
A quick Google makes it appear to be more of a bug (a properly malformed MIME header could result in code execution) than attempting to find the right balance between security and usability.
I remember when Microsoft automatically executing email attachments was intended to strike the right balance between security and usability. That was a long time ago, in a galaxy far, far away.
I'm no fan of Microsoft's security history, but when did they ever have attachments auto execute?
The only reason Oracle has this flaw is because Microsoft's DB lineup hardly forces them to compete from a security perspective.
My screens are both Samsung T260 1920x1200, 25.5", which gives
I'm running them at 120dpi instead of 96 dpi, which scales things up by 25%. That is simply nowhere near enough. Until tonight, I was zooming in web pages at ~150%. Now I'm trying 200%, which given the 25% boost from the higher DPI, really translates into someone viewing at 96DPI at 250%
I was already using an 18 point monospace font for coding - I've just bumped it up to 24 point
Your strategy seems to be to make things larger, as if you were trying to cope with a minification (as in optics, not programming) problem. Have you experimented with radical adjustments to the distance of the object your eyes are focusing on? Relaxed the human eye focus is at infinity; think scanning the horizon for threats, with occasional refocusing on a near object before going back to scanning for stuff moving in the distance. Focusing on something a meter away all day long for years on end isn't what the eye does gracefully.
trying 200%, which given the 25% boost from the higher DPI, really translates into someone viewing at 96DPI at 250%
I didn't have your condition (in my case just plain old "eye fatigue"), but realizing focal length was the root cause and adjusting for it led to a dramatic improvement after years of experimenting with image scaling in exactly the same way you are. Image scaling helped with the symptoms (I could see what was happening easier) without any effect on the root cause.. In my case the difference was immediate and dramatic enough to notice after a weekend of playing games on a friends projector, leading to my adjusting my work environment for longer focal distances, so perhaps you could find out if focal length is or is not a factor in your condition just as quickly if you haven't already.
Well since you said "please"...
Unfortunately since you had to ask someone for the URL with an extremely simple syntax I think you'll be required to hand in your geek card now.
amusing to see how http://dyndns.org/ has changed over the years; in 1999 complaining on the front page about the programmer leaving and taking all his code with him to a completely anonymous, plasticky "professional" look in 2011 and all the slow changes in between,
Umm, how do you defect without leaving Soviet airspace? I somehow doubt he had permission to be in Japanese airspace.
The United States regularly broadcast specific instructions for defecting pilots to follow. Defecting pilots during the cold war (bringing their planes with them) absolutely had permission to enter Japanese airspace as long as they obeyed the flight plan directly to the airfield they were told to land at.
You tell him!
Nobody else has said that IBM or any customer using Watson is actually pursuing this use.
They're currently awaiting the final touches on Watson's t-800 style "meatspace interface/communication avatar" endoskeletons. Then the troll-pursuing begins.
Dude, you realize I was kidding about trends in overall CPU performance, right? Hence the "funny" moderation. I don't think anyone here seriously thinks we're approaching Bremermann's Limit and things just can't get faster.
Perhaps it "whooshed" you?
behind the curtain, they're secretly executing chains of RISC instructions with private, semi-asynchronous clocks as fast as they can & just presenting the public facade of a CISC architecture responding to a system-wide clock...
I like your description though...I always suspected there was something unwholesome and conspiratorial about CISC processing.
But there were far more megahertz than we'll get in gigahertz! You'll never get a 150GHz machine...
I'm with you totally...that's like well more than an order of magnitude faster than what we commonly have now, it would take huge advancements in technology to get there...heck, that would make the totally advanced and awesomely powerful computers we have these days feel like pocket calculators. Never'll happen