And almost noone that was working there in the 90s is still there. There is hardly anything the same in Microsoft as it was back then, except the name of course.
Not sure what the problem you have with firefox bookmarks. To use them, just click it. Arranging them, just drag them into the order you want them, or drag them (in the order you want) in your bookmarks toolbar.
I'll even go one further, and say that according to their study, you are SAFEST (on the freeway) traveling 15MPH over the average speed. A "U"-shaped curve happens surrounding that point where it is approximately as safe to drive 30MPH over the average speed on the freeway as it is to drive the speed limit on normal streets.
Sure. It was really hard to google "NTSA studies speed" and click the first link, but here you go:
SInce you couldn't take the time to google, I'll even give you the excerpt:
Low-speed drivers were more likely to be involved in
crashes than relatively high speed drivers
I can't recall a time in which it refered to anything else. Here's another example from 1985: the original Atari at service manual. http://www.atarimania.com/docu...
So it has been in use for at least 30 years. Not exactly "new". If you can find another reference before that in which it talks about resolution being described in pixel density (which would be hard I imagine as screens didn't have pixels back then). You might be able to find a reference to a tv (which isn't the same field) describing resolution in terms of lines, but again, lines isn't density either.
QEMM's optimize was awesome back in the day. It'd get me 95% of the way there.
Your blind test isn't actually testing to see if people can tell the difference between 1080p and 4k. It's testing to see if people can correctly identify which is which. That's not the same thing.
A better blind-study is to have two TVs placed side by side of the same make and model. Turn off all upsampling and then show a 1080p image on one and a 4K image on the other alternating randomly the which is which and have the participant identify which image looks better. You will have a much different result, and have proven that people can tell the difference quite easily.
Let's try an example. Enter "resolution" into google. What does it say?
the degree of sharpness of a computer-generated image as measured by the number of dots per linear inch in a hard-copy printout or the number of pixels across and down on a display screen. Their resolution never failed them, their fervour seemed never slackened.
Google says you are wrong.
Let's see what Microsoft says. Right click your desktop, and choose "Screen Resolution". What does it say? Microsoft says:
Resolution: 2560x1440 (Recommended)
Boy those silly software guys must have got it all wrong. Let's check the hardware guys... How about dell?
Under tech spec, that monitor says:
Native Resolution 1920 x 1200
Guess the hardware guys are wrong too. So who uses it the one true "Jane Q. Public" way?
That's exactly what it means: resolution is the number of pixels, always has been.
No it doesn't. It's the measurable degree of detail.
Yes it does. It's the number of pixels available on the screen, usually described in WxH.
Don't misuse the word then try to tell me it "always has been". That's just plain false. "Never was" would be closer to the truth.
Resolution has always referred to the number of pixels available on the screen.
1. the maximum number of pixels that can be displayed on a monitor, expressed as (number of horizontal pixels) x (number of vertical pixels), i.e., 1024x768. The ratio of horizontal to vertical resolution is usually 4:3, the same as that of conventional television sets.
In windows, when you go the screen resolution dialog, you tell it how many pixels by how many pixels your display is. It's been that way since Windows 2.0 back in 1987.
Sort of like windows-L?
Too bad he wasn't running windows. Linux is so insecure.
Perhaps it was perceived, but they determined that the market of people willing to face fines and possible imprisonment so that they can save $10 in their insurance wasn't big enough to warrant the expense of building all that extra security in.
Actually, a great location would be fermi-lab in Batavia, IL. Plenty of space there considering they built it for the large collider, in fact, he could probably build it right above the collider ring and there should be very little no/resistance and no environmental impact. Not to mention the near access to some of the countries best minds right on campus.
When I was in the USAF I had great fun telling users that they could have a wireless keyboard & mouse just as soon as they found FIPS 140-2 compliant ones. I then told them that not only do none exist to our knowledge, but none are planned. The main problem being once you put serious encryption in there(as 140-2 requires), you're looking at a keyboard/mouse that are closer to smartphones than keyboards. IE a AA won't last a few months, you'll need to charge it like you do your smartphone. AES encryption also isn't intended for 8-16 bits at a time, so it's not really efficient there.
That's easy to solve. Since the keyboard and mouse are very likely near a PC, just run a charging cable to one of it's USB ports and never disconnect it. Then you can get rid of the battery completely. Problem solved. Then you've got a nice battery-less, always charged wireless keyboard and mouse. Tada!
WTF is Git? Is that a new fork of git? Cause I can't tell what you are talking about because you put the wrong case.