The design is definitely obsolete compared to modern handguns, even if the core ideas that were introduced in it remain the same. It's not that it is inherently bad - it's just that we found ways to improve things since then, and realized that certain features were just bad ideas (e.g. a combination of a manual safety with a grip safety). Field stripping a 1911 next to any modern gun is enough to see how modern design is simpler in details while preserving the concept.
Let me guess. You also believe that throwing live ammunition into a fire is extremely dangerous?
It is, actually. When the casing erupts, there will often be quite a few fragments. They are unlikely to kill you, but they will hurt, and if one hits an eye, it is done for.
So who's career ended over 9/11?
Muhammed Atta and 3,000 others.
Those deep field photos always give me vertigo.
I remember back when I was a kid in the 80s instead of touch screens there were light pens. They were super expensive and only a few places I knew of had them. Then there were touch screens but they only responded to the touch of a pen that had a wire going back into the device. I'm not sure exactly how those worked, capacitance between the pen and the screen I guess.
The pen detected the passage of the CRT electron beam beneath the tip and sent a signal. Because CRT timings are well-known, it is therefore possible to sync the pen signal against them and determine which point of the screen it signaled at.
Then finally we had actual touch screens. All you had to do was touch it, with your finger if blunt was ok, use a piece of plastic with a pencil-like point if you wanted precision.
You had to mash it if you wanted to make sure that the press registered on the first go. And don't even get me started on swipe scrolling, which was an exercise in extreme frustration without a stylus.
Now we are back to the damn pens
Well, first of all, touch is still there, so no - we're not "back".
Second, Modern digitizer styluses are far less bulky than light pens of old (most are slender than your typical ball pen), and are not attached to the device. They also have many genuine benefits compared to resistive screen + sharp stick - they can determine pressure much more precisely, and also angle of the pen, which means that strokes are captured far more accurately - which matters a great deal when you're drawing something, for example.
Why is a digitizer the 'proper' way to do it? Because the marketing people say so?
Because resistive UIs suck at responsiveness. Anyone who used a resistive and a capacitive screens knows that. It's precisely why e.g. Windows Mobile was so stylus-centric - because with resistive screens, you pretty much had to use a stylus. Or at least something sharp, like your nails. Or cuss every time you try to press a touch button and it wouldn't register.
Why should I have to carry around some expensive second piece of hardware when my finger does the trick just fine with a resistive touch screen?
Usually, if a device has a digitizer, it has a stylus in the box (so you don't pay extra for it), and it also has some place to stow it away.
Why should I be locked into Samsung? I just had a Samsung phone. It sucked bad, it locked up or slowed down almost constantly.
Who says anything about being locked into Samsung? There are plenty of manufacturers who do digitizer-enabled phones and tablets these days, on various OSes.
Anyway, digitizer is certainly the most popular way to achieve this, but it's not the only way. The other option is to actually make capacitive screens that let you use pencils and ball pens (or something similarly sharp and conductive) on them without damaging the screen. An example of that approach is Sony Xperia Z Ultra, but I suspect we'll see more of it, now that the coating they use is on the market.
At home, I want my technology to serve me, and not take up any of my limited time.
I don't screw with it unless I have a particular itch that I want scratched.
That said, my work environments and home environments couldn't be more different.
At work, I work on Visual Studio. So I have windows machines with more hyper-V guests in them, running nightly builds of CLR and VS. Plus other ones for IIS/SQL to host test apps on.
At home, I run
- 1 windows workstation (turned off until I run a network drop over to it. We just moved),
- 1 surface RT my wife and I share
- 1 mac mini in the living room, when we want a bigger screen and keyboard
- an Ubuntu media/utility server in my rack
- a PC Engines Alix running openbsd as my edge device.
We use our smart phones at home a lot to watch email and facebook.
I just retired my ~6 year old windows media center machine for a WDTVLive. We also use an Xbox 360 for video games and DVDs.
I did a big batch of fiddling recently, as we moved house to a rural property with multiple buildings. I learned about Ubiquity hardware and have retired my previous consumre grade wireless gear in favor of UniFi APs, and I also have a Nanostation link between my house and shop building (with UniFi APs in both spots). Getting that setup was fun and easy, and unlike the consumer grade APs I was used to, I haven't had to power cycle the Ubiquity gear yet since owning it. Solid reliability and astounding speeds.
Also, after I got the WDTVlive and decided I liked it, and packed up my HTPC machine.. only then did I realize that the WD box wouldn't play any of my Hi10P anime. So I spent some time the other evening learning about ffmpeg and x264 build-from-source. My Ubuntu machine isn't fast enough to on-the-fly transcode hi10p to 8bit, and the binary distributions of ffmpeg and x264 on my old Ubuntu release didn't deal with 10bit either.
So, I need to spend some more time here, but honestly, it might just be easier to use handbrake on windows..
I recall a few years ago participating with a lot of others in a crowd source effort to categorize fuzzy pictures of possible galaxies. I think it was galaxy zoo.
So is this the result of our effort? Would be nice to know...
Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn.
My point was that mathematics is a science. That it is a formal science instead of a natural science is a different matter all together.
Laptop users that I've worked with tend to use cloud backup, which I tend to encourage
Guess what a laptop user does when he runs into the cloud backup service's storage cap. He cuts down the set of folders that get backed up. Expanding offline backup capacity doesn't have an annual fee per GB like what iCloud, Dropbox, and SkyDrive charge.