I was thinking that there must be some folks over at MIT with both cognitive science and software background. I also know for a fact people in the aero/astro department do human factors research; I even helped run some of it when I worked at the Center for Space Research (now the McNair Building).
How well do you know your neighbor?
And if your neighbor is hit by a bus, how well do you know your neighbor's heirs?
How much do you trust your neighbor's physical security?
And can you count on your neighbor being home if you need your data fast?
If you can live without your data for a few weeks, and your backups are encrypted, then walking your data over might not be such a bad thing. Having *two* offsite friends you park encrypted backups with would be better, and if they lived at least a few miles apart that would be even better (e.g. in case of a tornado, flood or large neighborhood fire).
I was OK with 8.0. I've reached the point where I hate the new versions of everything (except maybe XFCE, which is pretty much pitched toward people who hate the new versions of everything). The reason is all this struggle to revolutionize the user experience seems to have left the goals of making common tasks convenient for the user behind. Impressive but pointless seems to be à la mode these days, and designers appear increasingly incapable of distinguishing creativity from novelty.
But given that virtually everybody has caught this disease, I pretty much have given up on insisting that things making sense. I only ask for a few things, that a user interface be stable and respond consistently (consistency was a huge problem with Vista), and that I can figure out how to do what I need done after a few days with the system. Windows 8.0 fit the bill. It didn't make much *sense*, but it didn't crash and responded in a stable way so I simply adjusted to its quirks.
Windows 8.1 doesn't fit the bill, because it doesn't respond consistently; it brings back some of the Vista experience of having the OS throw unpredictable little delays into your work. The file manager windows are especially bad (e.g. when you're ejecting a drive). How in Hades' name could someone screw up something like that in 2013?
And the 8.0 to 8.1 upgrade process was terrible. It was so poorly designed from an HCI standpoint that I was actually tempted to believe it was an abusive prank by a disgruntled MS employee.
MS these days is looking to ever more like Lotus Software in its declining days. Despite being located in city crowded with world class universities, Lotus seemed utterly incapable of addressing even basic user interface problems except by pasting cheery looking wallpaper over them. I used to reflect when I passed their headquarters on Land Boulevard that if they put up a billboard say "We'll pay $100,000 to anyone who can fix our stupid UI problems," probably twenty or thirty people a day would see it who could probably take them up on the offer.
Use your special system architecture x-ray vision, folks. This is not simple, stand-alone site like Slashdot that just has to do some database queries and generate some XML, then uses JQuery or something to asynchronously load some advertising into a DIV. This is a system that must orchestrate a complex *synchronous* process involving servers that belong to outside organizations.
Case in point; the system requirements say that the site must exclude illegal immigrants, so the system has to request and obtain proof of your status from Homeland Security's servers before it can proceed. Also, instead of issuing the same subsidy to everyone, the law specifies and income dependent, means-tested subsidy, which means the system ALSO has to check your claims against the IRS's computers before continuing. That's before it actually gets to obtaining the marketplace data.
So the most complex aspect of this system is essentially untestable short of a near-full scale roll-out. Hey, IRS, can I try hosing down your servers with JMeter? Even if you could orchestrate the non-functional testing you'd want to do, you won't know how the system works until it's handling real data. It's not like you can shove a test load equivalent to a thousand applications per hour, then another equivalent to ten-thousand, then draw a straight line that will tell you how the system will perform with twenty-thousand. There are some serious discontinuities in performance lurking, and the actual data submitted is likely to change things.
I think if I were in charge of this, the extreme difficulty of realistic non-functional testing might have led me to isolate some of the data interchange into a post-processing step. That is, I'd let people apply and take them at their word about their immigration status and income, then tell them to check back in a day while we confirm the data they submitted. It's more bureaucratic, but a big part of user experience is predictability. If someone knows they can complete their application in half an hour and come back 24 hours later for confirmation, it's not so bad. But if the system is designed to give them the expectation that they can finish in a half hour, but sometimes takes so long their sessions expire, that's a disaster.
"Begging the question" refers to a logical fallacy in which you pose a question that can't be answered without assuming something that really ought to be proved first. The common example is "Have you stopped beating your wife?" It seems to call for a yes or no answer, but such an answer is only possible until we all agree that you've been beating your wife.
"Begging the question" is sometimes a result of sloppy thinking, but it can sometimes be an "intentional fallacy" -- a dirty trick, as when a prosecutor asks you, "How long have you been stealing from your employer?"
The term "begging the question" has never been clear in English; it's a literal translation of the Latin term "petitio principii"; it might better be called "assuming the conclusion". "Begging the question" in the logical-fallacy sense was always jargon, and thus not a first class member of the English lexicon in my opinion.
Anyhow, your reaction proves my point. The majority of even reasonably educated people now think "begging the question" mens "raising the question"; and so we have to accept that's one of it's meanings. It's more practical to reform the dictionary than it is to reform the language. If it helps, this is a case of what linguists call metonymy. The fallacy of petitio principii does indeed involve raising a question -- the question of the implicit premise's truth. Over time that meaning has been broadened to include *all* instances of raising questions.
Initially those who used "begging the question" to mean "raising the question" were just ignorant people trying to ape the writing of their more educated betters. But that's gone on so long there's no practical alternative but to grant that sense of phrase naturalized citizenship. But there are some who will never accept it, so it's best to accept the use of that sense by others but not to use it oneself.
It didn't affect me directly because I was working on System V Unix and we weren't directly connected to ARPANet.
I remember thinking, "Gee, someone actually *made* one of those?"
The idea had already popped up in some 70s sci-fi stories, and I remember in the late 70s pranking was already fairly common on timesharing systems. As soon as people began to share systems pranksters began to fool around with them, creating "fork bombs" and "chain jobs". It was annoying for sysadmins, but I think it wasn't malicious. The people who did this stuff were fascinated with the edge cases, the things a system could be made to do that it wasn't designed to do; and, let's just say they weren't necessarily the most attuned to the needs and desires of others.
Since the idea of network-vectored malware had cropped up shortly after the idea of a networked world became commonplace (this was still sci-fi stuff in the 70s), people had been talking about the real possibility of such a thing in the 80s; there were even some academic papers on the notion. But our forward thinking was more focused on the positive things that networked computers would do. In the end I think most of us fell short on both ends. Most of us underestimated just how useful and ubiquitous networking would become, at least in our lifetimes. And although we knew network-vectored malware was a theoretical possibility, we had no idea what a major feature of the networked world it would become -- at least in our lifetimes.
in retrospect, the Morris Worm wasn't so remarkable. We'd already seen pranksters on timesharing systems. I called them "doorknob twisters"; people whose curiosity and distractability meant they couldn't walk down a corridor without taking a peek behind the closed doors. Often these were the best people; Ken Thompson even described putting hidden hacks the C compiler in his Turing Award speech. And people had been talking about the possibility for network worms, albeit in sci-fi terms. Again in retrospect, something like the Morris Worm was bound to happen, probably within the next two or three years.
The Morris Worm is remarkable because it was our introduction to the unpredictability inherent in the scale of the network world. Just a tiny miscalculation was enough to turn an intellectual curiosity into a widespread disaster.
Yeah, the word "data" is plural and "impact" is not a synonym for "affect".
When you insist that the vast majority of a language's speakers are speaking wrongly, you're just spitting into the wind. I've even given up on "begging the question". I used it "correctly" myself, but it's far past the point where only morons use it to mean "brings up the question".
Rather un-called for. Here, have a virtual chocolate for your virtual PMS. Skype is a failure that Microsoft is trying to milk to death to justify the huge lump of cash they forked out for it. It's a brand name everyone knows, but that now has no actual substance. Their prices for international calls are NOT competitive, their quality of service sucks, and their business model depends on the rest of the world keeping POTS instead of moving to VoIP (which everyone is), because Skype computer to computer calls are "free". Well I can get that using any other VoIP app too, what makes Skype special here?
The only people that still use Skype are the ones that don't know anything better, or the ones that are forced too through OS/app design or business. I say it's a matter of time before no one uses Skype. Thus, Microsoft killed the brand.
Factorials were someone's attempt to make math LOOK exciting.