I ended up switching to Opera after retrying it a few months ago-- it has a whole slew of formerly Firefox-only extensions now (like AdBlock Plus, Stylish, Greasemonkey, LastPass, etc.), I haven't run across any site incompatibilities yet, and it's a hell of a lot faster than Firefox was running for me.
In most distros it's "open GUI package manager, type password, search for 'java', pick the one that says 'java plug-in', hit 'apply.' -- using the commandline hasn't been required in at least a few years... That GUI method is no more difficult to learn than the method for installing it on Windows or OS X, and for some users like my mother, it's a lot easier as it means they don't need to know where to download the item from, which file to get, where it should be saved on the computer, remember where they did save it, or what the filename is.
The fact that government "debt" isn't the same as personal debt is one of the most crucial things that people need to be aware of, especially when it comes to voting -- yet precious few Slashdotters have a damned clue about it. It's pretty messed up given how many people here claim we should restrict the right to vote to "knowledgeable" citizens in order to ensure good results...
maybe microsoft should cope gnome-2 or gnome-3 or kde-3
at least i knew where everything was and i was not searching all over menus and the control panel and rightclicking all over everything trying to find where they hid some feature i liked to change
I can see someone saying that about GNOME 2 (MATE) or KDE 3 (Trinity), but from everything I've heard, one of the reasons people are rebelling against G3 is because most of the great features of past versions are hidden or missing.
I think that the KDE 4 team's attitude probably would be the best thing Microsoft could copy. It actually listened to the horde of users listing the massive flaws that early KDE 4 releases had, started working hard to address them while keeping the things that people praised, and were back on track in under 2 years. That's all MS really has to do -- and with the amount of resources & skilled employees it has at all levels, it should be able to hit that same point in six months at most. (Especially if they borrow the good parts of KDE 4 and Linux's other environments, as I've heard they did in creating Windows 7.)
Yet there is far more than enough anecdote to bring it into question, and enough industry push to hide the question.
People that believe in alien abductions, psychics, etc. say the exact same thing...
I've tried sugar-based Coke, but it's nauseatingly over-sweetened (it's like liquified Sweet Tarts), and I've encountered quite a few other people that reacted the same way. (Quite a few that were 8-15 years old in 1985 have mentioned that they loved the hyper-sweet version as little kids, but were surprised to discover as they found it revolting as adults.) It all depends on how sensitive the person is to sweet things; back when Coke was still sugar-based, people like me drank diet sodas or were strong fans of a less-sweet cola like Pepsi.
There's a pretty decent chance that one of the reasons HFCS Coca-Cola resulted in a permanent surge in popularity is because it's palatable to a much wider range of tastes than the sugary type. My guess is that that any switch to sugar-based Coke will be short-lived because they'll see sales drop as more sensitive people will promptly switch brands even if they've been die-hard Coke fans as long as they can remember.
Actually, ObamaCare's model is from a Bush I/Clinton-era plan. Bush submitted the individual mandate as a proposal, then in 1993 the conservative Congress tried to pass a variant as the Health Equity and Access Reform Today Act, which was -- like Obamacare -- an individual mandate with penaties for non-compliance. (See the third paragraph of this section in Wikipedia's ObamaCare article.) Romney's plan was considered important because it was the first time it was actually enacted, thus demonstrating that they could attempt something like it without causing total disaster.
The plan itself has always been conservative-corporatarian in nature, though. The reason it has been picked up by the Democrats is essentially that politicians in general have become increasingly conservative & capitalist over time, so the Democrats of today tend to be very similar to average Republican politicians ~2 decades earlier. When it comes to Romney, he seems to follow the same path that most politicians do regardless of which party they belong to: aiming for what they perceive as centrism when elected at the local or state level, then shifting strongly in favor of conservative-corporatarian soon after being elected at the Federal level.
I don't think offline devices are nearly as useful as online ones are, and by the time you've found a place that's capable of using them, you'd really be better off lobbying government and local telcos to build a tower as well. I'm not just speculating about this, by the way, I've spent the last decade working in the developing world on exactly these sort of problems.
I'm not quite clear on the above -- do you mean that:
A. it would be more reasonable to wait years for the telecom infrastructure to become available and then go straight to Internet-capable devices (as opposed to offline devices right away)
B. Internet-capable devices are preloaded (e.g. with Wikipedia), so it's better to get them now as it will eventually be possible to fully utilize their abilities, as opposed to spending on a wave of offline devices followed by online ones
C. Internet-capable devices aren't preloaded, but better to get them for the features they do have as they'll be more useful down the road
I defer to your experience, but was wondering because in cases A & C, it seems to me like any substantial delay would harm the educational & skills development of the kids left waiting, and "A" would result in some kids reaching adulthood without getting their chance.
My impression is that the TrueType guys obsessed about file size.
It'd make sense: hard drives & RAM were still very limited in size/speed, so most programmers tried hard to conserve space AFAIK.
Facebook actually solves a particularly tangible problem -- how to casually communicate with a broad set of people in an easy way.
Email had already solved that, plus users didn't have to all be on the same network, use a particular client, give up their privacy, and so forth.
*First world people* fear snakes, with the bible (Gen 3:1-5) being a significant cause IMHO.
The Bible might be a major cause in highly-religious areas, but not in the rest of the country; it would be extremely unusual out on the US West Coast where I live, for example. (I've never known anyone that took religion *that* seriously; the closest I can think of was a hardcore Irish Catholic great-aunt born in the 1920s that would have been insulted enough to call me an idiot if I even asked whether she found snakes scary due to the Bible.)
In countries where seriously venomous snakes exist, they are venerated a holy animals
We have a few snakes that are venomous enough to be deadly to adults if antivenin isn't administered, and are extremely dangerous for kids or seniors -- for example, the Eastern Diamondback Rattlesnake has a 10-30% fatality rate. The reason the more dangerous snakes here don't kill very often is because good antivenin has been developed and improvements in roads/vehicles mean it's usually possible to get a human victim treatment in time. AFAIK, everyone I know is afraid of snakes primarily because we were warned about the deadly ones as kids and don't trust our ability to accurately distinguish dangerous kinds from snakes that merely look similar.
But there have been $200 Android tablets for years; the challenge being discussed is to create a functional sub-$200 laptop.
...crappy screens (ye olde 800x480), tiny RAM (2GB or so if you're lucky), and miniscule hard drives (8GB SSDs)...hard drive is probably $50 for 500GB...
With a polished+supported OS and an 80GB drive, at $200 it'd work for a lot of people, either as a primary system if they're poor or a secondary/work-only one if they're not. I'm speaking firsthand from my single-core 2GHz Thinkpad T43 after finally upgrading it to 2GB of RAM today; it has a 60GB hard drive, 1024x768 14" screen, runs SimplyMepis 11 Linux (currently using 4.8G + 1G swap), and does everything I'd like it to do.
My laptop's specs give a good idea of what a manufacturer could get away with in creating a polished Linux-based laptop. The OS and most Linux programs don't take up much room, so even an 8-12G SSD (or 30GB HD to be generous) would be fine and a SD/microSD card reader would then allow the user to take on the cost of additional storage based on his/her needs. If the timing's just right, the company could take advantage of others pushing towards super-high resolutions by buying the WXGA or XGA screens at a huge discount.
I don't know the OS costs, so it's hard to comment much on them -- but there are at least a few computer repair/building services out there that sell PCs they've set up with very newbie-friendly Linux distros and have had a lot of very satisfied/repeat customers, which suggests it's possible to pull it off; seeking out those successful geeks and finding out their "secrets" might be the wisest approach. The most important thing there, I believe, would be to ensure the customers know that the computer wouldn't run Windows, so there's no confusion/shock when they go to use it (as with the netbooks a few years ago); hell, with word out now that Windows 8 is a giant clusterfuck, it shouldn't be hard to market the fact that the OS isn't Windows as a desirable thing.
The metaphor goes even further: HTML was modeled after HyperCard, which was designed as a programming language geeks & non-geeks alike could use at their respective ability levels -- newbies/non-geeks could quickly learn to code the equivalent of a hand-written personal website (linked text+multimedia) while experts/geeks made professional standalone applications like today's major shopping sites or (literally) Myst. While HTML can't easily be used to hand-write a "good" site anymore, it's still very possible for a regular user to rapidly learn the basics, making it still on some level be a "language" potentially common among human beings, rather than being specific to the subset that are inclined towards programming.
Er, no. Witch hunts aren't from anonymity, or there wouldn't have been any back when everyone lived in little towns where they could be instantly & easily identified by just about anyone else there.
Witch hunts often come about for the opposite reason, in fact: it places immense psychological pressure upon someone when they see others around them (particularly people they respect) targeting individuals or groups, and because the "us vs. them" attitude means that even appearing to sympathize or disagree means potentially being targeted as "one of them," the vast majority of people will go along with it. Even if there's no threat of physical harm, being socially ostracized or looked down upon by others that know them (particularly ones respected by the community) is stressful enough to impel most to cooperate, particularly if the group includes individuals that the person wishes to be respected by and/or that are higher in the social hierarchy.
In-person, people gossip and speculate about others they know with friends, co-workers, or others they're on a first-name basis with, often despite knowing that the "information"-sharing could or will cause grief for the individual down the road; if the friend/co-worker starts speaking negatively about someone else, again, the majority of people will refrain from speaking out against it for fear of either hurting their standing with the person or becoming the next target. Likewise, it's similarly highly common for kids to gossip, knowingly lie about or even bully classmates as a method of bolstering their place in the social hierarchy, and very few kids will speak out against someone their age that's engaging in that kind of behavior.
The idea that anonymity leads to anti-social behavior online was nothing more than an untested theory that the media picked up on and ran with, not a well-established psychological reality. The few studies that have been done (including examinations of forum results) indicated that requiring real names only eliminates a tiny percentage of the vicious posts or trolls. That's because the vast majority of people that troll individuals or groups feel there's absolutely nothing wrong with their behavior, very often even feeling proud of it; they'll say the same things whether they're logged in under their real name, a pseudonym, or anonymously, which is why *Facebook* has a huge problem with bullying & trolling by people using their real names. (Think about all of the times you've seen people express smug pride for being "politically incorrect" by needlessly using slurs or saying things when somebody politely says it's hurtful to them & others -- the same callousness appears on Facebook and other places where people post under their real names.)