Want to know a big reason people have been getting Macs, that Apple doesn't like to admit? You can run Windows on them now. The Intel switch made it viable to run Windows on them, natively if you wanted, and good virtualization tech means it runs fast in OS-X. That lets people get their shiny status symbol, but still use the programs they need.
We've seen that at work (an Engineering college). Prior to the Intel conversion, there were almost no Mac users. The thing is engineering software just isn't written for the Mac. There is actually some stuff now, but even so the vast majority is Windows or Linux. Back in the PPC days, there was almost nothing. So we had only really two stubborn faculty that used Macs, one because he did no research and just played around, and one because he wrote his own code and was stubborn. However that was it, you just couldn't do your work on them.
Now? All kinds of faculty and students have Macs. PCs are still dominant, but we see a lot more Macs. However every one has Windows on it. Some it is all they have. Seriously, we have two guys who buy Macs, but have us install Windows on it, they don't use MacOS they just want the shiny toy. A number have bootcamp, and many have VMWare. Regardless, I've yet to see one, faculty, staff, or student, that didn't put Windows on it to be able to do the work they need to.
So that is no small part of how Intel helped Apple gain market share.
As most users have already defected to other distros, it was not worth the effort!
If any partition is more than 50% full, you had best have a plan for what to do next, even if it will not need to be done for two years. If you don't plan two years ahead, you should not be running a server. If you use SCSI disks on Unix, its easy to add more hard disks. If you are not using SCSI hard disks, well, presumably the data was not very important anyway.
Did I mention the lawn?
Speed matters less with each step up. Going from a modem to broadband is amazing, going from something like 256k DSL to 20mb cable is pretty damn huge, however going from 20mbps cable to 200mbps cable is nice, but fairly minor and going from a few hundred mbps to gbps is hardly noticeable.
I have 150mbps cable at home, and get what I pay for. Games from GOG and Steam download at 18-19MB/sec. It is fun, I can download a new game in minutes... however outside that I notice little difference from the 30mbps connection I stepped up from. Streaming worked just as well before, web surfing was just as fast, etc. The extra speed matters little to none in day to day operations.
Same thing at work. I'm on a campus and we have some pretty hardcore bandwidth, as campuses often do, so much it is hard to test as the testing site usually is the limit. Downloading large stuff it is nice, though really not that much less time than at home. I don't really mind the difference between a 2-5 minute wait and a 15-20 minute wait for a program. Surfing, streaming, etc all are 100% the same, no difference at all, speed seems to be limited by waiting for all the DHTML crap on a site to render, not the data to download.
While geeks get all over excited about bigger better more when it comes to bandwidth, for normal use what matters is just to have "enough" and "enough" turns out to be not all that much. It'll grow with time, of course, higher rez streaming, larger programs, etc will demand more bandwidth but still this idea that there is the difference between uber fast Internet and just regular fast Internet is silly.
It will not create any meaningful divide.
You must be new here!
I particularly dislike the fact that, not only have you left the roundabout, but have entered the next one, before it notices. It is positively dangerous when you have to go round a roundabout twice for it to catch up! (In a 40 ton rig).
And that on a Note 3, but it used to work well on an HTC Desire Bravo!
Come of Google - you need to test software before you release it - you are not Microsoft
Even in the US such an amount wouldn't be a tax in the sense of raising revenue, but an attempt to stifle usage. That is a lot per GB, even at US income levels. As such in Hungary, this is even more restrictive, given the lower income levels. It is for sure an attempt to stifle usage, and not a legitimate revenue measure.
Oh, they don't leave the house! That explains it!
Most people who leave their mother's basement for more than the above-mentioned few hours!
I mean sure if you use heavy usage games lots then maybe this matters, but most of your use is standby and cell network stuff. I've got my Note 3 lasting 3-4 days on a charge. How?
1) Turning off background services that slurp up battery. Just took some looking at the battery monitor and then considering what I needed and didn't.
2) Turning off additional radios like Bluetooth and GPS when I don't need them. It doesn't take long to hit the button if I do, and even when they aren't doing things actively they can sip some juice.
3) Having it on WiFi whenever possible. In good implementations on modern phones it uses less power than the cell network. Work has WiFi and I have a nice AP at home so most of the time it is on WiFi.
4) Using WiFi calling. T-mobile lets you route voice calls through WiFi. When you do that, it shuts down the cellular radio entirely (except occasionally to check on things) and does all data, text, and voice via WiFi. Uses very little juice and an hours long call only takes a bit of battery.
The WiFi calling thing has been really amazing. When you shut down the cellular radios battery goes way up. Not just in idle, but in use. Prior to that (when I first got it T-Mobile was having trouble with the feature) standby life was good, though not as good as it is now, but talk seriously hit the battery. Two to three hours could do it in almost completely. Now? I can do that, no issue, and still have plenty left.
Assuming I could pronounce it.
It used to work quite well for me, till I discovered where to buy a *BSD CD.
Unity killed Ubuntu. I am busy Migrating the family from Ubuntu to Mint.
"People in the past were wrong about what is possible, so clearly the naysayers are wrong about my thing!" See how stupid that logic looks? Trying to argue that cold fusion must be possible because people have been wrong about things in the past is arguing crosseyed badger spit. It is a nonsensical argument used by con men to deflect from their BS.
Here's the thing: With all these technologies that actually exist (#4 doesn't) you see two important things:
1) They are actually available to look at, in a non-controlled environment. You can verify them yourself, without some "researchers" standing over your shoulder, telling you what you can and can't see, what you can and can't touch. They are easy to verify they are real.
2) You can have the theoretical basis for how they operate explained to you, and that is consistent with our understanding of physics, chemistry, and so on. There's no hand waving, there's just science.
So when cold fusion hits that point, call me. When someone can say "Here is how this device works on an atomic/quantum level and why it is actually a fusion process," and when these claims are examined and confirmed by reputable labs at universities, where the researchers are given a device and allowed to do what they please with it, then I'm interested. Until then, STFU.
People have been wrong about things in the past, so clearly they are wrong about this thing!