Vote this AC up because this is EXACTLY the kind of crap Disney has pulled in the past.
I've worked with MaxMind stuff on mobile IP location - as they guy says it's pretty useless. If the user is on wifi it's not too bad, at least the IPv4 stuff could pretty reliably get the state and often city. I never had any luck with IPv6 although they claim to support it better now.
The big kicker is if the user is on cellular - at least in the US most cell networks are natively IPv6, and they tunnel connections through giant NAT devices. This leads to two interesting effects - firstly the IPv4 address you see on the server is located at some random data center usually on the other side of the country from the user. Secondly, the IP (and therefore the data center) keeps changing - sometimes multiple times within a few minutes. Doing any kind of tracking leads to a device which appears to keep hopping back and forth between California and Kansas.
This Microsoft Research whitepaper talks more about these issues.
(and before anyone jumps on me for the privacy implications of trying to do this - in my specific case it was tracking devices in an enterprise environment for security purposes and everyone involved had given informed consent)
Why do they have to be exclusive options? I backup locally to a server under my desk, and remotely to the cloud. In the (more likely) event of an HDD failure I can restore as fast as my server can spit the data back out and be up and running in a few hours. In the (less likely) event of a catastrophe like a fire it might take a while to restore everything but at least it's not gone forever (and if I'm willing to pay they'll fedex me all my data on a drive). If the cloud provider go bust I still have my local backup and I can switch to a new offsite provider.
FWIW I pay around $12 a month for unlimited off site storage (and currently use maybe 4TB) - this is with crashplan. If you have anything remotely valuable it seems like an obvious thing to do for a little more peace of mind.
He wrote some software, you weren't charged for it and it's existence doesn't affect anyone. Your anger, if it exists, should be directed to those forcing you to use it. Who are "no-one" or "your distro maintainers" depending on your POV.
Our memory usage scales with load. Our load scales with usage. Predictions about growth in popularity of our product are all very well, but no excuse for not monitoring for impending doom
Of course. But testing will tell you something like "a single instance with a 32GB heap will support 9000 tx/sec with acceptable 99.9% latency". So you can monitor traffic levels and scale out as appropriate well before something monitoring GCs starts seeing problems. Where I work we deal with request rates in the 100k/s range and so if things go wrong they do so very fast - the trick is to know the limits and stay well away from them!
(especially since we have some legacy code that doesn't scale horizontally and so we have to keep throwing more memory at the problem for those services until we can fix that).
Tracking the frequency/duration of full collections is the usual approach. The GC has to work harder as heap space runs out, a system which is tight will do frequent full GCs vs one which is running with plenty of head room. In particular if you're using G1, seeing full (single thread) GCs at all is a bad sign. I'd also do this out of process, either by monitoring via JMX or simply scanning GC logs. A process trying to monitor itself rarely works out well
The better garbage collector for servers (G1) never pauses the world to free everything it can, so it's not like you can look at post-collection heap size or anything.
It's an over simplification to call G1 "the better collector for servers", it's more complicated than that - and G1 certainly can do a stop the world, it just tries to avoid it.
I'd also say this - if you're capable of writing C++ without any resource leaks you're capable of writing Java without any resource leaks. In which case memory usage will be predictable and simple load testing will show you how big a heap you need to allocate.
Except that's entirely untrue. You may wish it were, but it is not. I don't have an HOA at my house but there are myriad laws (federal, state & local) which restrict what I can and can't do in and to my house.
I wish I had upvotes for you.
I am a power user. I'm currently surrounded by two very powerful PCs... rather a high-end 'docked' mac laptop dedicated to development work and a frankenstein's monster BYOC dedicated to gaming, Watching and converting video (-- Anime junkie) and artwork.
I also own a little Samsung Android tablet. Despite the mobile development workstation, I use the ever-loving snot out of that tablet. I use it to watch video I've converted for it, read books and magazines, browse web while seated in my nice club chair in the living room, have a reference site up while console gaming, and art. Turns out that Autodesk has a VERY nice painting app for $6. Works beautifully with cheapy capacitive styluses.
I consume the vast majority of my Crunchyroll subscription on it (more anime and manga).
However, I don't use it at ALL for email.
So yeah, mobile matters.
Tetra-chromats are by definition female. You need two x chromosomes to carry the two different genes that code the two slightly different pigments.
It's an axiom that there are no women on the internet.
Ergo, there are no tetra-chromats on Slashdot.
It's not about average usage, it's about instantaneous usage. Most of the time my connection is pretty idle, but when I want to download something big (e.g. multiple gigs) I don't really want to wait around for it. That's what I'm paying for - not having to wait.
Don't forget storage. Bandwidth is one thing, but image storage is a big deal for sites like FB. They often store multiple copies of each one (e.g. at different sizes) and then you also have copies cached on CDNs etc, which also costs money. 5% isn't going to make or break the company, but it's worth investigating.
Which in turn would mean that for the problem space it's capable of operating within it's no faster than a normal computer. Which reduces down to "it's no faster than a normal computer".
Door opening: See above re: neighbor or friend, or hide a key somewhere.
A truly special reply suggesting mitigating a theoretical, limited, network security vulnerability by quite literally leaving the physical keys to the castle out in public. Please hand in your risk assessment credentials at the door.
Or you pay a couple of bucks and complain later. Given that this scenario has never happened to me in years of riding the subway makes me quite happy to take the $2 charge every few years to avoid dealing with the police.
I get over 100mbps on FIOS right now. I've frequently maxed out my 150mbps connection pulling from a single server (well, single URL), particularly if I use a download manager which opens a few connections. It's true you don't usually see those traffic levels in normal browsing but for large file downloads it's not hard.