Comment Re:Who has enough money to buy IBM GTS? (Score 1) 81
> That's the mainframes, power systems, research, and what else?
None of it is part of GTS, cringely doesn't even know how the company is currently organized.
> That's the mainframes, power systems, research, and what else?
None of it is part of GTS, cringely doesn't even know how the company is currently organized.
The analyst said that Twitter's data quality is "horrible". Chowdhry said that many pollsters used Twitter data to predict a Hillary Clinton win in the U.S. election but the fact that Donald Trump won shows that data quality is poor. One reason for this is too many fake users on the platform, Chowdhry claims.
Twitter has had issues with monetization, but the idea that the platform is somehow flawed because some idiot used it as a source polling is nuts. You can't determine an election from reading tweets.
Twitter differentiated itself from other social sites by embracing simplicity and mobile. The simplicity of twitter has also hurt it, because it keeps failing at expanding the platform beyond tweets making it a poor growth stock since its user growth has stagnated.
Also have some trouble with assertion "it is very common to use gzip at the HTTP level." For static assets sure however I expect numbers for dynamic content to be a much different story.
It's in fact very common for dynamic content.
Cannot be in the header, since the headers are not compressed in HTTP.
Or he's just using the assumed units, like we always do in these discussions.
That's certainly possible, just like it's possible he tolerates 700 kilobits/second on what he thinks is 21 megabits per second and hasn't pursued it as some acute problem with his service, or just gone elsewhere.
However, you did seem to avoid the question of why the company should be able to sell a connection that they seem to never be able to come close to meeting.
TFA says they are able to come close for a random sampling of users, for what they claim to be a maximum speed for some class of connection. When I see someone post about "21 meg or some shit" and claim to be tolerating 700 kilobits/second in the early evening, I assume they're mistaken or haven't behaved rationally in resolving it.
I'd like to know where they tested Charter at. If you're in a relatively sparse area they're great, but here in Madison, WI, they fucking suck. I have "21 meg" or some shit and at most I pull down between 2 and 5. Between the hours of 5 and 7 or 8 o'clock in the evening, it's damn near unusable because everybody in the city comes home and starts streaming Hulu and Netflix and I'll be lucky to pull down 700k, and the latency spikes like you wouldn't believe. The techs themselves tell me never to expect to hit the speeds I'm told I'll get, because that's not "real-world use."
So if I'm never going to get that speed in practical application, why again are they allowed to advertise said speed?
Sounds pretty likely by the ambiguous units in your post that your expectations are inflated by a factor of 8 because you're misunderstand the units of what's advertised vs. the units in what you're observing.
"The great question... which I have not been able to answer... is, `What does woman want?'" -- Sigmund Freud