Link to Original Source
But while that may not come as any great surprise, the map reveals a startling bigotry coursing beneath our preconceived notions of just where in the US hate is harbored most. Americans, it turns out, fall racist and homophobic and ableist, and are apparently vocal enough about it to spout off bigotry on social media, in no real discernible pattern, though it's often where we least expect bigotry that we find it rearing its ugly head.
The visualization comes way of Humboldt State University's Dr. Monica Stephens and the Floating Sheep--the same group that made a map of post-election Twitter hate speech. It comprises 150,000 geo-coded hate tweets flagged between June 2012 and April 2013 for including the word "chink," "gook," "nigger," "wetback," "spick," "cripple," "dyke," "fag," "homo," or "queer". At first blush it's awfully depressing, a real day ruiner, or worse. Click around and most slurs--not all, but most--see the intercontinental US pocked by deep reds, the research team's translation for "most hate." Jesus Christ. Is it 2013? It can't be 2013."
Link to Original Source
the admin who wont call a terrorist attack a terrorist attack simply because it goes against his political agenda?
You are likely referring to the Boston bombings; as I understood it, Obama didn't use the term "terrorist" specifically ON the day of the bombings, and has ever after. I'd say this is simply him doing his due-diligence in not jumping to conclusions, as at the time no one knew if the explosions weren't simply a gas line exploding. If anything I'd want more of politicians and news stations taking a deliberate and thorough approach to things, rather than going all reddit on us and pointing fingers and making sensationalist claims. Each to their own eh?
Students have it good when it comes to matlab -- you can get a student version of matlab + simulink (with 10 or so toolboxes) for $99. The people who are really hurt by matlab's pricing schemes are the hobbyists who don't qualify for a student copy. There's this huge price dichotomy; when you're a student it's $99, after you graduate it's $5000+, and that's without any toolboxes.
However, for academic use it makes perfect sense for scientists to use matlab over the alternatives. At least in the UC (university of california) system, a department will have some (large) number of licences for it's faculty to use, and so professors and their student (who commonly won't have much coding experience) will have what's essentially free access matlab and it's associated toolboxes. From their perspective, they want to run their experiments and write their papers, not learn how to code and be a pseudo-sysadmin. They want to use the simplest environment that stays out of their way, and not have to deal with installing various libraries. Say what you will about matlab, its support and documentation is very good.
This is a major improvement for GPGPU, not game playing. Memory throughput is often the bottleneck in applications, as computational throughput improvements has greatly outstripped memory throughput improvements. To give you an idea about the importance of memory bandwidth, if you have a GPU with a peak arithmetic throughput of 1170 GFLOPS (this is how much a Tesla K20 gets for double precision floating point) performing FMA (fused multiply add, so 2 floating point operations for 3 operands), then to sustain that level of throughput, you would need roughly 13 TB/s*** memory throughput (this is assuming 8 byte operands and that each of the 3 operands of the FMA are unique). Of course you can't reach those levels with global memory, but any sort of improvement helps.
*** required memory throughput per second = 1170 * 10^9 ops * (24 bytes / 2 ops) = 14040 * 10^9 bytes ~= 13 TB
By the time remote controlled robots would be usable enough to carry around and install office equipment it won't be long before we have robots that can do it without any remote control.
There's actually a huge gap here. Having robots perform tasks autonomously in anything other than a very narrow, constrained environment would require semantic understanding. We've had robots that can go through the motions while being controlled by humans for decades (telerobotics), but developing machines with deep understanding on a semantic level has been something of a holy grail in AI, and is still as far-off as ever. While I agree with you that we'll have to undergo a dramatic cultural shift if we ever create a legitimate AI, that is still many, many years off.
I doubt he has a chance in hell of winning his suit.
I wouldn't be too sure; from the article "Lecerf said that it wasn't the first time his speed dial had jammed but that Renault had looked at the car and assured him that it was fine." (emphasis mine) whether or not this makes them liable is another matter, but they apparently did have some involvement with his vehicle