I'm actually wondering whether this is about downranking sites that have no stripped down "mobile" equivalent, or whether it's about downranking sites that don't use responsive CSS/HTML. The latter makes more sense and it doesn't leave mobile users forced to use crappy websites with most of the functionality (and often content) missing.
The former makes no sense in any world, and it would be terrible for Google to go there.
The first patent (which had no attempt to commercialize) was in 1979. Most early research, with largely failed attempts to come up with a commercially viable product, were in the mid 1980s. The tech has slowly advanced since then, and nowadays is becoming rather mature.
I don't know why this is seen as a way to diss 3d printing. Some people's hatred of makerbots and their ilk is so great that they can't accept that 3d printing broadly has developed into actually useful production processes in some fields. Rocketry is a great example. It's just silly to have to make (and warehouse) moulds or stamps for parts that you only need a couple dozen of and which you may revise after just a couple launches. Now that 3d printing technologies have advanced enough to produce high quality metal parts, it's properly taking of. It even pairs nicely with CNC, there's now hybrid 3d printing / CNC machines out there. CNC gets you the coarse, primary shape and 3d printing adds in the intricate and/or jutting out components.
3d printing is a very useful technology for low volume or rapidly evolving part runs. No need to play it down just because Makerbots exist.
Agreed. XKCD covered it concerning apps, but it's usually not much better with mobile versions.
What happened with Google? It's like every change they make these days is to make things worse. And I say this as a person who's generally a big Google fan.
Now machines at call centers can be used to seamlessly generate spoken responses to customer inquiries, so that a single operator can handle multiple customers all at once.
No. HUMANS can be forced to read off a script but MACHINES suck at anything more complex than "Did you say "yes"".
Well, they're already better than merely "Did you say 'yes'?" way better. And honestly, I'd rather have a machine than a barely english speaking call center bozo in India reading off a script they know nothing about. At least with the machine, I know the script, and in some cases can significantly cut the time by knowing the response sequence. I don't have to wait for the "question" to finish before putting in the next response. The latest encounter was with my ISP, where the machine knew my previous calls progress and picked up with the next steps through multiple calls, and automatically escalated me once all the bogus rebooting of components had been completed, to a level 2 tech that knew enough to figure out there was a hardware issue in their equipment within a couple of minutes. The entire process took about as long as just the wait time for a real person previously. So that's progress, and what use was a script reading barely understandable human anyways? I guess maybe you can release some frustration on them and perhaps that call/transcript will wind up somewhere on the internet.
I think you've just illustrated why it is, in some ways, a "product" - you've latched onto one possible hole (bugs) rather than seeing the whole picture (security models, personal interaction, etc.)
Bugs are a problem, but a well written bug free system with a poor security model is always going to be an issue. Java has bugs. The Commodore Amiga operating system had (by the time it matured) relatively few but it did have a security model (that it relied upon) that was overly permissive. Given the choice between basing my new secure system on Java, or running it a code audited AmigaOS based appliance, I'd pick the former in a heartbeat (no offense).
And we haven't addressed yet the notion that user requirements also need to be lead to secure outcomes.
Pfft. Then you just add another layer of NAT! You can make 4 million two host networks with 10,0.0.0/24. Then you can put 4 million two host networks on each of those networks, too. Now you've got support for 17 trillion end user devices!
Much like turtles, the Internet could be IPv4 NAT all the way down....
Honestly, without regulation or legislation, I suspect that's how we'll end up.
Or maybe he's just a dickhead.
But I'm not sure he's all that substantially more evil than many others cruising at that altitude. The GOP hardly seems to miss an opportunity to miss an opportunity to set about reforming anything.
(2) It's really more her status as High Priestess of Cthulhu that worries me.