The hardest problem I've seen people have with Google Glass is how obvious it is you are wearing the glasses. People in public assume you are recording them and it bothers them.
Actually, I don't think that's the hardest problem. Our innovation team at work brought in a pair of Google glasses and let us try them out. Frankly, they are exceedingly underwhelming. The screen is really small, but worse, the resolution seems low and the colors aren't very great, so it's actually really hard to read. And it's not really like a HUD or anything like that. You have to really take your attention away from everything else to read the screen, so in that respect it's not very immersive and it feels like you are doing two things at once: interacting with the real world or interacting with glass (just like how you can either look at the world or look at your smartphone). The real potential would be if you could walk around and have immersive information show up around products, etc, without you having to take your eyes completely off them.
And another design problem with them is that they get really hot. Like uncomfortably hot when you touch them, like those old laptops always were when you set them on your lap.
So to me, privacy concerns matter, but I don't think the average citizen thinks about privacy all that much. I think to them, as well as myself, the big issue is an underwhelming design, combined with an exorbitant price ($1500) and really no practical application for it yet. It doesn't mean it won't ever succeed, of course. I just read an article reminding people that cars were around about 40 years before they became actually decent, and PDAs have been around since the 80s but only really took off when the smartphone craze kicked off. Someday, we may look back on this as the first step towards a technology that everyone has, but for now, they really aren't that great and there are many reasons they failed.
.NET is slowly beeing weeded out of the enterprise though and that's a trend I don't want to see diminished by devs picking up
.NET because it's now "open source". It's OK to hate .NET, open source or not.
Lol, are you serious about that? That's not true at all! I work at a fortune 500 company and it's the exact opposite: it's Java that everyone is trying to weed out. There are several reasons for this, but they include these three things: Java's performance is slower than
And one other thing about Java and another reason enterprises are trying to weed it out... the various Java application servers sprawling all over the place are seriously annoying and make supporting Java well a massive undertaking of training and manpower. In my organization, we have purchased Java applications from vendors that are based on all of these: Oracle Weblogic, IBM Websphere, Apache Tomcat, Redhat JBoss, and Apache Geronimo, and we have to figure out how to admin and support them all. And worse, none of these are as good as
Plus, there are other things about
So I really don't understand where this bashing of
Look, CO2 is like a blanket on the bed. Making it thicker makes you warmer. You wish to deny this?
Partially, yes, for three reasons:
- Your body is a heat source. Cover it with a blanket and you get warmer because the heat energy is trapped and cannot easily escape, and you body is constantly adding additional heat energy. By contrast, the Earth is not a heat source in that same way. Any heat it has is generated by an external body: the sun. It's like a rock sitting next to a fireplace with a blanket over it. Take away the fire, and rock is ice cold regardless of the blanket. Same with the Earth. This makes the CO2/blanket analogy very flawed, because the climate can be totally independent of the thickness of the blanket, and get much colder or much warmer based almost entirely on the current energy output of the sun.
- Secondly, CO2 is a tiny trace gas in our atmosphere. This is not Venus where it makes up the majority of the atmosphere. Our atmosphere is 78% nitrogen and 21% oxygen, and everything else is a trace gas. People like to claim there has been a dramatic rise in CO2, but zoom the scale of your graph out, and you see that the "big jump" is considerably less than a fart in a windstorm. Right now CO2 makes up 0.04% of our atmosphere. 100,000 years ago it is estimated that it was 0.03%. So even assuming humans are 100 percent responsible for the 0.01% increase, it is extremely tiny. In your blanket analogy, you claim that making the blanket thicker makes you warmer. I would dispute that and say that it does not make you warmer if the blanket is negligibly thin. If a human is covered by a blanket that is 0.03% the width of an average thread, and you "thicken" it to 0.04% the width of an average thread, I submit to you that that is so negligible that you do not, in fact, find yourself feeling warmer from the thickening of the blanket. We really do need to keep our perspective on CO2 percentage and not commit fallacies based on graphs of CO2 concentration that are far too zoomed in to show context.
- Thirdly, we do not understand all the interacting, chaotic systems on our planet at all. We see clearly that CO2 percentage and temperature have both varied considerably over the course of the planet's history, but frankly, we really don't know why. Why should there be a difference between 100,000 years ago and 50,000 years ago? We certainly know humans didn't have anything to do with that. And because we can't say what the causes are, we can't say definitively that thickening the so called blanket leads to warming. Historically, we know that CO2 increased only to find that in later eras it decreased. This would suggest the planet has some kind of feedback/absorbtion systems that can at times remove CO2 and thin the blanket. We also know temperature can increase or decrease by large amounts naturally with no involvement from humans, and that temperature does not always move in sync with CO2 concentrations historically. In short, we don't understand the relationships between the CO2, temperature, and the systems on this planet, so even though a CO2 increase may lead to a temperature increase in an isolated system, we don't know that CO2 increase leads to predictably higher temperatures (or even permanently higher CO2 levels) in the highly complex planetary system of Earth.
So yes, I wholeheartedly dispute your blanket analogy on the grounds that is a flawed analogy, and that we don't know enough about our planet to make any intelligent predictions or models at this time. Indeed, every model we have, when fed historical temperature data, says we should be at much higher temperatures than we are now. Most assume some kind of blanket model, but since none match our measured results, we can conclude that a simple blanket model does not match the complex reality of the systems on Earth.
No, foo. It's called basic common sense -- keeping confidential medical records, SSNs, and personnel files in paper format only, and not allowing them to be scanned or placed in a system connected to the general business intranet, or "the cloud".
That really seems like unnecessary effort. Why go all the way back to paper when you could set up computer systems in a back room on an isolated network, which is not connected to any other network (especially the Internet)? Then it's air gapped pretty nearly as effectively as paper, and you could get all the advantages of computerization without having to deal with the pain of paper only records. And if you are really worried about physical security, like thumb drives walking off, just put good physical security around the room with multiple locks on the door, with the keys to each lock spread among multiple people so no one can be in there alone copying data.
To me, that seems like a lot more effort than most companies would be willing to go to. Certainly it's a lot more painful because employees can't go in and update their personal records on their own remotely (things like W4s, address changes, etc). But it's a far better option than going all the way back to paper.
This is just the sort of bug to get people to adopt Linux on the desktop, since it will be more similar to what they expect from Windows.
Not me! I refuse to use software as immature as version 3 of Linux. Mac is on OS version 10, Windows is about to release version 10, and by golly, I'm not wasting a second of my time on Linux until it catches up!
You obviously are a bit hazy on what ethical means. To me it is ethical to kill a retarded person under certain circumstances. To sum things up, morals are the values instilled by society, ethics however are the values you aspire to. Personally for me it is highly ethical to lie as much as I am being lied to, especially to the people who are lying to me. Of course always considering the risk and benefit ratio. You might find that is highly immoral but I think you guessed it by now I am a very ethical person albeit not a very moral one.
You are neither ethical nor moral, nor are you correct on your definitions. No one believes ethics are "the values you aspire to, completely uncoupled from morality". If you aspire to have the worst moral values possible, that's not considered ethical. Only aspiring to high moral values is considered ethical.
Aspiring to kill retarded people is not ethical, not moral, and your posturing fools no one. Frankly, you must work at an IT shop full of the lowest talent possible, because you'd never for a second get away with lying where I work. I had someone try that on an interview once: I'd ask him questions, and rather than saying "I don't know" he'd very calmly and matter of factly tell me wrong answers as though he knew them. Problem for him was, I knew the actual answers and new he was lying... we have real IT people doing tech interviews, not HR. My immediate comment in the HR meeting afterwords was that he's a liar and he should never be on our team, and he never was. You'd never be allowed in the door.
There's now an entire generation of IS/IT managers, directors, and CIOs who not only prefer Microsoft technology but have an active dislike of anything related to Unix(tm)
I don't know how much the "actively dislike Unix" part is true, but yes, there are a lot of IT people that prefer Windows. And there are very good reasons for that. Microsoft makes some exceptionally good products in a number of areas. Here are some examples:
- Visual Studio, probably the best IDE known to exist. I've used it and competitors like Eclipse, and it is MUCH BETTER than Eclipse. This alone makes a lot of devs prefer Microsoft. And as of the announcement last week, is now going Open Source.
- .Net and ASP
- Powershell, which for management is really, really good. It's gotten to the point now where it is better than competitors like Bash. Objects in the pipeline, rather than just text, is just so much better than any other shell.
- SQL Server, which is finally reaching performance/feature parity with Oracle, but has better management tools and is generally preferred by a lot of devs.
- IIS, which in it's latest incarnation has better performance than Apache, is easier to manage and is easier to get security isolation of websites out of (I do web hosting for a living, and I can easily stack 350 sites onto IIS and have them all be completely isolated in different processes with different security accounts as well, and it's REALLY easy.
- Windows Server, which admittedly is a tossup but depending on what you want may cause IT people to prefer it. It admittedly doesn't run on as much variety of hardware as Linux or scale up to supercomputers like Linux, but really is a very competent OS that is simple to manage and has probably the largest ecosystem of software written for it.
In summary, I don't get the bashing of Windows or all the "My Linux is teh best!" kind of comments. Linux has it's strong points as an OS, but Microsoft does too, and they have some fantastic products out there that can handily beat some open source equivalents. Depending on your workload, it can be very appropriate to prefer Microsoft products. (Of course, I'll be the first to say Microsoft has it's terrible products too... Network Load Balancer anyone? Linux based load balancers like F5 beat the pants off that thing.)
The OP said this:
500kph is moving towards the average speed of an airliner. Add the convenience of no boarding issues, and city-centre to city-centre travel, and the case for trains as mass-transport begins to look stronger.
Airliners routinely cruise at 550 mph, which is nearly 900 kph. So I guess trains are moving towards the speed of an airliner in a strictly technical sense, but in reality, even this one, which is not representative of the norm, is still only just passing 50%, so not even close yet.
The OP also said this:
The Japanese Shinkansen is now running over 7 times times as fast as the average U.S. express passenger train.
It should be noted that there are almost no US express passenger trains anywhere in the country, except within a few large east cost cities. In the rest of the country, there are none city to city or coast to coast, except for one, maybe two Amtrak routes that appear to exist only for nostalgia reasons, not for routine travel.
Add the convenience of no boarding issues, and city-centre to city-centre travel, and the case for trains as mass-transport begins to look stronger.
Nope, not really. It only looks stronger if your cities are very densely populated AND very close together. Neither of those are true of the average US city. If I'm going from the city center of Minneapolis to the city center of Atlanta, that's 1815 km, and I'm not going to sit around for a whole day on a train getting there. And since the majority of the US population lives on the East and West coasts, what about going from the city center of New York to Los Angeles, a common route? That is about 4,500 km. So yeah, rail travel in the US continues to be a pipe dream that makes no sense. I don't understand why people are so hot on bringing the premier travel method of the 19th century back into the 21st century in the US, when we now have airliners for city to city travel and cars and buses for intra-city travel, both of which make far more sense and are far faster than rail. Rail in the US continues to be an expensive, money losing boondoggle almost everywhere.
The problem with North American rail travel has never been a technology barrier, it's always been about having any interest in doing better.
Or more precisely, the problem with North America is that it's a country where most people would never even benefit from having high speed rail.
The root cause of the lack of interest is that our nation's population is so spread out, you can't get rail to move you to your destination faster than a car, no matter how fast the train runs. It's not like densely populated areas of Europe or Japan where a million people all want to go from the same point A to the same point B. Americans are so spread out that you have many tiny groups wanting to go from many thousands of different point As to different point Bs. You'd have to make hundreds of thousands of train lines, traveled by only a handful of people, and even then you'd have to switch lines so many times as you travel the sprawling cities and suburbs that you'd never beat the car anyway.
That's why most large American cities have bus lines instead of subways as well. Americans built their cities out, not up, and you can cheaply throw tons of small capacity buses on the roads going all kinds of different directions to move people about. It's really the only kind of transit other than a car that makes any sense in American cities like Houston, Minneapolis, Kansas City, etc. And even then, your car is going to easily beat the bus unless it's during rush hour when the bus drives in a dedicated lane. But at least the bus can go anywhere in any direction, so they still will easily beat rail in almost all scenarios, with the exception of a few densely populated East Coast cities like New York. They also do much more to relieve congestion, since more people can get where they want to go via bus than train, and are therefore more likely to take it.