Modern, best-practice C can be compiled with a C++ compiler. (There are a few gotchas moving in either direction - http://www.cprogramming.com/tu... - but it's not hard to avoid them.) For all its object-oriented impurity and spec-bloat, the one thing I love about C++ is that you can write relatively high-level code when that makes sense, but you always have the option to grapple with all the fine detail when that's useful.
The article doesn't seem to point out the obvious explanation, ie that H1B applications contain personal data (of the type Slashdotters are usually passionate about protecting), and that it is good practice not to keep such information hanging around once it has served its primary purpose. There are presumably solutions to the research concerns, such as aggregating the data before it is deleted or collecting the specific data necessary before the records are deleted.
This sounds great in theory but, in practice, it's going to be almost impossible to enforce (eg whose definition of 'vulnerable'?) and it would promptly create several new Internet plagues, eg the "Your server has a vulnerability, pay us now to stop us reporting it" spam email.
The picture you paint of Europe is a little simplistic too. France has a few large cities, but the tenth-biggest one has less than half a million inhabitants. It has tens of thousands of villages with 1000 or less inhabitants. And you get a choice of cheap ADSL provider in most of those small villages.
The answer to "Could someone else make this thing I just made" is always "yes", eventually. We have patents to slow the arrival of the "yes" answer enough so that the first person to do so gets to make a bit of money.
But in this case (and most other cases) there's more than one way to do it and a lot of relevant technology, a lot of which is general car technology. And in every case, sooner or later, the huge company with a huge patent portfolio and huge expertise in manufacturing is going to win the "lowest price point" game... if they want to.
At the moment, the big players don't think there's a big enough market to make it worth their while to compete aggressively. At some point that will change, and at that point GM and other huge companies will develop, licence or acquire whatever technology they need. At the moment, Tesla is selling a niche product. That's great, but it hardly the same as producing electric cars for everyone.
Or, to put it the other way round, does anyone see Tesla scaling production up to anything like GM's level while GM quietly hands them market share and eventually gets out of the car business?
My experience is that I have to read 10 Stack Overflow responses to find one that gives me a clue to the right answer... and that this is still usually a faster way to find a solution than trying to work it all out myself. It's usually one of the "No, that's wrong because..." post that turns the lights on for me.
Smart ISP routers in France come pre-configured with a unique, obscure SSID and a unique, long and obscure WPA key.
I can't see how this tells them anything useful about price points for retail sale. The people who pledged money are agreeing to buy an untested phone in a year's time. That's way beyond even "normal" "early adopters". To do that, you have to be really passionate about new technology AND be able to pay a premium price for a phone you can't use for 12 months.
I've spoken to several people who, like me, might well have paid if the phone would be shipped today or in a couple of months. But with the timescales in the proposal the "price points" are for venture capitalists plus people with money to spare who just want a slice of a neat idea.
None of this tells us anything about how much they could sell production phones with this spec for in a year's time, and it's pretty much certain that to achieve any kind of market share they'd have so drop prices compared with the ones they tried this month.
They make this claim in the first paragraph and then spend the next four pages pointing out that they didn't check lifestyle, didn't distinguish caffeinated and decaff and that half a dozen other studies have shown health benefits of drinking coffee, and conclude by saying that health experts are not putting coffee on any lists for lack of hard evidence.
Exactly. (For younger readers, Ellison was all over the media 20 years ago announcing that Network Computers would be the nemesis of Microsoft in the very near future. I don't think waiting for the Chromebook was part of the game plan at the time.)
Does anyone else think this whole sage would be much simpler if the media consistently referred to people:dell or company:dell?
I'd be interested to see the answers broken down by age. It may well be that most of the people who love paper books will be dead in 20 years.
I suspect there's also a "fake good" effect, in that people feel they ought to be supporting their local bookshop and therefore say that they do, even if, in fact, they buy a book a year in an airport and every other book on Amazon.
Personally, I really like paper, even for technical books, but all my colleagues look at me like I'm wearing sabre-toothed tiger skins and wielding a club.
If we take this to its logical conclusion, ex-pats should lose the ability to speak the local language whenever they look at their spouse. And Chinese staff in a Chinese restaurant outside of China wouldn't have a hope. This has not been my experience. I suspect that the experiment is not demonstrating what the experimenters think it is demonstrating.
Maybe to avoid Titanic Syndrome ("A boat even God couldn't sink"). Not that I think God goes around sinking boats and blowing down data centres to win arguments. But if your data centre does get damaged in a storm, and you haven't claimed that it's indestructible, you don't end up being used as a moral cautionary tale about the perils of pride for the next 100 years.
I think that everyone should learn to code. Not because it will make them a programmer. Not because it will enable them to estimate how long something will take, not least because experienced programmers are legendarily bad at doing that anyway. Everyone should learn to program because programming makes the modern world go round, and it's good for everyone to have at least an inkling of what that involves.
We teach a lot of kids chemistry, without any expectation that they will invent a new compound that will change the world. We teach a lot of kids physics, without any expectation that they'll make a significant contribution to subatomic particle research. We teach most kids to do creative writing and poetry, without expecting the vast majority of them to produce fiction or poetry of publishable quality. I don't see why we wouldn't teach programming alongside all those other topics that most students never master and never "need".
One argument for teaching a lot of academic subjects widely is that the skills you learn along the way have wider application than the topic itself. And it seems to me that this argument holds at least as well for programming as for, say, pure math. As programmers keep saying, programming is about analysis, structure, models... is there really no application whatsoever for those skills outside of hardcore programming? Does no-one ever wish that their managers had a better grasp of "system"? Yes, of course, you can acquire these skills in other places. But the thing about programming, pretty much from the outset, is that your pious beliefs about system will stop your code from performing correctly unless those beliefs are reasonably accurate. I sometimes tell people that I do executable philisophy - it's all about logic, but, unlike the philosopher, my logic has to work.
No, a bit of Python won't enable people to produce estimates for projects. But it may enable managers to understand why writing code once to do something that needs doing often is often a good plan (and, also, why it sometimes isn't). It may enable managers to understand why "Can we just change this one assumption" at the end of a project may involve restarting the entire project.
Yes, a little knowledge is a dangerous thing. But the little knowledge is out there already on the TV station of your choice. I don't even like Python that much, but I'd still much rather deal with erroneous assumptions based on a bit of Python experience than deal with erroneous assumptions based on watching Mission Impossible and NCIS.