Putting aside any other consideration, you don't want that DC10 taken out of service during the fire season.
The Tate Modern, London, formerly the Bankside generation plant.
If you don't understand memory management then you are not yet a C++ programmer, let alone an intermediate one.
In my experience of interviewing, I have spoken to many alleged C++ programmers who can give a textbook definition for the terms 'copy constructor' and 'assignment operator', but who have no idea of their purpose, and their role in resource management. Unfortunately, I do not need to imagine the sort of code they write, because I have seen it.
You will have many horrible consequences.
You have no idea what your code will produce.
You will never know that your code actually works.
As you spend a VAST amount of time debugging weird stuff, you will eventually realize it would be better if the program crashed when the error happened.
This person is claiming twenty years of programming experience, and if so, then all but the last of these things have already come to pass.
Sadly, it is not hard to believe that this person not only has 20 years experience, but in a professional capacity, and ditto for everyone who is supporting this idea. This is why there is so much bad software around: too much of it is written by people who do not really understand either what their own code is doing or what they are trying to do with their code.
Someone explain this techno nerd obsession with replacing people with robots, I just don't get it.
There's a general and a specific answer to that question:
General: If it is going to happen, it is much better to be the person doing it than the person it is done to, so if it can happen, best assume it will.
Specific: Almost all of Uber's problems spring from the fact that cabs currently need to have drivers.
These points of view do not consider what effects this will have on society in general. Capitalism does not do that.
Of course, there are also people who simply like the technical challenge, but they are generally not financing themselves in this pursuit.
The hard part is indeed establishing what the right level of security is and how to evaluate companies against that.
That will be an issue, but I get the impression that many organizations will have to make significant improvements before it becomes a matter of immediate practical concern.
If a piece of code is well written...
This is not an article about what is possible, it is about what actually happens. I have seen incredible abuses of operator overloading, for example. I have also seen some highly confused Java, but its pedestrian syntax seems to make it a little harder to write cryptic bad code in.
Road wear is often estimated as the fourth power of axle weight. So I imagine the final regulation will include road wear as a factor.
That is an admirably rational argument that I fear won't stand up to politicians' desire to pander to car manufacturers and dealers, oil companies, and that part of the electorate who feel entitled to drive a big vehicle that they have no use case for and can't really afford to run. I hope I am wrong.
They need critical reasoning skills and an understanding of how the world they live in actually works.
Waterfall is a straw man. It only continues to be mentioned because it is an easy target. Claims that X is better than waterfall are a waste of time; they tell us nothing about whether X is actually effective.
So Tiversa breached systems to get data from them to show the system owner that they needed their services?
But if Tiversa did breach those systems, then they did need Tiversa's services didn't they?
Yet the linked-to article says "If Wallace is telling the truth, the FTC aggressively prosecuted a company based on bogus evidence."
The only way I can see the evidence being bogus is if Wallace exploited a position of trust granted to him by the target company, and not even necessarily then. Whatever the truth is, the report is not self-consistent. Apparently, rational analysis and critical thinking are not employed at CNN - but we suspected that, anyway.
The 10x rule is that the best developers are 10x as productive as the worst 10 developers. That's all it says. So you can replace them with 10 worse developers, that's the whole frickin' point of the 10x rule.
I am not aware of any study that shows that arbitrarily complex tasks (a distributed operating system, an AI project such as Watson, an airplane fly-by-wire system...) can be completed solely by a sufficient number of bottom-of-the-barrel programmers. I don't recall any such claim in The Mythical Man-Month, but it has been a while since I read it - perhaps you could cite the chapter that references such a study, so I can look it up in my copy?
It's not bimodal but 10x rule still applies.
There's no point in trying to assign a ratio to it, because the worst developers have zero or negative productivity, on account of the damage that they do to projects by checking in inordinately complicated code that fails mysteriously, or simply by messing up everyone else's schedule by failing to deliver their part. This is not a hypothetical argument; I have several specific people from my personal experience of corporate development in mind (all in the past, fortunately.)
You would have to poll all of them to establish that.
Thank you for your highly informative comment about Goldman Sachs indemnifying its employees for legal fees.
It seems that Goldman Sachs may be hoist by its own petard here, if the intent behind this provision was, at least in part, to enable 'aggressive' behavior by its officers and employees (i.e. to flirt with arguably illegal behavior).