People are stupid, and news sites like to print the most horrible, twisted and eye grabbing thing they possibly can. In that kind of environment, it only takes a single crash due to a software glitch to get the project outlawed statewide. I'd keep the information private too, and probably pay off the guilty party anyway just to keep them quiet.
When I was growing up, my sisters would spend fully 1-3 hours a day doing makeup, hair, clothing, and other prep work to look presentable in public. Society expected this of them, or rather they felt society expected it. I spent 15-30 minutes a day on the same task.
When I was growing up, my sisters spent virtually no time at the computer, or building things, or learning about engineering. Society expected this of them, and my parents largely supported it. I spent 1-3 hours a day on these tasks.
The gender gap isn't due to lack of raw talent, or lack of ability. It's not due to the jobs not being "interesting" or "world changing". The gender gap is because women don't spend the same amount of time doing engineering that men do in their formative years, when it matters the most. We aren't going to fix the problem by trying to bring women into the picture after the damage has been done - the best we can do at that point is mitigate the issue. To really fix it, society is going to have to value engineering more highly than spending two hours on makeup and appearance.
Good luck with that.
I've looked at D before. It looks promising, and I've considered using it. The reason I don't is a bad reason, but it's the most common bad reason: legacy code.
I have two hundred and sixty four thousand lines of code in my personal project/library archive (my own code, not counting custom versions of external packages like openssl and portaudio), all in C/C++, all with a unified build system, that's been ported and debugged on serveral platforms. Every new project I start uses those core libraries and header files. When I think about switching to a new language, my biggest concerns are how new code will integrate with my existing, how the new language will make use of my existing libraries, and how to remain productive in a dual language environment. The long term gain might eventually make it worthwhile - but it might also just cost me time should the new language die out or not support a platform I need it to.
I simply can't justify the gamble.
Bennett, please just shut the fuck up about your improvements to burning man. For your traffic flow 'improvement', you notably didn't provide simulations, suggested a broken alternative, and didn't even bother to fully understand the situation before jumping in with both feet. For this one, you've done something similar.
In particular, the problem when the ice line is backed up is -not- because the ice wasn't prefetched. It's because half the time they can't get it out of the trucks any faster, and half the time they can't get the customers out of the way faster. Adding prefetch to a throughput bound system does not improve performance. If you had the experience of going through the lines more than a few times, you'd have maybe picked up on that before offering your advice.
I'm not going to say that traffic isn't a problem, or that ice queues aren't stupidly long at times. But these are hard problems, and they have been thought about extensively by smarter people than you, smarter people who have more information and experience than you. You insult all of them by discounting that so blatantly.
The FDA has -safety concerns- testing a drug that might be effective against a -contagious disease with a 90% fatality rate-?
Somebody's priorities are in the wrong order.
Don't be a dumbass. CCl4 isn't used in HVAC, CFC's are, and no, they aren't liquid at room temperatures.
Oh yeah, (a) is definitely true, but that's no reason to blackhole it immediately, especially if it does something good at the parsing layer. There have been plenty of instances of 'deprecated' features being used for several years with compiler switches to allow it.
It would be better if you asked along the lines of "If you could change any aspect of C++ and have it immediately adopted/supported" - you never know, his preference might be to change template syntax or something similar instead of adding something new.
Personally, I do a lot of the following:
- learn the minimum aspects of the language needed to navigate your specific codebase, and learn them very well
- copy/paste of terrible syntax to avoid compiler failures
- avoid using templates if at all possible
- create strongly typed specific-use classes that export only the minimum functionality of the underlying libstdc++ classes without using templates
- keep all pointer casting to well defined, central locations
- all production code runs with asserts on all the time
- thou shalt never use multiple inheritance. Ever.
Regarding managing inheritance and figuring out inheritance trees and which classes own what functions where, I'm still basically at a loss and I've been fighting that problem for years. The best advice I can give is try to keep your inheritance trees as shallow as possible; but for real world systems, it often doesn't make sense. I have a set of socket libraries which pretty much has to be 4-5 levels deep in spots to do what it needs to do properly, and it's disconcerting to have to look back to for example layers 1 and 3 to try to figure out what's going on.
Even when they do fully implement the entire language, the 'compiler dependent' aspects of the language allow compilers to produce arbitrary results.
No. No no no no no. No.
Having lived through the embedded C/C++ debacle, I can't stress 'no' hard enough.
Hopefully, not for a long time. C++ doesn't need more complexity at the moment. It needs less.
The last thing C++ needs right now is another complex feature which adds more syntax and fundamentally new operation.
Further, in my opinion, we should look at the most common coding conventions and consider adding pieces of them to the compiler and language specification if it makes sense to do so. Naming of functions almost certainly doesn't make sense; but requiring braces after all conditionals very well might (and may make compiler parsing of other constructs easier to handle.)
The most common coding styles almost always touch on what would be 'best practices' in the field. Adding those 'best practices' to the core language specification seems like something that should at least be considered.
This is a very good set of observations, and I also feel that C++ has become a 'niche language for insiders'. The syntax is difficult; it's remarkably easy to shoot yourself in the foot in unobvious ways; and porting can be problematic, as no two compilers compile the same code the same way. Trivial mistakes such as pointer aliasing are often compiled -silently and without error-, producing different results at different optimization levels, and -this is considered normal-. Over the years, you learn these things and you figure out which things to avoid, but for new people coming into the language, it's a huge barrier to entry.
If the goal is to really get a lot more programmers to use it, the base syntax almost certainly needs to be improved. Rather than providing some obscure syntax to do some obscure library feature, make it easy to do simple things and make the language as idiotproof as possible. Make compilers either strictly produce well defined output as per the spec, or throw an error. Do -something- to improve template syntax.
Pretty much all of the new features I've seen in the C++11 spec are niche features, things used by high-end library writers with 20 years of experience to do complicated library things. That's good, libraries are important. But libraries will not translate into users if normal users cannot use them, and libraries will not translate into users if the language itself is the bulk of the learning curve.