No it was really two separate, independent experiments, CMS and ATLAS. Both reported discovery independently at the 5 sigma level (10^-5 probability of error or whatever it is). Both discovered a particle at the same mass, within experimental error. There was no shared knowledge or data between the analysis teams. They were working on the same accelerator (LHC), but at different points in the ring, so no cross talk or anything is possible between the two experiments (several km of rock in the way). The only thing they have in common is that the same protons were whizzing around.
default constructor, default destructor, default assignment, implicit calling of constructors... it's a mess. Half of developers don't even know that this is happening. Style guides suggest disabling this stuff.
It's impossible to assign anything - how do you assign data to a std::vector? You need to make an array! Ack! Fixed in whatever C++011 but then they introduced a whole load more implicit type assignment problems. Ack!
I recently gave a similar talk to the UK Institute of Physics conference on High Energy Physics. The fact is that particle physics costs too much. The problem, in my view, is generated by particle physicists. We have underinvested in the basic technology of accelerator-driven HEP, namely superconducting magnets and to a lesser extent high gradient RF cavities. This underinvestment has lasted for several decades.
For example, there are a bunch of folks working on HTS (High Temperature Superconductors) in the US with the potential to increase magnet field strengths by an order of magnitude - and hence particle accelerator fields by an order of magnitude. But the program is poorly funded if at all. In Europe, there are similar programs but they are disjoint (as so many things in Europe) between different countries.
Sadly, the SSC and LHC were both disastrous in this respect. They basically bankrupted the HEP community. Now the US is more-or-less withdrawing from HEP and European accelerator driven HEP seems to have nowhere to go after LHC.
The impact to HEP community is clear, but what about the impact to society? Where will we be in a world where we no longer have the capability to push back the fundamental frontiers of knowledge. Is that it?
Why is this even a discussion? I mean how many people died in the tidal wave compared to the power plant going pop? How many people will die from chemical poisoning due to all the conventional facilities that were destroyed? But somehow, there is this discussion about Fukushima nuclear power plants that is a complete distraction.
Put it like this - when the Tsunami hit, do you remember all those oil refineries blowing up? How much crap came out of the huge black clouds, and is right now poisoning the poor people of Japan. But everyone has this crazy thing about Fukushima because it's nucular. Get over it!
Because of this complete misconception about the health risk of nuclear power compared to conventional facilities, thousands of people have been displaced - why haven't equivalent people been displaced due to the health risk from conventional facilities?
"self-regulation...can be more effective than a regulatory approach in delivering flexible solutions that work for both industry and consumers.”
Translation from British into American - "Go screw yourself you crazy old bat" cf Yes Minister/Yes Prime Minister for further examples of British English
Well, as a C++/python developer I spend 90% of my time in C++ wrestling with the obscure syntax and memory management to tell the compiler what I want to do. That 90% of my time in python - well I spend it writing tests and documentation and all the good things that squash bugs. Like a couple of days ago I spent 3 hours trying to figure out why the compiler was spitting back my code - turns out I forgot to declare a function const. Fine, that's an error, but it's not like it actually makes a difference to the code. Better to spend 10 minutes writing a test that checks for constness if that's what I want.
Shorter prettier easier code is less likely to be buggy. If I can read my code in 5 seconds, chances are I can spot bugs. If it takes 5 minutes well that's a problem.
Decent error handling makes code robust. Ack, I ran over the end of my std::vector. Better hope I tested for that. Ack, I ran over the end of my list. Lucky python throws an error for me. That's not static vs dynamic typing, just about general robustness of the language/libraries. But sure as hell makes a difference.
In every non-trivial program there is at least one bug.