$300? OK. I'll take all you have to offer at $300. Feel free to sell me Fort Knox, that's OK. I'll take it.
Ok, I actually think you, me, and Theo all agree
1) We don't think a specific technical change would have _prevented_ the issue.
2) We all agree that better software engineering practices would have found this bug sooner. Maybe even prevented it from ever getting checked in (e.g. suppose the codebase was using malloc primitives that that static analysis tools could "see across", and that the code was analysis clean. Could this bug have existed?)
Who has claimed that using the system allocator, all else being equal, would have prevented heartbleed?
Who has claimed that heartbleed was an allocation bug?
I understand what freelists are and do.
The point here is that rigorous software engineering practices -- including the use of evil allocators or static analyzers that could actually understand they were looking at heap routines -- would have pointed out that the code implicated in heartbleed was unreliable and incorrect.
If you read the link you pointed at, after making a modification to OpenSSL such that coverity could understand that the custom allocator was really just doing memory allocation, Coverity reported 173 additional "use after free" bugs.
There are bugs from years ago showing that openSSL fails with a system allocator.
Don't you suppose that in the process of fixing such bugs, it is likely that correctness issues like this one would have been caught?
Actually, it is you who are wrong.
Theo's point from the beginning is that a custom allocator was used here, which removed any beneficial effects of both good platform allocators AND "evil" allocator tools.
His response was a specific circumstance of the poor software engineering practices behind openSSL.
Furthermore, at some point, openSSL became behaviorally dependant on its own allocator -- that is, when you tried to use a system allocator, it broke -- because it wasn't handing you back unmodified memory contents you had just freed.
This dependency was known and documented. And not fixed.
IMO, using a custom allocator is a bit like doing your own crypto. "Normal people" shouldn't do it.
If you look at what open SSL is
1) crypto software
2) that is on by default
3) that listens to the public internet
4) that accepts data under the control of attackers
I would say that, "taking a hard dependence on my own custom allocator" and not investigating _why_ the platform allocator can no longer be used to give correct behavior is a _worst practice_. And its especially damning given how critical and predisposed to exploitability something like openSSL is.
Yet that is what the openSSL team did. And they knew it. And they didn't care. And it caught up with them.
The point of Theo's remarks is not to say "using a system allocator would have prevented bad code from being exploitable". The point is "having an engineering culture that ran tests using a system allocator and a debugging allocator would have prevented this bad code from staying around as long as it did"
Let people swap the "fast" allocator back in at runtime, if you must. But make damn sure the code is correct enough to pass on "correctness checking" allocators.
I never noticed it, at least not in the ranges as described. Sometimes the clay can bounce a bit, but even that is rather marginal if you are shooting correctly.
Shotgun pattern distribution is governed by several factors, including shot quality / material, wad design, barrel design, hull design, forcing cone length / shape, but most especially choke. Steel shot will rip up some chokes. Chokes can creep (particularly on a hot Illinois day). Wadding can foul a barrel.
I wonder if these were controlled for.
Today's mass-scale manufacturing will collapse, and needs will change, so my bet is that it will be very useful to be the guy who can design models to be fed to 3D printers.
This is going to become a useful skill anyway in the next few decades, so it's not a bad investment for a hobby today.
Will lawyers be useful? (I know many slashdotters will laugh and say we'll be better off without them, but the new forms of society will need new rules and a new justice system - and programmers would do this as badly as lawyers would program.)