> Make static analysis much more anal, forcing the programmer to express their intent up front - static types, constraints, etc. Make the compiler a totally pedantic Nazi. Sure, it's nice to be able to hack shit up in an afternoon in Python or whatever, but then it ships, and the bugs come in, and you end up adding a pile of asserts and whatnot that should have been caught way before the product shipped.
Sometimes, but other times I write a function that I call two or three times but never gets used anywhere else. They work fine when I wrote them and they always will. Why should I have to deal with the extra time it takes to get all the types and arguments and interfaces right (and yes, those small bits of time really really really do add up). Python is faster to code that Java. Period. And because I took me 3 times as long just to get a working prototype in front of the customer and now I'm that much closer to being out of budget, we've just been pushed from iterative development right back to the waterfall model. It's all about striking a /balance/ between strict, compiler-enforced, future/change-proof coding and get-er'-done imperfect and fragile but cheap and useful software.
What we /really/ need is language that fully supports strict typing but keeps it optional and allows us to turn type checking on and off at compile-time and on a class-by-class, function-by-function basis. That way we start with quick-and-dirty working prototypes and turn on additional compile-time checks as we go.
> Make unit/integration testing a mandatory part of the build, i.e. the compiler/linker refuses to link with code that hasn't been marked as tested.
Even with a good unit test framework, writing unit tests takes me just as long as writing the original code. And yes of course it /may/ save time later if my unit test catches breakage as things change, but half the time it's the unit test rather than the code that needs to change when a unit test fails. Whether unit testing has a net benefit is project dependent. Who pays for the extra time unit testing takes if it's always mandatory?
> If we learned to put the hard thinking and effort into designing APIs, and then reusing those same APIs across whole new classes of problem (because the language makes defining APIs is such a hassle that we'd rather not dream up new ones left and right), I think things would improve massively.
So you're saying we should just think about our requirements a bit more up front before we go writing code? I know getting those requirements on paper before they're in code /always/ makes things go better. ;) Oh, wait, that's not right. Actually, in the history of software development, no one has ever developed an even remotely useful set of requirements before writing some code and putting it in front of users. You've got to code the API, use it, find it where its design falls down, fix it, fix all the code that uses it, and eventually, over time, you end up with something that's both stable and good (not bug-free and perfect, but stable and good). It's called mature software, and short of a crystal ball, iterative development and time is the only thing I've ever seen produce it.
> None of this would stop you from writing shitty code. But at least, to do so, you'd have to knowingly subvert the compiler in a bogus way, ignoring screeds of the compiler telling you that you and your code suck goats' balls.
Have you ever tried compiling anything remotely complicated with gcc? The compile logs are filled with warning this and incompatible type that, yet the resulting software works quite well. Based on compiler complaints, I don't think I've ever seen code more complicated than Hello, World! that doesn't suck goats' balls. The /real/ solution are things like the language-based buffer overflow and garbage collection fixes you mentioned above. It's simply syntactically-impossible to write a buffer overflow or memory leak in java/ruby/python/perl/etc. (Other types of bad memory management are still possible, but not a proper textbook buffer overflow or leak.) The real question is "What are common software problems (including time-sucks) and how do we change programming language not to warn us or refuse to compile, but eliminate them in a conceptual manner?"
I find that I/O (shuffling stuff from disk/stream/database to objects and data structures in memory) is by far the most error-prone and time consuming part of software development, even with ORMs like ActiveRecord and REST/yaml/json. I think we need better data persistence mechanisms that completely hide underlying storage from the programmer. Eliminate things like open(), read(), write() query('select'), obj->save(). Replace them with things like MyObj objs = find('some new query language here'), persistMemorySpace('some optional limits'). The missing piece is a way to pick-and-choose what gets saved and loaded when. JSON.stringify(myobj) is mostly like persistMemorySpace(), but you hit problems with recursion and lack of intelligence to pick-and-choose what to save. ActiveRecord's find() lets you load objects from a single table in one shot, but hitting a related objects hits the database again on an object-by-object basis. And you're still stuck manually specifying relationships which you've already specified in the database schema. We need a better query language that lets us select some subset of all objects in memory both for saving and loading and still auto-loads and auto-saves as necessary.