Case in point: Peter William Lount has a rant about the subject of typing. He starts off critiquing a paper by Meijer/Drayton and tells us he likes Smalltalk. (the same paper gets a good roasting over at Lambda the Ultimate) So far so good. Then he starts to tell us why he's qualified to pontificate on statically typed languages...
[I've] learned C, Objective-C, Java, and other "typed" languages since then. Actually, like many professionals I speak - or is that write - many computer languages fluently and can read many more. I'm no stranger to typed languages, they work and get can usually be made to get a job done.
Well, of course that starts the eyes rolling into the back of my head. Note the scare quotes around "typed" and note anything other than an Algol derived language is lumped in the other category. Hmm, no note of Eiffel, or Ada, let alone a language with a modern type system (you know, something straight out of the 1970's) like ML or Haskell.
If you're arguing that C's type system isn't the end of type system evolution, you might as well skip the blog entry and point to (perl maven) Mark Jason Dominus' thoughts on the subject.
Mr Lount goes on...
Dynamicly [sic] untyped-languages work very well without types. Indesputable [sic] fact.
I don't know why I'm supposed to accept this statement as "indisputable". In fact, I'd argue that types are the meta-system which gives programs meaning. When you send bits to your video card, it has to take into account what "type" the bits are to properly display them. Some bytes are characters (when the display is in text mode). Other bytes represent red, or green, or blue pixel intensities. The same physical representation leads to a whole different meaning depending on what type the representation has. Other examples from the real world are abundant. If I gave you the word "tea" you need to know what language its in (i.e. its type) before you know its meaning. The bit pattern 01110010 could be a number or a character or a pointer, but it probably only makes sense to make it uppercase in one of those instances. Whether you like it or not, your programs are loaded with types, even if they're not expliticly specified in the source code or the language semantics.
For the life of me, I can't figure out a good interpretation for taking the square root of a string, so I generally want to catch these types of error sooner, rather than later. So I prefer the static solution. And I don't want to roll my own abstraction violation preventer, so I'll settle for the one that comes with the compiler. Hey, lookie there, code reuse. Hot damn!
And for those people who make the claims that they rarely ever make type error is there program, I'd tend to agree, you probably don't. But you probably don't have very sophisticated algorithms or data structures either. It's like if someone told me they had never run into a bug in their programs caused by floating point round-off. That either means you've never written a numerical program, or you've never tested it properly.
But let's continue...
Some very large applications have been written in Smalltalk, the most well known of the dynamicly untyped-languages. In fact, Das Kapital, in use by JPMorgan is a huge application that is used by the company to manage vast sums of money.
The Java effort to replace Das Kapital failed, in part because the users had become accustomed to the power and leverage that Smalltalk enabled us to build into it.
The astute reader will note that he never actually mention why "types" caused this failure. In fact, maybe Smalltalk is just better than Java, and maybe types have nothing to do with it. Nah, that wouldn't fit in with our preconceived notions. Also, I guess we're supposed to believe that we can reliably extrapolate from this one data point. Oh, and that the relative failure of Smalltalk compared to Java is a market failure of almost unspeakable proportions. And while we're at it, it seems like he's conflating typing with dynamicism. I wonder if he considers Scheme typed or un-typed.
While many in the "typed" and "functional" world have for some non-rational reason rejected "runtime type inferencing" as not being valid "type inferencing" their argument is mute and without basis.
Here's one ("non-rational"?) reason. "Runtime type inferencing" with unit tests is an ad-hoc error-prone way to do it. With compile time type inference you get a whole bunch of unit tests, which have guaranteed 100% coverage (e.g. both branches of a conditional are checked, etc.), they cover a potentially infinite amount of cases (e.g. Integers), and they come for a very minimal price. That price is minimal, because you're going to have to fix any type error whether or not you get them at run-time or compile-time. And why build your own unit test suite to catch type errors, when (again) you can leverage the one which came with your compiler?
While compile time distinctions are useful for certain aspects they are a tiny corner of the full potential of the computing space whose majority of solution space exists at runtime. The proof? You can't use a computer, other than as a boat anchor, without "running it", thus by it's very definition, it's all about runtime.
Well don't try too hard to be rigorous. One thing I'd actually like to see from one of these Smalltalk advocates is an article explaining the theoretical and practical difference they see between types and classes.