There must be some great examples of 'too-clever APL' out there unless the mag tapes and IBM 2311s they were saved on have all Gone South by now.
Learning APL as my first high level programming language (at age 18) was one of the best things I ever did.
The vast majority of "cleverness" in APL one-liners involved exploiting deep mathematical identities which were completely obvious once mentally unpacked. If you used the language enough, you started to recognize certain chunks of symbols as idioms, and you could pretty quickly guess which mathematical mother sauce had been tarted up with carrots and pine nuts. You practically felt your IQ growing as you gained fluency.
The human mind is amazingly sophisticated as some very difficult tasks (ask anyone who has ever studied human vision) but at the same time we stumble over double negatives (to say nothing of triple negatives) which involves nothing more than determining the parity of a small integer.
Anyone who has ever assembled a BBQ knows just how badly the human brain handles the dihedral group D_4. If every piece in the kit was marked with its symmetry code, there would be far fewer BBQ assembly nightmares (like not initially noticing a small asymmetric drill hole in a strut that is otherwise completely reversible).
And why is it possible to tell someone to head north, turn right, go a bit, turn left, go another bit, turn left again, pass the church, then turn right and look for your destination and not have the person immediately know the direction of the final street? It's vastly easier than many other mental functions that we do do effortlessly.
APL makes one small step of logical necessity which involves one giant leap of man's brain: that we really can be taught not to be ignoramuses about the orientation of a multidimensional array after it is transposed in three different ways as it works it's way through an expression operating along different subsets of axes.
The bad kind of cleverness is relying on something that's kind of arbitrary about your data structure or its arrangement. "Well, in that case, it just falls through to the zero that terminates the following data object after walking a few extra bytes." (Don't suppose that assembly language coders working with just a few hundred bytes of RAM or ROM didn't regularly do exactly this kind of thing.)
You never get that kind of thing in APL, because that language is stripped down to mathematical essentials and almost completely devoid of weird-arbitrary-thingness.
If your application domain was full of weird arbitrary thingness, APL was a pretty poor match as a choice of programming language. You wanted to use APL to solve problems where some kind of deep order was in there screaming to rise to the surface.
You might think something was kind of too cleverish on first pass, but 90% of the time your final verdict escalated from clever to deep.
Everything I programmed in APL assumed that the vast majority of working data lived in system memory, like some kind of calculator on steroids. It was far from a great systems language.
It's because of my early experience with APL that I began permanently immune to certain kinds of fuzzy thinking. The only way to program a computer to pass the Turing test where it's as bad as the human mind at simple parity or tracking compass directions or mapping BBQ parts onto D_4 is to use neural networks that mimic the construction of the human brain.
Do you really think that Robocop cares if he's holding a book upside down? There might be situations where a book balances better with the fat side on the right, regardless of text orientation. When he gets to the mid point, he rotates the book 180 degrees and starts flipping the pages in the opposite direction.
Is he being too clever, or is he just being sensible, without the encumbrance of a ridiculous human perceptual asymmetry?
Human cognitive limitations are rather Byzantine. Our programming languages get so bound up with catering to our cognitive limitations that we succumb to writing algorithms that are less than pure or fully general, and we hardly notice.
Commander Data would not agree that APL was peculiarly amenable to excess cleverness. In truth, he would argue the complete opposite—unless he had his weird human cognitive limitations chip installed and fully activated.
This entire article too silly to bother comprehending so far as I can see, but one fundamental aspect of source text they clearly aren't thinking about (which can be determined without even comprehending the text) is that program texts form an evolution space, highly suggestive of similar programs that might be constructed within a short edit distance. Part of the function of program text is to correctly specify what it doesn't yet do, but could reasonably be made to do. (Bad code bases fail in this essential function, as you can never tell if what appears to be a reasonable change will work as expected until you actually try it.)
APL was a tiny language best described as the anti-PHP.
That men do not learn very much from the lessons of history is the most important of all the lessons that history has to teach.
— Aldous Huxley