Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment I don't get the public furor (Score 2) 658

I became interested in the history of code breaking and surveillance in the late 1970s, even before The Puzzle Palace permanently breached the NSA's public anonymity.

I don't get the public furor because there's nothing new here: what Snowden revealed is just a logical extension of how this program has always operated, as documented since way back for anyone who wanted to know. It has always been part of the anonymity construct that the NSA could purport (or purport by implication) that it operated within the groove of democratic principles, up to a point. The old relationship with the British (I'll watch yours, if you watch mine) was always a burden, but I guess that burden must have been manageable for a time.

Once COTS technology (Cisco, Nortel, Lucent, Alcatel, Juniper) begins to outpace the astrobuck edge, the NSA is forced by brutal practicalities to review and revise their anonymity construct. Just how much can be exchanged through a stiff-upper-lip tea service?

At this point, the NSA's democratic cloak is outright risible: any foreign person, anyone whose patterns of contact with such people is vaguely suspicious (there has never been a shortage of suspicion where suspicion greases operational desires) and anyone who crosses paths in any way with this substantial kernel of the vaguely suspicious, citizenship be damned. We're more than halfway along the spectrum of seven degrees.

Suppose we apply the principles of differential cryptanalysis to this interesting social network. Suppose there is some American citizen not yet trawled by this social graph of chance connection. What's the least amount of suspicion one must inject at some chosen suspicion-coloured node of this graph for a tentacle to slop out of the bucket to engulf the arbitrary citizen of the moment? Once engulfed, does this person ever escape this webbing ever again on principles of liberty and freedom or is this person's only democratic salvation to fall beneath some metric of cost/benefit in keeping his or her node active in the vast suspicion graph? How much easier is it for a person to be bumped back into this mesh once you've been on it before? Does that scarlet letter ever fall off?

I doubt there's anyone in America whose nose is so clean that ten minutes of brow-drenched pretext-manufacture by some nearby NSA staffer with any prospect of future promotion wouldn't serve to lasso this person onto the suspicion list by some ready-to-hand agency criterion (a clean nose for this purpose is mainly established by not getting out much except on Sunday morning, not using email, and never answering your telephone when pestered by a wrong number).

That's pretty much the minimal operation capability they would settle for, no matter which democratic cover story of the day hits the news cycle. I doubt they ever expected that a program as large as this could maintain cover of darkness indefinitely. So the real response and public optics is mainly for consumption inside the Faraday cage: the Snowden meme is not one they wish to see take root among their own.

It's a basic tenant of military or police training to punish the group on the pretext of individual lapses, failure, or sloth until the group is conditioned to self police. Wouldn't be surprised if everyone in the entire agency is working unpaid overtime on invented files (as in The Firm) until Snowden is brought to Faraday justice. I get the internal furor loud and clear.

Comment sarcasm as cognitive burden (Score 1) 167

This whole idea that sarcasm doesn't come through in text needs to be revisited.

I have a reputation in my work environment for being perceptive, thoughtful, and lucid. I also have a reputation for having near perfect recall of anything previously discussed that could possibly go wrong, and for sometimes becoming extremely intense and hard to deter from constantly injecting these unhappy reminiscences into self-satisfied negotiations until everyone else glasses over. Others might characterize this as a geek loss of control thing. I prefer to characterize this as an obnoxious streak where I constantly remind people of just how lazy they are (cognitively).

Drucker says that if there's no conflict around a decision, you should cancel the meeting and come back better prepared. I have this weird capacity to internalize long lists of reasons why anything might possibly not work, and recall much of this years later, the way some people memorize lists of baseball players. Let's put it this way: on the first round of viewing, no one in the room was surprised that I could beat Sheldon to many of his lines. Nor would people have been surprised for me to comment (at an appropriate juncture) that K4 is the smallest complete graph with no Hamiltonian path. So for the purposes of linear screen-writing (essential to the joke), Spock (the punch line) needs his lizard. But why "lizard"? Well, some combination of phonology, meter, and a viable coin-flip vagary of superiorities and frailties, with a subtle invocation of cheesy Gorn action figures (universally possessed among the group, though left unstated in the script through the use of Hemingway silence).

First of all, dead-straight sarcasm mainly plays against personal memes established in the group. If you're widely known as the guy most likely to mention Hamiltonians and Hemingway in the same sentence, your remarks will be taken in a certain light when your sentence is uncharacteristically plain, and you can play off that. No single-utterance algorithm ever devised will detect this (though it might succeed in raising a flag that an otherwise bare statement likely plays off in-group dynamics).

Second, much intended sarcasm is simply bad writing, typically perpetrated by the participants who are better at jostling for attention than presenting a sustained perspective. These people want their successes to be more memorable than their failures and tend to be completely content receiving credit for a remark construed opposite to the spin attempted. These people are happiest when no particular valence sticks to their persona. Misconstrual is half the payload. That's the closest they ever get to a date. (Note that this verbal pun on "Miss Construal" leaves the word "menstrual" partially activated in the subconscious with nowhere to go, which is half of the humour but none of the joke. The other half of the humour, but not the joke, is the unsticky half payload of no particular valence. "There was some scattered clapping, but most of them were trying to work it out to see if it impugns virility." That's also funny on a second level: mocking with a formal nuance of the word "impugn" this entire business of establishing one's virility via assertional discourse. Oops, I did it again.) Summarizing: bad sarcasm is half-assed and, sadly, good sarcasm often lands to scattered applause (someone disaccord me a Kleenex).

Finally, sarcasm is not a quality of an utterance as a whole. The same paragraph can wend through sarcasm, mockery (sometimes self-ish), memes, cues, call-backs, verbal and visual double meanings with all the obviousness of Sheldon's didactic Hamiltonian. Koothrappali gets it, on the first take.

That's a joke itself (from the original script) in how it emphasizes one kind of cognition over another. More typical in human discourse is a litany of ten ambiguous statements. The people who get sarcasm most reliably are the ones who can maintain a larger cloud of ambiguity for longer durations. Rushing to accept the surface implication is a way of reducing cognitive burden for the lazy mind. Lazy minds indulging in an orgy of positive reinforcement is what we know and love as mob psychology. Sarcasm is a way of queering these feedback loops by bifurcating the coefficients. To a first approximation, many lost souls wish to become the least cool member of the coolest club that will take them, so the coolest members have to keep the ramparts slippery.

Comment blaming the government (Score 4, Funny) 287

I watched a Bill Maher video yesterday in which a conservative politician who clearly believed that cleanliness (and short hair) is next to godliness claimed to believe in "adaptation" but not a certain fish story when confronted by a historically unelectable Canadian politician about whether he believed in antibiotic resistance (in which the evolution of the resistance trait was greatly accelerated by careless overuse).

I actually cut the guy some slack. There's no reason why he can't logically believe in the special theory of evolution (local adaptation) without necessarily believing in the general theory of evolution (the ascent of complexity from primordial origins). To believe in one without the other requires a larger than average mental judgement in between. Unfortunately, he lamely fell back on invoking the missing link. Bzzzzt. Thanks for playing.

Clearly he hasn't checked in with the Out of Africa theory lately, which was speculative until we began to read DNA in the early 1980s with all the proficiency of a clever three year old. Right now we're at about year two of a ten year post-graduate program in speed reading for lifeforms with facet eyes. Things have changed. If there were any region of the globe over the past 10,000 years (or 100,000 years) where the genetic lineage of any species of quadruped (Noah being the patron saint of charismatic megafauna) is constricted to a single breeding pair, we'll surely find it soon on the rising flood of sequence data. Dude groomed for rapture should be worrying about the missing crink, not the missing link.

I can't say I have a higher opinion of "blame the government". It's like blaming calcium for arthritis, on the grounds that sans calcium, arthritis as we know it would no longer exist. The problem here is that calcium is just the implementation. The specification is to have a load bearing structure nimble enough to evade and pursue (aka biosecurity). A large branch of the solution space descends from elbows and kneecaps.

One of the major functions of a large population is agreeing on the threat enough to achieve cohesion in the threat response. This is mirrored in the organism by how the fight/flight response is balanced on a knife edge, and how the hormones that prime this metabolic state also tamps down immune response. Guess what, libertarians, that's a centralized response.

You can discard the implementation (government as we know it), but you can't discard the specification. Unfortunately, contrary to the most vociferous howls, the problems are actually rooted in the specification, not the implementation.

Just like replacing an aging software system, while it's absolutely certain that the worst points of friction in the existing system will go away, new points of friction are extremely likely to take their place, unless you stumble upon the "silver bullet" solution paradigm (social media won't let you down). I tend to be fairly reluctant to stick up my hand when a surgeon promises to cure my arthritic knee by lopping off my leg and grafting on a tentacle to replace it. I worry that might bring with it new problems every bit as annoying as the previous problem.

The present state of the NSA and the legislation around it is pretty much an unbroken story since the end of the first world war. (The Germans did not invent Enigma on a fall afternoon in 1939.) I vaguely recall reading in the The Puzzle Palace (or something similar from the same era) that before the U.S. government passes a law preventing secret agencies from spying on American citizens there was already a secret law on the books exempted a certain no such agency from being beholden to any such future law.

Democracy it turns out is a lot like the human immune system. It shuts down on a dime in the presence of an acute threat, as defined by the pulsed secretion of some small gland. Once you get to the place where the small gland sees a lion in every box of Cracker Jack, democracy is reduced to vestigial status, until we're all killed by a disease transmitted by infected telephones. Then the cycle repeats. The general theory of evolution seems to iterate on foolish overreaching. Perhaps evolving toward complexity is a hard problem where nothing useful is achieved by linear enlightenment.

This engineering problem of how to achieve group consensus on when to shut down group consensus remains to be solved. Software engineers who delight in replacing old cruft (whose pains are self evident) with new cruft (whose pains can be only be imagined) need not apply. Dweeb-thought: If only we replaced Oracle with MySQL, life would all be good. Deep-thought: Mutex-free distributed integrity is a hard problem.

Politically, well over 90% of the negativity toward government I see expressed on the intertubes falls into the dweeb-thought bucket.

Comment hands in your pockets (Score 2) 778

Glenn Gould used to take a lot of flack for refusing to shake people's hands even though we all know that you can't go through life refusing to shake hands. Perhaps he had a good reason?

Even if you're less of a sociopathic hypochondriac than Glenn Gould, there's still an issue concerning how automatically one reaches out. I'm a little more hesitant to offer my mitt to a vagrant person who's just popped out a discrete alleyway with flecks of an old newspaper stuck to their shoe. Colour me paranoid. And yet the default on the web is to arrive on every web page in full embrace, even the typosquatters with old newspaper stuck to their shoes.

On my FF I have things pretty locked down. If on first impression I haven't teleported into the worst bathroom in all of Scotland, I'm pretty quick to enable first party cookies. Tracking cookies from the social media paparazzi, not so quickly.

When I get a site coded to misbehave at the first whiff of the end user exercising prudence or discretion, I switch the URL into Chrome where I have practically nothing locked down and visit nowhere important and where the social media paparazzi will observe my click trail as an infrequent user engaged who exclusively visits the wrong side of town, but never never pulls his hands out of his pockets to engage the temptations.

What's in your wallet?

Comment elusive simplicity (Score 1) 381

Something like Ukkonen's algorithm is both hard to explain and a good idea and that's just the first one to come to mind.

Suffix trees and suffix arrays make for a brilliant study of elusive simplicity.

Suffix array

A well-known recursive algorithm for integer alphabets is the DC3 / skew algorithm of KÃrkkÃinen & Sanders (2003).

The paper includes a 50 line reference implementation (excluding comments).

One of the first algorithms to achieve all goals is the SA-IS algorithm of Nong, Zhang & Chan (2009). The algorithm is also rather simple (< 100 LOC) and can be enhanced to simultaneously construct the LCP array. The SA-IS algorithm is one of the fastest known suffix array construction algorithms. A careful implementation by Yuta Mori outperforms most other linear or super-linear construction approaches.

Why wasn't this algorithm discovered thirty years ago?

The concept was first introduced as a position tree by Weiner (1973), which Donald Knuth subsequently characterized as "Algorithm of the Year 1973". The construction was greatly simplified by McCreight (1976) , and also by Ukkonen (1995).

It's not like people didn't recognize this algorithm is an important building block since way back. Thirty six years to arrive at "rather simple". Amazing.

Yeah, and one more thing: Slashdot has been around for sixteen years and still can't render diacritics pasted in from Wikipedia. Who could have anticipated we'd wish to use those? Besides, it's a good American tradition. Right after being awestruck by the Statue of Liberty, Karkkainen steps off the boat and declares his name to the port authority.

Karkkainen? What kind of name is that? Umlaut schmoomlaut. You can have Kirkby or Kirklen. Kirlen it is then. What's that? I missed a K? Whatever, no point starting over. Next!

Comment Re:The power of love (Score 1) 204

It wasn't the work that got the result, it was the work + training + money, without any one of those ingredients he wouldn't have gotten the result he did.

You don't seem to grasp the asymmetry between life and death. I was configuring a FreeBSD jail the other day. The guide I consulted expressed strongly recommended that you begin with a fully featured jail and then subtract until it breaks rather than start with a bare jail and add until it works. There's usually about a 100,000 ways you can yank out a coloured wire and cause something complicated to break. Which one is the God wire?

Sure he started 28,000 feet up the mountain. I've heard the last 1000 feet poses more difficulty than most humans wish to endure. The reason a paraplegic can haul himself arm over arm out of the Grand Canyon is because he has a T10 injury rather than something higher up. Your weird subtractive calculus totally misses the point.

Hardware

Quantum-Tunneling Electrons Could Make Semiconductors Obsolete 276

Nerval's Lobster writes "The powerful, reliable combination of transistors and semiconductors in computer processors could give way to systems built on the way electrons misbehave, all of it contained in circuits that warp even the most basic rules of physics. Rather than relying on a predictable flow of electrons that appear to know whether they are particles or waves, the new approach depends on quantum tunneling, in which electrons given the right incentive can travel faster than light, appear to arrive at a new location before having left the old one, and pass straight through barriers that should be able to hold them back. Quantum tunneling is one of a series of quantum-mechanics-related techniques being developed as possible replacements for transistors embedded in semiconducting materials such as silicon. Unlike traditional transistors, circuits built by creating pathways for electrons to travel across a bed of nanotubes are not limited by any size restriction relevant to current manufacturing methods, require far less power than even the tiniest transistors, and do not give off heat or leak electricity as waste products, according to Yoke Khin Yap of Michigan Technological University, lead author of a paper describing the technique, which was published in the journal Advanced Materials last week."

Comment alternate universe (Score 1) 98

Oh yes it did. I'm guessing you're just too young to remember. Thanks to massive os/2 tv campaigns, "normal" people suddenly wanted a computer, not just a console to play games on

I'm certainly not "too young to remember". I wish.

It was a different world then. There wasn't an internet to immediately find out that some marketing term was full of shit. If five percent of the population at the time could distinguish OS/2 from PS/2 I'd be shocked. The one thing people knew for certain is that IBM never went hungry. IBM was attempting to run the entire information technology industry as a centrally planned economy, with some success. When the PC division was finally cut loose from the rest of the Blue Machine, it was mainly to free it from the IBM culture of seven layers of internal review on every decision about capability, volume, or price.

The only reason IBM entered the PC business in the first place was to drain away the nimbleness of young legs. If IBM had allowed the PC industry to cannibalize the mid-range sooner and more aggressively, all their employees clinging to incentive clauses in their mid-range operations would have started to circulate their resumes, both within IBM and without. As my brother never ceases to repeat: the first rats off a sinking ship are the best swimmers. Loss of talent off the top would have been horrendous in some of their existing cash-cow business lines. Quarterly earnings reports would have ceased to glow and executives would spending more quality time with family.

Businesses really do paint themselves into a corner with their internal incentive structures. Tearing up all those employment contracts is disruptive. Clinging to the past is dangerous. Operating a company with different rules in different divisions can quickly gut your workforce at the high end, as the best swimmers stampede to opportunity unleashed. It's extraordinarily rare to gut the cash cow, no matter how rabid the skinny upstart across the street.

What IBM underestimated was the acceleration term: how much more quickly a person armed with a crappy PC was able to figure out they had been saddled with an over-built and over-priced tank capriciously constrained to lumber along with an insufficient engine for a decade or more.

Intel 80286 had 134,000 transistors. Cortex M0 can be implemented in 12K gates. Based on logic functions which shows 12 transistors for a general purpose flip flop these designs are at about the same level of complexity. 80286 runs 2.66 MIPS at 12.5 MHz. The M0 runs 0.9 MIPS/MHz (wider MIPS to boot). Now it might be the case that exploiting the Cortex instruction set back in the eighties was a beyond the compiler technology of the day, but somehow I have my doubts that IBM was incapable of crossing that bridge had they chosen to do so.

I'd be very curious to see someone figure out how well a Cortex M0 could have been implemented in the 80286 process technology. Three to one margin? It's certainly possible on the surface numbers. The downside of the Cortex is increasing memory pressure with wider native memory cycles and a more severe performance trade-off when byte-packing or bit-packing every important data structure. The wider off-chip memory path is a significant PCB fabrication cost.

As I correct one myopic IBM decision after another I wind up in an alternate universe where AT&T sues IBM instead of suing BSD/Cortex. Those of us who lived through this era spent a lot of time day-dreaming about alternate universes.

The Military

Fear of Thinking War Machines May Push U.S. To Exascale 192

dcblogs writes "Unlike China and Europe, the U.S. has yet to adopt and fund an exascale development program, and concerns about what that means to U.S. security are growing darker and more dire. If the U.S. falls behind in HPC, the consequences will be 'in a word, devastating,' Selmer Bringsford, chair of the Department. of Cognitive Science at Rensselaer Polytechnic Institute, said at a U.S. House forum this week. 'If we were to lose our capacity to build preeminently smart machines, that would be a very dark situation, because machines can serve as weapons.' The House is about to get a bill requiring the Dept. of Energy to establish an exascale program. But the expected funding level, about $200 million annually, 'is better than nothing, but compared to China and Europe it's at least 10 times too low,' said Earl Joseph, an HPC analyst at IDC. David McQueeney, vice president of IBM research, told lawmakers that HPC systems now have the ability to not only deal with large data sets but 'to draw insights out of them.' The new generation of machines are being programmed to understand what the data sources are telling them, he said."

Comment low Android sex drive (Score 1) 64

I'm also getting closer to ten days per charge mainly running the low power Big Time watchface and not receiving too many notifications.

First win: I've programmed my own watchface with a non-standard time coordinate that matters to me.

Second win: I used to take a medication daily that had to be taken at a precise time in the mid-afternoon for optimum effect. Even after more than a year of practice, I still missed one audible watch alarm every ten days to two weeks. I don't wear my phone on my belt (it gets set down across the room when at home), so that wouldn't have been reliable either. Never miss Pebble's wrist buzzer if I'm wearing the watch. Even when I'm in the shower, if the the watch is placed on a hard surface, if makes enough noise to hear over the splashing water. I could wear it in the shower, but I don't wish to expose it to my nasty medicated shampoo.

Fortunately I've been immune all my life to any concern over whether someone out there might think something is cool, so far seeking out my own functionality. I like mine 20" square (in pairs) or small and unobtrusive. I find the 4" lifestyle most awkward of all: large enough to constantly notice you have it, too small to be completely effective. Likewise, I find Twitter completely ridiculous. Either the message should read "Beers 5 o'clock?" or it should be written with full sentences and paragraph units.

I watched a video on illicit cognitive enhancing drugs last night. I can see the appeal for the younger generation. They need to recover the 10% of their brain power they lose by the over-use of these ridiculous tweener form factors which specialize in mental fragments longer than a smoke signal and shorter than a completed thought.

Third win: This morning I received a phone call while I was still in bed. My watch rasped on my bed-side table so I opened one eye, determined it was a call I wanted that could wait for another hour, then rolled over and went right back to sleep. My phone was in the far corner of the house. I'm really surprised it works at all at that distance. (I've also missed a few from this distance. This might depend on charge status of one device or the other.)

Given that I don't actually sleep with my phone (low sex drive, I guess) my Pebble easily earns its keep.

Comment poppycops (Score 2) 476

This is ridiculous. No one would take a one-time one foot rise in global sea level seriously if it wasn't being construed as a canary in a coal mine with respect to a larger threat. They would just accept the city being built with insufficient surge margin as one of a thousand things done differently one hundred years ago.

Nor would people rush to conclude that a one-time one foot rise in sea level was a high price to pay with what humanity has achieved in the last one hundred years.

Building too close to unpredictable water is an ageless human tradition.

I think it's poppycock to tie an amorphous process such as global warming to any specific counterfactual. There are many environmental carcinogens where we know it doubles the base rate, but we can't point to any one specific person and say "you died because of this".

It's unscientific in attitutude to dupe the public into thinking that science operates in these terms. One does not need a concrete case of cause and effect in order for a process to have real effects. Even if the sea level had declined by a foot, some storm somewhere would have been worse. I've never had much appetite for scientists drawn into PR.

Comment what's in a name (Score 1) 656

If you want a hard-core mathematical proof that your code fulfills certain post-conditions etc., there's a large body of knowledge about how to go about it when the problem is posed in a functional programming language. Doing it to an otherwise unconstrained piece of C code is much harder.

If you want a hard-core mathematical proof about how your code behaves in time and space (for a value of time and space that makes your software market competitive) often a procedural representation is better.

Look at what happened between ATM and IP networking: "Another key ATM concept involves the traffic contract." For TCP/IP over Ethernet, the "channel contract" was a reamed-out muzzle diameter.

Two viable business models:

* Usain Bolt with a water-resistant wristwatch
* Arnold Schwarzenegger with a waterproof wristwatch

One permits more formal math than the other. I'm guessing Bolt is cooling down before 'egger has finished filling in his entry form.

Comment Re:Sounds like a huge risk (Score 1) 94

I totally agree. Seven days is long enough for a vendor to formulate a sober verbal response and run it through channels when their customers are already being rooted due to eggregious failings in their software products.

At the very least the customers can increase vigilance around the disclosed vulnerability.

Sure wouldn't hurt if this policy leads to fewer eggregious and embarrassing software flaws in the first place.

Comment binary in 1972 (Score 1) 623

My father taught me binary in 1971 when I was eight years old. He showed up one evening with black marbles and the bottom half of an egg carton. He had learned this from one of the original APL greybeards who attended his church. My father having himself dropped out of engineering to switch to theology had an interest in these things. Binary itself was easy (easier than learning to read an analog clock face). What took another week or so was puzzling out that binary was just a representation of the abstract notion of the integers. I wanted to learn more about computers, but hardly any books existed. Two years later I had pestered my father enough to bring home four books from the University of Calgary library. He said he had brought most of the books that seemed even valuely accessible. Most of these were stupid books full of pictures of shiny IBM consoles. I pitched them only my bedroom floor in disgust.

One actually taught some programming, mainly from the flowchart perspective. I tried to write a flowchart of getting up in the morning to go to school and all the decisions involved. This quickly got out of hand (I was fated to never become good at getting up in the morning). I concluded a month later that flowcharts were intellectually damaged: too bushy for the paltry logic they managed to encapsulate.

In 1976 I got my hands on 8008/8080 datasheets. The dumb thing took three power supplies and was far to expensive for me to ever own. I also soon acquired a TTL data book and realized I could design my own micro-controller from discrete logic. I designed such a thing on paper in the back of English Literature class. I like literature, but she was very boring and she never told my parents when I didn't hand in my assignments, so as far as I was concerned this class was a spare.

My grade six math teacher had allowed four members of the class to work at our own speed, after testing us with arithmetic quizes on a sequence of recorded tapes. I very nearly finished the last and fastest tape (very fast) but got ahead of myself trying to multitask the current question with a question I had missed. I didn't want less than a perfect score and wasn't mature enough to let that one question go. Then I jumbled five questions in a row trying to remember all five at the same time. I had never experienced not keeping up in math class before.

It was nice to be left to my own devices, but he compensated for his largess by making us write out in full nearly every darn exercise at the end of every chapter. I could pretty much read a Heinlein book on the side while doing 100 metric conversions long-hand. My progress was rate limited mainly by my pencil. By the end of the year I had completed the grade nine algebra textbook.

If my grade seven teacher had let me stay on the same track, I would have completely high school algebra by xmas. But he insisted that I stay with the rest of my classmates doing fractions again, or some rot. This bugged the hell out of me because the jocks with talent got special attention, and math is even worse than athletics as something where you can go a lot further if you start young. Just watched Proof the other night. Hopkins: How many days did you lose? How many!!. Days? I lost fucking years.

In 1978 I finally got my hands on more than a TI-30. My school bought a TRS-80 with 4kB of system RAM and 7/8kB of video RAM (16 rows of 64 characters by seven bits). This was to save ONE whole 1kBx1 memory chip. (The font ROM actually had lower case characters, but the memory bit that drove this pin wasn't there.) If the msb was 0, you got 64 different printable characters (not including lower case). If the msb was 1, the lower six bits controlled a 3x2 pixel block in the font ROM (making 48x128 pixels total, if you chose to treat this as a bit-mapped display).

I was also given an SC/MP homebrew by a local electronics instructor. He taught me hex in five minutes (but neglected twos complement for negative numbers). This was nothing but toggle switches (ten, for the address) and eight buttons to set individual bits (one button to clear the location). I had an extremely frustrating night trying to puzzle out how the branch instruction worked (not immediately realizing that twos-complement was based on the address of the instruction that followed). The next year I had an APL account on the university computer system, having taken calculus early. By then I had disassembled much of the TRS-80 BASIC in ROM and found an undocumented cassette tape routine for loading programs coded in Z80 assembly language. I wrote a Galaga-style game in Z80 assembly language using a nasty assembler I whipped up in BASIC. Since I had memorized most of the opcodes by then, it only handled the label arithmetic. Putting in symbolic opcodes would only have made the program slower and less reliable to read off the horrible cassette tape drive, which usually took five passes for the simplest program.

At university they were forcing us to take COBOL and Fortran for the CO-OP job market. My roommate and I both had Z80-based systems of our own by then. His was a Heathkit. Mine was the Osborne. The IBM PC did not yet exist, and I had never fallen in love with Apple (that continues).

One day he hands me a zip-locked baggy with a floppy inside (actually floppy). It was a C compiler from the Software Toolworks. What a breath of fresh air compared to Pascal! I was hooked on C forever after, or at least until 1996 when I discovered the C++ STL and template metaprogramming. These days I mainly program in C/C++ and R (my APL heritage dies hard).

Two years later (still in the early 1980s) I actually programmed on a Xerox Dorado for an hour or so when a classmate had a workterm at Xerox Parc and I biked to Stanford down the west coast.

There weren't many major outside influences: Knuth, Dijkstra, Hoare, Wirth, Plauger, Iverson, Brooks, Bertrand Meyer, K&R, Walls, and one paper by Michael Jackson. One book I beat to death was an early book on writing portable C programs. Don't recall the author just now, but I had it around the time I purchased my first 386 based system from an unknown mail-order company named Gateway 2000. That was the book more than any other that taught me how to program professionally. The next large system I wrote was ported from MSDOS to QNX in a very short time (the glorious Watcom compiler presiding). Man that feels good after wading through so much shit code.

#define ISUCK 42-1

Good god man, get some sideview mirrors on that expression!

for (i = 0; i < ISUCK^2; ++i) never_get_there();

Amazingly, ISUCK is a fixed point under exponentiation.

On a side note, my last completed program was a Pebble watch face written in C. Good times.

Slashdot Top Deals

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...