typodupeerror

## Comment hands in your pockets (Score 2)778

Glenn Gould used to take a lot of flack for refusing to shake people's hands even though we all know that you can't go through life refusing to shake hands. Perhaps he had a good reason?

Even if you're less of a sociopathic hypochondriac than Glenn Gould, there's still an issue concerning how automatically one reaches out. I'm a little more hesitant to offer my mitt to a vagrant person who's just popped out a discrete alleyway with flecks of an old newspaper stuck to their shoe. Colour me paranoid. And yet the default on the web is to arrive on every web page in full embrace, even the typosquatters with old newspaper stuck to their shoes.

On my FF I have things pretty locked down. If on first impression I haven't teleported into the worst bathroom in all of Scotland, I'm pretty quick to enable first party cookies. Tracking cookies from the social media paparazzi, not so quickly.

When I get a site coded to misbehave at the first whiff of the end user exercising prudence or discretion, I switch the URL into Chrome where I have practically nothing locked down and visit nowhere important and where the social media paparazzi will observe my click trail as an infrequent user engaged who exclusively visits the wrong side of town, but never never pulls his hands out of his pockets to engage the temptations.

## Comment elusive simplicity (Score 1)381

Something like Ukkonen's algorithm is both hard to explain and a good idea and that's just the first one to come to mind.

Suffix trees and suffix arrays make for a brilliant study of elusive simplicity.

A well-known recursive algorithm for integer alphabets is the DC3 / skew algorithm of KÃrkkÃinen & Sanders (2003).

The paper includes a 50 line reference implementation (excluding comments).

One of the first algorithms to achieve all goals is the SA-IS algorithm of Nong, Zhang & Chan (2009). The algorithm is also rather simple (< 100 LOC) and can be enhanced to simultaneously construct the LCP array. The SA-IS algorithm is one of the fastest known suffix array construction algorithms. A careful implementation by Yuta Mori outperforms most other linear or super-linear construction approaches.

Why wasn't this algorithm discovered thirty years ago?

The concept was first introduced as a position tree by Weiner (1973), which Donald Knuth subsequently characterized as "Algorithm of the Year 1973". The construction was greatly simplified by McCreight (1976) , and also by Ukkonen (1995).

It's not like people didn't recognize this algorithm is an important building block since way back. Thirty six years to arrive at "rather simple". Amazing.

Yeah, and one more thing: Slashdot has been around for sixteen years and still can't render diacritics pasted in from Wikipedia. Who could have anticipated we'd wish to use those? Besides, it's a good American tradition. Right after being awestruck by the Statue of Liberty, Karkkainen steps off the boat and declares his name to the port authority.

Karkkainen? What kind of name is that? Umlaut schmoomlaut. You can have Kirkby or Kirklen. Kirlen it is then. What's that? I missed a K? Whatever, no point starting over. Next!

## Comment Re:The power of love (Score 1)204

It wasn't the work that got the result, it was the work + training + money, without any one of those ingredients he wouldn't have gotten the result he did.

You don't seem to grasp the asymmetry between life and death. I was configuring a FreeBSD jail the other day. The guide I consulted expressed strongly recommended that you begin with a fully featured jail and then subtract until it breaks rather than start with a bare jail and add until it works. There's usually about a 100,000 ways you can yank out a coloured wire and cause something complicated to break. Which one is the God wire?

Sure he started 28,000 feet up the mountain. I've heard the last 1000 feet poses more difficulty than most humans wish to endure. The reason a paraplegic can haul himself arm over arm out of the Grand Canyon is because he has a T10 injury rather than something higher up. Your weird subtractive calculus totally misses the point.

## Quantum-Tunneling Electrons Could Make Semiconductors Obsolete276

Nerval's Lobster writes "The powerful, reliable combination of transistors and semiconductors in computer processors could give way to systems built on the way electrons misbehave, all of it contained in circuits that warp even the most basic rules of physics. Rather than relying on a predictable flow of electrons that appear to know whether they are particles or waves, the new approach depends on quantum tunneling, in which electrons given the right incentive can travel faster than light, appear to arrive at a new location before having left the old one, and pass straight through barriers that should be able to hold them back. Quantum tunneling is one of a series of quantum-mechanics-related techniques being developed as possible replacements for transistors embedded in semiconducting materials such as silicon. Unlike traditional transistors, circuits built by creating pathways for electrons to travel across a bed of nanotubes are not limited by any size restriction relevant to current manufacturing methods, require far less power than even the tiniest transistors, and do not give off heat or leak electricity as waste products, according to Yoke Khin Yap of Michigan Technological University, lead author of a paper describing the technique, which was published in the journal Advanced Materials last week."

## Comment alternate universe (Score 1)98

Oh yes it did. I'm guessing you're just too young to remember. Thanks to massive os/2 tv campaigns, "normal" people suddenly wanted a computer, not just a console to play games on

I'm certainly not "too young to remember". I wish.

It was a different world then. There wasn't an internet to immediately find out that some marketing term was full of shit. If five percent of the population at the time could distinguish OS/2 from PS/2 I'd be shocked. The one thing people knew for certain is that IBM never went hungry. IBM was attempting to run the entire information technology industry as a centrally planned economy, with some success. When the PC division was finally cut loose from the rest of the Blue Machine, it was mainly to free it from the IBM culture of seven layers of internal review on every decision about capability, volume, or price.

The only reason IBM entered the PC business in the first place was to drain away the nimbleness of young legs. If IBM had allowed the PC industry to cannibalize the mid-range sooner and more aggressively, all their employees clinging to incentive clauses in their mid-range operations would have started to circulate their resumes, both within IBM and without. As my brother never ceases to repeat: the first rats off a sinking ship are the best swimmers. Loss of talent off the top would have been horrendous in some of their existing cash-cow business lines. Quarterly earnings reports would have ceased to glow and executives would spending more quality time with family.

Businesses really do paint themselves into a corner with their internal incentive structures. Tearing up all those employment contracts is disruptive. Clinging to the past is dangerous. Operating a company with different rules in different divisions can quickly gut your workforce at the high end, as the best swimmers stampede to opportunity unleashed. It's extraordinarily rare to gut the cash cow, no matter how rabid the skinny upstart across the street.

What IBM underestimated was the acceleration term: how much more quickly a person armed with a crappy PC was able to figure out they had been saddled with an over-built and over-priced tank capriciously constrained to lumber along with an insufficient engine for a decade or more.

Intel 80286 had 134,000 transistors. Cortex M0 can be implemented in 12K gates. Based on logic functions which shows 12 transistors for a general purpose flip flop these designs are at about the same level of complexity. 80286 runs 2.66 MIPS at 12.5 MHz. The M0 runs 0.9 MIPS/MHz (wider MIPS to boot). Now it might be the case that exploiting the Cortex instruction set back in the eighties was a beyond the compiler technology of the day, but somehow I have my doubts that IBM was incapable of crossing that bridge had they chosen to do so.

I'd be very curious to see someone figure out how well a Cortex M0 could have been implemented in the 80286 process technology. Three to one margin? It's certainly possible on the surface numbers. The downside of the Cortex is increasing memory pressure with wider native memory cycles and a more severe performance trade-off when byte-packing or bit-packing every important data structure. The wider off-chip memory path is a significant PCB fabrication cost.

As I correct one myopic IBM decision after another I wind up in an alternate universe where AT&T sues IBM instead of suing BSD/Cortex. Those of us who lived through this era spent a lot of time day-dreaming about alternate universes.

## Fear of Thinking War Machines May Push U.S. To Exascale192

dcblogs writes "Unlike China and Europe, the U.S. has yet to adopt and fund an exascale development program, and concerns about what that means to U.S. security are growing darker and more dire. If the U.S. falls behind in HPC, the consequences will be 'in a word, devastating,' Selmer Bringsford, chair of the Department. of Cognitive Science at Rensselaer Polytechnic Institute, said at a U.S. House forum this week. 'If we were to lose our capacity to build preeminently smart machines, that would be a very dark situation, because machines can serve as weapons.' The House is about to get a bill requiring the Dept. of Energy to establish an exascale program. But the expected funding level, about \$200 million annually, 'is better than nothing, but compared to China and Europe it's at least 10 times too low,' said Earl Joseph, an HPC analyst at IDC. David McQueeney, vice president of IBM research, told lawmakers that HPC systems now have the ability to not only deal with large data sets but 'to draw insights out of them.' The new generation of machines are being programmed to understand what the data sources are telling them, he said."

## Comment low Android sex drive (Score 1)64

I'm also getting closer to ten days per charge mainly running the low power Big Time watchface and not receiving too many notifications.

First win: I've programmed my own watchface with a non-standard time coordinate that matters to me.

Second win: I used to take a medication daily that had to be taken at a precise time in the mid-afternoon for optimum effect. Even after more than a year of practice, I still missed one audible watch alarm every ten days to two weeks. I don't wear my phone on my belt (it gets set down across the room when at home), so that wouldn't have been reliable either. Never miss Pebble's wrist buzzer if I'm wearing the watch. Even when I'm in the shower, if the the watch is placed on a hard surface, if makes enough noise to hear over the splashing water. I could wear it in the shower, but I don't wish to expose it to my nasty medicated shampoo.

Fortunately I've been immune all my life to any concern over whether someone out there might think something is cool, so far seeking out my own functionality. I like mine 20" square (in pairs) or small and unobtrusive. I find the 4" lifestyle most awkward of all: large enough to constantly notice you have it, too small to be completely effective. Likewise, I find Twitter completely ridiculous. Either the message should read "Beers 5 o'clock?" or it should be written with full sentences and paragraph units.

I watched a video on illicit cognitive enhancing drugs last night. I can see the appeal for the younger generation. They need to recover the 10% of their brain power they lose by the over-use of these ridiculous tweener form factors which specialize in mental fragments longer than a smoke signal and shorter than a completed thought.

Third win: This morning I received a phone call while I was still in bed. My watch rasped on my bed-side table so I opened one eye, determined it was a call I wanted that could wait for another hour, then rolled over and went right back to sleep. My phone was in the far corner of the house. I'm really surprised it works at all at that distance. (I've also missed a few from this distance. This might depend on charge status of one device or the other.)

Given that I don't actually sleep with my phone (low sex drive, I guess) my Pebble easily earns its keep.

## Comment poppycops (Score 2)476

This is ridiculous. No one would take a one-time one foot rise in global sea level seriously if it wasn't being construed as a canary in a coal mine with respect to a larger threat. They would just accept the city being built with insufficient surge margin as one of a thousand things done differently one hundred years ago.

Nor would people rush to conclude that a one-time one foot rise in sea level was a high price to pay with what humanity has achieved in the last one hundred years.

Building too close to unpredictable water is an ageless human tradition.

I think it's poppycock to tie an amorphous process such as global warming to any specific counterfactual. There are many environmental carcinogens where we know it doubles the base rate, but we can't point to any one specific person and say "you died because of this".

It's unscientific in attitutude to dupe the public into thinking that science operates in these terms. One does not need a concrete case of cause and effect in order for a process to have real effects. Even if the sea level had declined by a foot, some storm somewhere would have been worse. I've never had much appetite for scientists drawn into PR.

## Comment what's in a name (Score 1)656

If you want a hard-core mathematical proof that your code fulfills certain post-conditions etc., there's a large body of knowledge about how to go about it when the problem is posed in a functional programming language. Doing it to an otherwise unconstrained piece of C code is much harder.

If you want a hard-core mathematical proof about how your code behaves in time and space (for a value of time and space that makes your software market competitive) often a procedural representation is better.

Look at what happened between ATM and IP networking: "Another key ATM concept involves the traffic contract." For TCP/IP over Ethernet, the "channel contract" was a reamed-out muzzle diameter.

* Usain Bolt with a water-resistant wristwatch
* Arnold Schwarzenegger with a waterproof wristwatch

One permits more formal math than the other. I'm guessing Bolt is cooling down before 'egger has finished filling in his entry form.

## Comment Re:Sounds like a huge risk (Score 1)94

I totally agree. Seven days is long enough for a vendor to formulate a sober verbal response and run it through channels when their customers are already being rooted due to eggregious failings in their software products.

At the very least the customers can increase vigilance around the disclosed vulnerability.

Sure wouldn't hurt if this policy leads to fewer eggregious and embarrassing software flaws in the first place.

## Comment binary in 1972 (Score 1)623

My father taught me binary in 1971 when I was eight years old. He showed up one evening with black marbles and the bottom half of an egg carton. He had learned this from one of the original APL greybeards who attended his church. My father having himself dropped out of engineering to switch to theology had an interest in these things. Binary itself was easy (easier than learning to read an analog clock face). What took another week or so was puzzling out that binary was just a representation of the abstract notion of the integers. I wanted to learn more about computers, but hardly any books existed. Two years later I had pestered my father enough to bring home four books from the University of Calgary library. He said he had brought most of the books that seemed even valuely accessible. Most of these were stupid books full of pictures of shiny IBM consoles. I pitched them only my bedroom floor in disgust.

One actually taught some programming, mainly from the flowchart perspective. I tried to write a flowchart of getting up in the morning to go to school and all the decisions involved. This quickly got out of hand (I was fated to never become good at getting up in the morning). I concluded a month later that flowcharts were intellectually damaged: too bushy for the paltry logic they managed to encapsulate.

In 1976 I got my hands on 8008/8080 datasheets. The dumb thing took three power supplies and was far to expensive for me to ever own. I also soon acquired a TTL data book and realized I could design my own micro-controller from discrete logic. I designed such a thing on paper in the back of English Literature class. I like literature, but she was very boring and she never told my parents when I didn't hand in my assignments, so as far as I was concerned this class was a spare.

My grade six math teacher had allowed four members of the class to work at our own speed, after testing us with arithmetic quizes on a sequence of recorded tapes. I very nearly finished the last and fastest tape (very fast) but got ahead of myself trying to multitask the current question with a question I had missed. I didn't want less than a perfect score and wasn't mature enough to let that one question go. Then I jumbled five questions in a row trying to remember all five at the same time. I had never experienced not keeping up in math class before.

It was nice to be left to my own devices, but he compensated for his largess by making us write out in full nearly every darn exercise at the end of every chapter. I could pretty much read a Heinlein book on the side while doing 100 metric conversions long-hand. My progress was rate limited mainly by my pencil. By the end of the year I had completed the grade nine algebra textbook.

If my grade seven teacher had let me stay on the same track, I would have completely high school algebra by xmas. But he insisted that I stay with the rest of my classmates doing fractions again, or some rot. This bugged the hell out of me because the jocks with talent got special attention, and math is even worse than athletics as something where you can go a lot further if you start young. Just watched Proof the other night. Hopkins: How many days did you lose? How many!!. Days? I lost fucking years.

In 1978 I finally got my hands on more than a TI-30. My school bought a TRS-80 with 4kB of system RAM and 7/8kB of video RAM (16 rows of 64 characters by seven bits). This was to save ONE whole 1kBx1 memory chip. (The font ROM actually had lower case characters, but the memory bit that drove this pin wasn't there.) If the msb was 0, you got 64 different printable characters (not including lower case). If the msb was 1, the lower six bits controlled a 3x2 pixel block in the font ROM (making 48x128 pixels total, if you chose to treat this as a bit-mapped display).

I was also given an SC/MP homebrew by a local electronics instructor. He taught me hex in five minutes (but neglected twos complement for negative numbers). This was nothing but toggle switches (ten, for the address) and eight buttons to set individual bits (one button to clear the location). I had an extremely frustrating night trying to puzzle out how the branch instruction worked (not immediately realizing that twos-complement was based on the address of the instruction that followed). The next year I had an APL account on the university computer system, having taken calculus early. By then I had disassembled much of the TRS-80 BASIC in ROM and found an undocumented cassette tape routine for loading programs coded in Z80 assembly language. I wrote a Galaga-style game in Z80 assembly language using a nasty assembler I whipped up in BASIC. Since I had memorized most of the opcodes by then, it only handled the label arithmetic. Putting in symbolic opcodes would only have made the program slower and less reliable to read off the horrible cassette tape drive, which usually took five passes for the simplest program.

At university they were forcing us to take COBOL and Fortran for the CO-OP job market. My roommate and I both had Z80-based systems of our own by then. His was a Heathkit. Mine was the Osborne. The IBM PC did not yet exist, and I had never fallen in love with Apple (that continues).

One day he hands me a zip-locked baggy with a floppy inside (actually floppy). It was a C compiler from the Software Toolworks. What a breath of fresh air compared to Pascal! I was hooked on C forever after, or at least until 1996 when I discovered the C++ STL and template metaprogramming. These days I mainly program in C/C++ and R (my APL heritage dies hard).

Two years later (still in the early 1980s) I actually programmed on a Xerox Dorado for an hour or so when a classmate had a workterm at Xerox Parc and I biked to Stanford down the west coast.

There weren't many major outside influences: Knuth, Dijkstra, Hoare, Wirth, Plauger, Iverson, Brooks, Bertrand Meyer, K&R, Walls, and one paper by Michael Jackson. One book I beat to death was an early book on writing portable C programs. Don't recall the author just now, but I had it around the time I purchased my first 386 based system from an unknown mail-order company named Gateway 2000. That was the book more than any other that taught me how to program professionally. The next large system I wrote was ported from MSDOS to QNX in a very short time (the glorious Watcom compiler presiding). Man that feels good after wading through so much shit code.

#define ISUCK 42-1

Good god man, get some sideview mirrors on that expression!

for (i = 0; i < ISUCK^2; ++i) never_get_there();

Amazingly, ISUCK is a fixed point under exponentiation.

On a side note, my last completed program was a Pebble watch face written in C. Good times.

## Comment pixel pack rats (Score 1)573

Boy will you be laughing at yourself in a couple of years when you look back on how you thought a few dozen TB of data a month was like, some big deal.

Boy will we all be laughing at you a decade from now for predicting that Windows would expand to fill any hard drive ever invented, unless you're the kind of person where no-one can see inside your house because your collection of yellowing newspapers has taken possession of every vertical surface.

There will come a day where rendering a ROTK tribute will be an afternoon school project. That decade is not this decade.

We're at the point where we should be measuring bandwidth in dBA where 10x energy is perceived as 2x loudness.

## Comment nurture white in teeth and paw (Score 1)201

What does this story have to offer?

The world is a competitive place, except when it isn't. And why is that, exactly? Why do social insects exist? Why, for that matter, do social mammals exist? We wouldn't even have social networking unless the roots of cooperation in our genetics and culture are nearly as deep (and indispensable) as nature red in tooth and claw.

Competition will never not be present, which provides an excellent enclosed gondola for all the slippery-slopers out there. How nice is that? You can never be entirely wrong arguing that competition will always exist. Safe! Secure! You'll never say anything insightful, either, about how competition self-regulates into ritualized displays of dominance/submission without goring every participant.

## Comment bring back the hereditary git tax (Score 1)311

Who gets to decide how much is too much?

Point me to any country where you can identity any small group with sole authority for this kind of decision, and I'll wager they mainly discuss among themselves the problem of too much being not enough. In societies where decisions are reached by a process (in which many people can participate and where chance also plays a significant role) there's at least some potential for antitrust legislation to pass which enacts a ceiling low enough to echo-locate.

Really, America had it right before they repealed the estate tax. It should have been called the hereditary git tax, to remind Americans of what their forefathers were so intent on escaping in the first place. Since when did it become an American value for the children of privilege to cruise through life on daddy's deep pockets without earning it themselves, generation upon generation? Just wondering.

# Slashdot Top Deals

If computers take over (which seems to be their natural tendency), it will serve us right. -- Alistair Cooke

Working...