Oh yes it did. I'm guessing you're just too young to remember. Thanks to massive os/2 tv campaigns, "normal" people suddenly wanted a computer, not just a console to play games on
I'm certainly not "too young to remember". I wish.
It was a different world then. There wasn't an internet to immediately find out that some marketing term was full of shit. If five percent of the population at the time could distinguish OS/2 from PS/2 I'd be shocked. The one thing people knew for certain is that IBM never went hungry. IBM was attempting to run the entire information technology industry as a centrally planned economy, with some success. When the PC division was finally cut loose from the rest of the Blue Machine, it was mainly to free it from the IBM culture of seven layers of internal review on every decision about capability, volume, or price.
The only reason IBM entered the PC business in the first place was to drain away the nimbleness of young legs. If IBM had allowed the PC industry to cannibalize the mid-range sooner and more aggressively, all their employees clinging to incentive clauses in their mid-range operations would have started to circulate their resumes, both within IBM and without. As my brother never ceases to repeat: the first rats off a sinking ship are the best swimmers. Loss of talent off the top would have been horrendous in some of their existing cash-cow business lines. Quarterly earnings reports would have ceased to glow and executives would spending more quality time with family.
Businesses really do paint themselves into a corner with their internal incentive structures. Tearing up all those employment contracts is disruptive. Clinging to the past is dangerous. Operating a company with different rules in different divisions can quickly gut your workforce at the high end, as the best swimmers stampede to opportunity unleashed. It's extraordinarily rare to gut the cash cow, no matter how rabid the skinny upstart across the street.
What IBM underestimated was the acceleration term: how much more quickly a person armed with a crappy PC was able to figure out they had been saddled with an over-built and over-priced tank capriciously constrained to lumber along with an insufficient engine for a decade or more.
Intel 80286 had 134,000 transistors. Cortex M0 can be implemented in 12K gates. Based on logic functions which shows 12 transistors for a general purpose flip flop these designs are at about the same level of complexity. 80286 runs 2.66 MIPS at 12.5 MHz. The M0 runs 0.9 MIPS/MHz (wider MIPS to boot). Now it might be the case that exploiting the Cortex instruction set back in the eighties was a beyond the compiler technology of the day, but somehow I have my doubts that IBM was incapable of crossing that bridge had they chosen to do so.
I'd be very curious to see someone figure out how well a Cortex M0 could have been implemented in the 80286 process technology. Three to one margin? It's certainly possible on the surface numbers. The downside of the Cortex is increasing memory pressure with wider native memory cycles and a more severe performance trade-off when byte-packing or bit-packing every important data structure. The wider off-chip memory path is a significant PCB fabrication cost.
As I correct one myopic IBM decision after another I wind up in an alternate universe where AT&T sues IBM instead of suing BSD/Cortex. Those of us who lived through this era spent a lot of time day-dreaming about alternate universes.
I'm also getting closer to ten days per charge mainly running the low power Big Time watchface and not receiving too many notifications.
First win: I've programmed my own watchface with a non-standard time coordinate that matters to me.
Second win: I used to take a medication daily that had to be taken at a precise time in the mid-afternoon for optimum effect. Even after more than a year of practice, I still missed one audible watch alarm every ten days to two weeks. I don't wear my phone on my belt (it gets set down across the room when at home), so that wouldn't have been reliable either. Never miss Pebble's wrist buzzer if I'm wearing the watch. Even when I'm in the shower, if the the watch is placed on a hard surface, if makes enough noise to hear over the splashing water. I could wear it in the shower, but I don't wish to expose it to my nasty medicated shampoo.
Fortunately I've been immune all my life to any concern over whether someone out there might think something is cool, so far seeking out my own functionality. I like mine 20" square (in pairs) or small and unobtrusive. I find the 4" lifestyle most awkward of all: large enough to constantly notice you have it, too small to be completely effective. Likewise, I find Twitter completely ridiculous. Either the message should read "Beers 5 o'clock?" or it should be written with full sentences and paragraph units.
I watched a video on illicit cognitive enhancing drugs last night. I can see the appeal for the younger generation. They need to recover the 10% of their brain power they lose by the over-use of these ridiculous tweener form factors which specialize in mental fragments longer than a smoke signal and shorter than a completed thought.
Third win: This morning I received a phone call while I was still in bed. My watch rasped on my bed-side table so I opened one eye, determined it was a call I wanted that could wait for another hour, then rolled over and went right back to sleep. My phone was in the far corner of the house. I'm really surprised it works at all at that distance. (I've also missed a few from this distance. This might depend on charge status of one device or the other.)
Given that I don't actually sleep with my phone (low sex drive, I guess) my Pebble easily earns its keep.
This is ridiculous. No one would take a one-time one foot rise in global sea level seriously if it wasn't being construed as a canary in a coal mine with respect to a larger threat. They would just accept the city being built with insufficient surge margin as one of a thousand things done differently one hundred years ago.
Nor would people rush to conclude that a one-time one foot rise in sea level was a high price to pay with what humanity has achieved in the last one hundred years.
Building too close to unpredictable water is an ageless human tradition.
I think it's poppycock to tie an amorphous process such as global warming to any specific counterfactual. There are many environmental carcinogens where we know it doubles the base rate, but we can't point to any one specific person and say "you died because of this".
It's unscientific in attitutude to dupe the public into thinking that science operates in these terms. One does not need a concrete case of cause and effect in order for a process to have real effects. Even if the sea level had declined by a foot, some storm somewhere would have been worse. I've never had much appetite for scientists drawn into PR.
If you want a hard-core mathematical proof that your code fulfills certain post-conditions etc., there's a large body of knowledge about how to go about it when the problem is posed in a functional programming language. Doing it to an otherwise unconstrained piece of C code is much harder.
If you want a hard-core mathematical proof about how your code behaves in time and space (for a value of time and space that makes your software market competitive) often a procedural representation is better.
Look at what happened between ATM and IP networking: "Another key ATM concept involves the traffic contract." For TCP/IP over Ethernet, the "channel contract" was a reamed-out muzzle diameter.
Two viable business models:
* Usain Bolt with a water-resistant wristwatch
* Arnold Schwarzenegger with a waterproof wristwatch
One permits more formal math than the other. I'm guessing Bolt is cooling down before 'egger has finished filling in his entry form.
I totally agree. Seven days is long enough for a vendor to formulate a sober verbal response and run it through channels when their customers are already being rooted due to eggregious failings in their software products.
At the very least the customers can increase vigilance around the disclosed vulnerability.
Sure wouldn't hurt if this policy leads to fewer eggregious and embarrassing software flaws in the first place.
My father taught me binary in 1971 when I was eight years old. He showed up one evening with black marbles and the bottom half of an egg carton. He had learned this from one of the original APL greybeards who attended his church. My father having himself dropped out of engineering to switch to theology had an interest in these things. Binary itself was easy (easier than learning to read an analog clock face). What took another week or so was puzzling out that binary was just a representation of the abstract notion of the integers. I wanted to learn more about computers, but hardly any books existed. Two years later I had pestered my father enough to bring home four books from the University of Calgary library. He said he had brought most of the books that seemed even valuely accessible. Most of these were stupid books full of pictures of shiny IBM consoles. I pitched them only my bedroom floor in disgust.
One actually taught some programming, mainly from the flowchart perspective. I tried to write a flowchart of getting up in the morning to go to school and all the decisions involved. This quickly got out of hand (I was fated to never become good at getting up in the morning). I concluded a month later that flowcharts were intellectually damaged: too bushy for the paltry logic they managed to encapsulate.
In 1976 I got my hands on 8008/8080 datasheets. The dumb thing took three power supplies and was far to expensive for me to ever own. I also soon acquired a TTL data book and realized I could design my own micro-controller from discrete logic. I designed such a thing on paper in the back of English Literature class. I like literature, but she was very boring and she never told my parents when I didn't hand in my assignments, so as far as I was concerned this class was a spare.
My grade six math teacher had allowed four members of the class to work at our own speed, after testing us with arithmetic quizes on a sequence of recorded tapes. I very nearly finished the last and fastest tape (very fast) but got ahead of myself trying to multitask the current question with a question I had missed. I didn't want less than a perfect score and wasn't mature enough to let that one question go. Then I jumbled five questions in a row trying to remember all five at the same time. I had never experienced not keeping up in math class before.
It was nice to be left to my own devices, but he compensated for his largess by making us write out in full nearly every darn exercise at the end of every chapter. I could pretty much read a Heinlein book on the side while doing 100 metric conversions long-hand. My progress was rate limited mainly by my pencil. By the end of the year I had completed the grade nine algebra textbook.
If my grade seven teacher had let me stay on the same track, I would have completely high school algebra by xmas. But he insisted that I stay with the rest of my classmates doing fractions again, or some rot. This bugged the hell out of me because the jocks with talent got special attention, and math is even worse than athletics as something where you can go a lot further if you start young. Just watched Proof the other night. Hopkins: How many days did you lose? How many!!. Days? I lost fucking years.
In 1978 I finally got my hands on more than a TI-30. My school bought a TRS-80 with 4kB of system RAM and 7/8kB of video RAM (16 rows of 64 characters by seven bits). This was to save ONE whole 1kBx1 memory chip. (The font ROM actually had lower case characters, but the memory bit that drove this pin wasn't there.) If the msb was 0, you got 64 different printable characters (not including lower case). If the msb was 1, the lower six bits controlled a 3x2 pixel block in the font ROM (making 48x128 pixels total, if you chose to treat this as a bit-mapped display).
I was also given an SC/MP homebrew by a local electronics instructor. He taught me hex in five minutes (but neglected twos complement for negative numbers). This was nothing but toggle switches (ten, for the address) and eight buttons to set individual bits (one button to clear the location). I had an extremely frustrating night trying to puzzle out how the branch instruction worked (not immediately realizing that twos-complement was based on the address of the instruction that followed). The next year I had an APL account on the university computer system, having taken calculus early. By then I had disassembled much of the TRS-80 BASIC in ROM and found an undocumented cassette tape routine for loading programs coded in Z80 assembly language. I wrote a Galaga-style game in Z80 assembly language using a nasty assembler I whipped up in BASIC. Since I had memorized most of the opcodes by then, it only handled the label arithmetic. Putting in symbolic opcodes would only have made the program slower and less reliable to read off the horrible cassette tape drive, which usually took five passes for the simplest program.
At university they were forcing us to take COBOL and Fortran for the CO-OP job market. My roommate and I both had Z80-based systems of our own by then. His was a Heathkit. Mine was the Osborne. The IBM PC did not yet exist, and I had never fallen in love with Apple (that continues).
One day he hands me a zip-locked baggy with a floppy inside (actually floppy). It was a C compiler from the Software Toolworks. What a breath of fresh air compared to Pascal! I was hooked on C forever after, or at least until 1996 when I discovered the C++ STL and template metaprogramming. These days I mainly program in C/C++ and R (my APL heritage dies hard).
Two years later (still in the early 1980s) I actually programmed on a Xerox Dorado for an hour or so when a classmate had a workterm at Xerox Parc and I biked to Stanford down the west coast.
There weren't many major outside influences: Knuth, Dijkstra, Hoare, Wirth, Plauger, Iverson, Brooks, Bertrand Meyer, K&R, Walls, and one paper by Michael Jackson. One book I beat to death was an early book on writing portable C programs. Don't recall the author just now, but I had it around the time I purchased my first 386 based system from an unknown mail-order company named Gateway 2000. That was the book more than any other that taught me how to program professionally. The next large system I wrote was ported from MSDOS to QNX in a very short time (the glorious Watcom compiler presiding). Man that feels good after wading through so much shit code.
#define ISUCK 42-1
Good god man, get some sideview mirrors on that expression!
for (i = 0; i < ISUCK^2; ++i) never_get_there();
Amazingly, ISUCK is a fixed point under exponentiation.
On a side note, my last completed program was a Pebble watch face written in C. Good times.
They always include a ludicrous^2 proposal to distract the discussion from ludicrous^(3/2) which itself exists to ensure that ludicrous^(5/4) is barely noticed.
Boy will you be laughing at yourself in a couple of years when you look back on how you thought a few dozen TB of data a month was like, some big deal.
Boy will we all be laughing at you a decade from now for predicting that Windows would expand to fill any hard drive ever invented, unless you're the kind of person where no-one can see inside your house because your collection of yellowing newspapers has taken possession of every vertical surface.
There will come a day where rendering a ROTK tribute will be an afternoon school project. That decade is not this decade.
We're at the point where we should be measuring bandwidth in dBA where 10x energy is perceived as 2x loudness.
What does this story have to offer?
The world is a competitive place, except when it isn't. And why is that, exactly? Why do social insects exist? Why, for that matter, do social mammals exist? We wouldn't even have social networking unless the roots of cooperation in our genetics and culture are nearly as deep (and indispensable) as nature red in tooth and claw.
Competition will never not be present, which provides an excellent enclosed gondola for all the slippery-slopers out there. How nice is that? You can never be entirely wrong arguing that competition will always exist. Safe! Secure! You'll never say anything insightful, either, about how competition self-regulates into ritualized displays of dominance/submission without goring every participant.
Who gets to decide how much is too much?
Point me to any country where you can identity any small group with sole authority for this kind of decision, and I'll wager they mainly discuss among themselves the problem of too much being not enough. In societies where decisions are reached by a process (in which many people can participate and where chance also plays a significant role) there's at least some potential for antitrust legislation to pass which enacts a ceiling low enough to echo-locate.
Really, America had it right before they repealed the estate tax. It should have been called the hereditary git tax, to remind Americans of what their forefathers were so intent on escaping in the first place. Since when did it become an American value for the children of privilege to cruise through life on daddy's deep pockets without earning it themselves, generation upon generation? Just wondering.
If the set of primes is finite, form the product of all primes and add 1, creating a number not divisible by any prime (making the number formed prime by definition) yet not included in the set of all primes by construction. At this point you can smoke some weed or you can begin to suspect that the set of primes is not finite.
Let p = 10^-googolplex.
With enough patience, you can win this lottery 100 times in a row, and you can do that as many times as you like.
All this new result gives us is further evidence that in the unextinguished coincidence of short spacings, the distribution of primes resembles a random process. There's a structural reason why both N and N+1 are never prime at the same time. It appears, however, to be rather difficult to identify any other structure of the distribution of primes taking the form of permanently extinguished gap distances.
Our list of viable gaps grows thin. (I did wish momentarily to mark that up as <e>thin</e> for Elvish italic.)
First "I do", then 10,000 followers fill his shoe, then pussy-whipped and Zucker-punched. He should have ended his complaint by confessing that he feels so ruthlessly dis-empowered he hasn't had a decent erection in three weeks.
Life would be so much easier if we could just look at the source code. -- Dave Olson