Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:6502 orgasms (Score 1) 113

The Vectrex video game system also used the 6809.

Yeah, I saw that in the Wikipedia article.

And my guess as to why Moto didn't use the 6809 as the basis of the 6811/6812 is because they wanted to use microcode, and it would have been harder because of the post-byte index modes.

Interesting. I am not familiar enough with the internal circuit topology of the '09 to comment. Did the PLA approach take more, less, or about the same silicon as if the '09 would have used a microcoded approach? Because that was around the time when Mot. (And other) electronics salespeople started taking about "nanoacres of silicon", LOL! So, if the microcode-based designs were done in less silicon, and with the microcontroller price-wars heating-up, I could see Mot. Mgmt. Axing the 6809 for MCUs.

And then there was the 68000... apparently the marketing guys back in the day were dead set on only selling thousands of 68000s for full-blown Unix-type systems, and against selling millions of 68000s as an embedded processor.

Well, if you are really talking about "embedded" applications, one cannot forget the MC68HC33x series, with that mega-cool TPU. And slightly OT, Frickin' Fords ended up with PowerPCs in them, FFS!!!

By the time of the Macintosh/Amiga/Atari ST when they finally wised up, it was too late, the IBM PC had already happened.

Yes, but that is an ENTIRELY different story...

My piecing together of various legends about IBM choosing the 8088 was that they were interested in the 68008, Motorola didn't want to commit to IBM's deadline, IBM said never mind, then Motorola ended up releasing it by that date after all. And that is how you lose a war that you didn't even know had started.

But the 68008 was dog-slow, so IBM probably would have jumped-ship anyway.

In my opinion, the lack of a large flat address space in the leading architecture set the industry back by ten years.

Not for the Mac, Amiga nor Atari...

Comment Re:No Interlacing (Score 1) 113

You did preemptive multitasking on the Apple //? Way cool.... mine was strictly a cooperative multitasker, although it considered waiting for input (either from a remote connection or the keyboard) to be indicative that it was safe for the task requesting input to yield control. I had no hardware clock in my apple, so I could not do full-preemptive multitasking.

Thanks for the props, LOL!

Looking back on it, It was actually pretty close to a true, modern RTOS, with semaphores and "mailboxes", "task-suspending", and the whole bit. I had 16 "slots" (threads) that could be managed at a time. I called the functions "TaskMaster", IIRC. It was born out of the need to have multiple asynchronous functions, such as crossfades, sequences (which I could even "nest" up to 8 levels deep!), and to manage the CHARACTER-BASED, OVERLAPPING "Windowing" system I created for it as well. The hardware timer was part of the 16-channel self-refreshing "Channel" cards that were part of the system. Each card had two 7489 16x4 dual-ported "RAMs", and the 2k oscillator ran a 4-bit counter that kept the on-board 16-channel sample/hold (this was before DMX!) refreshed without processor intervention. in this fashion, you could use up to 4 cards in an Apple ][ to have a total of 64 channels of analog-dimming C.V. output, all without CPU intervention for the refresh. Each card had its own 2k "clock"; but also had a switch, so you could enable just one of them to be the "time-slice" clock. Worked a treat! In fact, I look back at that project (which eventually went exactly nowhere), and wonder how I did all that in 6502 Assembler, not knowing a damned thing about RTOS design, and very little about interrupt-driven systems, let alone windowed UIs (heck, hardly ANYONE knew much about that in 1982!!!) ...

As I said, when I was writing it I didn't even know the word 'multitasking' would describe what I was doing... I always described the mechanism as having "swappable stacks and I/O" so that it was easy to write in basic, for example, a multi-user bulletin board where the main program didn't have to concern itself with coordinating input and output for all other users (this was actually the specific purpose for which it was designed). With my OS extension taking care of which thread was talking to which I/O system (which modem, or the keyboard/screen) it actually was pretty cool.

Sounds like it!

And I know full well that prodos's basic.system is not a language, it was the part of the OS that was loaded into regular ram and interfaced with applesoft basic, understanding such things as ctrl-d on output being a DOS request. the mechanism I used replaced the basic.system file entirely. At the time, it was the largest assembly project I had ever tackled, clocking in at about 6.5K after it was assembled and linked, and sat at the top of ram, where basic.system ordinarily resided.

Yeah, what finally killed my enthusiasm for my lighting controller was the fact that, no matter how hard I tried, and no matter how much I modified the RWTS (Read/Write Track & Sector) floppy code, there was STILL a small sliver of time where Interrupts had to be DISABLED during the read of the "Next Song" in the "Set List", causing a momentary (around 1/2 second) glitch in any crossfade and/or sequence that was in progress.

I investigated switching from an Apple ][ to a standalone product using an embedded Amiga 500 (because their OS was INCREDIBLE at multitasking!), and then to a dedicated design using 3 microcontrollers (a 65816, a 6809, and a 6502) each running at 2 MHz, with a common memory-pool running at 6 MHz (so the CPUs would each "act" as if they had exclusive access to the memory; but could still do "message-passing" through agreed-upon "mailboxes"); but neither of those ever got past the drawing-board stage...

Good times, good times...

Comment Re:6502 orgasms (Score 2) 113

YES. 6809 was a truly elegant design that deserved to have far more success than it did.

Yeah, I always wondered about that. It never got transmogrified into an "HC" version, never-ever became a Microcontroller (oh, how much more fun the HC11 would have been, if it was based on a 6809 instead of a 6801?), etc. This article states incorrectly that a modified version of the 6809 forms the CPU in the HC11; but I remember from the datasheet that it was called a modified 6800 (the 6801); so?

The other possibility is that Motorola was already investing heavily into the 68k R&D, and didn't want to lose potential "sockets" to the less-expensive 6809. In fact, IIRC, the Mac was originally slated to have a 6809; but no one wanted to port QuickDraw (which was written in 68k for the Lisa) back to 6809; so they went ahead and put a 68k into the first Macs (overall, IMHO, a VERY wise decision!)...

By the way, that Wikipedia article does contain a fairly long list of products (some very successful!) that employed the 6809 or Hitachi's improved 6309 version. I remember Williams used it in their video games and electronic-ified Pinball machines, and also that the Fairlight CMI used dual out-of-phase 6809s as its main CPU(s); but I never knew that Yamaha put the 6309 into the DX-7 synth.

So, are there any Slashdotters out there that might know the inside story on why the 6809 never went further? Was it because Motorola lost its way so bad in the mid-late 90s? Was it the 68k? Or what?

Comment Re:I wonder (Score 1) 113

I wonder how Woz feels about this kind of development. He has a /. account so if you read this: did you ever think there would be computers powerful enough and people interested enough to implement your brainchild on a credit card sized machine with different architecture within your lifetime. What do you think of the arduino movement in comparison with the DIY computer movement from our time?

Well, considering that Apple pretty-much did an Apple-//e-on-a-chip back in 1991, I'd say he'd be rather bemused.

But supportive, nonetheless...

Comment Re:No Interlacing (Score 1) 113

ProDOS

ProDOS was a Disk Operating System, not a Language.

Oh, and I wrote a preemptive RTOS in 6502 for my Apple ][-based Stage Lighting Controller back in 1982, using a hardware "time-slicer" (interrupt-generator) running at 2 KHz, long before I knew what an RTOS was.

And I also wrote a "virtual-memory" system for Applesoft BASIC programs, that used the On-Error GoTo and "Ampersand Hook" to allow a programmer (we didn't call them "Developers" in those days!) to either write a program in "modular" form, or, in the case of the program I actually wrote it for, slice up an already-existing program into "modules", which would then be loaded directly from Disk (including startup, I could load an 8 KB module in about a second, and MUCH faster with a Corvus Hard Drive!), and silently continue without disturbing the Variables or anything else. As long as you didn't do something utterly stupid like try to break execution up across a Loop, it worked an absolute TREAT! Took an Applesoft program that was SO out-of-memory that it spent nearly all of its time doing Garbage Collection, and turned it into a completely-usable system.

Comment Re:6502 orgasms (Score 2) 113

Forget the troll. I understand exactly how you felt: just like getting inside a flying saucer.

IMHO it's arguable that the 8080 was better than 6502, though. Each had their advantages.

The Z80 OTOH was clearly better than all the others. To this day I find it had an elegant design.

I disagree.

Although I absolutely love coding Assembly language on the 6502, and have written tens of thousands of lines of same; I would rather the 6809 had taken off. The design of that processor was truly forward-thinking. It was a shame that only the Radio Shack (RIP) Color Computer (CoCo) employed that CPU; because it was closer to an 8/16 bit "miniature 68k" than it was to the 6800/6801/6501/6502 designs.

Among other things, It had an A and B Accumulator, which you could concatenate into one 16 bit accumulator, a hardware 8 x 8 MUL instruction, a 16-bit X Index Register, and a USER Stack Pointer, which was PERFECT for implementing things like FORTH, or for using as an Interrupt Stack. Along with this, it had a nearly-orthagonal Instruction Set, and almost as many Addressing Modes as the 6502. And, IIRC, unlike the 8080 and especially the Z80, it was nearly as miserly with cycles-per-instruction as the 6502 (which was nearly an early RISC chip in its efficiency). In fact, it was so far advanced from anything else Motorola was making at the time, that I wondered why the 6801 core was the one that got pulled-forward into all the Mot. Microcontrollers of the 80s and 90s, instead of the 6809. It was almost like someone from outer space dropped in to Motorola, threw the 6809 designs on the table, and then beamed-out again!

Comment Woz Sez He Regrets the Video Addressing Shortcut (Score 1) 113

I probably can't find the quote; but I distinctly remember reading an interview with Woz, stating (among other things), paraphrasing, "If I had known how popular the Apple ][ was going to be, I would have gone ahead and included the two extra chips it would have taken to make the video memory addressed sequentially."

Instead, we had BASCALC and HBASCALC calls in the Apple Monitor ROM.

And we liked it!

Comment Re:A less biased source please? (Score 1) 91

Yes, but Google gives you the choice; a simple settings parameter will allow you to step outside that walled garden (giving me hesitation to even term it as such) while Apple requires you to jailbreak your phone, which they forbid and actively try to prevent.

For billionth fucking time.

WOOSH!

You were so filled with Apple Hate that you completely missed the point: That, perhaps that, even given the choice (which I bet most Android users actually don't tKe advantage of anyway), the number of times that going outside of the Garden would solve an actual need for all but the most esoteric of uses, is so vanishingly small, that it just doesn't make sense by an overwhelming margin.

Typical egotistical slashdotter; what's good for me must be good for the platform...

Comment Re:A less biased source please? (Score 1) 91

...have you ever administered users in a Mac environment? I have. It made me want to leave IT.

Why yes. Yes I have. In a mixed environment with Windows, so I know the difference.

In that particular case, the Windows:Mac ratio was only about 2:1, yet the trouble-ticket ratio was about 50:1, with Windows being the "50", of course. And no, it is not a reflection on my IT skills in either environment; nor of the relative complexity of the tasks to which the machines of each OS were put; but rather a testament to the stability and ease-of-use (or lack thereof), of each.

. And judging from your responses and attitude, I would say that it is clear that you never bothered to learn how to Admin Macs, and so were stuck forever trying to figure out how to Admin Macs like Windows machines; which obviously frustrated both you and your unfortunate users. No wonder you were frustrated; hoping to death that your superiors wouldn't find out just how lame you were/are.

I have, at various times since Windows was DOS and Macs were Lisas, both used and administered both platforms in a huge variety of applications, including, but in no way limited to, such "swimming-uphill" ones like doing embedded software and hardware design on old-school Macs (it's a lot easier now), to doing desktop publishing on Windows, and absolutely everything in-between. I currently write Windows application software, and Admin several Windows servers (and the occasional Windows desktop) across several versions of both.

What I am saying is that you'd better improve your understanding of Macs and OS X; because, they're here to stay (just like Windows). So, the next time you feel enraged at a "stupid Mac user" (and, just like stupid Windows users, they most certainly exist), you can stop, take a deep breath, and actually try to help, rather than berate, them, ok?

Or perhaps, you should just listen to your heart and simply leave IT; because you aren't doing it right.

Comment Re:I lost it... (Score 1) 94

As for the scroll wheel, I wonder why it's there (instead of using gestures on the touch screen)

That was explained in the Keynote last Fall, when the watch was debuted.

Basically, Apple decided that the Watch screen was just too small to reasonably support Zoom and Scroll gestures with average size adult hands.

Slashdot Top Deals

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...