Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

Comment: Dogs Flew Spaceships! (Score 1) 140

by macs4all (#49496261) Attached to: Scientists Close To Solving the Mystery of Where Dogs Came From
Dogs Flew Spaceships!

The Aztecs Invented the Vacation!

Men and Women are the Same Sex!

Our Forefathers Took Drugs!

Your Brain Is Not the Boss!

Yes, that's Right: Everything You Know Is Wrong!!!

Hello seekers! Here we go again! And hello to the skeptic inside you who might still believe that pigs live in trees, and that faithful Rovers is nothing more than a pet sleeping by the doggie door. Well, doggone it, he's smarter than you'll ever be! Yes, I've got proof here that his ancestors came from the Dog Star millions of years ago to rule the Earth! He's been there - and you probably don't even know where you are...

-- With much kudos to The Firesign Theatre, 1974

Comment: Re:Who wears a watch these days (Score 1) 289

Neither had I. Until last July, when my wife gave me an Android smartwatch for my birthday (suck it Apple ... you are not an innovator of smart watches).

It's actually very funny when a Fandroid accuses Apple of not inventing something; because, since they ripped-off the entire concept of the original iPhone, ya know...

I trust everyone around here remembers that; or do I have to whip out the citations, Google internal memos, and the before and after pictures (yet again)?

Comment: Re:Hmm (Score 1) 197

by macs4all (#49430439) Attached to: Windows 10 Successor Codenamed 'Redstone,' Targeting 2016 Launch

And the fact you could get software from many publishers, rather than just Microsoft, is why Microsoft has the massive market share over Apple. Guess history is not a good lesson.

Um, that's not why.

There is one, and only one, reason why Microsoft once had massive market share over Apple (and everyone else) in the business world: Exchange and Outlook.

Period. Seriously. Period.

Further, Apple has never restricted the sources from which you could get software for MacOS nor OS X. In recent years, it has made the user make a conscious decision to do so; but it has never disallowed Macintosh software from any source.

You're confusing OS X and iOS. Pretty lame for a Slashdotter.

Comment: Re:6502 orgasms (Score 1) 113

by macs4all (#49426317) Attached to: Turning the Arduino Uno Into an Apple ][

The Vectrex video game system also used the 6809.

Yeah, I saw that in the Wikipedia article.

And my guess as to why Moto didn't use the 6809 as the basis of the 6811/6812 is because they wanted to use microcode, and it would have been harder because of the post-byte index modes.

Interesting. I am not familiar enough with the internal circuit topology of the '09 to comment. Did the PLA approach take more, less, or about the same silicon as if the '09 would have used a microcoded approach? Because that was around the time when Mot. (And other) electronics salespeople started taking about "nanoacres of silicon", LOL! So, if the microcode-based designs were done in less silicon, and with the microcontroller price-wars heating-up, I could see Mot. Mgmt. Axing the 6809 for MCUs.

And then there was the 68000... apparently the marketing guys back in the day were dead set on only selling thousands of 68000s for full-blown Unix-type systems, and against selling millions of 68000s as an embedded processor.

Well, if you are really talking about "embedded" applications, one cannot forget the MC68HC33x series, with that mega-cool TPU. And slightly OT, Frickin' Fords ended up with PowerPCs in them, FFS!!!

By the time of the Macintosh/Amiga/Atari ST when they finally wised up, it was too late, the IBM PC had already happened.

Yes, but that is an ENTIRELY different story...

My piecing together of various legends about IBM choosing the 8088 was that they were interested in the 68008, Motorola didn't want to commit to IBM's deadline, IBM said never mind, then Motorola ended up releasing it by that date after all. And that is how you lose a war that you didn't even know had started.

But the 68008 was dog-slow, so IBM probably would have jumped-ship anyway.

In my opinion, the lack of a large flat address space in the leading architecture set the industry back by ten years.

Not for the Mac, Amiga nor Atari...

Comment: Re:No Interlacing (Score 1) 113

by macs4all (#49425399) Attached to: Turning the Arduino Uno Into an Apple ][

You did preemptive multitasking on the Apple //? Way cool.... mine was strictly a cooperative multitasker, although it considered waiting for input (either from a remote connection or the keyboard) to be indicative that it was safe for the task requesting input to yield control. I had no hardware clock in my apple, so I could not do full-preemptive multitasking.

Thanks for the props, LOL!

Looking back on it, It was actually pretty close to a true, modern RTOS, with semaphores and "mailboxes", "task-suspending", and the whole bit. I had 16 "slots" (threads) that could be managed at a time. I called the functions "TaskMaster", IIRC. It was born out of the need to have multiple asynchronous functions, such as crossfades, sequences (which I could even "nest" up to 8 levels deep!), and to manage the CHARACTER-BASED, OVERLAPPING "Windowing" system I created for it as well. The hardware timer was part of the 16-channel self-refreshing "Channel" cards that were part of the system. Each card had two 7489 16x4 dual-ported "RAMs", and the 2k oscillator ran a 4-bit counter that kept the on-board 16-channel sample/hold (this was before DMX!) refreshed without processor intervention. in this fashion, you could use up to 4 cards in an Apple ][ to have a total of 64 channels of analog-dimming C.V. output, all without CPU intervention for the refresh. Each card had its own 2k "clock"; but also had a switch, so you could enable just one of them to be the "time-slice" clock. Worked a treat! In fact, I look back at that project (which eventually went exactly nowhere), and wonder how I did all that in 6502 Assembler, not knowing a damned thing about RTOS design, and very little about interrupt-driven systems, let alone windowed UIs (heck, hardly ANYONE knew much about that in 1982!!!) ...

As I said, when I was writing it I didn't even know the word 'multitasking' would describe what I was doing... I always described the mechanism as having "swappable stacks and I/O" so that it was easy to write in basic, for example, a multi-user bulletin board where the main program didn't have to concern itself with coordinating input and output for all other users (this was actually the specific purpose for which it was designed). With my OS extension taking care of which thread was talking to which I/O system (which modem, or the keyboard/screen) it actually was pretty cool.

Sounds like it!

And I know full well that prodos's basic.system is not a language, it was the part of the OS that was loaded into regular ram and interfaced with applesoft basic, understanding such things as ctrl-d on output being a DOS request. the mechanism I used replaced the basic.system file entirely. At the time, it was the largest assembly project I had ever tackled, clocking in at about 6.5K after it was assembled and linked, and sat at the top of ram, where basic.system ordinarily resided.

Yeah, what finally killed my enthusiasm for my lighting controller was the fact that, no matter how hard I tried, and no matter how much I modified the RWTS (Read/Write Track & Sector) floppy code, there was STILL a small sliver of time where Interrupts had to be DISABLED during the read of the "Next Song" in the "Set List", causing a momentary (around 1/2 second) glitch in any crossfade and/or sequence that was in progress.

I investigated switching from an Apple ][ to a standalone product using an embedded Amiga 500 (because their OS was INCREDIBLE at multitasking!), and then to a dedicated design using 3 microcontrollers (a 65816, a 6809, and a 6502) each running at 2 MHz, with a common memory-pool running at 6 MHz (so the CPUs would each "act" as if they had exclusive access to the memory; but could still do "message-passing" through agreed-upon "mailboxes"); but neither of those ever got past the drawing-board stage...

Good times, good times...

Comment: Re:6502 orgasms (Score 2) 113

by macs4all (#49425083) Attached to: Turning the Arduino Uno Into an Apple ][

YES. 6809 was a truly elegant design that deserved to have far more success than it did.

Yeah, I always wondered about that. It never got transmogrified into an "HC" version, never-ever became a Microcontroller (oh, how much more fun the HC11 would have been, if it was based on a 6809 instead of a 6801?), etc. This article states incorrectly that a modified version of the 6809 forms the CPU in the HC11; but I remember from the datasheet that it was called a modified 6800 (the 6801); so?

The other possibility is that Motorola was already investing heavily into the 68k R&D, and didn't want to lose potential "sockets" to the less-expensive 6809. In fact, IIRC, the Mac was originally slated to have a 6809; but no one wanted to port QuickDraw (which was written in 68k for the Lisa) back to 6809; so they went ahead and put a 68k into the first Macs (overall, IMHO, a VERY wise decision!)...

By the way, that Wikipedia article does contain a fairly long list of products (some very successful!) that employed the 6809 or Hitachi's improved 6309 version. I remember Williams used it in their video games and electronic-ified Pinball machines, and also that the Fairlight CMI used dual out-of-phase 6809s as its main CPU(s); but I never knew that Yamaha put the 6309 into the DX-7 synth.

So, are there any Slashdotters out there that might know the inside story on why the 6809 never went further? Was it because Motorola lost its way so bad in the mid-late 90s? Was it the 68k? Or what?

Comment: Re:I wonder (Score 1) 113

by macs4all (#49423403) Attached to: Turning the Arduino Uno Into an Apple ][

I wonder how Woz feels about this kind of development. He has a /. account so if you read this: did you ever think there would be computers powerful enough and people interested enough to implement your brainchild on a credit card sized machine with different architecture within your lifetime. What do you think of the arduino movement in comparison with the DIY computer movement from our time?

Well, considering that Apple pretty-much did an Apple-//e-on-a-chip back in 1991, I'd say he'd be rather bemused.

But supportive, nonetheless...

Comment: Re:No Interlacing (Score 1) 113

by macs4all (#49423321) Attached to: Turning the Arduino Uno Into an Apple ][

ProDOS

ProDOS was a Disk Operating System, not a Language.

Oh, and I wrote a preemptive RTOS in 6502 for my Apple ][-based Stage Lighting Controller back in 1982, using a hardware "time-slicer" (interrupt-generator) running at 2 KHz, long before I knew what an RTOS was.

And I also wrote a "virtual-memory" system for Applesoft BASIC programs, that used the On-Error GoTo and "Ampersand Hook" to allow a programmer (we didn't call them "Developers" in those days!) to either write a program in "modular" form, or, in the case of the program I actually wrote it for, slice up an already-existing program into "modules", which would then be loaded directly from Disk (including startup, I could load an 8 KB module in about a second, and MUCH faster with a Corvus Hard Drive!), and silently continue without disturbing the Variables or anything else. As long as you didn't do something utterly stupid like try to break execution up across a Loop, it worked an absolute TREAT! Took an Applesoft program that was SO out-of-memory that it spent nearly all of its time doing Garbage Collection, and turned it into a completely-usable system.

Comment: Re:6502 orgasms (Score 2) 113

by macs4all (#49423217) Attached to: Turning the Arduino Uno Into an Apple ][

Forget the troll. I understand exactly how you felt: just like getting inside a flying saucer.

IMHO it's arguable that the 8080 was better than 6502, though. Each had their advantages.

The Z80 OTOH was clearly better than all the others. To this day I find it had an elegant design.

I disagree.

Although I absolutely love coding Assembly language on the 6502, and have written tens of thousands of lines of same; I would rather the 6809 had taken off. The design of that processor was truly forward-thinking. It was a shame that only the Radio Shack (RIP) Color Computer (CoCo) employed that CPU; because it was closer to an 8/16 bit "miniature 68k" than it was to the 6800/6801/6501/6502 designs.

Among other things, It had an A and B Accumulator, which you could concatenate into one 16 bit accumulator, a hardware 8 x 8 MUL instruction, a 16-bit X Index Register, and a USER Stack Pointer, which was PERFECT for implementing things like FORTH, or for using as an Interrupt Stack. Along with this, it had a nearly-orthagonal Instruction Set, and almost as many Addressing Modes as the 6502. And, IIRC, unlike the 8080 and especially the Z80, it was nearly as miserly with cycles-per-instruction as the 6502 (which was nearly an early RISC chip in its efficiency). In fact, it was so far advanced from anything else Motorola was making at the time, that I wondered why the 6801 core was the one that got pulled-forward into all the Mot. Microcontrollers of the 80s and 90s, instead of the 6809. It was almost like someone from outer space dropped in to Motorola, threw the 6809 designs on the table, and then beamed-out again!

Comment: Woz Sez He Regrets the Video Addressing Shortcut (Score 1) 113

by macs4all (#49423045) Attached to: Turning the Arduino Uno Into an Apple ][
I probably can't find the quote; but I distinctly remember reading an interview with Woz, stating (among other things), paraphrasing, "If I had known how popular the Apple ][ was going to be, I would have gone ahead and included the two extra chips it would have taken to make the video memory addressed sequentially."

Instead, we had BASCALC and HBASCALC calls in the Apple Monitor ROM.

And we liked it!

Comment: Re:A less biased source please? (Score 1) 91

Yes, but Google gives you the choice; a simple settings parameter will allow you to step outside that walled garden (giving me hesitation to even term it as such) while Apple requires you to jailbreak your phone, which they forbid and actively try to prevent.

For billionth fucking time.

WOOSH!

You were so filled with Apple Hate that you completely missed the point: That, perhaps that, even given the choice (which I bet most Android users actually don't tKe advantage of anyway), the number of times that going outside of the Garden would solve an actual need for all but the most esoteric of uses, is so vanishingly small, that it just doesn't make sense by an overwhelming margin.

Typical egotistical slashdotter; what's good for me must be good for the platform...

APL is a write-only language. I can write programs in APL, but I can't read any of them. -- Roy Keir

Working...