Slashdot videos: Now with more Slashdot!
I hope you're just forgetting that "Atari BASIC" was pretty synonymous with the 8-bit Atari 400/800/1200XL/600XL/800XL/65XE/130XE/XEGS line of computers. Since it was based on Microsoft BASIC (but NOT written by Microsoft), a lot of the same tricks could be applied. Including multiple statements per line.
The Atari VCS BASIC cartridge is a aberration that should not be mentioned in the same breath
I'm currently taking the Internet History, Technology, and Security course through Coursera. There was a written assignment, 200-400 words in length where you could describe how people, technology and information connected to create the internet, within the timeframe of 1930 and 1990.
It was hinted at that I plagiarized my answer, mostly because I didn't cite my sources. Well, let's see... I've been programming since 1979, online since 1982, and started using the web in 1994. Did I take my answers from Wikipedia? Did I take them from the web at all? No, because I love computers and have read thousands of books and magazines, talked to thousands of like-minded people, and I lived through a decent chunk of that history. There isn't a clear source to cite, and I doubt that I could say something like: "Tonah, Lo (2012). "Crap I Remember," 'My Old Brain', 00(00)."
So in a 400 word essay, how is there enough room to write something self-created that a plagiarism tool won't trip over?
Did you eventually get paid for Synfile+? The wording of your sentence could lead someone to believe that either it took you three years to get paid, or that you did three years of work that you didn't get paid for. I sure hope you were paid.
I never know what to think or say when I think about the last decade of Atari. Warner had no idea how to run a tech company, and there were too many projects going nowhere. They took a look at a few months-worth of profits and decided that they could spend money like that forever. Suddenly the videogame market, then the home computer market tanked.
So Jack comes in, and has a lot of hard decisions to make. Cut here, slash there. Discontinue products. Write off factories and warehouses full of product that isn't moving. Kind of like how Steve Jobs came back to Apple and had to gut things fast.
He knew that selling the Atari 8 bits wasn't going to work for long--PC compatibles and Macintoshes were starting to make inroads into homes and smaller businesses. Game machines were dead. He knew what Amiga had cooking, and when Commodore got ahold of that he knew he needed a counter-product. So, like the IBM PC, Atari used off-the-shelf components and built something quick and dirty.
A lot of people took him to task for not advertising. There was advertising, just not in expensive publications. Very little in Byte, for instance. There was a big campaign at first, but then it seemed like nothing. Atari turned inward, producing magazines like Atari Explorer instead. Besides, who is to say that spending $5 million dollars a month on advertising actually is effective? BTW, I heard that it cost $1 million to do a full page colour ad in Byte. So how much for Newsweek, Time, etc., and how effective is it?
I think Atari did a lot with the little money they had. I doubt Tramiel got any richer from his time there. He came into Atari with $40 million personal worth--how much did he leave with?
He is right, but he is wrong.
GP basically said that the C64 had the first graphics chips. Weeelll... yes, before IBM had them. But Atari was doing graphic chips before Commodore, so that's why he is wrong.
Okay, I'm forced to agree with this. Atari did very little to upgrade the ST, and it pissed me off too.
Its a bit of a damned if you do, damned if you don't situation. It was always said that Atari didn't want to confuse users by having to pay attention to hardware specs when buying software, but the rest of the industry was doing just fine with that dilemma.
I would have killed for an expansion slot on the ST. A cheaper Mega ST. Anything. I did some crazy upgrades to my ST (Stereo sound, 4096 colour video expansion, 4Mb memory). I got a 20Mb hard drive (bought it from Bill Wilkinson himself). You shouldn't have to jump through the hoops that I did to expand a machine. I had the Magic Sac, then the Spectre 128, then the Spectre GCR.
But then Windows 3.0 came out, and I got a 286 with a mono VGA card, 2Mb RAM, and a 20Mb internal hard drive. That's when Atari stopped being my main computer.
I think I would blame Irving Goulde rather than Tramiel for sucking the money out. Tramiel wanted new designs, he wasn't afraid of trashing an architecture in order to move on.
Commodore had all sorts of 68000, 8080 and Z8000 designs (although most were co-processors to the 650x) that never saw the light of day. But because they saved so much money on using MOStek processors, they kept going back to that well. I think their biggest failures were not forcing MOStek to come up with the 65816 themselves.
Geez, Tramiel didn't come to your house and force you to buy it at gunpoint. The world couldn't help you from being a dumbass Atari fanboi, so why blame him for smacking you back to reality?
Atari couldn't afford to take three years to design the next Amiga. They did great considering the time/money constraints.
If Commodore hadn't bought MOStek, the Apple
So...if Apple had to use the 6800, the price of their computers would likely be around $200 higher.
Jack *had* to decimate Atari. Atari was losing so much money that nothing short of gutting it could save it.
I think that he was referring to cards like those put out by Applied Engineering, that would give you higher resolutions, RGB output, etc. Apple didn't do much more than 80-column/memory cards for the
Well, I dunno if the 6800 was more powerful than the 6502 (wasn't the 6502 more or less a clone of the 6800?), but the 6809 sure was.
And the Z80 was driving off the cliff by 1982 (IBM PC's were starting to eat CP/M machines for lunch). I don't think I'd say the Z80 at 4MHz was more powerful than the 6502 at 1MHz. The 6502 could do more per cycle.
As far as business machines go, it was more about an operating system than about any one computer. I saw CP/M run on some pretty crappy hardware, and I would rather have had an Apple, PET or TRS-80 than some of those so-called business machines. Once VisiCalc came out, and then Lotus 1-2-3, it was game over for CP/M.
This is my favorite comment of the day. Thanks!
I love how people who weren't there try to pass themselves off as experts
Really? I must have been a time traveller then, because I recall using an IBM PC in 1982 (came out in '81).