They are called robots for the same reason that radio controled fighting toys on television are called robots. Remotely operated devices that mimic the movements of their operators were once called slave devices. The term robot comes from Chech where it meant a slave laborer!
This technology isn't new. An Austin car dealer a few years back had something similar where the buyer would pay weekly and punch a code on the receipt on a PINpad or else their car wouldn't start.
Not new at all. This was well known in metro Detroit at the now defunct Mel Farr Ford.
Mel Far bought an established dealership. Before Ford Motor forced him to close, Farr becase a "superstar" in these type of leases.
You already have experience in something specialized. There may not be many jobs per se, but there aren't many people to fill those. Move into driver development or embedded system programming.
Metro Detroit and the auto industry have started to come back from the dead. Look for real estate bargains in Oakland or western Wayne County. Ann Arbor is nice too.
THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800), the instruction set was reduced. It had only 2 data registers (A,B) and two 8 bit address registers ( X Y) and fewer complicated ways to branch. Instead it effectively memory mapped the registers by using instructions like, offset Y by A, treat that as an address and get the byte at that location. Because it could do all that in one clock cycle, This effectively gave it 256 memory mapped registers. It also didn't have separate input lines for perifprials, and instead memory mapped those.
Actually the 6502 only had one accumulator, the A register. The 6809 had A and B. It is correct that the 6502 had very nice addressing modes. Zero page addresses acted more like machine registers. One commonly used addressing mode was z-page indirect indexed by Y. Two consecutive locations on z-page acted like a 16 bit pointer and register. Either that could be incremented OR Y could be incremented. So a block move of 1 256 byte page was easy.
I don't think I *ever* used ($23,X) where X selects the z-page locs ("register pair") to use as a pointer.
At one time I had an Apple 2+ with a hardware accelerator board which ran at 3 MHz instead of the standard 1 MHz. For many tasks, my fast 2+ outran a comterporary PC-AT machine. For word processing, the Apple was much more responsive.
The conventional wisdom at the time was that the 65xx was clock for clock 4x more powerful. 3x4 effectively was 12 MHz which was faster than an AT. (Yes I'm ignoring memory and disk....).
Almost all pre-loaded software on a major PC brand (excluding Apple) is crippleware.
Not just pre-loaded software. Today I saw a poster in an internet newsgroup wanting to translate very old computer language X to newer computer language Y. Y is often used in an educational setting, particularly in physics, astronomy and math. I'm fluent in X and have used Y in the past, with some major gotchas. The previous version of Y misbehaved on Win XP. It would regularly GPF and graphic displays were mostly useless. I remembered that they offered a program to convert X to Y. It was compiled and it was missing two required DLLs. After rooting around on the internet, I found these libraries and ran the program. It created something that looked like it was written in Y, but did not look so great. Just for fun, I asked myself if I could compile and run the program in Y, since the program in X compiled and ran as is with more modern versions of X. The vendor offered two demo versions. The first choked on the source code, complaining about bad line numbers before it hung. Its interface reminded me of Win 3.1, so I was happy to remove it. The 2nd demo version was quite a bit larger. After starting its IDE, I loaded in the converted source code and tried to run it. No luck again. This time it "compiled" and left somewhat cryptic error messages. OK. I'll just try and fix the source code and re-compile. That turned unpleasant, because the editor on the time limited demo version was crippled. It would not save edited source files. I did not feel like going to the bother of using an external text editor to make corrections. I'm not happy about a time locked demo. I'm even less happy about one that deliberately aggravates me. It's not fun at all to test and correct example programs in this fashion. Why would I want to try this demo one minute more? I gave up, and uninstalled it. I then let the newsgroup know what had happened and that I would *not* be buying any products from Y.
There was Beneath Apple DOS, a fabulous book from the time which was invaluable for figuring out what was going on. My understanding was that Don Worth and Peter Lechner disassembled the shipped code and sorted out how things worked, with great explanations.
One thing that made their task easier was a program supplied with Apple DOS called FID (File Developer). That program hooked into a mid level part of DOS called the File Manager. FID spent a lot of time populating a data structure called the "File Manager Parameter List" and then calling various lower level routines.
Worth and Lechner, however, did a wonderful job of explaining Apple DOS at all levels, from how the disk hardware works all the way up to the command processor.
Well to be fair, Fortran has some pretty funky built in file IO constructs, and if you're doing binary data, the entire array must be read in at once.
Not modern Fortran. Most compilers now support stream I/O. List directed I/O and internal read-write have been around for a long time.
BTW, part of my job is management of a small business. If I had a business card, it would list my title as JOAT or CCBW (chief cook and bottle washer).
This, ladies and gentlemen, is why we don't get any kind of respect in management. Because that's what they see in us: The computerized equivalent of plumbers and bricklayers.
That's what happened to me today. I got a frantic call from the front desk. "Help, I can't print a recept - on any of 3 printers." Freeze. Deer in the headlights. Not a clue. The document in question was a PDF file displayed within a PDF reader. I quickly tried 2 printers and failed. The obvious solution, at least to me was to save the file to the desktop, then to a USB drive, then sneaker-net it over to my machine and print from there. Rebooted and the problem went away. Would anyone else have thought to save the file and print it elsewhere, no way, not in a million years. It's not just technical knowledge, but a particular way of thinking to solve problems.
I guess I'm not the only one who is amazed again and again how simple, trivial concepts can be impossible to grasp for allegedly intelligent people. And of course I consider what I can do fairly trivial because, well, let's be honest, it is.
I am amazed, in a technical newsgroup, how much trouble some scientists and engineers (I *hope* they are students...) have with simple programming concepts or using command line tools. One poster wrote a small novel to take an input file, extract a selected column of data and write that to an output file. That's a trivial 1 line program in AWK. It's not bad in Fortran, except that often people want to read the entire data file (of unknown length) into an array first instead of processing the input line-by-line. I guess the concept of INPUT-PROCESS-OUTPUT is too difficult. My experience with reading punched cards and generating line-printer reports *must* be worthless....
1. Performs better than Win7 (for me)
2. Has been perhaps the most stable iteration of windows (for me).
Classic Shell solved my problems with the UI.
So far Win 8 does one thing better than Win 7 (or Vista). It handles printers well.
1. I attached an older LJ 1020 to a new Win 8 laptop and installed the latest driver. PDF files would not print, so I rolled back to the latest *recommended* driver. Removing the printer and installing the new driver worked quickly and cleanly.
2. I have two networked HP printers on my LAN. Rather than install a full software suite, I chose universal printer drivers for PCL 6. Most of the time, when my network or router goes down, the printer assignments get lost. Printing sent to my B&W printer goes to my color laser, or just does not work at all. I finally gave up and installed printer specific drivers on all of my Vista and Win7 machines. Ick.
Win 8 auto detected both printers. It required *no* drivers to be loaded. I've had several power outages. Since then, my networked printers do not get lost on my Win 8 machines.
The big downside to pascal was that it took more effort to run on multiple platforms. Java had that going.
Not at all. UCSD Pascal was designed to be easy to port to various architectures. The bottom level was machine code. On top of that was P-code. Most everything including system utilities, editors, compilers was written in P-code.
Turbo Pascal, which I used both on CP/M and MS-DOS introduced a nice fast built in IDE.Compiler, editor and programs shared a common runtime. MS-Windows was written with Pascal in mind. Once upon a time STDCALL meant FAR PASCAL.
The main benefit of Pascal was that it was a good teaching language. It also was designed so that it could compile itself. Its design made 1 or 2 pass compilation easy.
Corn and sugarcane got nothing on the sugar beet.
As a Michigan native, I have always thought that sugar came from beets. This part of the state is the heart of sugar beet country. Growing more beets would solve several problems at once. It's time to plow under most of Detroit and plant beets. This would reclaim more of the city for productive use, create a tax base and possibly produce bio-fuels. At the same time, we can lower unemployment and empty the jails by teaching young people to farm. Imagine the historical irony of undoing the "great migration".
This CIA (Culinary Institute of America) used to be in New Haven. Maybe this explains why so many Bulldogs became spooks.
Matlab is obviously designed by someone who hates computers and FORTRAN was designed BEFORE the compiler was invented, meaning it was never meant to be used as what we would call a programming language.
Matlab was designed as a wrapper around library routines. Obviously you've never written in a modern Fortran. It has modern flow control, whole array operations, generic functions, user defined types, polymorphism and so on. AND a modern Fortran compile will still compile the dustiest decks from the early 60's.
Us old farts are trainable. I've forced myself to use Fortran 77, 90/95/2000/2003 to replace the FORTRAN I learned back in '62.
So keep off my lawn!
I do a lot of engineering software and a lot of that is in Fortran. A few years ago I migrated a system with 400 thousand lines of VAX-Fortran code to Linux, using g77. Recently I had to install this system in a new computer, running Ubuntu Lucid. To my dismay, I learned that Lucid doesn't have the g77 package anymore, the gcc compiler suite has been "upgraded" to gfortran. And gfortran does not support the VAX extensions that g77 did.
Luckily there's still a way to install g77 in Lucid using the Hardy repositories, but how long will this last?
Had the old engineers said, "OK, Fortran is dead, let's just keep a legacy compiler to run old code" everything would have been fine. But no, they insist on "improving" Fortran by putting C language features, e.g. pointers, into it. Why can't they just learn to program in C and let the old compilers do what they are good for, which is running legacy code?
I once signed a petition to retire Fortran [fortranstatement.com],
1. Horse puckey. Find a distribution that contains g77. Or compile g77 from source for your target platform. If you have maintained a 400 K LOC program then building g77 should be no problem.
2. gfortran goes out of its way to compile old g77 code.
3. Don't knock Fortran 90/95/2000/2003 etc because YOU can't write standard conforming code. Learn to use the tools properly instead of bitching about them. Modern Fortran gives you the facilities to replace almost all VAXisms. The only thing I know that will not convert well are VAX extensions that involve various RMS files. BTDT. If you are using keyed or ISAM files on the VAX, then you deserve to be hosed.
4. I don't care how Atlas is optimized. For me, having true multidimensional arrays indexed the way I want them brings me more optimization than C could ever do. Plus the "read only" code that C and C++ encourages creates horrors even worse than the oldest dusty program from FORTRAN IV days. Modules etc catch all sorts of errors that C/C++ compilers happily allow.
5. None of this justifies paying junior developers more, but your level of CRI does.
6. I was programming in Fortran long before you were born. So get off my lawn!
Operator: Can I help you?
Skype: YES, all of our peer-to-peer servers just went down. We have 23 million users offline right now.
Actually it's more like this:
Skype customer service. This is Peggy.....