Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Nixon introduced Metrics in the 70's (Score 3, Interesting) 440

I was in the first grade in California when they started teaching us the metric system. That went on for a couple of years, but we returned to "English Measure" after Nixon left office. I didn't see Metrics again until I took trig.

Here's a paragraph from Nixon's letter to Congress:

5) An important step which could be of great significance in fostering technological innovations and enhancing our position in world trade is that of changing to the metric system of measurement. The Secretary of Commerce has submitted to the Congress legislation which would allow us to begin to develop a carefully coordinated national plan to bring about this change. The proposed legislation would bring together a broadly representative board of private citizens who would work with all sectors of our society in planning for such a transition. Should such a change be decided on, it would be implemented on a cooperative, voluntary basis.

Source: http://www.presidency.ucsb.edu...

Comment AWK SED GREP... (Score 2) 106

I was using SVR and BSD unixes from '88 forward. Just about everyone used awk(1), sed(1), and grep(1) to get non-dataplane work done. Perl came along in the early 90's so the propeller heads who brought us awk(1) improved awk's regular expression handler and you could accomplish everything with awk/sed/grep that perl could do. So few of my coworkers switched to perl: it was yet another "flavor of the month" scripting language that you could easily avoid. Oh: and perl wasn't distributed in money-making unixes of the time, and Linux hadn't initially even come out yet, so using perl meant you had to install it and manage it on every server in the farm across many unix flavors. Talk about de-motivation.

I admit it: I am a developer, I worked in C in 1988, and I work in C today. Back in 1993 when perl first darkened my e-mail inbox, the guy in charge of administering ClearCase wrote a ton of perl scripts to make ClearCase do whatever trickery we needed. I've never quite understood why he snorted the kool-aid and went with perl, but I'm guessing he was bored... with ClearCase administration... and perl gave him something to poke a sharp stick at.

Fast forward to 2017. I'm still an engineer and my hands are dirty with C code. More than 50 languages have come and gone since the late 80's, but I still butter my bread with C, and for trivial scripting or shell work, I still use awk/sed/grep, and now python. Python is exceptional, but I don't want to go into that: I'm talking about perl.

I completely skipped perl, and I am glad: any time I have to debug someone else's perl code usually its because it broke in some mysterious way because the author wasn't much of an engineer to begin with, doesn't work here anymore, or was stubbornly holding onto perl because they put so much effort into learning it and tries to avoid responsibility for the old ratty perl code they wrote. The perl developer often hates python with a religious passion, and that's a little weird. And the perl developer doesn't like to use comments in their code, or adds snarky/4-letter-word comments, or leaves dead comments around like little land-mines that distract the next poor sot who has to debug it.

So as far as perl goes, I'm not a fan: perl has brought me more time-wasting bugs than any other language I have done code-clerk mopping up operations for. So please go away, perl, so I can get some work done. I'm too old to waste the time.

Comment Again with ReactOS (Score 1) 118

Improved graphics in ReactOS is like improving the quality of a jump off a cliff: the improvements may be nice for an observer, but as the person jumping, my patience for testing ReactOS is nearing an end. If you get it working and charge 10% of what Microsoft charges, you will retire early, and I'll ask you to STFU and take my money.

But so far I haven't found a version of ReactOS that gets past my first attempts at using the internet: something bizarre always goes wrong that I can't find documentation for, and I admit, I'm not going to try very hard when I'm a Linux user since 1992 where shit works.

But I must admit: get this working, and you are on to something really, hugely big. I offer to carry you through the ticker-tape parade that will thank you for eliminating Micro$oft.

Comment KIM-1 (Score 3, Interesting) 857

In 1975 my father brought home a KIM-1 that had been built by the guy who designed the IMSAI 8080. I eagerly typed in the 6502 instructions included in the HOWTO manual that came with it, and I got an idea of what Turing Complete was all about. Great fun. But at the time there was no way to save the instructions, so you lost everything when the power went down. I got over it: I was 10 years old, and it was a great way to learn about volatility and, as I mentioned, Turing Complete.
Then in 2005 I was working for a GPS company (which later became Garmin). One day my manager came to me with an SOC data sheet, and he said something like, "This is a really cheap part, but we need to program it to coordinate the 32-bit GPS part with the SSD part, and the USB part." I read the datasheet and about screamed with joy when I saw that it was a 6502 (now owned by ST Micro). Once again, the 6502 taught me in an amazing way: the 6502 was bit-banging (I2C) the NMEA sentences out of the 32-bit part, and control of the SSD part, and was able to control the interface to the USB device. My job: write firmware (YES! FIRMWARE on a 6502! NO MORE POWER OUTAGES) so the high-speed USB part could power things and exchange NMEA sentences; make the SSD hold the ephemeris and almanac for the 32-bit part. That little 6502 certainly did it's job, and I had great fun re-learning the 6502 instruction set.

Comment Is this news? (Score 2) 508

Back in the 1940's, and even decades before that, this sort of thing was only muttered by sci-fi authors. Fast forward to our lofty, present date, and people with letters after their names get attention for regurgitating, and not adding one quip with yearing, over what was said almost a century ago. Why, oh, why, is this news? Will it still be news in 30 years after it STILL hasn't happened?

Comment Re:Math is fine! (Score 2) 218

I loved math until the 3rd grade (c. 1973); and between 3rd and 12th grades, I was a math failure. When I hit the age of 21, I decided to go to college, and that meant starting from scratch with remedial math at the local junior college, and the plan was to transfer to a 4-year university after that. Remedial math was wonderful: I had an instructor (Dr. Baum) who would keep explaining things until I could understood. This was very different from gradeschool where instructors would keep repeating the same thing that I didn't understand, but say it a bit louder each time they repeated it (as if that was supposed to help me understand). Five years later I was in my senior year after my transfer to UC Berkeley. I was an EECS major, and I had completed the Calculus (for engineers) and Math Analysis, and I was sailing toward my EECS degree. I decided to take a trip back to my original jr college, and look up Dr. Baum. To my extreme disappointment and sadness, I found out Dr. Baum had died. He never knew the door he had opened for me. Here we are, 30 years into my EECS career (more of a CS career, as it turns out; but I get to beat up on oscilloscopes and logic analysers once in a while), and it has been a lot of fun. Dr. Baum proved that it can come down to EXACTLY ONE instructor who reaches a student's mind, and without excellent instructors, we easily lose minds that could otherwise have been STEM participants. More's the pity. WE NEED MORE MATH TEACHER that aren't crappy.

Comment Looks cool... but... yet another language tool? (Score 4, Insightful) 105

I remember back when Cobol was going out of style, and I was an early adopter of C++ (1987-ish). ADA was going to change the world, C++ was doomed to never go anywhere, and C was going to vanish. Yourdon wrote a book about the fall of the American programmer. I wept over my keyboard. I told everyone I was crying because my C++ compiler was so frigging slow. But I knew the world was going to change, that ADA was going to kill all the other languages, and I really loved working in C and C++. So I waited for the world to change. Prolog was a big deal about the same time, and I didn't want to miss out, so I jumped on it for functional. And the "wow" thing of the day was Expert Systems. They were going to change the world. So I wrote some interesting diggers with Prolog. And I waited for the world to change. In around 1992 I entered the CHICAGO beta with Microsoft in preparation for Windows NT (which was going to change the world). I even wrote a device driver for CHICAGO to operate a RHETOREX PCM telephony board, and a printer driver for an old ATARI thermal printer. Fun projects, actually. Didn't make a dime, though. OS/2 WARP came out around then. It was going to change the world. It was 1994 when I first saw Java. It was going to change the world. I looked at the language, and it didn't interest me: I had C++, and C++ was starting to grow. And I couldn't even imagine not having pointers, not being able to talk to the CPU or devices directly (sans imported libraries). 1995 came along, a friend handed me a stack of floppies (I think about 20), and installed SLACKWARE LINUX over my Windows partition. "This is going to change the world," he said. It was funny, but I really and for truly was convinced this time that the world would be changed, and I didn't wait. I jumped into Linux with everything I had, and I've been working in C and C++ in linux ever since. I'm not trying to be funny or anything. The truth of the matter is that I've listed only several languages here, but I've worked in at least two dozen others that probably most people have never heard of (e.g. SPL for MTM/32). I keep seeing language come and go, that are supposed to change the world. As a young engineer I'd jump on every new language that came out, but most of the time the language turned out to be raspy in some way, was good at exactly one thing, and pretty sucky at everything else. And here we are. 2015. I still work in C and C++ every day of the work week, but I don't see ADA anywhere, I haven't cranked a line of FORTRAN since 1993, I never had to write RPG for a living, I've avoided Cobol altogether, HASKELL never took off like it was supposed to (ditto EIFEL), MATLAB costs too much (even though it is a heck of a tool), I like Python and don't much care for Perl, and on and on and on. And I've debugged way more Java code than I ever wanted to, but I haven't written a single line of Java, yet. And here's what I wanted to get to... I opened up Slashdot today and found the OP's article, and watched the video. And you know what? THIS ISN"T ANYTHING NEW! Not the features, not the tools, not the results. It is yet another language, yet another IDE, and I'm seeing the same kind of features I was using back in the 90's. Funny thing... I use gcc/g++ for my compilers; I use VIM for my editor; and I do quite well. I hate IDE's with a passion; and any time I've been sentenced to use a product with "code completion" or "intellisense", I feel like I've joined some kind of Commune of the Damned. I've quit jobs to escape the transition so the baloney world of IDE productivity. Maybe that means I'm out of touch or old fashioned or "stuck in the 80's". But I've never wanted for a job. And the kids we interview today mostly know the current "hip" language(s) and/or IDE (Hey! lets write a web page, yah!), but if you ask them about superscalar architectures, or how to write a Fibonacci generator using C++ templates, or what a 3-way handshake is for, you get a deer-in-the-headlilghts stare.

Slashdot Top Deals

Last yeer I kudn't spel Engineer. Now I are won.

Working...