Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Interview with Monte Davidoff 134

Motor writes: "The Register has an interview with Monte Davidoff, one of the men responsible (along with Gates and Allen) for the original Microsoft BASIC. So what does he think of Linux... CPRM... Python... RMS and GNU software? Great stuff."
This discussion has been archived. No new comments can be posted.

Interview with Monte Davidoff

Comments Filter:
  • by Anonymous Coward
    You might like to take a look at the Windows Scripting Host. It has been available for years, it's MS produced, it comes as standard with W2K (maybe others too). It is a free download from the MS website if it's not already there or on the CD. It's been a while since I used a win9x but I'm fairly sure it was also on the win98 CD.
  • by Anonymous Coward
    Let me get this straight. You write C++ code for 7 years without using any inheritance? I'm impressed. I suppose you use C without pointers too? Perl without regexp? HTML without tags? How about English without verbs? Do you use only the first gear on your automobile? Please, I want to know.
  • by Anonymous Coward
    You're my new hero.
  • Uhh, it's perfectly possible to live independantly off $15k/yr. You may have to choose where ya live carefully (I wouldn't want to try it in the Bay Area), but I've lived comfortably off of $8k/yr before. Not that I'm not happier with my present (higher) pay -- but I had no complaints even then.
  • Uh, I'm 21.
  • That's pretty close-minded. Most excellent programmers I know (yes, there are some better than me... sigh) cut their teeth programming basic. For someone curious about computers, in those days it was very easy to get into basic: it was bundled with dos, taught in (lower) schools, there were help files for it, and it's an easy language to pick up. The alternative was to buy a 'real' compiler for a 'real' language, either from Borland or Microsoft (TurboXXX and QuickXXX prespectively), unless one knew people who could get that stuff for you. And then a stack of books to be able to do anything useful. Now, I don't know how serious Dijkstra was when he made that statement, probably not very, I would hope. Otherwise it makes him seem arrogant, oh, look, lucky me, I never had to deal with basic, I started with a _man's_ programming language unlike those other losers!
  • by Andy Tai ( 1884 ) on Friday May 11, 2001 @06:54PM (#228942) Homepage
    For someone who was in the dark side at the very beginning, it is nice that he actually recognizes the contribution of GNU. The contrast between Bill Gates and Richard Stallman is worth noting, for this contrast is shaping what's to come in the software world.

    "Most of you steal your software... What hobbyist can put years into programming, finding all bugs, documenting his product and distribute for free?"----An Open Letter to Hobbyists, Bill Gates, Micro-soft, 1976

    "GNU... is the name for the complete Unix-compatible software system which I am writing so that I can give it away free... Once GNU is written, everyone will be able to obtain good system software free, just like air."----The GNU Manifesto, Richard Stallman, Free Software Foundation, 1985

    Microsoft Windows vs. GNU/Linux, 2000

  • VB supports interface inheritance which is a feature of COM.

    As one example virtually every MTS component you write implements Objectcontrol methods Activate, Deactivate and CanBePooled

    We also usually write our external component interfaces using IDL and then implementing them in VB code.

    It's pretty standard. I honestly use inheritance all the time in VB.

  • I and most other geeks I know cut our teeth on BASIC back in 1982. Microsoft BASIC, Applesoft, Atari, Commodore, etc.

    10 print "stuff"
    20 gosub 50
    30 goto 10
    50 print "more stuff"
    60 return

    You could do some simple functions in MSBASIC, it supported a function definition which was really more of an inline macro.

    Other than that, yeah it pretty much sucked for trying to do anything structured.

    That all changed towards the mid 80's. There were a lot of more advanced BASIC compilers available for the Amiga, ST, PC, etc. that supported functions, subroutines, etc. without line numbers. :)
  • Was he in the famous Microsoft team photo from 1978 [] - the scary one where almost everyone has terrifying arrangements of facial except for the young CEO - and could he please identify himself? As it happens, he wasn't there, but he does shed light on its origins:-

    "That was taken after my second summer, working on the second BASIC for Microsoft. The story there is that Bob Greenberg (center right) had won a prize from a photo lab, which was a free photo shoot." So the first corporate publicity shot came about completely by chance.

    The stories of these people can be found here [] for example.

  • Indeed - Monte is a cool guy. I know him through his Harvard room-mate that I worked with 15 years or so ago. It's funny that the guy didn't ask the most obvious question, ie.e is Monte steemed about not being a Billionaire too? Well - let's just say he's gotten over it ;-)

    It's kinda neat to see a friend actually show up on Slashdot! ;-)

  • So the next time you decide to put down VB, remember that you are deriding a compiled, object-oriented language. Java can't even claim to be that.
    Well, as you pointed out, VB lacks inheritance which is generally considered one of the big 3 features it takes to qualify a language as OO. As far as being compiled, any language can be compiled, dipshit.
  • VB supports interface inheritance which is a feature of COM.
    It's pretty standard. I honestly use inheritance all the time in VB.
    Hmm....I wouldn't mix the two words 'inheritance' and 'interfaces'. The 'interfaces' I've dealt with (as language features, e.g., Delphi, Java) usually lack an implementation (interfaces hide implementation), while inheritance generally refers to inheritance of attributes and/or behavior (implementation). I'm by no means a VB expert, however, saying that VB supports inheritance because it lets your classes (or whatever they are in Vb)extend interfaces is not a compelling argument.
  • You're incorrect, Daisy. The brain is enormously pre-wired. Who you are is, on the whole, by nature, not nurture. Shyness, for instance, can be predicted with startling accuracy *before* the baby is born, by monitoring its reaction to stimulus. Baby pre-speech gurgles and noises are wired-in, and the brain is already configured to learn language.

    It's not a blank slate. There's quite a bit of structure built-in, and the entire thing is primed for learning and pattern recognition.

  • >Get over it. THere will never be another Bill
    >Gates in the software industry.

    Or another rockefeller in the oil industry. Or Henry Ford in the automobile industry. And (switching from money to fame), it'd be a touch hard to get another Linus Torvalds in the operating system space.

    Hands up everybody who's suprised it's easier to get a 50% share of an industry when it's really really tiny, and then hang on for the ride as it grows, than try to de-commoditize established one.


  • "...I have to say something positive about the open source and FSF stuff or the mob will come after me and mutilate me..."

  • Most people on slashdot are posers. They don't even use linux they just hang out here and pretend they are cool.

    Or they use Opera and leave it in its default "Pretend to be IE5" mode.

    I use Opera on both Linux and Windows out of preference, and while I prefer the Windows version at the moment (gesture support rocks, and the scroll works better than Linux) I can see how leaving it with IE support would mess up the stats.

    Remember this when reading your logs, webmasters!

  • Except that version 2 of BASIC was "more efficient". THAT'S certainly changed.

  • Yes, I especially enjoyed this bit near the top, where Gates is talking about using time-shared equipment back in junior high:

    You would type the programs off-line on this yellow paper tape and then put it into the tape reader, dial up the computer, and very quickly feedin [sic] the paper tape and run your program. They charged you not only for the connect time, but also for storage units and CPU time. So, if you had a program that had a loop in it of some type you could spend a lot of money very quickly. And so we went through the money that the Mothers Club had given very rapidly. It was a little awkward for the teachers, because it was just students sitting there and zoom! -- the money was gone.
    Rented software, anyone?
  • It is. My favourite quote from billg:

    In fact, they thought there wasn't enough work to go around, so they kicked me off. I said, "Look, if you want me to come back you have to let me be in charge. But this is a dangerous thing, because if you put me in charge this time, I'm going to want to be in charge forever after."

    I don't think that desire has diminished one smidgen with time...

  • Fascinating read. From the article:

    We went around the country in this big van, big blue van, they had, with these machines starting up user groups and demonstrating things. Actually, before we even shipped BASIC, somebody stole the demo copy out of the van and started copying it around and sending it to different computer clubs.

    I wonder if that was the first ever incident of software piracy? ;-)

  • I found it interesting that in the list Intel processors preceding the 8088 was the 404. A coincidence, or a prediction of where it will all end up?

  • by spudnic ( 32107 ) on Friday May 11, 2001 @03:26PM (#228958)
    I really don't think that's the point. It appears that he was just a part time employee working on a project during two summers.

    We aren't told that he decided to take the high road with his career and not try to become filthy rich.

    He did his job and went on with his life.
  • The same thang can be said about perl.
  • That wasn't nearly as fun as changing a single variable to make Nuclear Gorilla!

    In my mind, having a language, even BASIC, plus example source code in the OS distribution was a very good thing and I was sad to see it disappear. It wasn't long before I was exclusively Linux.

    You can't even script in Windows without third-party tools. That's pretty pathetic.
  • by James Lanfear ( 34124 ) on Friday May 11, 2001 @09:26PM (#228961)
    The algorithmic approach to software construction is the primary reason why software sucks.

    Which is unfortunate, since algorithms are one of the fundamental concepts in computing. I'd love to hear how you intend to replace the whole of computer science with an algorithm-free alternative.

    And it's all because of the algorithm.

    And physics. If we could get rid of physics it would be a lot easier to keep planes in the air. Actually, the common element seems to be time -- physics supplies it and algorithms consume it. I suggest we stop using time immediately.

    Well consider this: The reliability of software is inversely proportional to its complexity while the reliability of the human brain improves as it gets more complex.

    When was the last time you found a worm with Alzheimer's, or schizophrenia, Tourette Syndrome? I have yet to a bug so depressed as to leap beneath a shoe to to be squished. (You could -- and I might -- argue that those don't count a defects, since, e.g., schizophrenia could very well be the correct state for some people's brains, given their genetics composition, but I could just as easily say that Windows should crash given the crappy code that goes into it.)

    The most obvious difference between software and the brain is that the former uses sequential algorithms whereas the latter is based on parallel streams of signals.

    Which are provably equivalent to sequential and parallel algorithms, barring a gross violation of the laws of physics. In fact, if you accept ANNs as reasonable abstractions of real neural networks, I have a book on the topic right here.

    A signal-based system is more reliable because it makes it possible to have strict control over the timing of events. By contrast, one can never be sure when an algorithm will be done, and this creates all sorts of timing problems.

    I'm sure this would come as a surprise to hard realtime systems and neurons alike.

    It is no secret that hardware is inherently parallel and driven by signals.

    Try directly implementing an algorithm in your choice of fundamental fields. Note the reasons why this doesn't work.

    I just remembered who you are, and grew very tired, so I'm going to watch TV. Have fun changing the world.

  • clued in ones usually :
    [1] get bored easily.
    [2] dont care about the business aspect.
    [3] dont like to support the same product for n years after they write it.
    [4] detest marketing deadlines and budgets.
    [5] prefer working in small groups or alone
    [6] move on to more interesting things. raking in money doing nothing is really boring. trust me on this one.
  • I've still got the original floppy, and the manual. I learned programming from the function dictionary in the manual. That manual will forever be placed in an esteemed position in my library (a 90 degree turn of my head is all it takes to view it right now), for it got me interesting in programming. I lost my social life during middle school to that book. Ah, the memories....

  • Most people on slashdot are posers. They don't even use linux they just hang out here and pretend they are cool.
  • Guess what sparky, money isn't everything. :) I would rather make my salary of ~$15,000/year and be very happy with what I do. Then make insaine ammounts of money and not be happy. I am pretty sure he likes what he does.

    Why do we always assume that rich people are unhappy? Is it a way of expressing our envy, and since we aren't that rich, well, at least _I_ am happy, because he surely can't be! But then, you did mention you had the chance, and turned it down.

    Maybe rich people are happy making money, just in the pure pursuit of it. Having it doesn't do it, because these guys just keep making more, it seems. Maybe I'm not rich because making money doesn't make me happy, so I don't pursue it with gusto. And I can't imagine lots of money making me happy. I've seen lots of miserable poor folk, too. I can tell you one thing: I sure do like being comfortable a whole lot more than being poor, which I've been.

  • If you want to get ogg vorbis supported I suggest you get it included by default in the xmms source release, or at least make the ./configure script stop if you dont have the ogg vorbis libs on your system. Also getting it included in Red Hat and other distributions would be a good idea.
  • I must disagree with you here - you are discussing the topic at far to high of a level. You're looking at the applications and trying to understand the hardware. There are two models of cognition prevalent in modern cognitive sciene: Connectionism ans Symbolism. If you have formal education in the field of Cognitive Science, you already know you sound like a Conectionist. I on the other hand, am a Symbolist. Let me explain: (and if there any cognitive scientists reading, please forgive my constant analogies to computer graphics)

    The idea behind the symbolist model of cognition is looking at the brain to determine what is the minimal amount ot pre-wiring you need to build a device that thinks like a human. In the Symbolist model of cognition, this turns out to be quite a lot - there are many things you know how to do that you never learned and were never taught.

    Take vision for example. Your eyes are detectors of luminosity. They essentially record giant "arrays" of luminosity data and ship it over the optic nerve to the brain. Now, we should all know that a collection of "pixels" doesn't make a 3D scene. So here's what has to happen:
    1) The brain has to find the edges in the input. It does this by examining the second derivative of the input function. Where the second derivative has a zero in a luminosity function, there is contrast. The brain collects areas of contrast and forms edges. When you were born as a blank IBM RAM chip, who taught your eyes to differeniate a luminosity function and form egdes?
    2) Edges are collected to form 2D "polygons". These polygons are collected by finding groups of closed edges. When you were a child were you taught to find polygons by locating closed groups of edges? I don't think so.
    3) Now that the brain has polygons, they can be assembled into geometry, and certain predictions can be made. When you see a new object that has a certain geometry, you understand what that geometry is like - even on the side you've never seen! You've never seen the dark side of the moon, but I'll bet you can extrapolate what the geometry of it is. I'm not talking about craters and stuff like that - but I would be quite surprised to hear you thought the moon was spherical on the Earth side, and cubic on the dark side. Where were you taught to construct generalized geometry from polygons?

    The point is that there are literally thousands of things that the brain must be born knowing, and there are many more examples that I have given. If you're interested in further information, you should look into linguistics - that's where most of this theory comes from, becuase there are some very interesting properties of language (there is a Universal Grammer, common to every language! You weren't taught that when you were little were you?)

    Now, I wouldn't stand to argue that a true AI would have the ability to develop a distasteful personality. This is obviously and plainly true. But it's not true because the brain starts out blank - quite simply because the brain can't start out blank.


    You could be correct, and I would tend to believe this. I didn't mean to imply that the wiring was "software" of some kind. Rather, it's emergent from the modular design of the brain. If the modules can be arranged in such a way that they can detect edges, then why not make the physical design spec reflect this? After all, if there isn't enough room in the genetic code for brain wiring, why would there be enough room for explicit physical design? Since all humans work the same, make the arrangement of items cause abilities to emerge. We can walk upright by virtue of the relative position of legs to torso. We can then save space in the code by allowing walking to be an emergent behavior.

  • So, you traded one of these [] for your ideals and integrity? Ludicrous!!

    Why didn't you just take over the family company, and promise you'd change it within 5 years, sort of like Al Pacino in the Godfather's? Oh wait...

  • by dimator ( 71399 ) on Friday May 11, 2001 @04:36PM (#228970) Homepage Journal
    "It had to run in 4k. In fact the 8k version had algorithms that were more efficient but that took up more space. By the time the 4k BASIC was done, the 8k version was out."

    So, they began doubling memory requirements starting with their second ever software release, and they've continued until this very day!!

  • That whole interview was kind of weird.

    Reminded me of Obi-Wan talking about Anikin with that far-away look in his eye.

    "He wasn't always a hulking metal monster, you know?"

    Anyway, why is it the clued in one who dissapears from the scene?


  • That was supposed to be a 4004.

  • "Anyway, why is it the clued in one who dissapears from the scene? "

    Dunno. Ask Woz.

  • > Why do we always assume that rich people are
    > unhappy?

    On the contrary. I think that money can make people happy. However it seems it cannot make them satisfied.

    (I have heard this too: money cannot buy love but
    it can rent a lot of sex.)

  • shamelessly stolen:

    Happiness is not getting what you want, Happiness is wanting what you get.

  • Actually, scratch that, it was more like:

    The secret to happiness is not getting what you want, it's wanting what you get.

    much better.

  • Guess what sparky, money isn't everything. :) I would rather make my salary of ~$15,000/year and be very happy with what I do. Then make insaine ammounts of money and not be happy. I am pretty sure he likes what he does.

    poor billy. he has billions and billions of dollars, but he's an unhappy boy.
  • The quote refers to Basic, not VisualBasic. Basic consisted of a bunch of numbered instructions. There were no functions or parameters - not to mention classes. Just numbered labels and goto statements. The closest it had to a function was gosub - which pushed the current line onto a stack and let you return from it. My memory is a bit fuzzy, but I am pretty sure that you didn't even have local variables. Everything was global. It did not lend itself to structured programming and it took a change in thinking to switch to a language like C.
  • He's trashed Basic, APL, Cobol, Pascal...granted, at least the first three had it coming big time. Shoot, is there even one programming language that meets with Mr. Dijkstra's approval?
    Ooh, moderator points! Five more idjits go to Minus One Hell!
    Delenda est Windoze
  • In an ideal world there would be no billionaires :) This coming from someone coming from one of those countries with devalued currencies where most upper-middle class people are billionaires...
  • Hey dood, I really enjoyed those big monkeys throwing bananas at each other.

  • Exactly. It's possible. Maybe not in the valley or some other all-around-incredibly-expensive place. I make about $11k/year (including school loans as simulated income- just humor me) and I pay for school + a new computer a year + all food + all rent (over priced place too). That's including school. Take that out of the equation, and I live pretty comfortably on $7k/year. If I was making $10/yr (no school included), i could also be paying for health insurance (if my pt job didn't offer it) and a fat retirement fund.
  • Ahhh, yes, Microsoft's innovative version of Moore's Law.
  • I know of lot's of good programmers that started with Basic. If you can't learn to program after having learned Basic, you weren't cut out to be a good programmer anyway.

    A much more likely explanation for the phenomenon would probably be that stupid people are attracted towards Basic, while clever people soon realizes they need to learn something else, and doesn't remain in the Basic camp for very long.

    Anyway, the Basic Djikstra talked about has almost nothing in common with e.g. Visual Basic that would, well, probably not make Djikstra happy, but at least not make him physically ill (oh wait, that was another language).

    Edsger Djikstra is a person that deserves respect. It is just too bad that this stupid quote is what people remember of him!

  • Ogg vorbis comes with redhat 7.1
  • Until AmigaBasic came along wheeeee.... (even if it was written by MS and very buggy - a forerunner of VB no doubt)
  • The funniest thing is the very well-written Bill Gates interview that is linked to at the end -
    Have lots of people read the code so that you don't end up with one person who is kind of hiding the fact that they can't solve a problem. Design speed in from the beginning. A lot of things that have helped us, even as the project teams have become larger, and the company has become a lot larger than it was. It is not some methodology where there is a lot of funny documentation. Source code itself is where you should put all your thoughts, not in any other thing. So, our source codes, all though there are a few exceptions, tend to be very well commented in a very structured way.
    - Bill Gates, Interview with David Allison of the Smithsonian.
  • I'm sorry, but your argument is pretty flimsy. If I had mod status I'd slap you with a "Flamebait" as this is the kind of silliness that characterizes Scott Nudds and other famous trolls. Software is a concrete representation of a mathematical formalism. I'm a Scheme hacker, and part of this language's beauty is the fact that it takes this concept to extremes, being closely associated with the lambda calculus (where EVERYTHING is a function call). Now, based on your conclusions here, this means that programs written in functional languages like Scheme, Haskell, etc. must be the most unreliable... they abstract the entire problem domain into a series of algorithms that invoke each other (and often themselves)! And talk about timing problems... many LISP/Scheme development environments support interpreted code intermingled with compiled code!

    In reality, you will discover the opposite: software that employs a functional approach is usually not only extremely reliable, but can reach levels of complexity that staggers most proponents of the Pascal school of thought.

    I suspect that you pulled this timing argument out of the same orifice from which Alex Chiu derived his famously stupid "all humans are magnets" nonsense. Algorithms are not concerned with the timing of events, only with the results of the operation. Software can wait around forever for the results, but fails when the results are not what were expected. That is the source of a lot of software reliability and compatibility issues. Timing is an issue when it comes to execution speed, but for most algorithms the time of execution is knowable, and there are algorithmic analysis procedures that software developers go through to improve the speed of their programs; i.e., if algorithm A produces the exact same results as algorithm B but with a tenfold speed increase, we can improve our speed by using algorithm A. (Again, algorithms are concerned with results, NOT timings!)

    As for space probes and air traffic control systems, that's what real-time systems are for. The code that runs these real-time applications is usually extremely reliable and coded to run in given time constraints.

    What you are describing exists to a small extent in systems like Smalltalk, but overall what you are proposing would introduce the same kinds of problems that led to the development of the all-purpose, stored-program computing machine in the first place, so we're back where we started.
  • And physics. If we could get rid of physics it would be a lot easier to keep planes in the air. Actually, the common element seems to be time -- physics supplies it and algorithms consume it. I suggest we stop using time immediately.
    Hit the link in his .sig; I wouldn't be surprised if he advocated exactly this!
  • He's a zealot, and sometimes I think we need guys like him in order to change our perspectives every now and then. Slashdot is full of free software softies. And when a Mozilla story runs, half of 'em talk about how they boot IE in VMware 'cause it's the only decent browser anymore...
  • by xp ( 146294 ) on Friday May 11, 2001 @03:04PM (#228991) Homepage Journal
    The Bill Gates interview linked at the bottom of the article is actually pretty good. I recommend it highly.

    Become a better stock trader with PeakTrader []

  • You want to have some real fun, try Ron Nicholson's Chipmunk Basic. It has OO features; I even tried to build a class library with it once.

  • If anyone needs the old skool gwBasic for DOS, I have it :) Brings back so many memories. I can remember my first calculator program now...
  • Teehee.

    We need plug-compatible components and message-based communication between objects

    When you get right down to it, at least as I see it, computers process instruction after instruction. Now, here's what's 'Free On-Line Dictionary of Computing' calls an algorithm, in part:

    A detailed sequence of actions to perform to accomplish some task. Named after an Iranian mathematician, Al-Khawarizmi.

    So, even if we use these objects that communicate via message passing, it's just a facade over the real algorithmic execution that's occuring. But then, this isn't meant to be serious. Entertaining though. Exactly how, do you propose we would write software without languages? And message passing, how is that different from function calls? Oh yeah, it's not! Don't worry, you get a cookie...

  • well, you got me there. Yes everything is inherited from IUnknown.

    However, lil ol' me(who, incidentally writes many COM components in C++) doesn't program giant component models(as of yet), having only done little 100,000 line web projects. I personally have never needed to use inheritance, and unfortunatly, although you've shown me a good example of what, don't say why.

  • yeah, well I just got done looking at a book - Interactive UML Developent with VB 6.0(sorry, no link) The forward was written by Grady Booch, who actually praised VB as a language.

    Yes, in 1982 BASIC was pretty crippled, however its current incarnation only lacks inheritance to be a full OO language(this can be worked around from what I understand, however in my 7 years as a programmer I still haven't seen a truly valid reason to use inheritance). Inheritance will be a feature of the next version of VB.

    So the next time you decide to put down VB, remember that you are deriding a compiled, object-oriented language. Java can't even claim to be that.

  • by krappie ( 172561 ) on Friday May 11, 2001 @04:04PM (#228997)
    He mentions how Microsoft is spending a lot of money trying to fight opensource ideas. This is funny, because I just read the Halloween documents for the first time yesterday, and I would like to point out a peice of it from section one []. I think it is a perfect indication of what Microsoft is doing, from the words of Microsoft themselves.
    Open Source Process

    Commercial software development processes are hallmarked by organization around economic goals. However, since money is often not the (primary) motivation behind Open Source Software, understanding the nature of the threat posed requires a deep understanding of the process and motivation of Open Source development teams.

    In other words, to understand how to compete against OSS, we must target a process rather than a company.

    { This is a very important insight, one I wish Microsoft had missed. The real battle isn't NT vs. Linux, or Microsoft vs. Red Hat/Caldera/S.u.S.E. -- it's closed-source development versus open-source. The cathedral versus the bazaar.

    This applies in reverse as well, which is why bashing Microsoft qua Microsoft misses the point -- they're a symptom, not the disease itself. I wish more Linux hackers understood this.

    On a practical level, this insight means we can expect Microsoft's propaganda machine to be directed against the process and culture of open source, rather than specific competitors. Brace for it... }

  • Umm, not even close. I have been living on my own for 6 years now. My GF makes another $20,000/year. So, the 2 incomes make it liveable. I don't have to worry about retiredment time, its been done takening care of since I was born ...
    until (succeed) try { again(); }
  • "I'm a programmer, not an English teacher"

    until (succeed) try { again(); }

  • VB programmer ?! Where did you come up with that one ?! I would recommend takening a look at the URL in my .sig
    until (succeed) try { again(); }
  • Becuase, in general they aren't. Sure, there are plenty of people who are very rich and very happy and vice versa. I know of too many people whos wives are with them for 1 reason, becuase they have money. Anyone who says that doesn't happy, needs to get there head checked. I used to work for a company that where the boss had so much fscking money, he didn't know what to do with it all. His wife was trying to take everything from him, his friends where only around for one reason. Its pretty common.

    until (succeed) try { again(); }

  • Guess what sparky, money isn't everything. :) I would rather make my salary of ~$15,000/year and be very happy with what I do. Then make insaine ammounts of money and not be happy. I am pretty sure he likes what he does.

    until (succeed) try { again(); }

  • Also, another thing to look at is that money changes people and the orginal ideas seem to get lost. I have witness this many times in the past.

    To be completly honest, my family owns the largest wood working shop on the east coast. I had the option to take over the business, I declined. My reasons where that my family is very Cut Throat, very gready, and will do anything to screaw over another family member. This is not a joke, its very serious. Sure, I could have made boat loads of money, but I would have suffered every single day.

    I have had many people in the past disagree and say, "Man I would have just done it!". You can't even begin to understand it unless your put in that postion. So, all in all, money isn't everything. Yet, its nice to have.

    until (succeed) try { again(); }
  • You are incorrect. The BASIC which you refer to is the version that was popular in the early eighties. The later versions were far more structured with local and remote functions and subs, libraries and includes. An example of BASIC code which you could run through BC.EXE (Microsoft's BASIC compiler) in probably about 1987 would be like this:

    DECLARE SUB SomeSub()

    DECLARE FUNCTION Multiply%(fnum%,snum%)
    DIM SHARED SomeGlobal%
    CALL SomeSub ()

    SUB SomeSub()
    DIM B%
    B% = Multiply%(2, 2)
    PRINT "The answer is " + STR$(B%)

    FUNCTION Multiply% (fnum%, snum%)
    Multiply% = fnum% * snum%

    The problem with the development world is that they are stuck on the fixation that the laguage hasn't evolved since BASIC in 1979. These are the same people who complained about using line numbers in QBASIC when they didn't need them at all (in fact the above code will probably work in QBASIC.EXE). Even C wasn't in its current state in a day!

  • by mickwd ( 196449 ) on Friday May 11, 2001 @03:06PM (#229005)
    Monte on Python
  • You know, alot of Microsoft idiots would love to say that Microsoft knows how to make great products and that the windows9.x instability had to do with its dos origins and it really wasn't Bill Gates fault. Xenix proves the exact opposite. Bill had access to probably the best OS at the time and with only the first version ms frankly crippled it. I know you were beign sarcastic, but Xenix was specifically rewritten to make unix porting between all unix systems all but impossible with a simple recompile. Frankly ms made a proprietary version of C which was incompadible on purpose but they could boast that it had all the features of Unix so people would buy it thinking its AT&T Unix. The good thing is back then people didn't look at Gates as some sort of god and most of them dumped xenix. After sco fixed it, it gained popularity since at&t raised the price of their unix out through the roof. However it remainded very unstable and yet people stuck with it. Infact even after sco purchased it they refused to enable memopry protections and it multitasked similiarly to win3.x with cooperative multitaksing. Utter crap. They fixed it by providing some memory protecting libs but all the utilities that came with it and many older apps didn't use the libraries so it was quite unstable for many years. I would actually rather live with Dos then Xenix.
  • The 1983/1984 version is there [].
  • "It is practically impossible to teach good programming style to students that have had prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."

    -- Edsger W. Dijkstra, 1982, Selected Writings on Computing: A Personal Perspective []

    ... A quote that didn't age one bit ;-)
  • this is interesting from the gates interview:

    And it was really his insight that because of semi-conductor improvements, things would just keep getting better. I said to him, "Oh, exponential phenomena is pretty rare, pretty dramatic. Are you serious about this? Because this means, in effect, we can think of computing as free." ( he is talking about paul allen)

    but the interesting thing is that he makes it sound as though he and paul realized MOORES LAW first. Egomaniac.
  • Who you are is, on the whole, by nature, not nurture

    Uhhhh, no. The best thinking today is that some characteristics are by nature and those can in turn influence along with external environmental factors the majority or characteristics that we call personality and capability.

    I've never seen a child spontaneously do calculus. Someone has to teach them.

    ~~ the real world is much simpler ~~
  • Remarkable guy. He didn't try to sell us anything. Considering he was involved in a project that still exists in a rapidly enhanced form (BASIC), it was neat to see him treat his accomplishment there as historical, and get with the times in terms of modern day alternatives to learning languages (regardless of whether he's right or wrong in his choice of Python...).
  • Agreed. Interesting interview but I would be interested to hear more about Davidoff's take on CPRM. CPRM is frightening as a technology, but I'm not suprised at it's introduction.

    Davidoff touches on this only periferally, but CPRM is another example of a society responding to technology, rather than adapting to it or making efficient use of it.

    Please bear with me as I rant for a moment:

    It's vary interesting to watch as society (as seen threough legislation that defines that society) scrambles to catch up with technology, where a half century ago, we drempt of what it would be like in the 21st century where we'd have flying cars and other astonishing technologies. I whonder if anyone - as part of the dream - envisioned tire manufacturers joining the enviromental lobby to put together legislation to prevent the introductions of cars that didn't roll along on tires.

    The MPAA isn't the only industry association to be staunchly protecting a business model that doesn't apply in a new milenium. Look at how long it has taken for gasoline-electric cars to be introduced. Even today, there are only a few out there. The technology exists, and it works but hasn't been widely adopted. What oil company would be in favor of such a technology?

    According to Davidoff:
    CPRM, is just the most notorious, or the most emblematic of a number of schemes that make the open personal computer into an limited and tightly-controlled playback device. Controlled, effectively, by the entertainment industry


    "I don't think people are aware of it, in spite of what you and others are writing about. It hasn't made it into the public consciousness," he says. "I didn't hear about the DMCA until after it had been passed."
    This is yet another eample of the same phenomenon. Most disturbind, is that he's completely correct, the public is simply unaware of many of these issues.

    In the 1950s we were dreaming of new technologies, without concern for how sociaty would react. Now, we have - then unimaginable - new technologies (although no flying cars yet) but society is fighting introduction of those technologies. New areas of law are created efery day as new problems are created, adressed, then others created. We need progressive lawmakers with insight into these technologies to make far more informed decisions. This, however is the catch-22. There will not be lawmakers who can make informed decisions with regard to a technology, unless that technology is widely available, such thet they are familiar with it, and yet, if archaic law is what is preventing the technology from proliferating through society, we will have created for ourselves a techno-evolutionary cul-de-sac from which itwill become increasingly difficult to extricate ourselves.


  • Support Open Source, remain a lowly software contractor. Demand payment for your product, become the world's richest man.

    We shall see how sustainable the software licensing model is-- apparently Microsoft must have some doubts about this because of their move towards a service oriented model.

    I think that this interview was interesting... I always admire mathematics coders because of the absolute beauty of mathematical computation (I use math functions whenever I can to solve seemingly non-mathematical tasks because I have cfound these solutions to be more efficent and extensible).

    I think his comments also about the impact of the GCC are also interesting. I had know it was influential but I was not familar with its impact on the embeded impact.

  • My Star Office 5.1 pretends to be IE 5.... OTOH, most of my Linux boxes are not convenient to hook up to the internet right now, so mostly I read Slashdot at my place of emplyment using IE. That being said, I still use Linux as a PDC (with Samba), and my primary workstation at my secondary job.
  • After 20+ years, who cares how sustainable it is? Gates & company now have enough dough to get into any business they want.

    Sure. However, if you want to get into the industry now or in a couple of years, this is an issue. How can you build a stable company in an ever changing industry on non-sustainable business practices?

    Where does Microsoft make most of their money? OEM sales and upgrade cycles (both the same, mostly-- people upgrade their machines by buying a new one all too often). As hardware becomes more powerful, the upgrade cycle becomes longer. Can they sustain their profits?

    Obviously not-- hence the service oriented model they keep saying they are moving towards.

    Get over it. THere will never be another Bill Gates in the software industry. You cannot be like him, nor can anyone else. THe opertunity is over for that kind of success.

    I think that proprietary software will always be around but will probably be relegated to niche roles (don't expect an open-source version of OrCAD any time soon).

    Also note the emphasis in the interview on RMS's academic background. It doesn't suprise me that an obviously intelligent programmer here is supporting free software.

    Also note that the essnece of making a fortune is not "licensing fees" it is "other people's work." This is true regardless of what one sells. So yes, I think that it is possible to make a lot of money in the OSS market, and that oportunity is slowly beginning to surface, but it will be a little whil before anyone does so.

  • I thought this Comment from Bill Gates regarding the future of computing to the DA's question was VERY ACCURATE and ahead of his time. I was impressed!!!

    The Future of Computing

    DA: You mentioned your vision of where the PC will be on every desk and in every home. You clearly have had a vision about the kinds of products that would come out and yet you said a minute ago, "This is just the beginning." What do you see as lying ahead in terms of further unfolding of the vision that you have held onto so continuously over the last 20 years?

    BG: Well, the PC will continue to evolve. In fact, you'll think of it simply as a flat screen that will range from a wallet size device to a notebook, to a desktop, to a wall. And besides the size of the screen, the only other characteristic will be whether it is wired to an optic fiber or operating over a wireless connection. And those computers will be everywhere. You can find other people who have things that are in common. You can post messages. You can watch shows. The flexibility that this will provide is really quite incredible. And already there is the mania in discussing this so called "Information Highway" which is the idea of connecting up these devices not only in business, but in home, and making sure that video feeds work very well across these new networks. So we've only come a small way. We haven't changed the way that markets are organized. We haven't changed the way people educate themselves, or socialize, or express their political opinions, in nearly the way that we will over the next ten years. And so the software is going to have to lead the way and provide the kind of ease of use, security, and richness that those applications demand.

  • This is a compelling argument and one which can partly be addressed by supporting open source gpl technologies like ogg vorbis [], which effect both Windows and non windows users. It's interesting to me that now sonic foundry [] is supporting vorbis in their music creation suite acid 3.0. Probably because it dosen't cost them anything to do but it will aid in the proliferation of ogg vorbis as a viable replacement to proprietary patented file formats (mp3). CPRM scares me deeply at the hardware level, I can only do what's in my power to not advise the purchase of anything that comes close to CPRM in hard disks etc.
  • WRITE INTEL: ta ct.htm
  •'s closed-source development versus open-source. The cathedral versus the bazaar.

    It is that point which RMS seems to get, and hammer (and hammer, and hammer..) into our thick skulls every time he opens his mouth/text editor.

  • "Guess what sparky, money isn't everything. :) I would rather make my salary of ~$15,000/year and be very happy with what I do. Then make insaine ammounts of money and not be happy. I am pretty sure he likes what he does."

    Know what you mean... I'm working as a R&D tester for a company that is a BIG Linux supporter, in an area (Raleigh, NC) where there are TONS of high paying IT jobs. I make less than I could, but I get to work with 4 different Linuxes (and both SCO Unixes), and it's really satisfying.

    By the end of summer, I hope to achieve my RHCE and take a purely Liunux position as a network engineer/BOFH for someone.
  • You're incorrect, Daisy. The brain is enormously pre-wired.

    Since you are so knowledgeable on the subject, perhaps you can supply me with the wiring diagram?

    I thought not. Your attitude is symptomatic of why no progress is being made in the field. While there are a few tendencies which may be pre-programmed, which act as relatively poor predictors of future behavior, the vast array of data structures which make up our personalities are acquired through life. It's the only place they can come from -- regardless of your religious views on the subject (and IME most believers in social darwinism have a nearly religious fixation on the idea), there is simply no room in the genetic code for those properties to be coded in a non-emergent way.

    (P.S. Emergent means they emerge like the features of a fractal; small changes in the code result in large, often crippling, changes in the result, making them non-evolvable.)

  • Take vision for example.

    Nice starting point, actually everyone's starting point because it's the most easily studied via ablation and point-scan type studies.

    When you were born as a blank IBM RAM chip, who taught your eyes to differeniate a luminosity function and form egdes?

    You get to the gist: I didn't.. I learned those things in the first hours/days after my eyes opened, or (alternately) after the wiring to my cerebral cortex became myelinized enough to permit the learning. How did I learn it? By looking at stuff..

    We are not born knowing how to, for example, detect short line segments or connect them into shapes or detect luminosity variations and do derivatives on them. This is the fundamental error of current AI research. How do I (O great swami, I hear you say)) arrive at this startling conclusion?


    Please go back to CSCI 100, re-read Knuth or whatever, until you grok that last statement, because it is really, really important.

  • The problem with "signal-based systems" like the brain is their inherent chaos -- which I mean in the strict mathematical sense, as in Mandelbrot et al. While this can make them very useful it also makes them fundamentally unpredictable, a characteristic which bean counters don't like.

    One reason AI research isn't going anywhere is that we are failing to face up to an important truth about how brains develop. Since they program themselves, starting with really very little seed information, most of their observable properties are emergent. The same would be true of any artificial system that really mimics the brain. The reason we don't have good AI isn't that the hardware isn't good enough -- I think it is, at this point -- it's that nobody wants such a system. Imagine educating your self-driving car for six years only to find out it's become a chance-taking rebellious delinquent!

    You mention "message-based communication between objects" as if those objects will somehow know how to talk to one another. Clue time: They don't know how to talk to one another until they learn. And they learn through experience. Sometimes their learning is imperfect, and it can be very difficult to recognize the holes in that learning. We don't even know how to reliably program our own children, much less an air traffic control system that will differ significantly from all known types of brain, animal and human.

    The newly formed brain is every bit as blank as the newly powered-up dynamic RAM chip -- anybody with an ounce of objectivity can look at the 7 Gb genome vs. the 10 ex 14 connectivity of the cerebral cortex and figure that out. Do you really want to go into the cyber day-care business, teaching your machine to speak english and habla the espanol and so forth the same way human babies learn it? Of course not. Since that's what it takes to do it the way you mention, it won't be done that way.

    At least, not by most people (sly grin).

  • The non-GPL version of what you're talking about is called many names: COM, OLE, Javabeans, VB, Corba...

    No. They all suck, especially COM and OLE. They suck because they are all based on the algorithm. One of the problems with algorithmic components is that, even is they are tested and proven to work reliably in one environment, there is no guarantee that they'll work the same way in another environement. Why? because that is the bane of algorithmic systems: event timing varies from one system to another. By contrast all hardware chips retain their temporal signatures wherever they are used.
  • So, even if we use these objects that communicate via message passing, it's just a facade over the real algorithmic execution that's occuring.

    Parallelism can easily be simulated in software. I do it all the time. For example, his is how neural networks work. The trick is to hide the sequential nature of the processor by using two lists one for input signals and one for output. Once you get to that level, you're got a signal-based system. Ideally, the only algorithm that should exist in a processor-based system is the single function that runs the message-passing operating system and processes the primitive objects. In the future when we have fullly reconfigurable memory, we'll get rid of the function altogether.
  • It's not just BASIC. All programming languages are based on 200 year-old ideas pioneered by Lady Ada Lovelace and Charles Babbage. They all have one thing in common: the algorithm.

    The algorithmic approach to software construction is the primary reason why software sucks. Software sucks because it is unreliable and takes too long to develop. The invention (again by Lady Ada) of the subroutine, although a great contribution when it was introduced to digital computers in the last century, did not prevent the current software crisis. Planes loaded with people are crashing, airports are shut down and Mars probes costing hundreds of millions of dollars are being lost. And it's all because of the algorithm.

    Why the algorithm you ask? Well consider this: The reliability of software is inversely proportional to its complexity while the reliability of the human brain improves as it gets more complex. There is an important lesson to be learned from this. The most obvious difference between software and the brain is that the former uses sequential algorithms whereas the latter is based on parallel streams of signals.

    A signal-based system is more reliable because it makes it possible to have strict control over the timing of events. By contrast, one can never be sure when an algorithm will be done, and this creates all sorts of timing problems. It is instructive to note that hardware is orders of magnitude more reliable than software. It is no secret that hardware is inherently parallel and driven by signals. I an convinced that a similar approach to software construction can improve reliability by several orders of magnitude.

    So there you have it. I call for the elimination of the algorithm as the basis of software construction. I call for a worldwide effort by geeks everywhere to contribute ideas for the establishment of signal-based software construction methods (GPLed, of course). We need plug-compatible components and message-based communication between objects. We need reliable, downloadble components that can snap together at the click of a mouse. No more function calls! No more languages!

    It is about time that software is changed from the cottage industry that it is today and moved into the 21st century. Let's face it, Lady Ada and Charles Babbage were true geniuses and we owe them a great deal, but they did not have to write code for interplanetary probes and air traffic control systems.
  • I'm sure your parents wouldn't be too happy.

    "You're 45 years old. Your father and I can't support you your whole life, you know!"

    "But mom... I'm so happy making $15,000 a year doing what I do. Can't you be happy for me?"

    Something tells me I wouldn't be bringing presents to your abode.

    Dancin Santa
  • Actually, I wasn't sure what he was talking about in regards to gcc and embedded systems either. It would have been nice if he had fleshed out what he thought the impacts of gcc were. *shrug*

    Dancin Santa
  • Gotcha! Thanks, +1 Informative, I'd mod. Can anyone spare a mod for Ayende?
  • in my 7 years as a programmer I still haven't seen a truly valid reason to use inheritance

    You like those controls and ActiveX and OLE objects that you can stick to your VB apps? Those REQUIRE inheritance. While it may not be visible to you, someone's using inheritance (and polymorphism and the rest of OO stuff) to bring these controls to you.

    Dancin Santa
  • by Dancin_Santa ( 265275 ) <> on Friday May 11, 2001 @03:03PM (#229035) Journal
    Support Open Source, remain a lowly software contractor. Demand payment for your product, become the world's richest man.

    Decisions, decisions.

    Dancin Santa
  • by tuxlove ( 316502 ) on Friday May 11, 2001 @03:25PM (#229048)
    I worked with Monte in a previous life. We were never on a project together, but had lots of water cooler interaction and that sort of thing. Played him at checkers once and won, which is surprising because I suck.

    There were rumors about his past with good ol' Bill, but I never bothered to ask. It's funny, now that I haven't seen him for several years, to see his past highlighted in the article. The stories I heard about his past seemed unlikely for someone like him (i.e. a reasonably normal guy without obvious riches).

    Don't get me wrong. Monte is a cool guy. Nice, friendly, smart and all of that. But to imagine him as one of the first 10 or less at MS is weird to say the least. Obviously he never got the riches out of it that the rest of them did. He always drove around in an ancient Honda Civic with faded and peeling paint. He had a relatively isolated position (in charge of development tools) in our relatively obscure company. Don't know much about his personal life, but I think he took Karate lessons. You could always count on him to ask the pointed, annoying question of the speaker at company meetings. It was inevitable, and they would always look for him in the crowd to get the questions out of the way.

    Not the mover and shaker one would associate with the other founders of MS. I wonder if he's sorry he didn't stick around long enough to become a billionaire. If you're reading this Monte, "Hey."
  • We did that paper-tape-storage programming back when I was in High School. In about 1975. I don't remember there being any expensive time limits, but each job run did report how much CPU time it had consumed, and you COULD run your account out of time. The big expense in our terminal room (we had two teletypes, a CRT display and a 'fast' 300 baud Texas Instruments thermal paper terminal) was the termial paper for the fast terminal. I can remember the instructor saying 'ten cents a foot! don't waste paper!' So we mostly settled for the 110 baud teletypes and fought over the CRT display (also 300 baud).
  • by WhtDaUWant ( 451060 ) on Friday May 11, 2001 @03:35PM (#229066)
    There is a link in the article that is even scarier - - everyone should see this and take note. It discuss "the end of the PC" although a little pessimistic and morbid it has alot of relevnt points especially from who it is coming from. Something needs to be done please if anyone has any ideas of how this could be prevented please something needs to be done so this Does not happen
  • by WhtDaUWant ( 451060 ) on Friday May 11, 2001 @03:00PM (#229067)
    I though that this was an excellent interview. It brough to light some scary stuffas far as PC's go. One of the questions i had on it was how will CPRM effect people who dont run windows? Is really going to end up being ignoredable by programs? After all the entertainment industry has gone to trouble to get it in the door. The next step if it was ignored by programs would be enforcing it in hardware instead of software. Digital fingerprints are all ready being implemented by Napster, it isnt that hard to take advantage of that at all. Really scary stuff.

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde