High-level Languages and Speed 777
nitsudima writes to tell us Informit's David Chisnall takes a look at the 'myth' of high-level languages versus speed and why it might not be entirely accurate. From the article: "When C was created, it was very fast because it was almost trivial to turn C code into equivalent machine code. But this was only a short-term benefit; in the 30 years since C was created, processors have changed a lot. The task of mapping C code to a modern microprocessor has gradually become increasingly difficult. Since a lot of legacy C code is still around, however, a huge amount of research effort (and money) has been applied to the problem, so we still can get good performance from the language."
C is the 3vil (Score:4, Funny)
ahah now we know why my java program is so slow. damn C slowing it down.
Great article! (Score:5, Funny)
Gentoo Stage 1 FTW! (Score:2, Funny)
A stage 1 install will do the following:
1) Compile glibc from source using architecture specific optimizations
2) Compile gcc using the previously compiled optimized glibc and optimize gcc for the architecture
3) Compile everything else using said architecture specific optimized tools
4) ???
5) Profit!
And before all the trolls come in and say how long it takes to compile things, the Gentoo Handbook has several tricks like compiling from RAM, etc... to speed up compile times. I normally don't waste my time with Stage 1 because there are plenty of Stage 3 tarballs I can grab for whatever architecture I may be using at the time.
Back to the point, if your glibc is compiling your code using the MMX registers for memcpy(), memset(), etc... it completely invalidates the point in the article about how those extra registers go unused. Additionally the point he makes about data structures, while valid, is a non-issue given that most serious programmers have taken a Data Structures and Algorithms course where you learn that O(n Log n) is less than O(n^2), and will choose to use trees and hash tables where appropriate.
[sarcasm]Nevermind, he's dead on, no one ever implements a spanning tree in C code[/sarcasm]
I see your point about the vectorization of library code, but point me to a high level language which does not suffer from that flaw given your assertion of closed source libraries. That is true across the board regardless of the language used.
"The Truth about C++ Revealed" (Score:5, Funny)
From:
Subject: The truth about 'C++' revealed
Date: Tuesday, December 31, 2002 5:20 AM
On the 1st of January, 1998, Bjarne Stroustrup gave an interview to the IEEE's 'Computer' magazine.
Naturally, the editors thought he would be giving a retrospective view of seven years of object-oriented design, using the language he created.
By the end of the interview, the interviewer got more than he had bargained for and, subsequently, the editor decided to suppress its contents, 'for the good of the industry' but, as with many of these things, there was a leak.
Here is a complete transcript of what was was said, unedited, and unrehearsed, so it isn't as neat as planned interviews.
You will find it interesting...
__________________________________________________ ________________
Interviewer: Well, it's been a few years since you changed the world of software design, how does it feel, looking back?
Stroustrup: Actually, I was thinking about those days, just before you arrived. Do you remember? Everyone was writing 'C' and, the trouble was, they were pretty damn good at it. Universities got pretty good at teaching it, too. They were turning out competent - I stress the word 'competent' - graduates at a phenomenal rate. That's what caused the problem.
Interviewer: problem?
Stroustrup: Yes, problem. Remember when everyone wrote Cobol?
Interviewer: Of course, I did too
Stroustrup: Well, in the beginning, these guys were like demi-gods. Their salaries were high, and they were treated like royalty.
Interviewer: Those were the days, eh?
Stroustrup: Right. So what happened? IBM got sick of it, and invested millions in training programmers, till they were a dime a dozen.
Interviewer: That's why I got out. Salaries dropped within a year, to the point where being a journalist actually paid better.
Stroustrup: Exactly. Well, the same happened with 'C' programmers.
Interviewer: I see, but what's the point?
Stroustrup: Well, one day, when I was sitting in my office, I thought of this little scheme, which would redress the balance a little. I thought 'I wonder what would happen, if there were a language so complicated, so difficult to learn, that nobody would ever be able to swamp the market with programmers? Actually, I got some of the ideas from X10, you know, X windows. That was such a bitch of a graphics system, that it only just ran on those Sun 3/60 things. They had all the ingredients for what I wanted. A really ridiculously complex syntax, obscure functions, and pseudo-OO structure. Even now, nobody writes raw X-windows code. Motif is the only way to go if you want to retain your sanity.
[NJW Comment: That explains everything. Most of my thesis work was in raw X-windows. :)]
Interviewer: You're kidding...?
Stroustrup: Not a bit of it. In fact, there was another problem. Unix was written in 'C', which meant that any 'C' programmer could very easily become a systems programmer. Remember what a mainframe systems programmer used to earn?
Interviewer: You bet I do, that's what I used to do.
Stroustrup: OK, so this new language had to divorce itself from Unix, by hiding all the system calls that bound the two together so nicely. This would enable guys who only knew about DOS to earn a decent living too.
Interviewer: I don't believe you said that...
Stroustrup: Well, it's been long enough, now, and I believe most people have figured out for themselves that C++ is a waste of time but, I must say, it's taken them a lot longer than I thought it would.
Interviewer: So how exactly did you do it?
Stroustrup: It was only supposed to be a joke, I never thought people would take the book seriously.
Re:Article is theory not practice - no measurement (Score:4, Funny)
Re:Old debate (Score:4, Funny)
Some things never change.