Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:Well, they're a good indicator of intelligence (Score 1) 672

Whiteboarding conveys somebody who can rush through his thoughts because most of the time we have a boss breathing down our necks waiting for us to give him a technical game plan for software that needs to be out ASAFP.

In other words, you want to hire people who can give you a passable solution in 20 minutes rather than someone who can give you a smartly architected solution tomorrow? No wonder most companies develop crap built on top of crap software, where large projects usually fail. Everything is built out of chewing gum and bailing wax. Companies are selecting for precisely the wrong skills in a field where robust engineering actually is the only sustainable solution to the problems we face in developing and maintaining large programs.

Secondly, any business person in a position of hiring will tell you that it is always better to accidentally turn away Einstein than it is to accidentally hire a moron.

There are many more effective ways not to hire morons. I presented one in my previous message. Very simple O(n) questions will also filter out morons, since I can tell you from reviewing much submitted code, that the typical "moron" programmer has absolutely no clue as to the difference between, for instance, O(n) and O(n**2).

|>ouglas

Comment Re:Well, they're a good indicator of intelligence (Score 1) 672

Perhaps. The article says "looking at real code" is better. Again perhaps. For example the problem there is: did they really write the code, if so how long did it take? Did someone else suggest fixes etc? You don't know. I mean 300 lines of beautiful C is all fine and dandy but if it took you 3 months to write it and half of it is cut and pasted from the web how good is it really?

Your worry is one of those moot worries that people have while sitting in their armchair. In my real world experience, it is not a problem. Where I work, we send the specs for a small program along with some unit tests to an applicant, and give the applicant however long they'd like to complete it. We can tell that most people do not cheat because their solutions typically completely suck. Or if they were cheating, they didn't know how to cheat effectively. The solutions usually all suck in different unique ways too. A few applications submit passable solutions, and so we have them come in, and as part of the interview, we'll have them go over their code a bit and explain why they made certain design decisions, etc.

This process gives us a much better idea of how someone would perform at the real positions we have than any amount of coding on a whiteboard ever could. It's also proven to be effective. We haven't hired any clunkers.

|>ouglas

Comment Re:Well, they're a good indicator of intelligence (Score 1) 672

If a candidate is whiteboarding a process for me and he silently doodles on the board then that is a problem. You are supposed to talk through the problem primarily and cement your ideas in on the board so that everybody can see a visual summary of your explanation. [...] somebody who just wants to demonstrate skill by typing in a text editor tells me that this person doesn't care about communicating or discussing complex ideas, they just want to showcase their skill.

This is such complete and under nonsense that it always amazes me that it has become the current orthodoxy. And a completely counterproductive orthodoxy at that.

Let me ask you a question: If this was the best way to evaluate whether someone can think critically, then why don't our finest engineering universities (e.g., MIT and Stanford) grade students by having them do all their exams at a whiteboard, while the professor or TA impatiently taps their pen on a desk while waiting for the answer? We should just get rid of problems sets and papers and projects and programming assignments and exams because none of these other things demonstrate, at least if one is to believe the orthodoxy, that someone can think critically or communicate effectively unless they can do so in real time in front of a whiteboard on a problem that they've never seen before.

What unremitting nonsense!

The actual fact of the matter is some of the world's greatest thinkers do their best thinking in the shower or while they're asleep or while cutting their toenails, and if you don't allow people the time and the space to think the way that they think best, you know absolutely NOTHING about them, other than that they're not so great at solving new problems while standing up in front of people in a certain particularly stressful situation. For all you know, Einstein would have failed your interview. And then companies bemoan the fact that there's not enough talent. Bah! They scared 3/4 of the talent away.

|>ouglas

P.S. Yeah, sure Phd qualifying exams are done orally. This no doubt stems from the fact that a PhD is academic training, so if you can't think fast on your toes in front of a classroom, then maybe academia is not where you should be. The current trend in requiring whiteboarding during a job interview is the revenge of the PhDs. Apparently enough of them left academia and got into the real world so that now they think that everyone should have to suffer what they had to.

P.P.S. Where I work now, the interviewing process is much more civilized. We send the specs for a small program along with some unit tests, and give the applicant however long they'd like to complete it. It should take a page or two of code and a couple of hours to complete at most. One might worry that people would cheat, but that hasn't been my experience. Most applicants never submit a solution at all. I suspect this is because they couldn't get their solution to pass the unit tests. (What we ask them to do, is not difficult. Anyone who passed a college course in software engineering with a grade better than a C should have absolutely no problem completing the assignment.)

So, then some fraction of applicants actually complete the assignment. These submissions show that the vast majority of so-called software engineers -- or at least those who are looking for jobs -- can't code their way out of a paper bag. I.e., most of the submissions are grossly inefficient and hugely over-engineered or under-engineered. Finally, a few of the submissions are passable (rarely do we get a truly excellent solution, which is kind of sad), so we have them come in, and as part of the interview, we'll have them go over their code a bit and explain why they made certain design decisions, etc.

If you ask me, this process gives us a much better idea of how someone would perform in the real world of being a software engineer than any amount of coding on a whiteboard ever could.

Comment Re:Why BASIC? What for? (Score 1) 783

Indenting sub-blocks is what is proper. This is all that Python requires. ....and that's all it takes to make it a completely worthless language. Because guess what happens when one starts transferring files back and forth between different programmers and text editors, cutting and pasting, with tab indentation vs space indentation, or even code generation tools that add or subtract code. Somehow one space somewhere is off and now the pile of garbage won't compile.

Since you've apparently never worked a real job, I'll tell you what happens in the real world as opposed to what happens in the imaginations of those who live in their moms' basements: everything works fine. And the reason that everything works fine, is that if people in the real world didn't already have this sort of thing well in hand, their Makefiles would also break and so would all of their data files. Data files for which a single missed character can be mean the difference between the rocket making its way to Mars and the rocket blowing up on the launch pad. Also, any programming language has string literals for which whitespace must be preserved properly, and if it isn't, bad things happen.

Yes, this is one perfect example of how the brain dead design decisions behind Python killed the language's shot at becoming more popular.

Would you mind telling us all just what scripting languages you think are more popular than Python?

Only PHP is, if you consider that to be a scripting language. And PHP isn't more popular than Python for any reason other than the fact that PHP is tailored to a very specific problem domain for which there is a lot of demand, and Python is not tailored for that particular domain.

All to no avail it seems, because they completely forgot to realize that some people don't like being forced into one specific way of laying out their code.

What the designers of C completely forgot to realize is that some people don't like being forced into typing lots of unnecessary "{"s and "}"s and being forced to end each line with a semicolon. It should be up to the individual programmer to make their own little design decisions, like whether their lines should end with semicolons or not.

Face it, if Python had been invented and then adopted in the '60s, and then later someone came out with a new programming language that forced you to type a lot of useless crap like "}", "{", and ";" when you didn't really have to, that person would be laughed out of the programming community. Jut look at all of the hostility for Lisp because it makes you type a few extra parentheses. How is being forced to type ";" everywhere that Lisp would make you type ")" any better than that?

It is [b]important[/b] to many people to be able to make these petty design decisions.

It's important to many people that they not have to type useless extra characters that serve no purpose except to aggravate their carpal tunnel syndrome.

This is why there are endless flame wars, or were, over different styles of source code formatting, and why it's such a contentious issue amongst passionate and independent programmers who come together to volunteer on the same project. Python's designers incorrectly concluded they could just "legislate" all this away by making it impossible to do anything other than their one way of doing things. Bzzzt....wrong.

First of all, in the real world, you don't get to make these petty design decisions. They get decided for you by the style guide that you and your coworkers use. And then when you get a different job, they use a different style guide, and it's difficult for you to read your own code, because now you're required to put all your braces in a different place from where you're used to seeing them.

Secondly, no one argues over how to indent code. They only argue about where the braces go. Since these braces are completely unnecessary -- they are only there to help the computer, so it can parse things a bit more easily; they are not there for any human purpose -- you can solve his problem of where to put the braces by completely eliminating them. You've now killed many birds with one stone: (1) You end endless pointless debate. (2) Your code looks the same at every place you work. (3) You don't have to type pointless extra characters that serve no purpose. (4) Your code is easier to read without extra noise in the code. (5) You don't have to retire early due to becoming disabled by CTS.

This is one of several reasons why Python will never become much more of a niche language than it already is.

I think you just have no clue about how popular Python actually is. I don't know of any place that doesn't use it these days. It has completely supplanted Perl in all the work places that I have seen, for instance. Python's lack of wide adoption outside of scripting is for two very important reasons: (1) It is slow for CPU-bound tasks. (2) It is dynamically typed and large projects almost always prefer statically typed languages.

That sounds to me more like the attitude of someone who is unemployed and living in their mom's basement.

Wow, you sure told me.

I sure did.

Maybe one day you'll be mature enough to have a real conversation without having to call people names.

Maybe one day you'll learn not to act like a pot calling the kettle black.

Also, for the record, I didn't insult you. I merely honestly stated what your words as posted to Slashdot sound like to me. Maybe one day you'll improve your reading comprehension.

|>ouglas

Comment Re:Why BASIC? What for? (Score 1) 783

1) What you consider "proper", I consider to be "fucking stupid, unreasonable, and unreadable."

Indenting sub-blocks is what is proper. This is all that Python requires. Furthermore, it does away with the requirement for braces or "begin" and "end" statements. By your own argument, it's the height of arrogance to believe that requiring braces is proper.

If you don't think that indenting subblocks is the right thing, then (1) you're unemployable as a programmer, and (2) you're nuts.

It is the height of arrogance to believe that one's preferred way of indentation is more "proper" than someone else's, because it's completely subjective.

No it isn't. Many facts about how the vast majority of humans process visual information has been studied to no small degree, objectively and scientifically.

Of course, your entire argument is inane: Why should anyone be bothered to use the correct syntax? One's preferred syntax is no more proper than someone else's. Consequently, we should all use whatever syntax we want!

2) What makes youthink I need "a job" from anyone? I'm self employed...I create my own job. And no, my job won't ever involve having to put up with other people's arrogant and stupid design decisions.

That sounds to me more like the attitude of someone who is unemployed and living in their mom's basement.

|>ouglas

Comment Re:Why BASIC? What for? (Score 1) 783

Oops, I apologize then. Clearly you are a person of taste and refinement. I thought you were the same person that I was originally replying to, and he or she was claiming that my reply was lame, and consequently, that he or she had thereby won the game. There are far too many Anonymous Cowards around here taking pot-shots for me to keep proper track of them all. Me, I just let it all hang out, and let my karma go where it will. As of yet, it is stuck on "Excellent".

Re the Special Olympics, you are right. I hereby apologize to the numerous Special Olympics athletes who participate here on Slashdot.

|>ouglas

Comment Re:Why BASIC? What for? (Score 1) 783

Certain things are true not because Newton said so, but because that's how Nature happens to work ;)

I didn't say that they are true because Newton said so. I said that I know that they are true because Newton (and other authorities) said so. It doesn't hurt that I can verify some of what he claimed for myself, but I'd have a much harder time doing that for General Relativity.

As for Pascal: it's pretty much C without syntax insanity. You can still shoot yourself in the foot with pointers, but you have to be more explicit about it. I think that using a language that completely obscures pointers is not a good choice for teaching at college level. You can do high order functions in both Pascal and C, you just need a library for it, and the syntax would be devoid of sugar since the language doesn't support it with built-in primitives.

So what you are claiming is Greenspun's Tenth Law. No thanks! To achieve a goal you should use a tool which provides a natural solution, if one such tool exists, not an awkward and cumbersome solution.

Also, when you sing the praises of Pascal, you are talking about a language that was so inflexible that for years it didn't even support functions that can operate on arrays with unspecified dimensions. I'm sure it got better, as there were ultimately quite a few dialects of Pascal, but personally, I'd much prefer to use a language got off to a more auspicious start, and one that provides automatic memory management. Life is too short for explicit memory management, unless you are writing an OS. With automated memory management, you can also do a much better job of teaching the concepts of encapsulation and modularity.

Re pointers, I got an extremely good CS education at MIT without us ever using a language with explicit pointers. I did of course implement an interpreter and a Modula II compiler and designed and built my own CPU on a breadboard using and-gates and adders, plus my very own microcode, etc., etc. These things are all significantly easier when you start off with powerful tools, rather than ones that make you waste your time counting beans.

Besides, IIRC, we were talking about replacing Basic, which is more appropriate for the junior high school level than the college level.

|>ouglas

Comment Re:Why BASIC? What for? (Score 1) 783

Are you saying that you know better than the entire MIT Electrical Engineering and Computer Science faculty.

An appeal to authority, I see.

There's this strange meme on Slashdot that appeals to authority are innately bad. This is an absurd claim. I know that certain things are true because Isaac Newton said so, and I know that other things are true because Einstein said so. Unless I'm willing devote quite a few years of my life to fully understanding General Relativity, it's going to have to remain this way.

Unlike with General Relativity, I could actually expound on and on about why Python is a great language for beginners, ad infinitum. But that would aggravate my carpal tunnel syndrome too much.

Well, they had a few choices, really: Pascal, C/C++, C#, Java, Python, and maybe assembly but for a sane architecture (I'd take Parallax Propeller over x86 anytime). Short of coming up with something brand new, anything they chose would have shortcomings one way or another. I would choose Pascal, but that's just me.

Yes, that's just you. No one teaches in Pascal anymore. And MIT never did. At MIT before Python was used for introductory programming, Scheme was used. I'm actually pretty sad that MIT doesn't still use Scheme, but Python is a decent alternative.

Btw, none of the languages that you mentioned, other than Python would work at MIT, because MIT's introductory programming class makes heavy use of higher order functions, which are a non-starter in Pascal and C. You can use them in Java, but doing so is extremely cumbersome. I know that C++ just added support for function literals in the latest standard, but I don't think this has been widely implemented yet, and it is sure to also be very cumbersome.

|>ouglas

Comment Re:Why BASIC? What for? (Score 1) 783

Java is used in a lot of respectable universities in introduction (and/or higher level) courses, yet we all know it's shit and should not be used. I don't see how MIT is relevant here.

I have probably many more specific complaints about Java than you do. I've programmed in it professionally, and I continue to do so. We've largely switched from Java to Scala, however, due to our complaints with Java. All this being said, Java makes a decent programming language to teach Software Engineering in.

|>ouglas

Comment Re:Why BASIC? What for? (Score 1) 783

2. An appeal to authority logical fallacy.

First let me state that you seem like a far more reasonable person than the vast majority of the denizens of Slashdot. For the most part, the right answer to any question posed on Slashdot is precisely the opposite of the consensus here. However, since this property is so reliable, this actually makes Slashdot somewhat useful.

That being said, your claim that I made an "appeal to authority logical fallacy" is absurd. I made no claim to making a logically deductive argument. Logically deductive arguments are notoriously difficult to produce for any topic of substance, and even when you do manage to compose a good one, the logically valid argument is usually unsound anyway due to questionable premises.

An appeal to authority is a perfectly good prima facie reason to believe a proposition, given a fairly reliable authority. I take it as a given that MIT as a whole is seen as a fairly reliable authority. If MIT were not, it would be unlikely to have the reputation that it does. A collection of good reasons doesn't comprise a knock-down argument, but the more good reasons that one has, the more solid ground one is on. I may not always be right, but I'm always on solid ground. The majority of Slashdot is treading quicksand.

|>ouglas

Comment Re:first principles (Score 1) 237

This is completely wrong of course. Programming is best taught by doing .

Where's the argument that "doing" necessarily means "learning how to use the latest toolkit"?

No such argument was provided because no such assertion was made. I said that learning to program is best done by doing. That means the focus should be on the doing first and foremost. At any given time, the best tool for accomplishing a task might be a tool that was invented many years ago, or it might be one that was invented yesterday, but it will, almost by definition, be one that makes it as easy as we know how to make it to get things done. People who argue that things should be done the same way that they did things back in the day are the same sort who think that kids should have to walk to school uphill in the snow both ways, because that's how they did it when they were kids. More particularly, focussing on coordinate frame transformations, the theory of Phong shading, GPU optimizations, etc., is almost guaranteed to turn off the typical high school student from wanting to program ever again.

First get them interested, and later you can send them off for a Stanford education.

|>ouglas

Slashdot Top Deals

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...