Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Scrum (Score 3, Interesting) 221

I second the nomination of Scrum, which complements agile development practices.

Scrum is about managing development priorities. You can't work efficiently if you keep changing priorities every day because nothing will ever get finished. On the other hand, if you *never* change development priorities until you've finished everything you set out to do, developers are happy but they might not be working on things the business needs or wants.

The truth is that businesses have to respond to change. A rival announces a new feature; the price of some related product or service changes dramatically; regulators threaten to fine your company for some reason; a PR scandal forces your CEO to get up and make public promises you'd never imagined. Things like these can change a business's priorities, and if your employer's priorities change, yours ought to as well. Just not so often you never manage to finish anything.

Scrum strikes a sensible balance between changing direction so often you never finish anything, and putting your head down and finishing things but then finding out your employer actually needed something else. Don't get me wrong, if you *can* keep the same priorities for months on end, you should. But in many situations you don't have that luxury. You have to respond to business changes, while at the same time finishing what you set out to accomplish.

Comment Re:I think... (Score 1) 304

No problem on the TL DR, but you raise an important point. You're absolutely right that a strongly typed language has some optimization advantages, but CPU is only one kind of resource. Optimizing CPU usage for a sequence of statements is a good thing, but that's simply not the bottleneck in scaling web services these days.

Node.js demonstrates this. Under the covers it uses polling (I presume) to ensure that the CPU keeps doing useful work as the load climbs, rather than spinning its wheels waiting for I/O. So instead of allocating a thread per request and stalling every single thread as it waits for the results of a database query, Node just goes down the list of queries with data returned and fires off a small event handler you write in javascript. I suppose it helps that the javascript engine Node uses is very efficient (for Javascript), but there's more to gain in efficiently CPU usage by managing *other* resources efficiently than there is by compiler optimizations -- at least for *typical* web applications, where the task is to glue back end resources like databases to front end applications in HTML.

Comment Re:Poison fruit (Score 1) 1448

I dunno. I'm pretty liberal myself, and I've supported same sex marriage since even before it was legal here in Massachusetts. I see no reason to boycott an author *just because I disagree with him*.

Now I could perhaps see some point in it if it were ten years ago, and I'd be sending money to an anti-gay marriage activist who would turn around and spend it on perpetuating an injustice. But as Card says, it's a moot point now. Opposition to gay marriage has been defeated with stunning rapidity, and as the change is implemented people will discover that dire predictions for the institution of marriage won't come true. In twenty years young people will wonder what all the fuss was about.

So the only point of a boycott NOW is to punish Card for being wrong. I suppose there's something in that, but I can't get all that excited about it; it smacks of being a bad winner. And if we punish people on our right for being wrong, shouldn't we also punish people on our left? Shall we boycott Frederik Pohl for being a former communist? Granted, he's not really much to *my* left, but I've never advocated nationalizing private businesses, I think that's morally wrong.

Now I don't care a bit about the movie, it can sink without a trace as far as I'm concerned, but to be totally consistent in the Jihad Against Card we'd *also* have to target the book; to shame people who buy the book and stores that sell the book. Against the value of stroking our righteous indignation against Card for his past misdeeds, we have to set the loss the the public of what is a landmark literary work. It's hard to name a science fiction novel in the past thirty or forty years of greater literary importance. Perhaps THE DISPOSESSED, GATEWAY, THE LEFT HAND OF DARKNESS, DO ANDROIDS DREAM OF ELECTRIC SHEEP, or TAU ZERO. Just a handful, and few of them are as accessible as ENDER'S GAME, which can be read as a straight up adventure story or bildungsroman. Accessible it may be, but ENDER'S GAME does something very interesting and ambitious: it explores the very nature of moral responsibility.

If there is a moral imperative to make the ENDER'S GAME movie into a commercial failure, then why wouldn't the *same* imperative must apply equally to the novel? And if we forced ENDER'S GAME out of the bookstores, we'd be depriving those future people (who have no idea what the fuss about same sex marriage was about) of an important science fiction novel.

Comment JavaEE : Node.js :: Apples : Oranges (Score 2) 304

You're asking for the classic apples to oranges comparison here.

Node.js is all about scaling the number of requests/second -- about minimizing the number of boxen you need to serve thousands of requests per second. By using polling instead of threads (under the covers) and asynchronous event handling (above the covers), it becomes simple to respond to high volumes of requests without allocating huge volumes of resources.

But requests/second is only one dimension of scalability. There is management of the infrastructure. There is security (the number of Node tutorials which completely omit this is shocking). There is complexity (much of which in Node.js is pushed to the client side). There are features (e.g. the messaging and timer services in Java EE).

The buzz over Node.js reminds me of the buzz over Ruby on Rails a few years back. RoR also introduced an elegant new programming paradigm -- configuration by convention. People were amazed that the could field a simple example app without having to write XML configuration files for the ORM layer. Look! It creates all CRUD interfaces for me! But in the end those tasks really aren't that challenging for an expert programmer; they're more like sand in the gears when you're starting up a project. So while RoR remains a good tool for certain kinds of web apps, it's nowhere near as revolutionary as it seemed at the time, and it has little penetration in the enterprise market. It sees to me that most of the joy in Node.js is likely to be on the front end of the project, but in the long tail of the project you're still going to have a lot of drudge work, especially where you have to roll your own enterprise features.

Which is not to say Node.js isn't brilliant. It appeals to the old Unix man in me, because it does one thing really well. It's a superb piece of middle-ware glue. It makes exposing back end services like databases to RESTful web clients a snap, and if you've got to do that on a massive scale, where by "massive scale" I mean by retail web standards where you have to handle tens of thousands of simultaneous connections. For web applications where you don't have to integrate with a lot of back end enterprise systems and where there's a heavy emphasis on a rich HTML/CSS UI, Node.js is an elegant solution that reduces the information overload on the development team by taking advantage of the Javascript expertise they're bound to have.

Comment Re:I think... (Score 1) 304

My oppinion is that Javascript is not bad as a scripting language, but we are abusing it and twisting it beyond its original purpose. The main issue is actually that Javascript is too flexible. Untyped code has an habit of hiding mistakes in hard-to-debug ways. But once you add types to Javascript, it's not Javascript anymore.

A couple of points. First, the argument you're making is for static -- i.e. compile-time -- semantic checking. I was hearing the very same argument thirty years ago among people who advocated Pascal over C; on paper it's sound, but decades of practice have convinced me that static checking, while probably helpful, is not as efficacious as it seems like it should be. After all, you *still* run into mishandled exceptions in Java, and many Java programmers do an end-run around around much of the compile-time restrictions by using runtime exceptions; in fact wrapping low-level checked exceptions in runtime exceptions are a feature of some frameworks. It's hard to get programmers out of the mindset that a core dump, or an unhandled exception, is some kind of serious calamity. You don't want them in production code of course, but they're far less calamitous than continuing processing with bad data. The place to catch those problems is in testing.

As for the other kind of compile-time problem, handing an object to a routine that expects an interface the object does not support, what scripting languages like Python have shown is that it's not that big of a problem. Large, sophisticated systems programs have been written in Python, which lacks precisely this kind of checking.

I'm very comfortable with Java, but there is something about the design of the language and its core libraries which encourages over-engineering. For example, itis possible to do asynchronous http handling in Java EE, like you do in Node.js, but in Node.js you simply create a non-blocking event handler for every case you want to handle. In Java EE, if memory serves, you create a ServletContextListener with a Queue member, and put that member in the servlet context. Then in your servlet you create an AsyncContext and put it in the Queue. None of this is as complicated as it sounds, but it only covers the case when you want to keep a socket open to the client, like when you're doing Comet. If you want to conserve resources by serving your I/O requests using polling rather than blocking threads (the primary Node.js use case), you've got to use java.nio, which can get complicated.

On paper, designing programs in Java is a cinch with Java's rich OO features. Any reasonably competent programmer can quickly learn enough to demonstrate facility with the *features* of the language. In practice as a programs required feature set grows, the number of classes and interfaces explodes, unless you are a very talented software designer.

While it's true most Javascript programmers write simple onload and onclick handlers, this doesn't make Javascript a toy language. While its object-oriented features are relatively primitive, its functional programming features are quite sophisticated, and this turns out to be just the thing when you are writing sophisticated event handlers, as in Node.js. Expert Javascript programming is about implementing higher order functions which return closures, something that current versions of Java can't do (Java SE8 is getting lambda expressions, so this might change). Functional programming is all about getting a single transformation of inputs to outputs correct; it feels *contained* -- that is to say you don't need to think a lot about the context in which your code runs. Programming in Java is often an exercise in information overload, as it typically involves mastering the nuances of multiple complex libraries and frameworks so you can fit them together properly.

Comment Re:Really?!? (Score 5, Insightful) 1448

He'd have been better off not saying anything. I'm sure I've read about him being a bigot in the past, but I'd actually forgotten about it. I can understand people not liking things that they feel are too "different", but I can't understand why he'd actively campaign against people who are different from him..

This is like some weird, modified version of the Streissand effect at work.

Comment Re:Had a bicyclist blow through a red-light today (Score 1) 413

So, I saw a *car* run a red light today. Strangely, that does not cause me to question the right of automobile drivers to use the roads.

I agree with most of your other points, although a turn signal isn't really practical. I use hand signals (although most drivers don't seem to understand them, so I signal a right turn with my right hand). I stop at the stop-line, I don't weave between cars, and in general obey all the rules that apply to slower moving vehicles. I do claim a lane when I travel in traffic, although I pull over regularly (when it is safe) to let trailing cars pass. That's not the law, it's just common courtesy.

Comment Maybe not so stupid (Score 1) 254

Security is hard. General-knowledge techies think they're much better at security than their masters, but I have my doubts. Techies don't always understand the value of assets and nature of threats to those assets. And they often overestimate their knowledge of system vulnerabilities. For example many techies think you can turn a computer into a blank slate by erasing the hard drive, but there have been demonstrations of firmware based malware. Just last year a security researcher created a proof-of-concept worm that stores itself in a computer's BIOS and the flash memory of attached devices and PCI cards. It has stealth features that make it virtually undetectable, except by pulling the flash chips and dumping their contents. If you *were* infected by a worm like this, and you wanted to eradicate it, you would *have* to physically destroy any attached device which had its own flash memory, including cameras, optical drives, and possibly even printers . Eradicating all physical traces is probably more than is needed to deactivate the worm, but it's a subtle point.

Another subtle point is that if you are worried about almost non-detectable malware, you have no assurance that the new equipment you are buying to replace the old stuff isn't factory infected. What that probably means is that trying to ensure you have a 100% guaranteed clean slate isn't cost effective for agencies, unless perhaps they are high value targets (e.g. NSA, CIA, some of the DoD). What to do instead isn't obvious. The simplistic model is you start with a clean slate and you prevent bad stuff from being introduced to your systems. That model doesn't work if you can't ensure your stuff is clean from the start, and if malware can enter your systems through channels you'd never imagined (e.g. some kind of innocuous USB device).

Destroying the equipment is almost certainly overkill in this case, but I can see why this particular agency might have chosen to do so. Given their role in advancing American competitiveness, they're probably hypersensitive to issues of industrial espionage and Advanced Persistent Threats (APT). According to the article the agency's CIO thought he was dealing with some sort of Stuxnet-like attack, which in hindsight doesn't seem to be the case.

As usual the /. summary is garbage. The agency spent 2.7 million to respond to the threat, but they didn't spend 2.7 million on hammer wielding contractors.Only $4,300 went to that, or 0.15% of the total expenditure on the event. The bulk of the rest of the money went to obtaining replacement services while their servers were offline, paying security investigators to track down the infection they did have, and developing a long term response to malware.

The physical destruction of the equipment was almost certainly overkill, as was bringing down their mail servers because they were transferrig infected emails. But one thing you have to admit is that the agency's response was swift and decisive.

Comment Re:Bullshit (Score 5, Insightful) 423

Having led development teams with native-born Indian engineers on them, I can confirm that Indian cultural diversity notwithstanding, deference to superiors is a big deal with many people brought up there. That's neither good, nor bad. It's just different. Where problems arise is when people don't recognize that there are differences and fail take those differences into account.

As an American, I don't feel insulted when a subordinate questions my ideas, in fact I rely on them challenging me. What took me awhile to figure out was that my Indian employees wouldn't stand up and contradict me, especially in public. In a American that would be cowardly, but that's because we communicate in what amounts to be a different social language from Indians. I soon learned that you have to manage employees from deferential cultures differently; you've got to spend a lot of personal time together having quiet chats, maybe go out after work for a couple of beers. And you have to recalibrate your trouble sensors when dealing with deferential employees. If you give them something resembling an order, if they do anything short of hopping right to it with open enthusiasm, it's time to have a quiet, tactfully executed one-on-one.

This is not a worse way of doing things, it's just different, and it has its advantages and disadvantages. For me the toughest thing was I had to be careful about thinking out loud -- at least at first -- because my guys took every that came out of my mouth so seriously. At first, I found my Indian subordinates to be frustratingly passive. They found me (no doubt) to be overbearing, insensitive, rash and pig-headed. This was all just miscommunication, because we all were acting and interpreting each others' actions through the lenses of different cultural conventions. In the end, we did what intelligent people of different cultures do when working with each other: we developed a way of doing things that combined what we felt was the best of both cultures.

And that's an important lesson: people aren't culturally programmed automatons. We are capable of thinking and adapting. People in an egalitarian culture are perfectly capable of coming together and working coherently as a team, although the process may look ugly and chaotic to outsiders. People in cultures with deference to elders are perfectly capable of reporting unwelcome news to a superior.

So if a junior pilot didn't communicate an emergency situation to a senior pilot, *then somebody on that team screwed up*. They weren't doomed to crash by cultural programming. There may be nuances of their culture which contributed to the disaster, but that's bound to be true of human error in every culture.

I won't go so far as to say that *all* cultural differences are superficial. But I think many differences are more superficial than a casual outsider might suspect. That outsider might look at something like the reluctance of a subordinate to question a superior's instructions and assume that the subordinate *can't*. That's simply not true. On one level, the shared cultural understanding of the subordinate and the boss provides them with ways of communication that escape the outsider's understanding. But more importantly, people aren't mindless cultural automatons. If his boss is about to stall your plane on the approach to the runway, I don't think a Korean co-pilot is simply going to stand by silently. I suppose it is possible that he might be inclined to wait a few seconds longer than an American co-pilot, but if that endangers the plane then that is a mistake, period. A Korean airline is perfectly capable of training the co-pilots to report problems promptly, just as an American airline can train co-pilots to execute the commander's orders promptly without engaging in an impromptu debate.

Comment Re:At 48, I got an offer from FB, but... (Score 2) 432

Not really discrimination if there are reasons. Old people are in physical and mental decline. Old people also aren't a minority: just like it's OK for a female manager to prefer to hire women, or a black manager to prefer blacks, the young can prefer their own kind. Sorry, time to die.

I've got news for you sonny -- we're *all* in physical and mental decline. If you think you are going to live forever, think again.

But the decline goes at different rates for each of us, it starts from different points, and is offset (in most cases) by gains in maturity, experience, and wisdom. So the bottom line is you can't make any useful generalization whatsoever about the ability of a fifty year-old to do programming vs the ability of a 25 year-old. It depends on all the things that add up to that unique person.

This is what's broken about bigoted thinking. It reduces people to some kind of ill-conceived average for their "group", when it ought to be evaluating them as individuals. Back in the 90s there was a controversial book called "The Bell Curve" which pointed out that there was a racial difference in IQs between blacks and whites, and made a number of (stupid) policy recommendations based on that difference. The inevitable shit-storm followed, in which the validity of IQ tests was questioned (in some cases with good reason), but lost in the shuffle was a simple mathematical fact: even if we assume that IQ tests are a perfect, unbiased measure of mental capability, and accept the racial differences in scores as measuring something real, those aggregate differences give almost no useful guidance in making decisions about *individuals*. That's because under those assumptions, something like 40% of blacks are smarter than 50% of whites. When you're looking for very high scoring individuals, they occur as statistical flukes in both groups.

Where that leaves you is that when intelligence is an important factor in judging a candidate for something, *especially* if you're looking for high scoring individuals, you have to judge individuals on their own merits. Skin color is at best statistically useless as a selection filter, at worst self-defeating.

The analogy holds for age differences. Even if we grant that 25 year-olds are on average more capable programmers than 50 year-olds (which is doubtful), it nonetheless remains that the vast majority of 25 year-old programmers are mediocre. It may be true that mental decay has shifted some fifty year-olds from the high performer category to the mediocre category, but it remains true that high performers are statistical flukes in either group. So gray hair has no value as a filter if you are looking for *good* programmers. They're a fluke in any category.

By the way, about older people being "minorities" -- they are effectively so *for purposes of anti-discrimination laws*. The term of art you are looking for is "protected class". So the good news for all you young, white American males who resent the legal protections minorities get is that all you have to do is survive until you are forty and you'll be protected by the Age Discrimination in Employment Act of 1967.

Comment Re:Seize wallet or real coints? (Score 1) 198

Out of interest, how to deleted coins get replaced into circulation? If there is a finite supply of BitCoin, and a slow de-circulation due to loss upon deletion, how does that get fixed?

In the real world, the government has statisticians who work out the approximate total loss due to destruction and re-mint coin to replace it. How would that work in the BTC world?

Slashdot Top Deals

I don't want to achieve immortality through my work. I want to achieve immortality through not dying. -- Woody Allen

Working...