Become a fan of Slashdot on Facebook


Forgot your password?

Comment I'm mostly on the #noestimates side (Score 1) 299

I'm a big proponent of Agile (mostly XP, mostly anti-Scrum) and have contributed some to the #noestimates "movement".

I don't really mean that nobody should ever estimate anything. I mean that I've never seen useful (fine-grained) estimates anywhere. Here are some of the problems with estimates that I've seen frequently:

  1. We're not good at estimating how long things will take. We're usually optimistic about how quickly we can get things done, and almost always miss thinking about things that will take more time. I've never seen a case where a project is completed more quickly than estimated. I've only rarely seen fine-grained (story-level) tasks completed more quickly than estimated.
  2. Management asks for estimates and then treats them as deadlines. The team then learns to inflate their estimates. Then management learns to reduce the estimates they're given. Given fudge factors in each direction, the estimate no longer has much reliability. Even if you're using story points, the point inflation/deflation leads to less consistency and therefore reduced reliability.
  3. Estimates that are given are negotiated down, or simply reduced. This leads to the question why you'd ask for an estimate and not take the answer provided. If you're not going to listen to the answer, why are you asking the question? This is probably the craziest one on the list -- given my first point, increasing an estimate would make sense. Reducing the estimates is just magical wishful thinking.
  4. Plans change and work is added, but the deadline (presumably based on the estimates) is not changed to correspond with the extra work involved. So again, you're not actually even using the estimates that were given.
  5. Management dictates deadlines arbitrarily, without speaking to the people who will be doing the work. Spending time estimating how long each task will take when the deadline is already set is completely pointless.
  6. Almost every deadline is complete bullshit, based on nothing. Often the excuse is that marketing needs to know when something will come out, so that they can let people know about it. Why they need to know the exact release date way in advance, I've never been able to figure out. Many people intuitively know that the deadlines are bullshit, and will likely be allowed to slip. The only exception to bullshit deadlines I've come across are regulatory deadlines.
  7. Estimation at a fine-grained level isn't necessary. Many Agile teams estimate using story points, and determine a conversion from story points to time based on previous empirical data. This is fine, except that the time spent estimating the story is wasted time -- counting the number of stories almost always gives the same predictive power. Teams tend to get better at breaking up stories over time, so that they're more consistent in size, so this becomes more likely over time.
  8. The ultimate purpose of an estimate is to evaluate whether the proposed work will be profitable, and therefore worth doing. Or to compare the ROI (return on investment) between alternative projects. But to know that, you'll have to know what value that work will provide. I don't believe I've ever seen that done -- at least not at a fine-grained level. Usually by the time you're asked to estimate, the project has already gotten approval to proceed.

I'll note that most of these pit management against the team, instead of working together toward a common cause. Most of the practices also lead to seriously demoralizing the team. And most of the time, the estimates aren't really even taken into account very much.

My advice is to first understand the value of a project before you consider estimating the costs. The estimation at this point will be very rough, so make sure that you have a very wide margin between the expected value and the rough estimate of the cost. If you're pretty certain of the expected value, I'd probably want to make sure I could still be profitable even if it took 3 or 4 times as long to complete as the rough estimate. And if there's uncertainty in the expected value, much more.

Another way to mitigate the risk of throwing money at something that's not going to have positive ROI is to reduce the feedback loop. Order the work so that the tasks are ranked in order of value. (Realistically, you'll have dependencies of tasks to worry about, and should consider effort involved too.) So work on the most valuable feature first -- get that out into production as soon as possible. Once that's done, you can assess if your ROI is positive or not. Keep iterating in this fashion, working on the features that will provide the most value first. Keep assessing your ROI, and stop when the ROI is no longer worth it, compared to other projects the team could be working on.

At a fine-grained level, if you're using story points, I'd ask you to do the math to see if just counting the stories would be as effective at predicting how much will be done over time as using the story points. If so, you can save the time the team spends on estimating stories. I'd still recommend spending time talking about stories so that everyone has a shared understanding of what needs to be done, and to break stories up into a smaller, more manageable size -- with one acceptance criteria per story. Also take a look to see if empirical average cycle time (how long it takes a single story to move from start to finish) might provide you the predictive power just as well as estimates. (I.e. is it bandwidth or latency that really provides the predictive power you're looking for?)

And don't forget Hofstadter's Law: It always takes longer than you expect, even when you take into account Hofstadter's Law.

Comment Re:In Theory - Thor (Score 1) 87

I'm an implementer of OIM (10 years now). OIM is an excellent framework for a provisioning tool, but the connectors are terrible (fortunately easy to build your own against the API) and the UI is useless. The most successful OIM implementations I've come across (or built) have been ones that used a custom UI and/or just made everything scriptable. The API is really the saving grace of OIM. It's confusing, but it is powerful.

Sadly, I'm watching the product spiral downhill as of the last several versions.

Comment Rubber Ducking (Score 3, Interesting) 131

It's called Rubber Ducking. The idea is that by talking out loud, you have to form your thoughts into words, which requires you to organize your thoughts more completely. Think about all the times that you've gone to ask someone a question, and as soon as you ask them the question, you figure out the answer yourself. Whether you use a rubber duck, a live video audience, or another person doesn't matter much. This is one of the reasons that pair programming can be quite effective.

Why Some Developers Are Live-Streaming Their Coding Sessions 131

itwbennett writes Adam Wulf recently spent two weeks live-streaming himself writing every line of code for a new mobile app. He originally started to live-stream as 'a fun way to introduce the code to the community.' But he quickly learned that it helps him to think differently than when he was coding without the camera on. "Usually when I work, so much of my thought process is internal monologue," he said, "but with live streaming I try to narrate my thought process out loud. This has forced me to think through problems a little differently than I otherwise would, which has been really beneficial for me."

Comment So do cars (Score 1) 228

Cars also help terrorists. Maybe we should consider restrictions on them too, to make sure they can't be used for terrorism. And guns help terrorists. I certainly don't see the Americans raising a fuss about that. Curiously, the UK doesn't seem to be raising a fuss about that either. Heck, western governments frequently help terrorists. Perhaps we should address that one first.

Saliva causes cancer, but only if swallowed in small amounts over a long period of time. -- George Carlin