I use it for any reasonably sized project. It has great support for tracking time (or not), iterations, defects, and backlogs.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Actually, you know what comes in handy for developing GUIs and interaction design? Cognitive Psychology. Linguisitics. Graphic Design. All taught at universities and count towards a degree in Computer Science or Cognitive science.
Need to develop an object structure or database schema for your application? At the most obvious, there's object oriented design theory. Database theory. Less obvious is analytic philosophy, such as symbolic logic, epistimology, ontology, and theory of language. They are directly applicable to knowledge representation, and help you think about abstraction, representation, and who "knows" what.
More directly to your problem, hashtable or tree for that map? Or linked list or array? If you don't know how those work, you don't know which ones are appropriate for a given task. Taught in Computer Algorithms, and can be pretty tricky to pick out the gotchas in those.
Want to write a game AI? Better have taken Artificial Intelligence and Natural Computation (neural nets, genetic algorithms, etc.) courses, or be really, really, good at predicting which algorithm to use in what case.
In all these cases, a class is usually pretty good at conferring the theory of the subject, which give you a better understanding of why and in what circumstances they work. And theory is usually hard to come by in practical books (learn to write a game in 24 hours!, learn Hibernate in 12 days!). You'll learn the how, but not the why behind the how.
If you look at industry 20 years ago it looks nothing like it does today. However, what was "theory" then (functional languages, AI, data mining, natural language processing, test driven design, parallel distributed computing) is practice today. In 20 years, the "practical" IT aspects will be completely different, but the theoretical foundations will still matter. You're going to need to learn how to keep up with practice yourself on your own as a matter of a) career maintenance and b) personal interest. From personal experience, I found it was much better for me to get started with that early. Take classes in the aspects that won't change, and teach yourself the latest and greatest. You only get a degree once, don't waste it on the flavor of the month.