This was first posted in a string about whether or not computer science was dying.
How many computer science graduates are actually doing computer science.
Working as a programmer in an IT department doesn't usually have much to do with science -- it's about trying to create more standardized cogs from a growing number of previously coded cogs.
How much research is being done today on individual users benefiting from 32-128core machines?
Assuming some magic doesn't happen and GHz start climbing / doubling without frying, it seems like they are just raising GHz/chip by adding more cores at similar clock rates. How can this benefit the average PC user? If it cannot be exploited for "individual persons", it sounds like the Personal Computer may be a thing of the past.
Is that what people want? Right now, the only ways of using multiple cores is usually running separate programs at the same time (if you can use that, however a home system doesn't usually need to serve web pages to 1000's of users); OR divide your machine into "VM's". Nice for test/develop/production/redundancy, but again -- not too helpful for the average joe wanting programs to run faster and smoother, and with more user friendliness.
What will it take to use parallel computing in the Personal Computer industry? doesn't that sorta imply, if not the death, a serious problem in the Personal Computer industry.
Do we have enough cores to start building some practical AI's? Can we develop special compilers and light-weight threads (i.e. - not separate processes) to allow use of multi cores dynamically in an individual program (ex. - "for loops", not needing previous loop results, could all be parallelized if parallelizing cost to create helper threads for a few to several loops could have low enough overhead to make it worth it.
Seems like AI and parallelizing are at least two areas that need computer science, but that doesn't seem to be what most people are doing these days. Might use it in voice and face recognition, but again -- not very general tasks.
What companies are doing Computer Science these days? Seems like most employers just need development of applications to run current "paradigms. No "computer science" needed.
Most high level developers at companies -- even in Open Source (or at least the Linux Kernel) are awfully conservative when it comes to doing computer science. They want the tried and true, step-wise development vs. large scale "disruptive technologies" that could enable whole new ways of doing things. People at the top of most large projects (commercial and O.S.) are too conservative to be doing real computer science.
Maybe research grants? Seems like the Bush idea of research grants are things that are guaranteed to provide benefits in the immediate future (soon enough to be used in next year's theater of battle, for example). That's not the most flexible environment to develop new technologies that may not be directly applicable to the next new stealth whatchamacallit.
Is there any place for computer science research outside of getting your doctoral?