Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment are you sure there is no practical application (Score 4, Insightful) 479

You assert without proof that your research has no practical application. Were your researching how to implement LOGO in VAX assembly language or something?

More to the point, if your research was on the cutting edge of Computer Science I assure you it has practical applications. Use some of the research skills that you gained obtaining your PhD and put them to use identifying companies that have business or research interests in line with your own. Then, using LinkedIn or conference proceedings, identify researchers and engineers with interests similar to your own and contact them. Ask to set up informational interviews. See if they "know anyone" looking for new researchers. Build a network tirelessly until you have a job.

You have a PhD. You're not a programmer anymore. Accept it and don't look for programming jobs. Most organizations that are pushing the state-of-the-art have need for PhD-level people. Find them and find your niche.

Comment ask your advisor (Score 5, Insightful) 479

Surely your advisor has links to industry? Where does the funding come from? Industrial consortia? Federal sources (NSF / DOE / etc). Can you look at doing a postdoc at a National Lab so you can make some contacts? If you don't, ask your advisor for help. It is the least he or she can do for you.

I don't think resume sites are good places for a newly minted PhD to look for work. You surely did some networking while you were a student. Did you present your research at some conferences? Those are the people you should be talking to about work, not filling out on-line applications. At the PhD level you find work based on a personal network, not web-based applications (although you will need to fill those out for compliance).

Comment Re:Simply ignore studies ... (Score 1) 588

Hah? Weight loss can certainly be attained through exercise. Basically, you need to burn more calories than you take in. You can do that by reducing calories, or by increasing the burn rate. If you keep your calorie intake constant and increase your exercise, you will lose weight, all else being equal.

While this is technically true, in practice it is very, very hard to significantly increase your exercise while keeping your caloric intake constant.

This is simply because you get much hungrier when you're exercising. If you increase your exercise volume while keeping your eating constant you'll feel miserable and hungry all the time. Just like dieting, except you'll feel worse for a given calorie deficit.

You can lose weight through diet, exercise, or a combination. For most people a combination works best but you have more leverage on the diet side than on the exercise side.

Comment Re:conflict (Score 1) 78

Indeed. I also find it strange Matt is so adamant that Tesla was shafted by modern memory, when the very unit of magnetic field strength is the Tesla! How many people get units of measurement named after them? Why did Musk name his car company Tesla if nobody had ever heard of him? Why did a heavy metal band name themselves Tesla and use the electricity metaphor in their marketing? There are researchers who probably contributed even more to the development of the modern world such as Steinmetz, Heaviside, and Shannon who are more obscure to the general public than Tesla.

Comment Re:What difference now does it make? :) Sunk costs (Score 4, Insightful) 364

You seem to misunderstand what sunk cost means. You're using the phrase as an argument to keep funding the project because "we can't reverse time and get the money back". In fact, the common definition of the sunk cost is opposite of your use. Generally only future costs should be relevant to an investment decision, otherwise you run into the danger of "throwing good money after bad". There is a lot of evidence that continued funding of the F-35 is in fact throwing good money after bad.

You also present a false dichotomy. One alternative option from spending upwards of a Trillion dollars on the F-35 is to manufacture more smaller, cheaper, proven fighters such as the F-18 or indeed the F-15. Keeping our current squadrons operable is less of an issue if we build more at lower cost.

Comment Kilby & Noyce (Score 1, Informative) 76

While Kilby's chip with bondwire interconnect was first, it's interesting that Noyce's concept at Fairchild using Hoerni's planar technology with all interconnect fabricated using the same photolithography as the devices is pretty much how we do it today. Kilby's concept was a technological dead end.

Comment Re:It's the fundamentally wrong approach (Score 2) 47

"Like the brain" is a fundamentally wrong-headed approach in my opinion. Biological systems are notoriously inefficient in many ways. Rather than modelling AI systems after the way "the brain" works, I think they should be spending a lot more time talking to philosophers and meditation specialists about how we *think* about things.

What you're suggesting has been the dominant paradigm in AI research for most of the 60-70 odd years there has been AI research. Some people have always thought we should model "thinking" processes, and others though we should model neural networks. At various points one or the other model is dominant.

To me it makes no sense to structure a memory system as inefficiently as the brain's, for example, with all it's tendancy to forgetfulness, omission, and random irrelevant "correlations". It makes far more sense to structure purely synthetic "memories" using database technologies of various kinds.

I have to disagree on it making no sense to structure a memory system "inefficiently" as the brain's, because inefficiency can mean different things. The brain is extraordinarily power efficient and that is an important consideration.

It's most likely, in my opinion, that we will eventually find a happy medium between things that computers do well, like compute and store information exactly, and what humans do well, process efficiently and make associations and correlations quickly.

Sure, biologicial systems employ some interesting short cuts to their processing, but always at a sacrifice in their accuracy. We should be striving for systems that are *better* than the biological, not just similar, but in silicon.

While I don't doubt silicon will be important for the foreseeable future, it does have limitations you know.

Comment Simulate a microprocessor. (Score 3, Insightful) 172

When I was in graduate school I had to write a C program to simulate the operation of a small custom microprocessor. It was a truly fascinating experience (and not terribly difficult). You can start with something really simple like a MIPS variant and go from there. I actually had to write several simulators at different levels of abstraction (one only simulated the instruction set, another simulated down to the microcode, etc). Just simulating a small instruction set is a great way to get started.

The cool part of this kind of project is it gets you learning so many different things out of necessity. To run assembly code on my C-based microprocessor simulation I had to learn to write assembly language programs. Then I had to learn how to write an assembler (I did it in C but if I were doing it today I would use Perl or Python) to generate object code for my microprocessor simulation.. Then to debug the microprocessor I needed to write a disassembler and so on.

The microprocessor was microcoded so I also got to learn how to write microcode to verify fine details of the microprocessor. I got some great insight to computer arithmetic and really enjoyed it.

I can't tell you what a cool experience it is to see a simple assembly code you wrote run on a microprocessor simulation you wrote. This can lead to getting involved in emulation but I didn't do that. I'm in the chip design business now so I write simulations and models of all kinds of analog and digital circuits and it is a blast.

Comment Re:No mysteries solvable within a lifetime (Score 4, Insightful) 292

I think you can demolish his argument that Nobel lag is indicative of science slowing down much more easily than that.

Think of the Nobel prize as an asynchronous FIFO. Every time a Nobel-worth discovery is made it gets put in the FIFO. Each year the Nobel committee awards a prize and removes one prize from the FIFO.

What if science is speeding up? Then more discoveries will be put into the FIFO than Nobel prizes can empty. So the FIFO gets longer and the length of time between discovery and prize gets longer.

What if science is slowing down? Than the consumption rate is larger than the generation rate and the FIFO empties. Eventually a scientist would win a prize the same here the discovery is made.

I don't understand this guy's logic. It seems to me more parsimonious that there are so many great discoveries for the Nobel committee to choose from that they are starting to queue up.

So, I think his data indicate science is speeding up.

Slashdot Top Deals

"When it comes to humility, I'm the greatest." -- Bullwinkle Moose

Working...