Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:conflict (Score 1) 78

Indeed. I also find it strange Matt is so adamant that Tesla was shafted by modern memory, when the very unit of magnetic field strength is the Tesla! How many people get units of measurement named after them? Why did Musk name his car company Tesla if nobody had ever heard of him? Why did a heavy metal band name themselves Tesla and use the electricity metaphor in their marketing? There are researchers who probably contributed even more to the development of the modern world such as Steinmetz, Heaviside, and Shannon who are more obscure to the general public than Tesla.

Comment Re:What difference now does it make? :) Sunk costs (Score 4, Insightful) 364

You seem to misunderstand what sunk cost means. You're using the phrase as an argument to keep funding the project because "we can't reverse time and get the money back". In fact, the common definition of the sunk cost is opposite of your use. Generally only future costs should be relevant to an investment decision, otherwise you run into the danger of "throwing good money after bad". There is a lot of evidence that continued funding of the F-35 is in fact throwing good money after bad.

You also present a false dichotomy. One alternative option from spending upwards of a Trillion dollars on the F-35 is to manufacture more smaller, cheaper, proven fighters such as the F-18 or indeed the F-15. Keeping our current squadrons operable is less of an issue if we build more at lower cost.

Comment Kilby & Noyce (Score 1, Informative) 76

While Kilby's chip with bondwire interconnect was first, it's interesting that Noyce's concept at Fairchild using Hoerni's planar technology with all interconnect fabricated using the same photolithography as the devices is pretty much how we do it today. Kilby's concept was a technological dead end.

Comment Re:It's the fundamentally wrong approach (Score 2) 47

"Like the brain" is a fundamentally wrong-headed approach in my opinion. Biological systems are notoriously inefficient in many ways. Rather than modelling AI systems after the way "the brain" works, I think they should be spending a lot more time talking to philosophers and meditation specialists about how we *think* about things.

What you're suggesting has been the dominant paradigm in AI research for most of the 60-70 odd years there has been AI research. Some people have always thought we should model "thinking" processes, and others though we should model neural networks. At various points one or the other model is dominant.

To me it makes no sense to structure a memory system as inefficiently as the brain's, for example, with all it's tendancy to forgetfulness, omission, and random irrelevant "correlations". It makes far more sense to structure purely synthetic "memories" using database technologies of various kinds.

I have to disagree on it making no sense to structure a memory system "inefficiently" as the brain's, because inefficiency can mean different things. The brain is extraordinarily power efficient and that is an important consideration.

It's most likely, in my opinion, that we will eventually find a happy medium between things that computers do well, like compute and store information exactly, and what humans do well, process efficiently and make associations and correlations quickly.

Sure, biologicial systems employ some interesting short cuts to their processing, but always at a sacrifice in their accuracy. We should be striving for systems that are *better* than the biological, not just similar, but in silicon.

While I don't doubt silicon will be important for the foreseeable future, it does have limitations you know.

Comment Simulate a microprocessor. (Score 3, Insightful) 172

When I was in graduate school I had to write a C program to simulate the operation of a small custom microprocessor. It was a truly fascinating experience (and not terribly difficult). You can start with something really simple like a MIPS variant and go from there. I actually had to write several simulators at different levels of abstraction (one only simulated the instruction set, another simulated down to the microcode, etc). Just simulating a small instruction set is a great way to get started.

The cool part of this kind of project is it gets you learning so many different things out of necessity. To run assembly code on my C-based microprocessor simulation I had to learn to write assembly language programs. Then I had to learn how to write an assembler (I did it in C but if I were doing it today I would use Perl or Python) to generate object code for my microprocessor simulation.. Then to debug the microprocessor I needed to write a disassembler and so on.

The microprocessor was microcoded so I also got to learn how to write microcode to verify fine details of the microprocessor. I got some great insight to computer arithmetic and really enjoyed it.

I can't tell you what a cool experience it is to see a simple assembly code you wrote run on a microprocessor simulation you wrote. This can lead to getting involved in emulation but I didn't do that. I'm in the chip design business now so I write simulations and models of all kinds of analog and digital circuits and it is a blast.

Comment Re:No mysteries solvable within a lifetime (Score 4, Insightful) 292

I think you can demolish his argument that Nobel lag is indicative of science slowing down much more easily than that.

Think of the Nobel prize as an asynchronous FIFO. Every time a Nobel-worth discovery is made it gets put in the FIFO. Each year the Nobel committee awards a prize and removes one prize from the FIFO.

What if science is speeding up? Then more discoveries will be put into the FIFO than Nobel prizes can empty. So the FIFO gets longer and the length of time between discovery and prize gets longer.

What if science is slowing down? Than the consumption rate is larger than the generation rate and the FIFO empties. Eventually a scientist would win a prize the same here the discovery is made.

I don't understand this guy's logic. It seems to me more parsimonious that there are so many great discoveries for the Nobel committee to choose from that they are starting to queue up.

So, I think his data indicate science is speeding up.

Comment Re:Punctuated upheaval (Score 1) 292

In my opinion this is a bit like sitting in your backyard with a telescope opining that there are no new planets left to discover in the solar system while people are out paving the way to actually visit them.

We don't yet understand if there are simple underlying principles in biology as there are in physics. Biology is so much more complex that physics and we are still in the 19th century...

At some point someone is going to discover the biological equivalent of quantum mechanics and then the world will change again.

It could be that this discovery could be a way to harness computation to really get a handle on complexity or it could be the discovery of the underlying principles.

I can't wait to find out.

Comment what about Neuroscience and structural biology? (Score 1) 292

I looked at the article and the author is focused on advances in physics, where he may actually have a point.

He doesn't seem to be aware of some of the stuff being done in neuroscience, nanotechnology, and structural biology, to name a few.

We've come so far in getting more insight into the biological and electrical nature of the brain in just a few years and the idea of a connectome (that we can actually map in principle) is a huge breakthrough that will lead to fantastic new technologies.

When one field plateaus, another explodes. Look up epigenetics and CRISPRs and prepare to blow your mind. To say we are near the end of science is crazy.

Also, this author doesn't seem to know about Occam's Razor. There are many explanations for why Nobel Prizes are taking longer to get awarded than any concept of science slowing down.

Comment Re:Level of public funding ? (Score 5, Insightful) 292

That's not necessarily a bad thing. Science is worthless if we don't use it in practical applications. But if we're looking for reasons why less basic research is getting done, this could play a role.

I think it's a bad thing. Most of our great advancements in consumer electronics, medicine, and computing are based on mining basic research (that was mostly publicly funded). When that mine is played out where will the raw material for new advances come from?

Slashdot Top Deals

"Look! There! Evil!.. pure and simple, total evil from the Eighth Dimension!" -- Buckaroo Banzai

Working...