Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Comment: Re:In other words... (Score 4, Insightful) 315

by ganv (#49779165) Attached to: Why PowerPoint Should Be Banned
Indeed, presentation tools can't compensate for poor skills in creating or giving presentations. Do people remember before powerpoint? At the scientific conferences I attended, as often as not people were throwing unreadable transparencies onto the project at a rate significantly faster than anyone in the audience except their collaborators could comprehend the concepts. Now they just flip through readable but incomprehensible power-point slides. It's the humans you have to fix, not the technology.

Comment: Re:This again? (Score 1) 480

by ganv (#49597901) Attached to: New Test Supports NASA's Controversial EM Drive
It is very hard to believe that they are going to send a propulsion system into space without a clear understanding of how it works. They claim that they have a device that violates very basic physics. They shouldn't be thinking about space flight at all. They should be asking the best experimentalists in RF cavities to collaborate with them to win the Nobel prize that will be given to anyone who shows a human scale object that violates the known laws of physics. It would be easily the most important discovery of the last 50 years. But for that same reason, it is 99.9% likely to be a misinterpretation of their experimental results.

Comment: Re:With the best will in the world... (Score 1) 486

by ganv (#49560791) Attached to: Audi Creates "Fuel of the Future" Using Just Carbon Dioxide and Water
Yes, the original article combines electrolysis, CO2 + H2 -> Diesel, and CO2 capture, three quite different processes. If you look at the full cycle efficiency including extracting the CO2 from the air, then the efficiency is likely very low. Electrolysis can be pretty efficient. I think it is straight forward to achieve 70% efficiency. Maybe the CO2 + H2 -> Diesel process can be made 50% efficient. Afidel above says 50-60% max, and that seems optimistic but not obviously wrong. But if you have to also extract the CO2 from the air, then your efficiency is going to be much lower. If this process is commercialized, I suspect it will use high concentration CO2 from power plant emissions. If you have to extract CO2 from the air, I would guess you would be happy to achieve 10% of the input electrical energy returned to heat by burning the Diesel fuel, and of course the diesel engine will only be 30% or 40% efficient or so, so on those (very rough) estimates, a vehicle driven on this fuel would require something like 25 times more renewable energy than the same vehicle powered directly by electricity. That still may be useful. Sometimes there is excess renewable energy, and this could be a way to put some of it into long term storage. And if you are on a nuclear aircraft carrier, you may be quite happy to obtain jet fuel from electricity even if the efficiency is low.

Comment: Re:recent breakthrough. (Score 1) 197

by ganv (#49530639) Attached to: Concerns of an Artificial Intelligence Pioneer
No progress you say? Seems like computers that drive cars and win at chess and Jeopardy are clearly progress. I guess if you define progress as the creation of a human level intelligence, then there will not be any progress until one arrives. But that is a useless notion of progress. I guess you are really arguing that we have a long ways to go. And I would agree. Do you have any evidence for the claim that the human brain is close to the maximum interconnect possible in this universe? The name calling at the end suggests I probably should not be bothering to reply. We really do need to learn to be more civil online. It is a serious philosophical question: can you distinguish between intelligence and the achievement of complex goals that humans achieve using intelligence ('getting the job done'). Many of us think this distinction can not be clearly made in the real world.

Comment: Re:recent breakthrough. (Score 1) 197

by ganv (#49524827) Attached to: Concerns of an Artificial Intelligence Pioneer
There has been dramatic progress. Call it a breakthrough or not, as you like, but 15 years ago, we were struggling to get basic voice recognition to work without elaborate training and controlled environments, and now we all use it many times a day. Similarly, machine translation between human languages has made dramatic progress in the past decade. Many of us suspect that your distinction between machines that get the job done and machines that are sentient, may not be a substantial distinction. I haven't heard the serious AI researchers I know promising strong AI over the past couple of decades. It is easy to heap scorn on the early optimists who thought they were going to build a super-intellect with LISP and 1960s era hardware. But it is much harder to explain to the phone call center workers, paralegals, and even surgeons, that their jobs are being taken by machines that don't have intelligence. And we have only just begun to learn how to build machines that integrate a wide range of sensor data with large databases to do tasks like drive a car. We are a long ways from the kind of intelligence displayed by humans. But consider where we were 200 years ago, and it is pretty hard to argue that human level artificial intelligence is not possible in the next 200 years. And if it is possible, then super-human intelligence will follow very quickly. Maybe they will be very different than human intelligence, but if they outcompete us, then they are better, even if they are different. And if you think that super-human level intelligence 200 years from now is not something to be cautious about, then I suggest contemplating what humans will choose to do when they feel obsolete and out of control of their future.

Comment: Re:Way too many humanities majors (Score 3, Insightful) 397

by ganv (#49380049) Attached to: Why America's Obsession With STEM Education Is Dangerous
I think you are right that current trends are devaluing the STEM majors. There is a big push to make these majors less 'elitist' which is code for requiring less foundational mastery of basic math and science. We really need a way to advocate for attracting underrepresented groups into STEM that does not involve changing the preparation standards required.

Comment: Re:Way too many humanities majors (Score 5, Insightful) 397

by ganv (#49379961) Attached to: Why America's Obsession With STEM Education Is Dangerous
Here is a quote from the Zakaria article to think about: 'Critical thinking is, in the end, the only way to protect American jobs.' His implication is that the humanities are a bastion of critical thinking. But when an introductory student is asked to do actual critical thinking where they might be wrong (i.e. introductory engineering, science, and math courses) they often conclude that they would rather go to the arts or humanities where the requirements of critical thinking are not as high.

The fundamental idea is right...that it is understanding of the human condition that will be the biggest growth area in the next few decades. But he is wrong that this is an argument for training more students in current curriculum in anthropology or classics. The future belongs to people who can take the serious critical thinking characteristic of math, science, and engineering curricula and apply it in complex situations where technical details and human behavior are both important.

Comment: Re:The fallacy of labels (Score 1) 320

Yes much of the problem lies in the difficulty of conceiving of the scientific enterprise. We inherited our labels from an era when science was just emerging as a human endeavor, and that was also a time of enlightenment optimism about the ability of human rationality to attain reliable truth free from spin and political agendas. In our era we have swung to the opposite end of the pendulum where all ideas are assumed to be used in pursuit of some political agenda or other. Reality is a very subtle combination of both. But anything simplified enough to be used in the media has to be black or white. So one set of stories digs out political agendas in scientific work and calls them scandalous. And another glorifies how close we are to a 'theory of everything'. The idea that a messy human scientific process might be able to achieve a patchwork of mostly self-consistent models of how most of our corner of the universe works is beyond description by the labels that the media has available at present.

Comment: Color means many things (Score 2) 420

by ganv (#49153205) Attached to: Is That Dress White and Gold Or Blue and Black?
This reminds me of the great entries from the competitaion to explain 'What is Color': http://www.centerforcommunicat...

By the way, I see white/lavender and brown. It would be very interesting to know what lighting/image manipulation was done to get those colors out of a dark blue and black dress.

Comment: Is NIH unique here? (Score 1) 153

by ganv (#48778295) Attached to: Fewer Grants For Young Researchers Causing Brain Drain In Academia
In the physics and engineering proposals I have reviewed, it seems that young researchers still get a significant preference in the distribution of grants. But there is a problem that the proposals from young researchers are often much weaker. It is really hard to write a great grant proposal and new faculty members usually struggle long and hard to get good at it. You have to have great ideas, preliminary work, and a great presentation. And you have to know how to market your ideas to the diverse set of people who will be reviewing the proposal. Maybe 30 years ago, people could get research grants just by describing some potentially interesting research, but in the current environment, you have to write a proposal that is better than 80 or 90% of the others, and that is hard for young people to do. Maybe the bias toward younger researchers should be stronger. But I don't think it helps them to set a low bar and then they will fail to get their grants renewed. I would recommend that grant agencies more aggressively limit the number of grants that can be accumulated by the big names. No one can effectively mentor 5 post-docs and 10 graduate students, and letting them suck up all the funding just because they are able to spit out a large number of strong proposals limits the number of new researchers who can be funded.

Comment: Re:A Word to the young bright kids out there (Score 2, Interesting) 153

by ganv (#48778101) Attached to: Fewer Grants For Young Researchers Causing Brain Drain In Academia
That advice makes sense. It will be very hard to implement though. Research grants pay graduate student stipends. I am not sure it is a subsidy. It is the way research work is paid for. The problem is that the work is done for depressed wages: the typical accomplishments of a grad student are much much larger than you could get with a similar salary offered to a non-degree seeking researcher, same thing for post-docs: they are paid less with the hope that they are preparing for a step up to a permanent position soon. So implementing your system is going to make research much more expensive to perform. If there are fewer graduate students doing research, then research becomes even more a winner take all because only the top professors will be able to support graduate students and maintain active research programs. That means even fewer faculty positions (without research funding, universities hire fewer faculty who teach more rather than more faculty who are also doing research). I think a better fix is to adjust the graduate programs so that they focus not on creating future researchers, but on creating experts prepared for a wide range of technical jobs that are not in research. Research would become a smaller part of these graduate programs and only the top few students who wanted to pursue a research career would continue for post-doctoral research.

Comment: Re:Why is he worried (Score 1) 583

by ganv (#48247209) Attached to: Elon Musk Warns Against Unleashing Artificial Intelligence "Demon"

I think Elon sees something that most of you do not. Artificial intelligence is not like anything else. We know very very little about the kinds of intelligence that are possible. But if it is possible to build AI that is smarter and more capable than us, then it will by definition be better than us at building the next generation of itself. And at that point, humans are permanently obsolete because we have no rapid methods for upgrading ourselves. It has nothing to do with who is 'using the AI' or 'Who is doing the prescription'. There will be no person and no human moral intuitions in the loop at all. The intelligence that supersedes us will be doing what it wants to do. We'll be like the fish who debate how to control their bipedal relatives who have decided to start overfishing the oceans. It is simply out of their control. And if that doesn't scare you, then you don't understand.

We don't know whether or not artificial intelligence is possible. But it seems like a very reasonable possibility sometime in the next few centuries. And we know so little about intelligence that we have very little idea about whether it will share anything like the moral intuitions that undergird human society. Many of us suspect those evolved for survival in hunter-gatherer tribes and AI will evolve a very different set of criteria upon which it makes its choices.

Comment: Re:Wrong distance away (Score 3, Informative) 23

by ganv (#48208985) Attached to: Two Exocomet Families Found Around Baby Star System
That error jumped out to me also. Its like describing a city 93 miles away and instead saying it is 93 million miles away which instead of being 1.5 hour drive is all the way to the sun. It is really useful to get a cosmic distance scale in your head: billions of light years is the size of the visible universe, millions of light years are distances to nearby galaxies. 30,000 light years is the distance to the center of our galaxy. 4 light years is the distance to the nearest stars.

Comment: Re:Amusing (Score 1) 350

by ganv (#48174057) Attached to: The Physics of Why Cold Fusion Isn't Real
We just discovered that we are made of atoms a little over one century ago, and our ignorance is vast. But we should also be careful not to err on the side of blindly assuming that anything is possible. It is essential to think clearly about what might and might not be possible based on what we know now in order to direct our investigations. Will we discover new laws of physics that are relevant to releasing energy from nuclear reactions? I suspect the answer to that is probably no, and the reason is the high precision we achieve from our current theories in describing the behavior of atoms and nuclei. Careful experiments of nuclear excitation energies, fusion cross sections, etc agree with theoretical calculations, often to many significant digits. There just isn't much place to hide new physics in this energy range. Of course new fundamental discoveries (dark matter, etc) are very likely. They just are unlikely to change our predictions for nuclear phenomena by a quantitatively significant amount. Would it be better to stay open minded because one can never be sure? (See http://www.preposterousunivers...) Or is it better to make audacious, falsifiable hypotheses, such as the hypothesis that we already know the laws underlying the physics of everyday life? (

That doesn't tell us much about how to engineer processes that obey the known laws of physics. Predicting what humans will be able to do is very very difficult...and people regularly get it badly wrong being both too optimistic and too pessimistic. In my mind, good hypotheses based on careful consideration of the best evidence are never premature. They just might be wrong.

The last person that quit or was fired will be held responsible for everything that goes wrong -- until the next person quits or is fired.