Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Honda Robot Controlled By Brain Waves 137

Dotnaught writes "Honda researchers to have developed a way to control robots using human brain waves. Using brain signals read from a person in a magnetic resonance imaging scanner, a robotic hand mirrored the movement of the human controller, spreading its fingers and making a 'V' sign."
This discussion has been archived. No new comments can be posted.

Honda Robot Controlled By Brain Waves

Comments Filter:
  • by spun ( 1352 ) <loverevolutionary@@@yahoo...com> on Thursday May 25, 2006 @02:47PM (#15403840) Journal
    Here ya go. [theonion.com]
  • hmmmm (Score:5, Informative)

    by venicebeach ( 702856 ) on Thursday May 25, 2006 @02:53PM (#15403890) Homepage Journal
    I've done some fMRI of motor movements... All these movements, the fist, the V-sign, would activate the hand area, premotor cortex, and some parietal areas... I am very skeptical that you could tell the difference between them. But if they can that is very impressive, especially to do it in real time...

    By the way, MRI does not measure "brain waves". It measures blood oxygenation changes, which are related to the firing of neurons.
  • by PIPBoy3000 ( 619296 ) on Thursday May 25, 2006 @03:05PM (#15404025)
    You have to remember how MRI's work [wikipedia.org]:
    It has been known for over 100 years (Roy and Sherrington 1890) that changes in blood flow and blood oxygenation in the brain (collectively known as hemodynamics) are closely linked to neural activity. When nerve cells are active they consume oxygen carried by hemoglobin in red blood cells from local capillaries. The local response to this oxygen utilisation is an increase in blood flow to regions of increased neural activity, occurring after a delay of approximately 1-5 seconds.
    Add in some computing time to process the image and you've got your latency right there.
  • More Links (Score:3, Informative)

    by vertinox ( 846076 ) on Thursday May 25, 2006 @03:07PM (#15404045)
    As per the discussion on Digg [digg.com] here is a video of the robot in action with the MRI:

    http://www.newlaunches.com/archives/honda_develops _bmi_robot_hand.php [newlaunches.com]

    And all the other links that were related:

    http://www.engadget.com/2006/05/24/hondas-asimo-ge ts-mind-control-interface/ [engadget.com]

    http://www.japancorp.net/Article.Asp?Art_ID=12565 [japancorp.net]

    The Japancorp has the most information than both the engadget and then Yahoo.
  • Re:universality? (Score:3, Informative)

    by FleaPlus ( 6935 ) on Thursday May 25, 2006 @04:22PM (#15404706) Journal
    For gross things, it can be quite obvious what the person is doing. I can tell by looking at the activations in your brain if you are looking at something versus hearing something. But looking at a duck versus looking at a cow? Much harder. Making a V-sign versus making a fist? I've never seen a paper where someone reported being able to do this. It is theoretically possible, but difficult with a blurry MRI signal that aggregates over populations of neurons.

    I think this research is a follow-up to a study Kamitani & Tong published last year in Nature Neuroscience, where they decoded the orientation of edges a subject was looking at. Here's the abstract:

    Decoding the visual and subjective contents of the human brain

    The potential for human neuroimaging to read out the detailed contents of a person's mental state has yet to be fully explored. We investigated whether the perception of edge orientation, a fundamental visual feature, can be decoded from human brain activity measured with functional magnetic resonance imaging (fMRI). Using statistical algorithms to classify brain states, we found that ensemble fMRI signals in early visual areas could reliably predict on individual trials which of eight stimulus orientations the subject was seeing. Moreover, when subjects had to attend to one of two overlapping orthogonal gratings, feature-based attention strongly biased ensemble activity toward the attended orientation. These results demonstrate that fMRI activity patterns in early visual areas, including primary visual cortex (V1), contain detailed orientation information that can reliably predict subjective perception. Our approach provides a framework for the readout of fine-tuned representations in the human brain and their subjective contents.

Work without a vision is slavery, Vision without work is a pipe dream, But vision with work is the hope of the world.

Working...