Old Scientist :: Robot

Recent work reported in the Scientific American brings together various disparate technologies – Brain research, Robotics and Communication Technology – in a very interesting and exciting way. The work is led by a group at Duke University, Durham, North Carolina in the US and includes colleagues in Europe, Japan and South America.

Reading electrical signals from the brain has a long history and the Electroencephalograph (EEG) has been a research and diagnostic tool for many decades.

The EEG read signals from transducers taped to the scalp. In the 1960s researchers started reading internal signals from electrodes inside the brain itself. This process culminated in production of a cubic sensor with 1000 microwires which could read the output of several thousand surrounding neurons. The cubes are small and compatible enough, so that several can been embedded in different parts of the brains of monkeys.

Neurochips have been developed which can take the output from the cubes and convert and rebroadcast the neuronal data in a 128 channel data stream.

The data streams are analysed by software to produce (presumably, initially, random) motor commands.

Coupling these data stream motor commands to control an on-screen display of an avatar the monkeys eventually learned to control the avatar – making it move and grasp objects. Further programming was able to return tactile signals to the monkeys so they could ‘feel’ objects the avatar was touching. This I find amazing, as the monkeys will have no real incentive to carry out what must have been an intensive thinking process.

Similar, although much less detailed, work has been demonstrated with humans using EEG output.

Robotics meanwhile was progressing, developing both the mechanics of articulated limbs and the data protocols to control them, culminating in manufacturing and movement simulating robots,

The monkeys moved on to learning to control walking robots.

In an astonishing demonstration, a Rhesus monkey, named Idoya, was trained to walk upright on a treadmill in Durham.  Her brain output was transmitted to a walking robot, CB1, in Kyoto, Japan which then followed Idoya’s movements in the US. A live video feed of the robot was provided to Idoya and she continued to control CB1 even after she stopped walking on the treadmill.

Parallel development of prostheses has been able to incorporate joints and controls developed by robotics. This has led to a further combination of technologies where an exo-skeleton is been produced to support and/or supplement human limbs.

This work is being coordinated by the Walk Again consortium which aims to use a neuro controlled exoskeleton to enable paralysed people to walk but could also help motor problems of Parkinson’s.

The group has filed for permission in Brazil to use this technology, by implanting the electronics in a human brain. The aim being to have a paralysed teenager deliver the first kick to open the 2014 World Cup. (Will probably be the most interesting incident in the entire competition)

References
http://www.scientificamerican.com/article.cfm?id=artificial-limbs-controlled-by-thought
http://walkagainproject.org/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s