A new brain-machine interface allows minds to literally feel the texture of computer-generated objects, a recent paper reports.
This interface not only allows a monkey to remotely control a virtual hand by willing it to move – the system also routes feedback on textures and vibrations to the somatosensory cortex, where that feedback is processed as sensations of touch.
Though mind-controlled robotic hands aren’t exactly breaking news anymore, most of those devices only provide visual feedback – in other words, the users of those robotic hands can’t actually feel the objects the hands touch. One recent project did use vibration feedback to help subjects sense the placement of a cursor, but that’s about as far as the idea had been taken.
But now, as the journal Nature reports, a team led by Duke University’s Miguel Nicolelis has created a brain–machine–brain interface (BMBI) that routes movement impulses from a monkey’s brain directly to a virtual hand, and routes tactile (touch) sensations from that hand directly into touch-processing regions of the monkey’s brain:
Here we report the operation of a brain–machine–brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex.
At the risk of sounding repetitive (I can’t help it; I’m so awestruck by this) the BMBI doesn’t involve any robotic hands – the entire interface takes place between the monkey’s brain and a virtual world created within a computer:
Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object.
The computer receives movement commands from 50 to 200 neurons in the monkey’s motor cortex, translating them into a variety of movements for a virtual “avatar” hand (which I’m picturing, of course, as huge and blue and stripey). As the virtual hand feels virtual objects, the system sends electrical signals down wires implanted into the monkey’s somatosensory cortex, where those signals are processed as touch sensations.
The researchers rewarded the monkeys for choosing virtual objects with specific textures. In trials, it only took the monkeys a few tries to learn how to feel using the BMBI – one monkey got proficient after nine attempts; another one picked it up in four.
The researchers hope this technology can be used to create touch-sensitive prostheses for people with amputated or paralyzed limbs. Which sounds awesome – but why stop there? Why not create entirely new bodies from computer-generated touch sensations? Why not place our consciousnesses into virtual birds, or fish, or swarms of bees?
Maybe it’s just me, but I feel like Sgt. Pepper’s must be playing on continual repeat in a lot of these neuroscience labs.