Man and Machine
13 October, 2011 | Cathryn Denney |
|
|
Given a difficult or dangerous task, or one in an environment not easily accessible to humans, we can usually find a robot or machine to do the job instead. Robotic surgery and space and underwater research are all examples of this. Likewise, if we lose a limb it can be replaced with a high tech prosthetic. As these scenarios become more complex, the better the interactions between man and machine need to be. Dr Sirko Straube of Bremen University, Germany, suggests using techniques well established in neuroscience and computer science, such as virtual immersion, psychophysics and electrophysiology (poster), to investigate and improve such interactions through a complex scenario involving telemanipulation of a robotic arm with the aid of a virtual environment.
Straube notes that in everyday life we’re used to assuming that when perceptual errors occur, “such as seeing a snake when we are actually looking at a stick,”(or that stranger in your room that is, in fact, your dressing gown hanging on your door), the error is ours, because this is what corresponds to our experience. But what about when we are dealing with a virtual environment, faced with the possibility that this environment could ‘make mistakes’? Although the brain may compensate to a certain extent, we might have to deal with a perceptual gap between man and machine.“I think that cognitive neuroscience has some answers to these questions on the one hand, and that we can systematically quantify errors of the artificial system on the other,” Straube says.
In their scenario, the operator of the one-arm exoskeleton is hooked up to an electroencephalography (EEG) machine in order to measure the electrical activity of the brain and give insight into the operator’s cognitive processes. Data from the EEG can be processed and classified by the robotic system meaning that, for example, movement preparation detected in the brain can be used to simultaneously prepare the exoskeleton for movement, thus creating a finer level of interaction in the virtual scenario – clever stuff. Straube also proposes psychophysics as a framework for evaluating perception, so that manipulations in robot control have the intended perceptual changes in the virtual environment.
Straube hopes that the work done in his lab will encourage other researchers to take a closer look at methods in neuroscience to evaluate their man-machine interfaces. As for future plans, Straube says: “I really hope to significantly improve our system! By doing this, I also hope to better understand our way of perceiving the world.”
Maybe man and machine can live in harmony!
|