Hooray for Science...
Researchers have created technology that will allow electrodes places on the brain to be able to manipulate a computer. Apparently, placing electrodes ON the brain is SOP for epileptics. These researchers basically took four people with electrodes already present and taught them to play a game on the computer. The thing that really suprised me was the speed with which they learned to manipulate the cursor. After six minutes of training and no more then 24 minutes of practice the patients were able to manipulate the cursor accurately enough to fire upon some kind of target (the article doesn't go into specifics of the game but it sounds like some kind of simple, point and click, target shooting game). Accuracy ranged from 74 to 100%.
Think about that for a minute. These electrodes were not placed strategically nor were they even placed based on their importance to the experiment. Yet the human mind was able to establish control over what amounts to a new muscle within 30 minutes. Physical therapy generally takes months but it's a result of muscles being retaught and of having to deal with thousands and thousands of individual nerves and muscles. But the computer input can be distilled down into a very select group of inputs. Think about the typical computer. A mouse has only two input variables (vector and speed) plus 2-5 buttons, a keyboard has at most 100 characters and probably less once you simplify that. So you're talking about somewhere south of 100 input "channels" and that's selecting one for each character on the keyboard.
The article goes on to say that the use of brain based electrodes is highly impractical but that they believe there is a real possibility that the electrodes could be remote or at the very least that they could be permanent implants that then transmit to a receiver. The entire notion is kind of creepy but at the same time kind of exciting. One of the major drawbacks to computers right now is the lack of a highly versatile input device. The human brain is the highly complex and versatile device that they've been looking for. Think for a moment about the scene in Minority Report where Tom Cruise is sifting through images and poking into the air to rewind and zoom and what not. We're not that far away from that kind of visual imaging technology being available. The true hold up is the input device you'd need. Hands poking into the air and twisting and turning to make different shapes and signals is a nice science fiction dream but it's not very practical. Consider how exact an input a computer generally needs. For example, highlight some text on this page. Go ahead. Now, try to make the highlighting end after a word rather then after the space after a word. You have a very small margin of error. Now, imagine trying to do that with your hands while making complex gestures. Now you understand why touch screens haven't become all the rage. This kind of technology, however, has the potential to solve many of those problems.
We're still many years away from anything like those scenes in Minority Report (or Paycheck, which I saw the other day and was suprised by its lack of crappiness) but this new avenue of research has some potentially groundbreaking areas to explore.
1 Comments:
I think that it is the other way around (you speak of inputs allowing us to use more 'technical' computer programs) I think that manufacturers and more importantly inventors will come out with a higher level of input device only when the amount of data we are processing reaches critical mass... 5 years ago (I am guessing) when the first 3 and 4 button mouses were coming out it made for massive improvements in control in games (the most input intensive computer programs right now... outside of maybe 3d modeling or hardcore video editing...)
The only programs that might need input improvements are 3d modeling and other very sophisticated programs that are so far from the main stream that it makes it impractical for a business to invest in it...
Just my thoughts... that input comes after data-thresholds... not the other way around...
Andrew
Post a Comment
<< Home