I'm a PhD student in a brain-machine interface lab at Stanford. This particular demonstration is not all that interesting since EEG control is extremely low bandwidth and particularly high noise.
Getting one or two degrees of freedom of control to turn a quadrotor is possible, but will never become a robust or fast method of control. The information content required from these EEG signals simply isn't there, and what is there is frequently swamped by any muscle movements like blinking, turning your head, etc.
It's possible to get much higher bandwidth and robustness with a cortical implant [1]. These provide single-neuron sensitivity and make it possible to record from several hundred neurons simultaneously, and achieve bitrates of 6-7 bps.
I am also a PhD student working on brain-computer interfaces.
I agree that this particular EEG system (it looks like an Emotiv EPOC) is fundamentally limited in bandwidth and quality of electrodes, and so I am surprised to see this level of control from it.
I would argue that other, higher quality EEG recording systems are much more practical than cortical implants for the foreseeable future, especially from the perspective of we hackers.
Check out the Wolpaw EEG lab. Good EEG brain-computer interfaces can, surprisingly, have comparable bitrates cortical implants - without having to crack the skull open.
Are you kidding? I'd estimate there is a small army of cyberpunk fans out there who would jump at the chance to use a mind-computer interface to control a robot.
Couldn't agree more. People do craziest of things for the sake of body art so there should be at least a couple of cyberpunk freaks out there with equal eagerness for modifying their bodies.
After some digging, I can confirm that they appear to use the Emotiv headset like munin hints on. While I never got around to diving too deep into my Emotiv headset, the one thing I did notice is during my profile training, I wouldn't get precise results without forcing some sort of neuro-muscular response (pulling my head slightly back to move the block closer).
I noticed the person in the wheelchair showed control from at least the neck up, so I assume there was some use of muscles to get that sort of response out of the quadcopter. Thinking, without muscles, is actually really HARD! I couldn't get past training without some assistance, and that was with maybe 2-3 actions (imagine ~10 different actions with unique thoughts needed)
Its still amazing work, and I'm glad to see some research in the field and will definitely try to get a copy of their paper. I'd love to see the next step to be either controlling the wheelchair with some good response time, or to increase the number of actions possible.
Looks like an eeg neural device. Lots of interesting stuff on this in terms of neural prosthesis. As suspect as the source may be, scholarly literature on the subject suggests this well within the realm of possibility.
Anyway, while the eeg angle for brain computer interfaces (BCI) are interesting, bandwith limits are a major issue.
Kuiken and his staff have had some success with reinnervation, which also returns some "tactile sense" capacity, but is still limited by the differing commands for input (in this case free muscle groups).
While this is certainly exciting, this is most likely an end of the road for EEG due to the bandwidth limits you describe, as well as pollution in the signal from EMG from muscle activity.
The prosthetics community is on track to deliver some amazing new capabilities in the relatively near future, but this will almost certainly require microelectrode or potentially Ecog.
> The source of the submitted article, rt.com, is not known for careful journalism.
Understatement of the year.
Just now as I have been searching for other sources, I see that all the stories on this issue so far are based on a press release, in very similar language, with the New Scientist blog write-up
perhaps having the most caution of statement in relaying the press release. So far no independent journalist has done any reporting from the scene in China.
Dude, the video in your second link is awesome! To be honest, I was sorta impressed with the floating robot alone, never mind the fact that he was controlling it with his brain.
I am peripherally familiar with the technology used to relay thoughts to computers. I obviously can't comment on the specifics of the technology described in the article, but the degree of control displayed in the video is unlikely to be be possible with current widely-known methods of detecting thought.
This doesn't mean that such a level is not ultimately achievable, or that they may have some "secret sauce".
Does anyone have any experience with an EEG headset like the one described in the article, that they can recommend? Do they work as audio input devices or are they independent input devices? Do they usually come with open source drivers or libraries in case of the latter? Seems like an interesting thing to tinker with.
there's a high school science project where you make your own EEG (not that hard, really!) and then use it to drive an RC blimp.
it would be even easier with the emotiv, which also gives you other face-muscle sensors so you can do the eyeblink stuff. with a homebuilt EEG you can also get a lot of data but there are few, as I understand it, signals that can be easily 'user influenced'.
Getting one or two degrees of freedom of control to turn a quadrotor is possible, but will never become a robust or fast method of control. The information content required from these EEG signals simply isn't there, and what is there is frequently swamped by any muscle movements like blinking, turning your head, etc.
It's possible to get much higher bandwidth and robustness with a cortical implant [1]. These provide single-neuron sensitivity and make it possible to record from several hundred neurons simultaneously, and achieve bitrates of 6-7 bps.
[1] http://www.blackrockmicro.com/