Hacker News new | past | comments | ask | show | jobs | submit login
Brain-computer interfaces creep closer to bionic, mecha dream (extremetech.com)
31 points by ukdm on Nov 2, 2011 | hide | past | favorite | 11 comments



I'm doing a master thesis on EEG classification. Some thoughts:

1) The term Brain-Computer Interface is not made up for this article, it is the accepted name for this kind of thing in academia.

2) Their 85% accuracy is not classification accuracy, but how many rings the user was able to get through. It's impossible to tell what this says about the accuracy of their classifier.

3) From skimming their paper, it seems like their classification approach is fairly simple. They do not learn a spatial filter for instance.

4) As far as I know, the article is correct in saying that this is the first BCI to allow navigation in 3D space. This is very cool, and I hope they release the code for it.

To sum it up, this is a very cool practical application. From what I can tell it does nothing ground breaking in order to do classification, and I bet a more computationally intensive approach to classification could perform substantially better.


The claim that this is the first time an EEG array was used to control something directly with human thoughts is a very bold lie. Also, "true" BCI does not generally mean what they think it means. While it is correct that electrode arrays are some form of brain-computer-interface, they are a very crude tool and have nothing in common with the imagery that is being invoked here for the sake of publicity. A real BCI is supposed to be bidirectional and it's supposed to interface with individual neurons or cortical columns.


I'd also doubt this is the "first brain-computer interface that allows the human user to navigate a 3D space". Note: "real BCI" does not need to be bidirectional and may be accomplished via read-only EEG (as ones eyes and ears will be receiving information from the machine).

One problem with neurofeedback has been that the visualizations have been done using proprietary software toolkits which inhibits the ability of designers in working with neurofeedback specialists to create more engaging and intuitive imagery for the brain. I've been working on a small toolkit that will allow you to control events in a browser using canvas and websockets: http://vimeo.com/mindhead.

Would love feedback and/or development help!


> The 'first' claim here is moving in 3D space. Controlling pixels on a screen has been done before -- and in many ways, this development is basically the same thing. The researchers make it sound like 'continuous and real-time' are also significant developments, along with the 3D thing. <

Mrsebastian, in case you didn't know: you appear to be hellbanned so I can't reply to you directly.

You are correct, the newness claim is centered around the 3D thing, but I'm very skeptical of that claim as well. Using cortical electrodes to control things is so old a method you can't really slap a marketing label on it ("look, this time we control XYZ! it's NEW!1!") and call it new. In any case, it's grossly misleading to hype this as groundbreaking research.


Hrm, the hellbanning is news to me -- I thought an administrator would send me an email if that was the case! Thanks for the heads-up, though.

Again, I don't think the fact that they've controlled something via trained brain waves is the news here -- as you say, that's fairly old hat. The fact that it's being done in real time and very accurately is impressive, though, I thought.


Strange, this post of yours seems normal, the previous one was auto-dead.

I stand by my opinion that the way this research is being presented is misleading and blown out of proportion. That doesn't mean, however, controlling stuff with EEG is not awesome. It is.

I must admit that I don't understand what the realtime claim in this case is supposed to be though. While there is always overhead due to signal processing, the reaction time is also tied to how much data you need to reflect a state change with enough confidence to trigger whatever the output device is. There is a lot of inherent inaccuracy in this measurement process, which makes sense when you consider the electrode's distance from the EM emitter and the tissue in between. So what does realtime and accuracy actually mean in this case?


The point of a hellban is (traditionally) that you don't even know you've been hellbanned--from your point of view, nobody ever comments on your stuff, even though it looks quite normal to you. Then eventually you get sick of not getting feedback, and leave.

For content, hell, I moved a cube around 3-d space using an Emotiv Epoc headset back in college. It wasn't "think of moving your hands forward", though--you trained it by saying "Ok, now imagine the cube moving backwards". It was actually pretty responsive after a few minutes of training, and it sure looked a lot better than this jerky helicopter demo.


"Surprizingly fast and mind-blowingly accurate"? I would not, in a million years, describe the flight in that video using those words. I'm glad the research is being done, but there's no reason to sensationalize something that's clearly performing worse than, say, a joystick. Or a laptop touchpad. Or four hard-to-press buttons.


Does the video remind anyone of the secret level in Super Mario 64 triggered by looking up into the sun? Flying through rings and all...

This one: http://www.youtube.com/watch?v=d0NXA7uJj58


Off topic, but speaking of hellbanning and auto-dead, I'm also not able to reply to any comments along Udo's thread (i.e. the thread starting with comment http://news.ycombinator.com/item?id=3186917). Why is that?


Looking at your post time it would appear to be the cool off timer. To prevent deeply nested back and forth each level of nesting in a thread introduces a delay for replies. Come back latter, and it will probably come back.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: