Hacker News new | past | comments | ask | show | jobs | submit login
Obama Seeking to Boost Study of Human Brain (nytimes.com)
119 points by robg on Feb 18, 2013 | hide | past | favorite | 75 comments



I was stunned reading this. It looks like the group pushing this project are just taking advantage of the president's advisers ignorance. Cognitive neuroscientists, psychologists and neurologists have been "mapping" the human brain for well over 20 years now. There's even a conference dedicated to the pursuit (http://www.humanbrainmapping.org) it attracts over 2500 attendees each year. Unless this project is focusing on some aspect of the brain that has yet to be studied (not very likely) it looks like this is just going to take NIH funds away from researchers who are already working in these areas. It doesn't surprise me though. Francis Collins (head of NIH) was the leader of the publicly funded half of the Human Genome Project. I guess he's trying to pull the same trick twice. It will be very interesting to see what the final proposal looks like.

EDIT:

Also claims that "we'll be able to cure Alzheimers!" are pretty much part of every grant proposal submitted to NIMH in one way or the other. It's just an easy way to get your "impact on public health" covered. I can't believe they're failing for it here.


Right, and people had been playing with rockets for 30 years before the Apollo project. There is something to be said for large, concerted effort toward a singular goal. There are still enough massive gaps in our understanding of the brain to justify further research.

That said, this press speculation is just fluff and it's unclear to me whether it is possible to define such a focused goal.

As an aside, I don't quite get all the cynicism in this thread. President: we will spend more money on science! HN: Meh?


  President: we will spend more money on science! HN: Meh?
The cynicism is that all "science" is not equal. Correct or not, cantastoria's criticism is that giving money to charlatans takes that money away from basic research. A good counterargument would outline why "mapping the brain" is possible and a good use of resources. Personally, I'm still waiting for that from someone.


  Correct or not, cantastoria's criticism is that giving money to 
  charlatans takes that money away from basic research.
I don't understand who the supposed charlatans are. The basic scientists lauded by cantastoria for doing existing research are almost certainly going to get the lion's share of this new money.

  outline why "mapping the brain" is possible and a good use of 
  resources.
http://www.sciencedirect.com/science/article/pii/S0896627312...


  ...are almost certainly going to get the lion's share of this new money.
Not sure why you think that.

Here's the pdf for your link (I can only guess that you find this paper convincing): http://bit.ly/Y2AXtz


  Not sure why you think that.
Because that's how grant peer-review works? Who will compose the review panels for this funding? Mostly the same people who already compose the review panels for existing NIH/NINDS/etc. funding mechanisms.

  I can only guess that you find this paper convincing.
The nanoprobe and "complex emergent properties" stuff at the beginning are a bit hand-wavy, but the concrete 5 and 10-year goals are sufficiently ambitious while certainly not outlandish.


It's more like if people had been already going to the moon for 30 years in this case.

There is something to be said for large, concerted effort toward a singular goal.

As I said that concerted effort has been going on for quite some time now it just hasn't been funded by a narrow project that will benefit very few scientists (and apparently Google, Microsoft and Qualcomm).

It's not cynicism, it's healthy skepticism. "More money on science" in this case looks more like a boondoggle that will benefit a few select scientists and corporations and will probably end up taking money away from a larger group of scientists already looking into these areas.


It's a positive goal not related to the military. I applaud that.


The "mapping" of the brain, while accelerating, has been growing at a rather slow pace. In fact one of the few large 'mapping' projects is conducted by a non-academic entity, the Allen (of microsoft fame) institute (http://www.brain-map.org/). The reason for this slow progress is the small scale and scope of most investigations. Thousands of studies are being repeated (and mice sacrificed) because of the lack of both data sharing and concerted planning among neuroscientists. The academic pressure for publishing fast unfortunately limits the scope of most scientists to projects that can bring results easily.

This new project sounds like it is the neuroscientific analog of LHC or the Human Genome Project. The truth is that we wouldn't be able to replace the LHC with 1000 synchrotrons.


Maybe but one of the reason why there is such slow growth is that there's so much disagreement amongst researchers about how to interpret results.

I would be much happier if this project was focusing on creating the LHC equivalent in brain mapping w.r.t. instrumentation and methods. For instance, a cheaper imaging technology that offered 2-3x the resolution of current techniques. That would be a much more focused goal and would have clear benefits for all scientists working in these areas.


The LHC (and even the HGP) were largely engineering projects. Basic science is much more hard to focus, but there is an overarching goal nevertheless: "figure out how the brain works by any means". I presume imaging methods are part of this.


ADNI initiative shares data and most researchers are part of it. NIH is HUGE on sharing.


Interesting, i 've never been able to find e.g. spike recordings from a published study. ADNI seems to be only about alzheimers


Why do you imply that the Human Genome Project was a scam?


While there has been a huge growth in brain research over the last decade or so, the methods and analysis are still immature, and results are often quite fuzzy (even if researchers would like their reviewers to believe otherwise). There also doesn't seem to be a clear goal here, other than to throw a bunch of money at some interesting questions. The Genome Project had the very clear goal of mapping the human genome. The European project aims to simulate a human brain. This just seems to be directing a lot of money at research which is already being done.

That being said, I think there might be some very interesting opportunities here for talented developers. When I worked in the field a few years ago, the software used for this stuff was generally a lot of MATLAB scripts with C subroutines held together by some scripting language duct tape (Python/Bash). It was slow, it was buggy, we were basically writing documentation as we figured it out ourselves, different labs had different methods and scripts and in general it was kind of a mess. It's actually bad enough that there are research grants out there to develop better analysis software. Most of the researchers are trained in statistics, neuroscience, and psychology but have very little programming experience. If you are interested in some of the software that is out there right now, below is a partial list. Also, if you are interested in this kind of research but can't contribute code-wise, contact your local research university. They are always looking for test subjects for this kind of stuff, you usually get paid pretty well ($50-100/hour for imaging studies) and you get some cool images of your own brain out of it if you ask.

http://www.fil.ion.ucl.ac.uk/spm/

http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FSL

http://afni.nimh.nih.gov/afni

http://surfer.nmr.mgh.harvard.edu/

http://nipy.sourceforge.net/

http://www.trackvis.org/


I've thought a lot about why fMRI data analysis tools are so terrible, and I've come up with a few reasons:

- It's impossible to implement the algorithms for fMRI data analysis efficiently in most "dynamic" programming languages due to the performance hit you take from using a dynamic language. (It might actually be possible in Julia, NumPyPy, Python with Numba, but these languages are not yet well-established.) On the other hand, dynamic programming languages are much better suited to exploratory data analysis than C is, so essentially all fMRI data analysis ends up being a mixture of C code and glue code in some other language. In this regard, I don't think SPM (MATLAB with C MEX files) is really that bad. It's fast and it avoids having to read the data from disk multiple times.

- People use what the tools they know, not the tools that are best for the job. FreeSurfer is a mess of C, C shell, and Tcl/Tk, but there's nothing else that can visualize fMRI data with comparable ease and accuracy. Most people in neuroimaging only know MATLAB, which is pretty terrible for analyzing large data sets because it can't mmap files (and it doesn't have the language features necessary to make this possible, and it's closed source).

- Related to the above, it's easier to get funding to develop a novel algorithm than to implement an existing algorithm in a way that makes it more useful/accessible to researchers. I believe this is slowly changing.

- There are a lot of different algorithms used for analyzing fMRI data, and no single package implements all of them. The necessity of each algorithm differs by lab and researcher, according to scientific necessity, personal preference, or the conventions of their subfield. People end up writing their own code to glue together methods from different analysis packages, which is, again, often written using the wrong tools.

- Us graduate students who know how to code well need to publish papers. There is comparatively little incentive to publish code.


Check out http://nipy.sourceforge.net/nipype/ It's basically a pipeline for plugging different packages' outputs and inputs into each other, wrappers are already there for SPM, FSL, and other well known programs and it's easy to write your own stuff if you know some Python. It speaks to exactly this approach, as it is algorithm/implementation agnostic, it just defines the process and tries to formalize getting data from one place to the next.


Matlab can mmap. SPM may not use it because they have limited themselves to Matlab 7 (2004) compatibility (also no toolboxes).

http://www.mathworks.com/help/matlab/memory-mapping.html

It's somewhat a moot point though, because everyone likes to gzip their nifti and unfortunately the file formats don't have a uniformly-accepted way to leverage the huge disk savings we can get from masked data without applying (stupid from the point of view of the types of analysis we do) compression. Even filesystem-level compression doesn't help. If you can fit all your data into available RAM, you're fine. If not...


I wasn't aware of this, and it's surprisingly hard to Google. Thanks for informing me.


Let's add to that fMRI data is next to useless for actually studying the brain. It can map large scale organisation, but it's like trying to measure the economy based on which power stations are in use at any given time.


You could learn quite a bit about the organization and function of a country based on the temporal-spatial patterns of its energy consumption particularly when you're given the power to control external stimuli. But yeah, it could be pretty useless for studying something on the scale of a subdivision. It depends entirely on which spatial scales you are interested in studying.


Following your analogy, trying to understand the brain by putting a few dozen electrodes in it is like trying to understand the economy based on watching people interact in a few dozen rooms.

Like you, I am skeptical of the explosion of human neuroimaging, but I think that, as a technique for determining where to drop your electrodes, fMRI can be a very powerful tool.


Agreed about the horrible tools. Unfortunately, this is an extremely difficult market to break into. Most academics want free and open source tools and don't care how many research assistants or post-docs they have to torture to use them. Compounding the problem is that the market is so small any commercial product is going to have to be very expensive (>$1000.00) to make it worth while re: development and support costs. For instance something like BrainVoyager (http://www.brainvoyager.com/) which is a commercial competitor the tools you listed starts at $4,000 which is a pretty good chunk of any equipment budget on a grant.


Closed-source tools are also problematic for research purposes, since at some point in your career you will almost certainly want to run an analysis that they can't perform.


BrainVoyager's approaches are from the ice age and not flexible enough to handle modern protocols. Only a moron could defend using it for research. It's tranlational at best (a toy you give those clinicians that can't even figure out SPM). But it does make pretty pictures.


I'm a clinician and it admittedly took me ages to figure out SPM. I still hate it, and think it is terrible to work with but useful. The state of the software is actively hampering contributions from clinicians. The problem is, as you have pointed out, that if BrainVoyager is the alternative, that's no alternative at all.


I'm not defending BrainVoyager as a product. It was just an example of the type of business model the academic market requires.


Well said! I agree and you are correct that the software used for this stuff pretty much sucks. The thing is that a lot of scientists don't know that the software could be better, and they are reluctant to try new software considering it is so expensive (We just spent $5k on a licence for software that is subpar. I've seen this 91203091283 times and it's disappointing). New software will change the game.

To add to that list, http://www.brain-map.org/ is a reallllllllly nice tool. I love what this company is doing in terms of automating gene expression within the brain and look forward to their software going through some revisions.


Add to those NEURON (http://www.neuron.yale.edu/neuron/) which is the most popular for compartmental neuron simulations. It's a terribly old program (1990) with horrible limitations yet people still use it.


Terribly old - But is being updated constantly, they are adding in the capabilities to explicitly (in hoc or python) add in chemical reaction schemes. Some interesting research is being worked on in that regard at the moment.


If there are horrible limitations, email Dr. Mike Hines or write some code that will fix it. It is open source.


From what I see in the EU project, while it's stated goal is to simulate an "e-brain", still a big portion of the money and manpower of it is going to (a) the mapping of biological brains that seems to overlap with this proposal, and (b) biochemistry and neurotransmitter basic research.

I mean, a mandatory prerequisite in any serious 'brain project' is some progress in almost all of the related fields, and in a sufficiently big project you try to get some leading scientists from all these areas under your umbrella.


I went to a fantastic talk by Microsoft Research's Stephen Emmott recently in which he referred to fMRI as "phrenology for the 21st century" and advocated the building of computational models of the brain. Mixed feelings about this announcement - some of the newer methods do look promising.


The proposed research is not focused on fMRI...


This sounds like a reactionary move after Europe announced their own. Can't complain though. Great to see they are taking this sort of research seriously. Now if they started funding international anti-asteroid systems, too (like how Russia suggested), that would be great.


Sounds like snake oil. Too early to build a complete human brain model. Simpler brain models (cat, lizard) haven't been done yet (?) but that's not headline-worthy.

How about "President announces program to solve all human suffering"? About as meaningless, but would actually be more useful if by some astronomical fluke they succeeded.


It's on the same timescale as the European Blue Brain Project (and also Kurzweil's predictions, if you put any faith in that). If you believe that this an information technology that has an exponential growth, it's not a sufficient objection that we're far off now. The same objections were raised for the Human Genome Project, which was far from completion most of its duration, but followed an exponential growth and completed on time.


All over the net I'm reading comments from naysayers who claim this will never happen, or it won't happen this century.

Even many neuroscientists are saying it's impossible. But although they might know about neuroscience, they don't seem to have grasped the concept of exponential technology growth yet.

I'm so glad Obama is not a naysayer.


Genome is a couple megabytes. Brain is terabytes. A billion times more complex. And its not all about the chemistry, its about the physical wiring. So the same techniques won't apply.


Yes, but you only have to double a megabyte 30 times to get a terabyte. That's a better perspective to take when dealing with exponential growth. Of course, the genome and the brain are different things, but the point remains that if you believe this will have an exponential growth, being far away from solving it is exactly where you expect us to be right now. And even now, we do have quite good computational models for certain small parts of the brain.


> you only have to double a megabyte 30 times to get a terabyte.

It seems to be only 19.93 times according to Wolfram Alpha?

http://www.wolframalpha.com/input/?i=log%281+terabyte%2F1+me...


You're right, I assumed the billion times figure was correct, it should be a million.


National security wood be improved by a thorough mapping of the 'lizard brain' http://www.lookfordiagnosis.com/videos.php?title=What+is+the...


This research program is not about human brain modeling like the EU bluebrain. Instead, it's try to do exactly what you suggest, understanding simpler brain models and building up from that to more complex organisms.


It's clear that the software that we are using is not sufficient, and a lot of it seems to be a lack of quality control. I've been in science now for close to 10 years and the software that we use looks like it was built in 1995.

This is a serious problem with science software. The features are there (for the most part), but it seems clear that much of the software engineers are not working with the users. There is a severe lack of thought when it comes to UI and as such the learning curve is pretty steep.

With that said, my first YC application was to build software (web based) to tackle some of these issues. I'm actually looking for coders to help build this product, so if you have any interest in jumping into this field, send me an email (in profile). I intend on applying to YC this summer and would love to work with a few programmers to give this project a real shot.


I've spent some time making web-based software while sitting across the lab from the scientists who use it. This looks like an interesting project, I'll email you for more info.


I may be way out of my depth on this, but isn't this project assuming that there is a 'normative' human brain, and that we all have the same neuron structure?

I know we all have the same general brain regions, and in most of us they function approximately the same. But to go down to the neuronal-level will any single brain be able to give a map that would be useful to the population as a whole? Or are they planning on making some type of composite map from multiple sources?


I used to work in functional brain imaging (fMRI), so perhaps I can shed some light on this. Generally what happens is you take a group of subject, and get "structural" (anatomic) scans of their brain as well as the functional scans showing brain activity for the given task. The structural scans are then run through a statistical "normalization" procedure, which basically does some fancy bending/warping of everyone's brains such that they all line up. Once this is done you apply that normalization function to the functional scans so you can compare everyone. It is also important to note that typically neuroimaging does not go down the neuronal level, the resolution is just not that high. Typically you are looking for brain regions, or sub-regions that correlate with a given activity. Something on the order of millimeters to centimeters. However there are other ways of investigating brain activity, and some of those can go down to the neuronal level.

Here is an example of common software package to do such normalization and analysis: http://www.fil.ion.ucl.ac.uk/spm/


I'm a PhD candidate in neuroscience, I'll take a stab at that.

You highlighted a critically important question in the field. We do all have the same brain structures, and neurons within those structures have similar shape and function, but the exact wiring of the circuits will not be the same for two individuals. Understanding how the structure of circuits leads to behavior, memory, cognition, etc. are the high level goals that the whole field is striving for.

This project seeks to answer quite a bit more than just mapping areas in the brain. I think the name is more of a way to connote that the project seeks to be comprehensive.


isn't this project assuming that there is a 'normative' human brain, and that we all have the same neuron structure?

At a certain resolution the structure of the brain is certainly close to normative, as demonstrated by decades of dissection studies. But when you get down to the axon level, there is a huge variability in connectivity. The open question is: at what level of connectivity (and, importantly, temporal synchronicity) does cognition emerge (or otherwise, depending on your philosophical school).


I'm currently working to develop neural prosthetics (computer cursor, robotic arm, etc.) and it's clear to me that the president's proposed research will have real clinical impact over the timescale of 5-10 years. Our lab and others already have paralyzed human patients controlling computers in clinical trials via a brain-machine interface, but it is hard to overstate the potential possibilities that remain hidden due to our lack of low-level understanding of low level circuits.

To clear up some misconceptions in this thread, this new work is NOT focused on top-down cognition using hand-wavy tools. Instead, it's focused on understanding neural circuits starting with simple organisms and working up to primates/humans by developing new technologies that allow us to radically scale up the scope of question we can ask.


I'm skeptical, but if it's reasonable to spend over a Trillion on a manned fighter project (F-35) considering the pathetic state of that project, then the few billion for this is worthwhile a million times over


How about neither.

The Paul Allen brain Atlas, among many other private initiatives, is capable of pursuing this goal better than the federal government, in the same way and for the exact same reasons that Craig Venter & Co. embarrassed the human genome project.


Actually, UC Santa Cruz and the NHGRI/NIH embarrassed Celera by publishing the human genome and setting the whole thing public domain several days before Celera could publish then patent their mappings. And, anyway, the only reason Venter went private was to draw a line in the sand about shotgun sequencing, for which he was having trouble finding public funding/support.

Besides, no one got into space faster than governments, no one has managed to develop cures and vaccines for large-scale and deadly diseases than labs directly funded by the NIH and other governmental agencies, and no one has had the infrastructure to build large-scale networks like the Internet like governments.

I won't even mention interstate/intercontinental highways, sewer systems, healthcare outside of the US, etc. There are simply some things that cannot be done by anyone but the uppercase-P People.


http://bluebrain.epfl.ch/cms/lang/en/pid/56882 Here are more details of the European project (labelled Blue Brain Project) other posters had mentioned about in this thread. Also, there is this TED talk http://www.ted.com/talks/henry_markram_supercomputing_the_br... by Henry Markram, the project director of the blue brain project.


For anyone still curious after watching the TED talk, also see the Almaden Cognitive Computing 2006 series - these presentations are technically intense (cellular biology, beyond-pop neuroscience) but(/and) enjoyable. The second one in the series is a lecture/presentation by the same Henry Markram: http://www.youtube.com/watch?v=9gFI7o69VJM

NB: intros (by those other people at the start of each video) tend to be long (especially for the first presentation (which is by the way very interesting as well)), I'd maybe skip them.

There are a couple of other videos from the series on youtube, but the original collection of twelve was removed from google video once that service became defunct. They are still available via somewhat obscure means from the original source:

webpage (original now gone from the source for some reason) with links to original AVIs and PPTs: http://kostas.mkj.lt/almaden2006/agenda.shtml ; I'm redownloading those AVIs now just in case... (www.almaden.ibm.com/institute/resources/2006/Disk[1-12].avi (replace integer interval with a single integer.))


To be clear, there are not many similarities between Markram's project and this program. Critics of bluebrain should be very interested in the proposed approaches and technologies outlined here. Rather than starting by simulating a whole brain before understanding a functional circuit-level description of different brain areas, this program proposes to start by studying simple and accessible circuits and build up to larger systems.


found these three documentaries on the progress of the project so far. worth a watch! http://bluebrainfilm.com/bb/


On the claimed $800 billion dollar return on the human genome project: http://www.nature.com/news/2011/110511/full/news.2011.281.ht... (It's skeptical but vague; it looked like the rest of the top hits just parroted the press release.)

Anyone have access to the brain-mapping proposal? http://www.cell.com/neuron/abstract/S0896-6273%2812%2900518-... "The function of neural circuits is an emergent property that arises from the coordinated activity of large numbers of neurons. To capture this, we propose launching a large-scale, international public effort, the Brain Activity Map Project, aimed at reconstructing the full record of neural activity across complete neural circuits. This technological challenge could prove to be an invaluable step toward understanding fundamental and pathological brain processes."



Thanks!

On skimming it sounds well thought out as research on its own terms, though the connection to curing diseases is just as handwavey.


This sounds awfully familiar (Europe's a step ahead, this time, at least in the press):

http://uk.news.yahoo.com/billion-euro-supercomputer-to--simu...


Not exactly.

  The Obama initiative is markedly different from a recently 
  announced European project that will invest 1 billion euros 
  in a Swiss-led effort to build a silicon-based “brain.” The 
  project seeks to construct a supercomputer simulation using 
  the best research about the inner workings of the brain.

  Critics, however, say the simulation will be built on 
  knowledge that is still theoretical, incomplete or 
  inaccurate.


This is great news. As skeptical as I am about the prospects for actually simulating the human brain in the near future the benefits that are likely to arise from devoting more resources to understanding it are enormous, both in terms of technology and human health. This is probably one of the best things the government could be spending research money on. I only hope that this proposal is funded at the level it deserves.


Anders Sandberg and Nick Bostrom's "Whole Brain Emulation Roadmap" might interest some people here:

http://www.philosophy.ox.ac.uk/__data/assets/pdf_file/0019/3...

Most of it is over my head, but it was an interest read and shows the kind of techniques we'd need to be able to do this.


How will this help "paving the way for Artificial intelligence"? We already have neural networks as a computational metaphor for the brain. I doubt we will find new metaphors. I doubt we will learn new computational tricks. Please enlight me on this issue.


ANNs were build with the assumption of point neurons as nodes in a network and many of their training algorithms are biophysically unrealistic. While ANNs can be good function approximators, they are a very inaccurate representation of brain neuronal neurons and networks (which have intricate dendrites and biophysics). We don't know yet how fundamental processes like memory storage work in the brain; it's one of the great mysteries of neuroscience.


I agree with the skepticism in the other comments. But I still happy to hear the news! We need more research on human brain! The more, the better! How human brain works is still pretty much a mystery..


Do not let this man fool you. Our president has every intention of developing a My Little Pony MMORPG once research is finished.


I bet you really wish HN comments allowed images.


How about a 10 year project to pay off the National Debt?

Quit wasting money on science projects that aren't needed (this is already being done).


3% interest times $20 trillion = $600 billion per year in interest. (interest rates will not stay this low indefinitely, and even 3% would be crazy low given the actual default risk via inflation that the US poses)

That interest basis alone would effectively wipe out the ability to continue Social Security (or almost the entire US military, take your pick).

The national debt can never be repaid under any circumstances. We can't afford the real interest cost right now, which is why the Fed is paying for that via debt monetization (aka QE). Throw in just a trillion per year in principle, and it becomes a sad joke.

We will never, and have no plans to ever pay for the national debt. There's no scenario under which the math works out, unless our government suddenly becomes hyper disciplined and initiates an uncompromising 50 year payback plan that takes a hatchet to the entire welfare state (corporate, social, military).


Politicians gonna pander. In unrelated news, the sun rose as expected this morning.


maybe this is the science project to unite the US and push STEM forward.


1984: We're behind schedule.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: