Pages

Monday, August 15, 2011

Neuroscience is Computer Science, Part II

Labrigger posted that the NIH has posted a request for information on what tools would speed neuroscience research.  As I've written before, neuroscience research is now heavily dependent on computing tools, and the programming skills of researchers.  As such, I wrote to them regarding what tools I'd like to see developed:
I would like to see a wide variety of software tools developed.
I have performed neuroscience analyses using expensive proprietary programs, or custom-designed software written by amateurs. For example, in my previous lab, we analyzed data using Vector NTI, which costs $1000/year, and has an awful user interface, wasting time.  In my current lab, we analyze multielectrode data using software written by the Buszaki lab. This software works, but is slow, requiring a high-powered computer to run overnight to analyze one piece of data. 
The NIH has already developed one nice piece of software, ImageJ, which is used extensively throughout neuroscience. The programs I would like to see (from my own experience) are an update of ImageJ, a DNA analysis tool (primer design, sequence alignment, etc.), spike identification, and spike clustering. Given the highly computationally intense nature of some of these programs, they should furthermore utilize the power present in modern graphics processors.
The current software tools we use are expensive, slow, and generally inadequate. A small amount of money spent by the NIH developing and standardizing these tools would save researchers money, time implementing solutions themselves, and time actually using the programs.

2 comments:

  1. Its hard to swallow the idea that this will increase science productivity, but it will certainly make doing science more cost effective, I'm sure lots of NIH grant money is wasted buying expensive software licenses. This brings up a much larger problem, --the cost of doing even basic science, like the cost of healthcare is ridiculously expensive. As NIH is increasing funding lines left and right where is all this speedy research going to happen. In academia? Certainly not.

    ReplyDelete
  2. You're right that this is more focused towards cost effectiveness than sheer speed, but I think productivity gains are included. For example, whenever I ran Vector NTI, I had to fight with the display windows, which made everything take a couple minutes longer. This was pure waste, and over all labs everywhere, years of man-hours are wasted.

    In my work now, some data crunching programs run for hours, and we usually run them overnight. But if they run faster, I can run them the same day I acquire data, churn through multiple data sets, and save significant time.

    And just like you say, more cost-efficient research lets us do more of it.

    ReplyDelete

Note: Only a member of this blog may post a comment.