Category: Software

Open Ephys

March 17, 2017

Jakob Voigts, from the Massachusetts Institute of Technology, has shared the following regarding www.open-ephys.org. Open Ephys aims to distribute reliable open source software as well as tools for extracellular recording and stimulation.


“Open Ephys is a collaborative effort to develop, document, and distribute open-source tools for systems neuroscience. Since the spring of 2011, our main focus has been on creating a multichannel data acquisition system optimized for recordings in freely behaving rodents. However, most of our tools are general enough to be used in applications involving other model organisms and electrode types.

We believe that open-source tools can improve basic scientific research in a variety of ways. They are often less expensive than their closed-source counterparts, making it more affordable to scale up one’s experiments. They are readily modifiable, giving scientists a degree of flexibility that is not usually provided by commercial systems. They are more transparent, which leads to a better understanding of how one’s data is being generated. Finally, by encouraging researchers to document and share tools they would otherwise keep to themselves, the open-source community reduces redundant development efforts, thereby increasing overall scientific productivity.”
– Jakob Voigts

Open Ephys features devices such as the flexDrive, a “chronic drive implant for extracellular electrophysiology”, as well as an arduino-based tetrode twister. The Pulse Pal generates precise voltage pulses. Also featured on Open Ephys is software such as Symphony, a MATLAB-based data acquisition system for electrophysiology.
The Open Ephys GitHub can be found here.

Eco-HAB

February 12, 2017 

Dr. Ewelina Knapska from the Nencki Institute of Experimental Biology in Warsaw, Poland has shared the following regarding Eco-HAB, an RFID-based system for automated tracking: 


Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.


Janelia Automatic Animal Behavior Annotator (JAABA)

December 30, 2016

Mayank Kabra has shared the following about JAABA, a machine learning-based behavior detection system developed by the Branson Lab at HHMI Janelia Farm.


The Janelia Automatic Animal Behavior Annotator (JAABA) is a machine learning-based system that enables researchers to automatically detect behaviors in video of behaving animals. Through our system, users begin by labeling the behavior of the animal, e.g. walking, grooming, or following, in few of the video frames. Using machine learning, JAABA learns behavior detectors based on these labels that can then be used to automatically classify the behaviors of animals in other videos. JAABA has a fast and intuitive graphical user interface, and multiple ways to visualize and understand the classifier by a non-machine learning user. JAABA has enabled extraction of detailed, scientifically meaningful measurements of the behavioral effects in large experiments.

Behavioral Observation Research Interactive Software (BORIS)

Olivier Friard of The University of Turin has generously shared the following about BORIS:


BORIS is an open-source, multiplatform standalone program that allows a user-specific coding environment for a computer-based review of previously recorded videos or live observations. The program allows defining a highly-customised, project-based ethogram that can then be shared with collaborators or can be imported or modified. Once the coding process is completed, the program can extract a time-budget or single or grouped observations automatically and present an at-a-glance summary of the main behavioural features. The observation data and analyses can be exported in various formats (e.g.; CSV or XLS). The behavioural events can also be plotted and exported in various graphic formats (e.g.; PNG, EPS, and PDF).

BORIS is currently used around the world for strikingly different approaches to the study of animal and human behaviour.

The latest version of BORIS can be downloaded from www.boris.unito.it, where the manual and some tutorials are also available.

observation_running  ethogram


Friard, O. and Gamba, M. (2016), BORIS: a free, versatile open-source
event-logging software for video/audio coding and live observations.
Methods Ecol Evol, 7: 1325–1330. doi:10.1111/2041-210X.12584

Bonsai

Gonçalo Lopez of the Champalimaud Neuroscience Programme has shared the following about Bonsai:

Bonsai is an open-source visual programming language for composing reactive and asynchronous data streams coming from video cameras, microphones, electrophysiology systems or data acquisition boards. It gives the experimenter an easy way not only to collect data from all these devices simultaneously, but also to quickly and flexibly chain together real-time processing pipelines that can be fed back to actuator systems for closed-loop stimulation using sound, images or other kinds of digital and analog output.
Bonsai is being used around the world to design all kinds of behavioral experiments from eye-tracking to virtual reality, spanning multiple model organisms from fish to humans.

The latest version of Bonsai can be downloaded from BitBucket, along with instructions for quickly setting up a working system.
A public user forum is also available where you can leave your questions and feedback about how to use Bonsai.

Scintillate

Ian Dublon, former postdoctoral researcher at the


scintillate

The program itself is relatively simple and originated out of the real need to rapidly appraise 2d calcium imaging datasets and the frustrations with existing acquisition software relying exclusively on defined regions of interest (ROIs). There is of course nothing wrong with ROIs but it is useful to rapidly appraise the the whole matrix. It really depends on the biological question being asked.

At its most simple, when you open a stacked Tiff, scintillate uses MATLAB’s imabsdiff to look at the absolute difference across the image matrix for successive frames in the stack. Several other more powerful tools, including background pre-stimulus subtraction and ICA (using the excellent FastICA) are provided to help the user ascertain the value of the studied preparation. This helps to confirm and pinpoint areas of signal change, allowing the imager to make adjustments to image acquisition parameters or simply to start afresh.

Written as a GUI in MATLAB and with source and GUIDE files provided alongside a getting started manual, it is designed both for image acquisition people and for MATLAB coders. If the compiler toolbox is present its possible to package a version of scintillate that will run without a MATLAB install present, making it ideal for running on the image acquisition workstation. Simply provide it with an acquired stacked tiff and within three or so clicks and no command line syntax you are ready to go.

Code is hosted at GitHub here: github.com/dublon/scintillate and scintillate uses some freely available 3rd party code which is gladly acknowledged throughout. Scintillate is not designed to replace traditional analysis methods but rather to provide a means of open-source rapid evaluation during the pre-processing stage. It is hoped that by hosting it on GitHub it may be developed further and thus adjusted to suit each persons individual imaging needs.

Oculomatic Eye-Tracking

Jan Zimmerman, a postdoc fellow in the Glimcher Lab at New York University has shared the following about Oculomatic:

captureVideo-based noninvasive eye trackers are an extremely useful tool for many areas of research. Many open-source eye trackers are available but current open-source systems are not designed to track eye movements with the temporal resolution required to investigate the mechanisms of oculomotor behavior. Commercial systems are available but employ closed source hardware and software and are relatively expensive therefore limiting wide-spread use.

Here we present Oculomatic, an open-source software and modular hardware solution to eye tracking for use in humans and non-human primates. Our fully modular hardware implementation relies on machine vision USB3 camera systems paired with affordable lens options from the surveillance sector. Our cross platform software implementation (C++) relies on openFrameworks as well as openCV and uses binary image statistics (following Green’s theorem) to compute pupil location and diameter. Oculomatic makes use of data acquisition devices to output eye position as calibrated analog voltages for direct integration into the electrophysiological acquisition chain.

Oculomatic features high temporal resolution (up to 600Hz), real-time eye tracking with high spatial accuracy (< 0.5°), and low system latency (< 1.8ms) at a relatively low-cost  (< 1000USD). We tested Oculomatic during regular monkey training and task performance in two independent laboratories and compared Oculomatic performance to existing scleral search-coils. While being fully non-invasive, Oculomatic performed favorably to the gold standard of search-coils with only a slight decrease in spatial accuracy.

We propose this system can support a range of research into the properties and neural mechanisms of oculomotor behavior as well as provide an affordable tool to scale non-human primate electrophysiology further.

img_0820


The most recent version of the software can be found at: https://github.com/oculomatic/oculomatic-release.


Zimmermann, J., Vazquez, Y., Glimcher, P. W., Pesaran, B., & Louie, K. (2016). Oculomatic: High speed, reliable, and accurate open-source eye tracking for humans and non-human primates. Journal of Neuroscience Methods, 270, 138-146.

Laubach Lab GitHub Repository

The Laubach Lab at American University investigates executive control and decision making, focusing on the role of the prefrontal cortex. Through their GitHub repository, these researchers provide 3D print files for many of the behavioral devices used in their lab, including a Nosepoke and a Lickometer designed from rats. The repository also includes a script that reads MedPC files into Python in a usable way.

Wave Surfer

WaveSurfer is an open-source application for neurophysiology data acquisition and analysis. The program is based in MatLab, and evolved from an earlier open-source software package called Ephus. WaveSurfer is currently pre-release, but can be downloaded from the WaveSurfer Webpage or the WaveSurfer GitHub Repository.

The project was initiated by the Svoboda Lab, and developed as a collaborative effort between several research groups at the Howard Hughes Medical Institute’s Janelia Research Campus. Janelia is a major proponent of collaboration and open-science, providing documentation for dozens of tools and innovations developed on their campus through their webpage, including several tools specific to behavioral neuroscience research.