Category: Most Recent

The Future is Open

August 16, 2019

This week’s post is about the current state of OpenBehavior (OB) and ongoing efforts within the open source neuroscience community. Next week, we will resume posting about new tools.

Samantha White, Linda Amarante, Lex Kravitz, and Mark Laubach published a commentary in eNeuro last week about how open-source tools are being used in neuroscience. We reported on our experiences in running OB since the summer of 2016, the many wonderful projects that we have posted about over the past three years, two surveys that we conducted on our site and open source tool use in general, and some observations on the mindset that comes from making and using open source tools. A link to our paper is https://www.eneuro.org/content/6/4/ENEURO.0223-19.2019.

The timing of our commentary and the related social media attention that is generated (e.g. https://twitter.com/samantha6rose/status/1159913815393341440) was especially nice as we have been working to expand OB to better serve the research community, and hope to find external support for the project. We would like to address an outstanding problem: it is not currently possible to systematically track the development and use of open source hardware and software in neuroscience research. To address this issue, we would like to to create a database of existing open source projects, characterize them using a newly developed “taxonomy” based on their functions (video analysis, behavioral control system, hardware for measuring or controlling behavior), and register projects using the SciCrunch RRID registry.

If you haven’t heard of SciCrunch, you should check it out: https://scicrunch.org/. Its an awesome project that tracks usage of research tools such as antibodies. RRIDs are citable and, if developed for open source hardware and software, would allow for developers to track how their tools are used in neuroscience publications. This might help provide incentives for sharing and metrics (RRIDs) on tool use and publication.

We are also planning to work with the Society for Neuroscience (SfN) to increase public awareness of neuroscience research by participating in SfN-sponsored advocacy and outreach events, facilitating discussions of open source tools through a new discussion topic in the Neuronline forums (more news on that soon), and continuing to provide curated itineraries on open source tools for attendees of the annual SfN meeting.

 

Pathfinder

AUGUST 8, 2019

Matthew Cooke and colleagues from Jason Snyder’s lab at University of British Columbia recently developed open source software to detect spatial navigation behavior in animals called Pathfinder:


Spatial navigation is studied across several different paradigms for different purposes in animals; through analyzing spatial behaviors we can gain insight into how an animal learns a task, how they change their approach strategy, and generally observing goal-directed behaviors. Pathfinder is an open source software that can analyze rodent navigation. The software intends to automatically classify patterns of navigation as a rodent performs in a task. Pathfinder can analyze subtle patterns in spatial behavior that simple analysis measures may not always be able to pick up on. Specifically, many water maze analyses use escape latency or path length as an analysis measure, but the authors point out that the time it takes to reach the platform may not differ while the strategy does, so using latency may not be the most optimal measure for analyzing an animal’s strategy and therefore experimenters may miss out on key differences in behavior. Therefore, Pathfinder aims to analyze more subtle aspects of the task to determine differences in spatial navigation and strategy.

Originally intended for water maze navigation, pathfinder can also be used to analyze many other spatial behaviors across different tasks, mazes, and species. The software takes x-y coordinates from behavior tracking software (for example, it can open files from Noldus Ethovision, ActiMetrics’ Watermaze, Stoelting’s Anymaze, and the open-source project ezTrack from Denise Cai’s lab), and then calculates the best-fit search strategy for each rodent’s trial. For the morris water maze task, trials are fit into several categories: Direct Swim, Directed Search, Focal Search, Spatial indirect, Chaining, Scanning, Thigmotaxis, and Random Search.

Pathfinder runs in Python and has an easy-to-use GUI; many aspects and parameters can be adjusted to analyze different tasks or behaviors.

For more details, check out their BioRxiV preprint here.

There’s a nice (humorous!) writeup of the project on the Snyder Lab website.

You can also download the project and view more details on their github:
https://matthewbcooke.github.io/Pathfinder/

https://github.com/MatthewBCooke/Pathfinder/


RAD

August 1, 2019

In their recent eNeuro article, Bridget Matikainen-Ankney and colleagues from the Kravitz Lab have developed and shared their device, rodent activity detector (RAD), a low-cost system that can track and record activity in rodent home cages.


Physical activity is an important measure used in many research studies and is an important determinant of human health. Current methods for measuring physical activity in laboratory rodents have limitations including high expense, specialized caging/equipment, and high computational overhead. To address these limitations, Matikainen-Ankney et al. designed an open-source and cost-effective device for measuring rodent behavior.

In their new manuscript, they describe the design and implementation of RAD, rodent activity detector. The system allows for high throughput installation, minimal investigator intervention and circadian monitoring.  The design includes a battery powered passive infrared (PIR) sensor, microcontroller, microSD card logger, and an oLED screen for displaying data. All of the build instructions for RAD manufacture and programming, including the Arduino code, are provided on the project’s website.

The system records the number of PIR active bouts and the total duration the PIR is active each minute. The authors report that RAD is useful for quantifying changes across minutes rather than on a second to second time-scale, so the default data-logging frequency is set to one minute. The CSV files can be viewed and data visualized using provided python scripts. Device validation with video monitoring strongly correlated PIR data with speed and showed it recorded place to place locomotion but not slow or in place movements. To verify the device’s utility, RAD was used to collect physical activity data from 40 animals for 10 weeks. RAD detected high fat diet (HFD)-induced changes in activity and quantified individual animals’ circadian rhythms. Several major advantages of this tool are that the PIR sensor is not triggered by activity in other cages, it can detect and quantify within-mouse activity changes over time, and little investigator intervention other than infrequent battery replacement is necessary. Although the design was optimized for the lab’s specific caging, the open-source nature of the project makes it easily modifiable.

More details on RAD can be found in their eNeuro manuscript here, and all documentation can also be found on the project’s Hackaday.io page.


Matikainen-Ankney, B. A., Garmendia-Cedillos, M., Ali, M., Krynitsky, J., Salem, G., Miyazaki, N. L., … Kravitz, A. V. (2019). Rodent Activity Detector (RAD), an Open Source Device for Measuring Activity in Rodent Home Cages. ENeuro, 6(4). https://doi.org/10.1523/ENEURO.0160-19.2019

MouseMove

July 18, 2019

In a 2015 Scientific Reports article, Andre Samson and colleagues shared their project MouseMove, an open-source software for quantifying movement in the open field test:


The Open Field (OF) test is a commonly used assay for monitoring exploratory behavior and locomotion in rodents. Most research groups use commercial systems for recording and analyzing behavior in the OF test, but these commercial systems can be expensive and lack flexibility. A few open-source OF systems have been developed, but are limited in the movement parameters that can be collected and analyzed. MouseMove is the first open-source software capable of providing qualitative and quantitative information on mouse locomotion in a semi-automated and high-throughput approach. With the aim of providing a freely available program for analyzing OF test data, these researchers developed a software that accurately quantifies numerous parameters of movement.

In their manuscript, Samson et al. describe the design and implementation of MouseMove. Their OF system allows for the measurement of distance, speed, and laterality with >96% accuracy. They use MouseMove as a method to analyze OF behavior of mice after experimental stroke to show reduced locomotor activity and quantify laterality deficits. The system is used in combination with the open source program ImageJ and the MTrack2 plugin to analyze pre-recorded OF test video.

The system has two downloadable components, the ImageJ macro and a separate program with the custom-built MouseMove GUI. ImageJ is used to subtract the background video from the experiment and create an image of the animals total trajectory. The MouseMove GUI then completes a detailed analysis of the movement patterns, measuring the fractional time spent stationary, the distance traveled, speed mean and various details of laterality. The results are depicted in both a visual/graphical form and as a saveable text file. In the manuscript, they provide step-wise instructions of how to use Mousemove. The authors additionally highlight the defined region-of-interest (ROI) ability of the software that makes it suitable for analysis of cognitive tests such as Novel Object Recognition. This tool offers relatively fast video-processing of motor cognitive behaviors and has many applications for the study of rodent models of brain injury/stimulation to measure altered locomotion.

 

More information on MouseMove can be found in their manuscript here.


Samson, A. L., Ju, L., Ah Kim, H., Zhang, S. R., Lee, J. A. A., Sturgeon, S. A., … Schoenwaelder, S. M. (2015). MouseMove: an open source program for semi-automated analysis of movement and cognitive testing in rodents. Scientific Reports, 5, 16171.  doi: 10.1038/srep16171

HOPE

July 12, 2019

Sebastien Delcasso from the Graybiel lab at MIT published a method for developing a brain implant called “HOPE” for combining with optogenetics, pharmacology, and electrophysiology:


HOPE (hybrid-drive combining optogenetics, pharmacology, and electrophysiology) is a method that simplifies the construction of a drivable and multi-task recording implant. HOPE is a new type of implant that can support up to 16 tetrodes, and allows for recordings of two different brain areas in a mouse at the same time, along with simultaneous optogenetic or pharmacological manipulation. The HOPE implants are open-source and can be recreated in CAD software and subsequently 3D printed, drastically lowering the cost of an electrophysiological implant. Additionally, instead of waiting months for a custom-made implant, these can be printed within a few hours.

The manuscript provides detailed instructions on constructing the implant, and allows for users to individually modify it for their own needs (and can be modified to be used in rats or non-human primates). Additionally, HOPE is meant to be used in experiments with paired electrophysiological experiments with either optogenetic or pharmacological manipulations, which will inevitably open the door to many more experiments. The implant is intended for microdrive recordings, and the actual implant is only made up of six 3D printed parts, an electrode interface board (EIB), and five screws.

The authors validate the implant by first successfully recording striatal neurons, using transgenic PV-Cre mice to optogenetically inhibit parvalbumin interneurons, and then using muscimol infused into the striatum in a head-fixed mouse preparation. HOPE is a novel open-source neural implant that can be paired with multiple methods (recordings, optogenetics, and pharmacology) to help in manipulating and subsequently recording brain activity.

 

 

More details of their implant can be found on their project site and on the project GitHub.


Delcasso, S., Denagamage, S., Britton, Z., & Graybiel, A. M. (2018). HOPE: Hybrid-Drive Combining Optogenetics, Pharmacology and Electrophysiology. Frontiers in neural circuits, 12, 41.

 

Teensy-based Interface

July 3, 2019

Michael Romano and colleagues from the Han Lab at Boston University recently published their project using a Teensy microcontroller to control an sCMOS camera in behavioral experiments to obtain high temporal precision:


Teensy microcontrollers are becoming increasingly more popular and widespread in the neuroscience community. One benefit of using a Teensy is its ease of programming for those with little programming experience, as it uses Arduino/C++ language. An additional benefit of using a Teensy microcontroller is that it can take in and send out time-precise signals. Romano et al. developed a flexible Teensy 3.2-based interface for data acquisition and delivery of analog and digital signals during a rodent locomotion tracking experiment and in a trace eye blink conditioning experiment. The group shows how the interface can be paired with optical calcium imaging as well. The setup integrates a sCMOS camera with behavioral experiments, and the interface is rather user-friendly.

The Teensy interface ensures that the data is temporally precise, and the Teensy interface can also deliver digital signals with microsecond precision to capture images from a paired sCMOS camera. Calcium imaging can be performed during the eye blink conditioning experiment. This was done through pulses send to the camera to capture calcium activity in the hippocampus at 20 Hz from the Teensy. Additionally, the group shows that the Teensy interface can also generate analog sound waveforms to drive speakers for the eye blink experiment. The study shows how an inexpensive piece of lab equipment, like a simple Teensy microcontroller, can be utilized to drive multiple aspects of a neuroscience experiment, and provides inspiration for future experiments to utilize microcontrollers to control behavioral experiments.

 

For more details on the project, check out the project’s GitHub here.

 

Romano, M., Bucklin, M., Gritton, H., Mehrotra, D., Kessel, R., & Han, X. (2019). A Teensy microcontroller-based interface for optical imaging camera control during behavioral experiments. Journal of Neuroscience Methods, 320, 107-115.

 

optoPAD

June 27, 2019

Carlos Ribeiro’s lab at Champalimaud recently published their new project called optoPAD in eLife:


Both the analysis of behavior and of neural activity need to be time-precise in order to make any correlation or comparison to each other. The analysis of behavior can be done through many methods (as seen by many featured projects on this site!). The Ribeiro lab has previously published their work on flyPAD (Itskov et al., 2014), which is a system for automated analysis of feeding behavior in Drosophila with high temporal precision. However, in attempts to manipulate specific feeding behaviors, the group wanted to go one step further to manipulate neural activity during feeding, and needed a method to do so that would be precise enough to compare with behavior.

In their new manuscript, Moreira et al. describe the design and implementation of a high-throughput system of closed-loop optogenetic manipulation of neurons in Drosophila during feeding behavior. Named optoPAD, the system allows for specific perturbation of specific groups of neurons. They use optoPAD as a method to induce appetitive and aversive effects on feeding through activating or inhibiting gustatory neurons in a closed-loop manner. OptoPAD is a combination of the previous flyPAD system with an additional method for stimulating LEDs for optogenetic perturbation. They also used their system combined with Bonsai, a current open-source framework for behavioral analysis.

The system first uses flyPAD to measure the interaction of the fly with the food given in an experiment. Then, Bonsai detects when the fly interacts with a food electrode, then sending a signal to a microcontroller which will turn on an LED for optogenetic perturbation of neurons in the fly. The authors additionally highlight the flexibility and expandability of the optoPAD system. They detail how flyPAD, once published and then implemented in an optogenetics framework by their group, had been successfully adapted by another group, which is a great example of the benefit of open-source sharing of projects.

 

Details on the hardware and software can be found at the Ribeiro lab Github. More details on flyPAD, the original project, can be found on their github as well.

Information on FlyPAD can also be found on the FlyPAD website and in the FlyPAD paper .


Moreira, J. M., Itskov, P. M., Goldschmidt, D., Steck, K., Walker, S. J., & Ribeiro, C. (2019). optoPAD: a closed-loop optogenetics system to study the circuit basis of feeding behaviors. eLife, doi: 10.7554/eLife.43924

DeepBehavior

June 20, 2019

Ahmet Arac from Peyman Golshani’s lab at UCLA recently developed DeepBehavior, a deep-learning toolbox with post processing methods for video analysis of behavior:


Recently, there has been a major push for more fine-grained and detailed behavioral analysis in the field of neuroscience. While there are methods for taking high-speed quality video to track behavior, the data still needs to be processed and analyzed. DeepBehavior is a deep learning toolbox that automates this process, as its main purpose is to analyze and track behavior in rodents and humans.

The authors provide three different convolutional neural network models (TensorBox, YOLOv3, and OpenPose) which were chosen for their ease of use, and the user can decide which model to implement based on what experiment or what kind of data they aim to collect and analyze. The article provides methods and tips on how to train neural networks with this type of data, and gives methods for post-processing of image data.

In the manuscript, the authors give examples of utilizing DeepBehavior in five behavioral tasks in both animals and humans. For rodents, they use a food pellet reaching task, a three-chamber test, and social interaction of two mice. In humans, they use a reaching task and a supination / pronation task. They provide 3D kinematic analysis in all tasks, and show that the transfer learning approach accelerates network training when images from the behavior videos are used. A major benefit of this tool is that it can be modified and generalized across behaviors, tasks, and species. Additionally, DeepBehavior uses several different neural network architectures, and uniquely provides post-processing methods for 3D kinematic analysis, which separates it from previously published toolboxes for video behavioral analysis. Finally, the authors emphasize the potential for using this toolbox in a clinical setting with analyzing human motor function.

 

For more details, take a look at their project’s Github.

All three models used in the paper also have their own Github: TensorBox, YOLOv3, and openpose.


Arac, A., Zhao, P., Dobkin, B. H., Carmichael, S. T., & Golshani, P. (2019). DeepBehavior: A deep learning toolbox for automated analysis of animal and human behavior imaging data. Frontiers in systems neuroscience, 13.

 

ezTrack

June 13, 2019

Zach Pennington from Denise Cai’s lab at Mt. Sinai recently posted a preprint describing their latest open-source project called ezTrack:


ezTrack is an open-source, platform independent set of behavior analysis pipelines using interactive Python (iPython/Jupyter Notebook) that researchers with no prior programming experience can use. ezTrack is a sigh of relief for researchers with little to no computer programming experience. Behavioral tracking analysis shouldn’t be limited to those with extensive programming knowledge, and ezTrack is a nice alternative to currently available software that may require a bit more programming experience. The manuscript and Jupyter notebooks are written in the style of a tutorial, and is meant to provide straightforward instructions to the user on implementing ezTrack. ezTrack is unique from other recent video analysis toolboxes in that this method does not use deep learning algorithms and thus does not require training sets for transfer learning.

ezTrack can be used to analyze rodent behavior videos of a single animal in different settings, and the authors provide examples of positional analysis across several tasks (place-preference, water-maze, open-field, elevated plus maze, light-dark boxes, etc), as well as analysis of freezing behavior. ezTrack can provide frame-by-frame data output in .csv files, and users can crop the frames of the video to get rid of any issue with cables from optogenetic or electrophysiology experiments. ezTrack can take on multiple different video formats, such as mpg1, wav, avi, and more.

Aside from the benefit of being open-source, there are several major advantages of ezTrack. Notably, the tool is user-friendly in that it is accessible to researchers with little to no programming background. The user does not need to make many adjustments to parameters of the toolbox, and the data can processed into interactive visualizations and is easily extractable in .csv files. ezTrack is both operating system and hardware independent and can be used across multiple platforms. Utilizing ipython/Jupyter Notebook allows researchers to easily replicate their analyses as well.

Check out their GitHub with more details on how to use ezTrack: https://github.com/denisecailab/ezTrack


Pennington, Z. T., Dong, Z., Bowler, R., Feng, Y., Vetere, L. M., Shuman, T., & Cai, D. J. (2019). ezTrack: An open-source video analysis pipeline for the investigation of animal behavior. BioRxiv, 592592.