Category: Most Recent

Teensy-based Interface

July 3, 2019

Michael Romano and colleagues from the Han Lab at Boston University recently published their project using a Teensy microcontroller to control an sCMOS camera in behavioral experiments to obtain high temporal precision:


Teensy microcontrollers are becoming increasingly more popular and widespread in the neuroscience community. One benefit of using a Teensy is its ease of programming for those with little programming experience, as it uses Arduino/C++ language. An additional benefit of using a Teensy microcontroller is that it can take in and send out time-precise signals. Romano et al. developed a flexible Teensy 3.2-based interface for data acquisition and delivery of analog and digital signals during a rodent locomotion tracking experiment and in a trace eye blink conditioning experiment. The group shows how the interface can be paired with optical calcium imaging as well. The setup integrates a sCMOS camera with behavioral experiments, and the interface is rather user-friendly.

The Teensy interface ensures that the data is temporally precise, and the Teensy interface can also deliver digital signals with microsecond precision to capture images from a paired sCMOS camera. Calcium imaging can be performed during the eye blink conditioning experiment. This was done through pulses send to the camera to capture calcium activity in the hippocampus at 20 Hz from the Teensy. Additionally, the group shows that the Teensy interface can also generate analog sound waveforms to drive speakers for the eye blink experiment. The study shows how an inexpensive piece of lab equipment, like a simple Teensy microcontroller, can be utilized to drive multiple aspects of a neuroscience experiment, and provides inspiration for future experiments to utilize microcontrollers to control behavioral experiments.

 

For more details on the project, check out the project’s GitHub here.

 

Romano, M., Bucklin, M., Gritton, H., Mehrotra, D., Kessel, R., & Han, X. (2019). A Teensy microcontroller-based interface for optical imaging camera control during behavioral experiments. Journal of Neuroscience Methods, 320, 107-115.

 

optoPAD

June 27, 2019

Carlos Ribeiro’s lab at Champalimaud recently published their new project called optoPAD in eLife:


Both the analysis of behavior and of neural activity need to be time-precise in order to make any correlation or comparison to each other. The analysis of behavior can be done through many methods (as seen by many featured projects on this site!). The Ribeiro lab has previously published their work on flyPAD (Itskov et al., 2014), which is a system for automated analysis of feeding behavior in Drosophila with high temporal precision. However, in attempts to manipulate specific feeding behaviors, the group wanted to go one step further to manipulate neural activity during feeding, and needed a method to do so that would be precise enough to compare with behavior.

In their new manuscript, Moreira et al. describe the design and implementation of a high-throughput system of closed-loop optogenetic manipulation of neurons in Drosophila during feeding behavior. Named optoPAD, the system allows for specific perturbation of specific groups of neurons. They use optoPAD as a method to induce appetitive and aversive effects on feeding through activating or inhibiting gustatory neurons in a closed-loop manner. OptoPAD is a combination of the previous flyPAD system with an additional method for stimulating LEDs for optogenetic perturbation. They also used their system combined with Bonsai, a current open-source framework for behavioral analysis.

The system first uses flyPAD to measure the interaction of the fly with the food given in an experiment. Then, Bonsai detects when the fly interacts with a food electrode, then sending a signal to a microcontroller which will turn on an LED for optogenetic perturbation of neurons in the fly. The authors additionally highlight the flexibility and expandability of the optoPAD system. They detail how flyPAD, once published and then implemented in an optogenetics framework by their group, had been successfully adapted by another group, which is a great example of the benefit of open-source sharing of projects.

 

Details on the hardware and software can be found at the Ribeiro lab Github. More details on flyPAD, the original project, can be found on their github as well.

Information on FlyPAD can also be found on the FlyPAD website and in the FlyPAD paper .


Moreira, J. M., Itskov, P. M., Goldschmidt, D., Steck, K., Walker, S. J., & Ribeiro, C. (2019). optoPAD: a closed-loop optogenetics system to study the circuit basis of feeding behaviors. eLife, doi: 10.7554/eLife.43924

DeepBehavior

June 20, 2019

Ahmet Arac from Peyman Golshani’s lab at UCLA recently developed DeepBehavior, a deep-learning toolbox with post processing methods for video analysis of behavior:


Recently, there has been a major push for more fine-grained and detailed behavioral analysis in the field of neuroscience. While there are methods for taking high-speed quality video to track behavior, the data still needs to be processed and analyzed. DeepBehavior is a deep learning toolbox that automates this process, as its main purpose is to analyze and track behavior in rodents and humans.

The authors provide three different convolutional neural network models (TensorBox, YOLOv3, and OpenPose) which were chosen for their ease of use, and the user can decide which model to implement based on what experiment or what kind of data they aim to collect and analyze. The article provides methods and tips on how to train neural networks with this type of data, and gives methods for post-processing of image data.

In the manuscript, the authors give examples of utilizing DeepBehavior in five behavioral tasks in both animals and humans. For rodents, they use a food pellet reaching task, a three-chamber test, and social interaction of two mice. In humans, they use a reaching task and a supination / pronation task. They provide 3D kinematic analysis in all tasks, and show that the transfer learning approach accelerates network training when images from the behavior videos are used. A major benefit of this tool is that it can be modified and generalized across behaviors, tasks, and species. Additionally, DeepBehavior uses several different neural network architectures, and uniquely provides post-processing methods for 3D kinematic analysis, which separates it from previously published toolboxes for video behavioral analysis. Finally, the authors emphasize the potential for using this toolbox in a clinical setting with analyzing human motor function.

 

For more details, take a look at their project’s Github.

All three models used in the paper also have their own Github: TensorBox, YOLOv3, and openpose.


Arac, A., Zhao, P., Dobkin, B. H., Carmichael, S. T., & Golshani, P. (2019). DeepBehavior: A deep learning toolbox for automated analysis of animal and human behavior imaging data. Frontiers in systems neuroscience, 13.

 

ezTrack

June 13, 2019

Zach Pennington from Denise Cai’s lab at Mt. Sinai recently posted a preprint describing their latest open-source project called ezTrack:


ezTrack is an open-source, platform independent set of behavior analysis pipelines using interactive Python (iPython/Jupyter Notebook) that researchers with no prior programming experience can use. ezTrack is a sigh of relief for researchers with little to no computer programming experience. Behavioral tracking analysis shouldn’t be limited to those with extensive programming knowledge, and ezTrack is a nice alternative to currently available software that may require a bit more programming experience. The manuscript and Jupyter notebooks are written in the style of a tutorial, and is meant to provide straightforward instructions to the user on implementing ezTrack. ezTrack is unique from other recent video analysis toolboxes in that this method does not use deep learning algorithms and thus does not require training sets for transfer learning.

ezTrack can be used to analyze rodent behavior videos of a single animal in different settings, and the authors provide examples of positional analysis across several tasks (place-preference, water-maze, open-field, elevated plus maze, light-dark boxes, etc), as well as analysis of freezing behavior. ezTrack can provide frame-by-frame data output in .csv files, and users can crop the frames of the video to get rid of any issue with cables from optogenetic or electrophysiology experiments. ezTrack can take on multiple different video formats, such as mpg1, wav, avi, and more.

Aside from the benefit of being open-source, there are several major advantages of ezTrack. Notably, the tool is user-friendly in that it is accessible to researchers with little to no programming background. The user does not need to make many adjustments to parameters of the toolbox, and the data can processed into interactive visualizations and is easily extractable in .csv files. ezTrack is both operating system and hardware independent and can be used across multiple platforms. Utilizing ipython/Jupyter Notebook allows researchers to easily replicate their analyses as well.

Check out their GitHub with more details on how to use ezTrack: https://github.com/denisecailab/ezTrack


Pennington, Z. T., Dong, Z., Bowler, R., Feng, Y., Vetere, L. M., Shuman, T., & Cai, D. J. (2019). ezTrack: An open-source video analysis pipeline for the investigation of animal behavior. BioRxiv, 592592. 

3D Printed Headstage Implant

June 6, 2019

Richard Pinnell from Ulrich Hofmann’s lab has three publications centered around open-source and 3D printed methods for headstage implant protection and portable / waterproof DBS and EEG to pair with water maze activity. We share details on the three studies below:


Most researchers opt to single-house rodents after rodents have undergone surgery. This helps the wound heal and prevent any issues with damage to the implant. However, there is substantial benefits to socially-housing rodents, as social isolation can create stressors for them. As a way to continue to socially-house rats, Pinnell et al. (2016a) created a novel 3D-printed headstage socket to surround an electrode connector. Rats were able to successfully be pair housed with these implants and their protective caps.

The polyamide headcap socket itself is 3D printed, and a stainless steel thimble can be screwed into it. The thimble can be removed by being unscrewed to reveal the electrode connector. This implant allows both for increased well-being of the rodent post-surgery, but also has additional benefits in that it can prevent any damage to the electrode implant during experiments and keeps the electrode implant clean as well.

The 3D printed headcap was used in a second study (Pinnell et al., 2016b) for wireless EEG recording in rats during a water maze task. The headstage socket housed the PCB electrode connector and the waterproof wireless system was attached. In this setup, during normal housing conditions, this waterproof attachment was replaced with a standard 18×9 mm stainless-steel sewing thimble, which contained 1.2 mm holes drilled at either end for attachment to the headstage socket. A PCB connector was manufactured to fit inside the socket, and contains an 18-pin zif connector, two DIP connectors, and an 18-pin Omnetics electrode connector for providing an interface between the implanted electrodes and the wireless recording system.

Finally, the implant was utilized in a third study (Pinnell et al., 2018) where the same group created a miniaturized, programmable deep-brain stimulator for use in a water maze. A portable deep brain stimulation (DBS) device was created through using a PCB design, and this was paired with the 3D printed device. The 3D printed headcap was modified from its use in Pinnell et al., 2016a to completely cover the implant and protect the PCB. The device, its battery, and housing weighs 2.7 g, and offers protection from both the environment and from other rats, and can be used in DBS studies during behavior in a water maze.

The portable stimulator, 3D printed cap .stl files, and more files from the publications can be found on https://figshare.com/s/31122e0263c47fa5dabd.


Pinnell, R. C., Almajidy, R. K., & Hofmann, U. G. (2016a). Versatile 3D-printed headstage implant for group housing of rodents. Journal of neuroscience methods, 257, 134-138.

Pinnell, R. C., Almajidy, R. K., Kirch, R. D., Cassel, J. C., & Hofmann, U. G. (2016b). A wireless EEG recording method for rat use inside the water maze. PloS one, 11(2), e0147730.

Low Cost Open Source Eye Tracking

May 30, 2019

On Hackaday, John Evans and colleagues have shared a design and build for an open-source eye-tracking system for human research.


We’ve wanted to expand our coverage of behavioral tools to include those used in human research. To get this rolling, we’d like to highlight a project for eye tracking that might be helpful to many labs, especially if you don’t have a grant to collect pilot data. Check out Low Cost Open Source Eye Tracking. It uses open-source code, available from GitHub, and a pair of cheap USB cameras.

Check out the details on Hackaday.io and GitHub!


Evans, J. (2018). Low Cost Open Source Eye Tracking. Retrieved from https://hackaday.io/project/153293-low-cost-open-source-eye-tracking

OpenBehavior: Progress, user survey, future directions

May 23, 2019

The OpenBehavior Project launched in 2016 with the goal of accelerating research through the promotion of collaboration and sharing open-source tools for behavioral neuroscience research. The project is 100% non-commercial and all content has been generated by volunteer efforts. Prior to the launch of our site, access to design files and build instructions relied on word of mouth and isolated blogs and posts on social media. Last week, we posted for the 100th time on our blog site about open-source tools for behavioral neuroscience research.

A wordcloud generated from survey respondents when asked about the strengths of OpenBehavior and why they might recommend it.

This week, we would like to report on the most active posts on the site and also on a survey on user experiences with OpenBehavior. Thanks to those who took part in the survey!

To date, the top five posts on OpenBehavior (based on web hits form unique URLs) have been on the UCLA Miniscope and four projects for video tracking and analysis: Automated Rodent Tracker (ART), FaceMap: Unsupervised analysis of rodent behaviors, Janelia Automatic Animal Behavior Annotator (JAABA), and Behavioral Observation Research Interactive Software (BORIS). We are preparing a more detailed report on these projects and their impact on current research in neuroscience that will be submitted very soon for publication.

72 people completed the survey. All users had very positive comments about the site and most said that they learned about us through Twitter. It’s quite satisfying to find that an open media framework like Twitter has helped promote the free exchange of ideas on open-source tools and designs.

Three aspects of the participants were most interesting. First, all participants described their field as behavioral neuroscience. We were a bit surprised not to find folks from other fields, such as electronics or software development, and acknowledge that we need to do more to bring in researchers focused on tools for research in human neuroscience and computational models and data analysis methods for understanding behavior. To this end, we are going to include posts at least once a month on these topics. Our next post will be about some open-source options for eye tracking in humans. Other upcoming posts will feature open-source software for drift diffusion models and algorithms for reinforcement learning.

Second, OpenBehavior posts are followed by folks at various career stages. Roughly half of the participants identified as postdoctoral researchers and principal investigators. 20 of 72 participants indicated that they had used tools featured on the site that were not developed by their own labs. 38 other participants indicated that they follow the site with plans to incorporate some of the devices and software that we have profiled into their research programs. The experiences of these users of our website and followers of our Twitter feed indicate that we have had strong initial success in our overall mission to accelerate research through promotion of collaboration and sharing.

Third, and most satisfying to us, the site is used by researchers all around the world. We are exploring adding a forum to the website to encourage interactions between developers and users, which was suggested by several participants of our survey. We will post more details on this soon.

Our survey was the first step in assessing how open-source tools are impacting behavioral studies in neuroscience. To further assess the interests and experiences of our community, we will be running another survey next week to assess knowledge and use of software, microcontrollers, and options for 3D and PCB printing. Once again, we would greatly appreciate input from the community on these issues, and will reach out via Twitter.

By the way, OpenBehavior would not be possible without the efforts of founders Lex Kravitz, Mark Laubach, and MJ Preston, current curators Linda Amarante and Samantha White, and you, our readers.

Automated classification of self-grooming in mice

May 16, 2019

In the Journal of Neuroscience Methods, Bastijn van den Boom and colleagues have shared their ‘how-to’ instructions for implementing behavioral classification with JAABA, featuring bonsai and motr!


In honor of our 100th post on OpenBehavior, we wanted to feature a project that exemplifies how multiple open-source projects can be implemented to address a common theme in behavioral neuroscience: tracking and classifying complex behaviors! The protocol from Van den Boom et al.  implements JAABA, an open-source machine learning based behavior detection system; motr, an open-source mouse trajectory tracking software; and bonsai, an open-source system capable of streaming and recording video. Together they use these tools to process videos of mice performing grooming behaviors in a variety of behavioral setups.

They then compare multiple tools for analyzing grooming behavior sequences in both wild-type and genetic knockout mice with a tendency to over groom. The JAABA trained classifier outperforms the commercially available behavior analysis software and more closely aligns with manual analysis of behavior by expert observers. This offers a novel, cost-effective and easy to use method for assessing grooming behavior in mice comparable to that of an expert observer, with the efficient advantage of being automatic. How to instructions for how to train your own JAABA classifier can be found in their paper!

Read more in their publication here!


Stytra

May 03, 2019

Vilim Štih has shared their new project from the Portugues lab called Stytra, which was recently published in PLOS Computational Biology (Štih, Petrucco et al., 2019):


“Stytra is a flexible open-source software package written in Python and designed to cover all the general requirements involved in larval zebrafish behavioral experiments. It provides timed stimulus presentation, interfacing with external devices and simultaneous real-time tracking of behavioral parameters such as position, orientation, tail and eye motion in both freely-swimming and head-restrained preparations. Stytra logs all recorded quantities, metadata, and code version in standardized formats to allow full provenance tracking, from data acquisition through analysis to publication. The package is modular and expandable for different experimental protocols and setups. Current releases can be found at https://github.com/portugueslab/stytra. We also provide complete documentation with examples for extending the package to new stimuli and hardware, as well as a schema and parts list for behavioral setups. We showcase Stytra by reproducing previously published behavioral protocols in both head-restrained and freely-swimming larvae. We also demonstrate the use of the software in the context of a calcium imaging experiment, where it interfaces with other acquisition devices. Our aims are to enable more laboratories to easily implement behavioral experiments, as well as to provide a platform for sharing stimulus protocols that permits easy reproduction of experiments and straightforward validation. Finally, we demonstrate how Stytra can serve as a platform to design behavioral experiments involving tracking or visual stimulation with other animals and provide an example integration with the DeepLabCut neural network-based tracking method.”

Check out the paper, the enhanced version with the documentation, at www.portugueslab.com/stytra or the pdf at PLOS Computational Biology

 


 

 

Phenopy

April 17, 2019

In a recent Nature Protocol’s article, Edoardo Balzani and colleagues from Valter Tucci’s lab have developed and shared Phenopy, a Python-based open-source analytical platform for behavioral phenotyping.


Behavioral phenotyping of mice using classic methods can be a long process and is susceptible to high variability, leading to inconsistent results. To reduce variance and speed up to process of behavioral analysis, Balzani et al. developed Phenopy, an open-source software for recording and analyzing behavioral data for phenotyping. The software allows for recording components of a behavioral task in combination with electrophysiology data. It is capable of performing online analysis as well as analysis of recorded data on a large scale, all within a user-friendly interface. Information about the software is available in their publication, available from Nature Protocols.*

Check out the full article from Nature Protocols!


(*alternatively available on ResearchGate)