Category: Data Analysis

Autopilot

DECEMBER 12, 2019

Jonny Saunders from Michael Wehr’s lab at the University of Oregon recently posted a preprint documenting their project Autopilot, which is a python framework for running behavioral experiments:


Autopilot is a python framework for behavioral experiments through utilizing Raspberry Pi microcontrollers. Autopilot incorporates all aspects of an experiment, including the hardware, stimuli, behavioral task paradigm, data management, data visualization, and a user interface. The authors propose that Autopilot is the fastest, least expensive, most flexibile behavioral system that is currently available.

The benefit of using Autopilot is that it allows more experimental flexibility, which lets researchers to optimize it for their specific experimental needs. Additionally, this project exemplifies how useful a raspberry pi can be for performing experiments and recording data. The preprint discusses many benefits of raspberry pis, including their speed, precision and proper data logging, and they only cost $35 (!!). Ultimately, the authors developed Autopilot in an effort to encourage users to write reusable, portable experiments that is put into a public central library to push replication and reproducibility.

 

For more information, check out their presentation or the Autopilot website here.

Additionally documentation is here, along with a github repo, and a link to their preprint is here.


Oat: online animal tracker

December 5, 2019

Jonathan Newman of the Wilson Lab at Massachusetts Institute of Technology has developed and shared a set of programs for processing video.


The only thing you’ll enjoy more than an oat milk latte from your favorite coffeeshop is the Oat collection of video processing components designed for use with Linux! Developed by Jonathan Newman of Open Ephys, this set of programs is useful for processing video, extracting object position information, and streaming data. While it was designed for use with real-time animal position tracking, it can be used in multiple applications that require real-time object tracking. The individual Oat components each have a standard interface that can be chained together to create complex dataflow networks for capturing  processing, and recording video streams.

Read more about Oat on the Open Ephys website, or check out more on Github!


https://open-ephys.org/oat

Touchscreen Cognition and MouseBytes

NOVEMBER 21, 2019

Tim Bussey and Lisa Saksida from Western University and the BrainsCAN group developed touchscreen device chambers that can be used to measure rodent behavior. While the touchscreens themselves are not an open-source device, we appreciate the open-science push for creating a user community, performing workshops and tutorials, and data sharing. Most notably, their sister project, MouseBytes, is an open-access database for all cognitive data collected from the touchscreen-related tasks:


Touchscreen History:

In efforts to develop a cognitive testing method for rodents that would optimally reflect a touchscreen testing method in humans, Bussey et al., (1994, 1997a,b) developed a touchscreen apparatus for rats, which was subsequently adapted for mice as well. In short, the touchscreens allow for computer-aided graphics to be presented to a rodent and the rodent can make choices in a task based on which stimuli appear. The group published a “tutorial” paper detailing the behavior and proper training methods to get rats to perform optimally using these devices (Bussey et al., 2008). Additionally, in 2013, three separate Nature Protocols articles were published by this group, with details on how to use the touchscreens in tasks assessing executive function, learning and memory, and working memory and pattern separation in rodents (Horner et al., 2013; Mar et al., 2013; Oomen et al., 2013).

Most recently, the group has developed https://touchscreencognition.org/ which is a place for user forums, discussion, training information, etc. The group is actively doing live training sessions as well for anyone interested in using touchscreens in their tasks. Their twitter account, @TouchScreenCog, highlights recent trainings as well. Through developing automated tests for specific behaviors, this data can be extrapolated across labs and tasks.


MouseBytes:

Additionally, MouseBytes is an open-access database where scientists can upload their data to, or can analyze other data already collected from another group. Not only does this reduce redundancy of experiments, but also allows for transparency and reproducibility for the community. The site also performs data comparison and interactive data visualization for any data uploaded onto the site. There are also guidelines and video tutorials on the site as well.


Nature Protocols Tutorials:

Horner, A. E., Heath, C. J., Hvoslef-Eide, M., Kent, B. A., Kim, C. H., Nilsson, S. R., … & Bussey, T. J. (2013). The touchscreen operant platform for testing learning and memory in rats and mice. Nature protocols, 8(10), 1961.

Mar, A. C., Horner, A. E., Nilsson, S. R., Alsiö, J., Kent, B. A., Kim, C. H., … & Bussey, T. J. (2013). The touchscreen operant platform for assessing executive function in rats and mice. Nature protocols, 8(10), 1985.

Oomen, C. A., Hvoslef-Eide, M., Heath, C. J., Mar, A. C., Horner, A. E., Bussey, T. J., & Saksida, L. M. (2013). The touchscreen operant platform for testing working memory and pattern separation in rats and mice. Nature protocols, 8(10), 2006.

Original Touchscreen Articles:

Bussey, T. J., Muir, J. L., & Robbins, T. W. (1994). A novel automated touchscreen procedure for assessing learning in the rat using computer graphic stimuli. Neuroscience Research Communications, 15(2), 103-110.

Bussey, T. J., Padain, T. L., Skillings, E. A., Winters, B. D., Morton, A. J., & Saksida, L. M. (2008). The touchscreen cognitive testing method for rodents: how to get the best out of your rat. Learning & memory, 15(7), 516-523.

 

You can buy the touchscreens here.

 

Editor’s Note: We understand that Nature Protocols is not an open-access journal and that the touchscreens must be purchased from a commercial company and are not technically open-source. However, we appreciate the group’s ongoing effort to streamline data across labs, to put on training workshops, and to provide an open-access data repository for this type of data.

Updates on LocoWhisk and ART

OCTOBER 3, 2019

Dr Robyn Grant from Manchester Metropolitan University in Manchester, UK has shared her group’s most recent project called LocoWhisk, which is a hardware and software solution for measuring rodent exploratory, sensory and motor behaviours:


In describing the project, Dr Grant writes, “Previous studies from our lab have shown that that analysing whisker movements and locomotion allows us to quantify the behavioural consequences of sensory, motor and cognitive deficits in rodents. Independent whisker and feet trackers existed but there was no fully-automated, open-source software and hardware solution, that could measure both whisker movements and gait.

We developed the LocoWhisk arena and new accompanying software, that allows the automatic detection and measurement of both whisker and gait information from high-speed video footage. The arena can easily be made from low-cost materials; it is portable and incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source).

The software, ARTv2 is freely available and open source. ARTv2 is also fully-automated and has been developed from our previous ART software (Automated Rodent Tracker).

ARTv2 contains new whisker and foot detector algorithms. On high-speed video footage of freely moving small mammals (including rat, mouse and opossum), we have found that ARTv2 is comparable in accuracy, and in some cases significantly better, than readily available software and manual trackers.

The LocoWhisk system enables the collection of quantitative data from whisker movements and locomotion in freely behaving rodents. The software automatically records both whisker and gait information and provides added statistical tools to analyse the data. We hope the LocoWhisk system and software will serve as a solid foundation from which to support future research in whisker and gait analysis.”

For more details on the ARTv2 software, check out the github page here.

Check out the paper that describes LocoWhisk and ARTv2, which has recently been published in the Journal of Neuroscience Methods.

LocoWhisk was initially shared and developed through the NC3Rs CRACK IT website here.


SpikeGadgets

AUGUST 22, 2019

We’d like to highlight groups and companies that support an open-source framework to their software and/or hardware in behavioral neuroscience. One of these groups is SpikeGadgets, a company co-founded by Mattias Karlsson and Magnus Karlsson.


SpikeGadgets is a group of electrophysiologists and engineers who are working to develop neuroscience hardware and software tools. Their open-source software, Trodes, is a cross-platform software suite for neuroscience data acquisition and experimental control, which is made up of modules that communicate with a centralized GUI to visualize and save electrophysiological data. Trodes has a camera module and a StateScript module, which is a state-based scripting language that can be used to program behavioral tasks through using lights, levels, beam breaks, lasers, stimulation sources, audio, solenoids, etc. The camera module can be used to acquire video that can synchronize to neural recordings; the camera module can track the animal’s position in real-time or play it back after the experiment. The camera module can work with USB webcams or GigE cameras.

Paired with the Trodes software and StateScript language is the SpikeGadgets hardware that can be purchased on their website. The hardware is used for data acquisition (Main Control Unit, used for electrophysiology) and behavioral control (Environmental Control Unit).  SpikeGadgets also provides both Matlab and Python toolboxes on their site that can be used to analyze both behavioral and electrophysiological data. Trodes can be used on Windows, Linux, or Mac, and there are step-by-step instructions for how to install and use Trodes on the group’s bitbucket page.

Spikegadgets mission is “to develop the most advanced neuroscience tools on the market, while preserving ease of use and science-driven customization.”

 


For more information on SpikeGadgets or to download or purchase their software or hardware, check out their website here.

There is additional documentation on their BitBucket Wiki, with a user manual, instructions for installation, and FAQ.

Check out their entire list of collaborators, contributors, and developers here.

optoPAD

June 27, 2019

Carlos Ribeiro’s lab at Champalimaud recently published their new project called optoPAD in eLife:


Both the analysis of behavior and of neural activity need to be time-precise in order to make any correlation or comparison to each other. The analysis of behavior can be done through many methods (as seen by many featured projects on this site!). The Ribeiro lab has previously published their work on flyPAD (Itskov et al., 2014), which is a system for automated analysis of feeding behavior in Drosophila with high temporal precision. However, in attempts to manipulate specific feeding behaviors, the group wanted to go one step further to manipulate neural activity during feeding, and needed a method to do so that would be precise enough to compare with behavior.

In their new manuscript, Moreira et al. describe the design and implementation of a high-throughput system of closed-loop optogenetic manipulation of neurons in Drosophila during feeding behavior. Named optoPAD, the system allows for specific perturbation of specific groups of neurons. They use optoPAD as a method to induce appetitive and aversive effects on feeding through activating or inhibiting gustatory neurons in a closed-loop manner. OptoPAD is a combination of the previous flyPAD system with an additional method for stimulating LEDs for optogenetic perturbation. They also used their system combined with Bonsai, a current open-source framework for behavioral analysis.

The system first uses flyPAD to measure the interaction of the fly with the food given in an experiment. Then, Bonsai detects when the fly interacts with a food electrode, then sending a signal to a microcontroller which will turn on an LED for optogenetic perturbation of neurons in the fly. The authors additionally highlight the flexibility and expandability of the optoPAD system. They detail how flyPAD, once published and then implemented in an optogenetics framework by their group, had been successfully adapted by another group, which is a great example of the benefit of open-source sharing of projects.

 

Details on the hardware and software can be found at the Ribeiro lab Github. More details on flyPAD, the original project, can be found on their github as well.

Information on FlyPAD can also be found on the FlyPAD website and in the FlyPAD paper .


Moreira, J. M., Itskov, P. M., Goldschmidt, D., Steck, K., Walker, S. J., & Ribeiro, C. (2019). optoPAD: a closed-loop optogenetics system to study the circuit basis of feeding behaviors. eLife, doi: 10.7554/eLife.43924

ezTrack

June 13, 2019

Zach Pennington from Denise Cai’s lab at Mt. Sinai recently posted a preprint describing their latest open-source project called ezTrack:


ezTrack is an open-source, platform independent set of behavior analysis pipelines using interactive Python (iPython/Jupyter Notebook) that researchers with no prior programming experience can use. ezTrack is a sigh of relief for researchers with little to no computer programming experience. Behavioral tracking analysis shouldn’t be limited to those with extensive programming knowledge, and ezTrack is a nice alternative to currently available software that may require a bit more programming experience. The manuscript and Jupyter notebooks are written in the style of a tutorial, and is meant to provide straightforward instructions to the user on implementing ezTrack. ezTrack is unique from other recent video analysis toolboxes in that this method does not use deep learning algorithms and thus does not require training sets for transfer learning.

ezTrack can be used to analyze rodent behavior videos of a single animal in different settings, and the authors provide examples of positional analysis across several tasks (place-preference, water-maze, open-field, elevated plus maze, light-dark boxes, etc), as well as analysis of freezing behavior. ezTrack can provide frame-by-frame data output in .csv files, and users can crop the frames of the video to get rid of any issue with cables from optogenetic or electrophysiology experiments. ezTrack can take on multiple different video formats, such as mpg1, wav, avi, and more.

Aside from the benefit of being open-source, there are several major advantages of ezTrack. Notably, the tool is user-friendly in that it is accessible to researchers with little to no programming background. The user does not need to make many adjustments to parameters of the toolbox, and the data can processed into interactive visualizations and is easily extractable in .csv files. ezTrack is both operating system and hardware independent and can be used across multiple platforms. Utilizing ipython/Jupyter Notebook allows researchers to easily replicate their analyses as well.

Check out their GitHub with more details on how to use ezTrack: https://github.com/denisecailab/ezTrack


Pennington, Z. T., Dong, Z., Bowler, R., Feng, Y., Vetere, L. M., Shuman, T., & Cai, D. J. (2019). ezTrack: An open-source video analysis pipeline for the investigation of animal behavior. BioRxiv, 592592. 

Automated classification of self-grooming in mice

May 16, 2019

In the Journal of Neuroscience Methods, Bastijn van den Boom and colleagues have shared their ‘how-to’ instructions for implementing behavioral classification with JAABA, featuring bonsai and motr!


In honor of our 100th post on OpenBehavior, we wanted to feature a project that exemplifies how multiple open-source projects can be implemented to address a common theme in behavioral neuroscience: tracking and classifying complex behaviors! The protocol from Van den Boom et al.  implements JAABA, an open-source machine learning based behavior detection system; motr, an open-source mouse trajectory tracking software; and bonsai, an open-source system capable of streaming and recording video. Together they use these tools to process videos of mice performing grooming behaviors in a variety of behavioral setups.

They then compare multiple tools for analyzing grooming behavior sequences in both wild-type and genetic knockout mice with a tendency to over groom. The JAABA trained classifier outperforms the commercially available behavior analysis software and more closely aligns with manual analysis of behavior by expert observers. This offers a novel, cost-effective and easy to use method for assessing grooming behavior in mice comparable to that of an expert observer, with the efficient advantage of being automatic. How to instructions for how to train your own JAABA classifier can be found in their paper!

Read more in their publication here!


AutonoMouse

May 10, 2019

In a recently published article (Erskine et al., 2019), The Schaefer lab at the Francis Crick Institute introduced their new open-source project called AutonoMouse.


AutonoMouse is a fully automated, high-throughput system for self-initiated conditioning and behavior tracking in mice. Many aspects of behavior can be analyzed through having rodents perform in operant conditioning tasks. However, in operant experiments, many variables can potentially alter or confound results (experimenter presence, picking up and handling animals, altered physiological states through water restriction, and the issue that rodents often need to be individually housed to keep track of their individual performances). This was the main motivation for the authors to investigate a way to completely automate operant conditioning. The authors developed AutonoMouse as a fully automated system that can track large numbers (over 25) of socially-housed mice through implanted RFID chips on mice. With the RFID trackers and other analyses, the behavior of mice can be tracked as they train and are subsequently tested on (or self-initiate testing in) an odor discrimination task over months with thousands of trials performed every day. The novelty in this study is the fully automated nature or the entire system (training, experiments, water delivery, weighing the animals are all automated) and the ability to keep mice socially-housed 24/7, all while still training them and tracking their performance in an olfactory operant conditioning task. The modular set-up makes it possible for AutonoMouse to be used to study many other sensory modalities, such as visual stimuli or in decision-making tasks. The authors provide a components list, layouts, construction drawings, and step-by-step instructions for the construction and use of AutonoMouse in their publication and on their project’s github.


For more details, check out this youtube clip interview with Andreas Schaefer, PI on the project.

 

The github for the project’s control software is located here: https://github.com/RoboDoig/autonomouse-control and for the project’s design and hardware instructions is here: https://github.com/RoboDoig/autonomouse-design. The schedule generation program is located here: https://github.com/RoboDoig/schedule-generator


Phenopy

April 17, 2019

In a recent Nature Protocol’s article, Edoardo Balzani and colleagues from Valter Tucci’s lab have developed and shared Phenopy, a Python-based open-source analytical platform for behavioral phenotyping.


Behavioral phenotyping of mice using classic methods can be a long process and is susceptible to high variability, leading to inconsistent results. To reduce variance and speed up to process of behavioral analysis, Balzani et al. developed Phenopy, an open-source software for recording and analyzing behavioral data for phenotyping. The software allows for recording components of a behavioral task in combination with electrophysiology data. It is capable of performing online analysis as well as analysis of recorded data on a large scale, all within a user-friendly interface. Information about the software is available in their publication, available from Nature Protocols.*

Check out the full article from Nature Protocols!


(*alternatively available on ResearchGate)