Category: Video Analysis

An inexpensive, scalable Picamera system for tracking rats in large spaces

August 15, 2018

In the Journal of Neurophysiology, Sachin S. Deshmuhk and colleagues share their design for a Picamera system that allows for tracking of animals in large behavioral arenas.

Studies of spatial navigation and its neural correlates have been limited in the past by the reach of recording cables and tracking ability in small behavioral arenas. With the implementation of long-range, wireless neural recording systems, researchers are not able to expand the size of their behavioral arenas to study spatial navigation, but a way to accurately track animals in these larger arenas is necessary. The Picamera system is a low-cost, open-source scalable multi-camera tracking system that can be used to track behavior in combination with wireless recording systems. The design is comprised of 8 overhead Raspberry Pi cameras (capable of recording at a high frame rate in a large field of view) recording video independently in individual Raspberry Pi microcomputers and processed using the Picamera Python library. When compared with a commercial tracking software for the same purpose, the Picamera system reportedly performed better with improvements in inter-frame interval jitter and temporal accuracy, which improved the ability to establish relationships between recorded neural activity and video. The Picamera system is an affordable, efficient solution for tracking animals in large spaces.

Read more here!

Or check out their GitHub!

Saxena, R., Barde, W., and Deshmukh, S.S. An inexpensive, scalable camera system for tracking rats in large spaces (2018). Journal of Neurophysiology.

Collaboration between OpenBehavior and

July 23, 2018

OpenBehavior has been covering open-source neuroscience projects for a few years, and we are always thrilled to see projects that are well documented and can be easily reproduced by others.  To further this goal, we have formed a collaboration with, who have provided a home for OpenBehavior on their site.  This can be found at:, where we currently have 36 projects listed ranging from electrophysiology to robotics to behavior.  We are excited about this collaboration because it provides a straightforward way for people to document their projects with instructions, videos, images, data, etc.  Check it out, see what’s there, and if you want your project linked to the OpenBehavior page simply tag it as “OPENBEHAVIOR” or drop us a line at the Hackaday page.

Note: This collaboration between OpenBehavior and is completely non-commercial, meaning that we don’t pay for anything, nor do we receive any payments from them.  It’s simply a way to further our goal of promoting open-source neuroscience tools and their goal of growing their science and engineering community.


Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai

June 15, 2018

In a recent preprint on BioRxiv, Alessio Buccino and colleagues from the University of Oslo provide a step-by-step guide for setting up an open source, low cost, and adaptable system for combined behavioral tracking, electrophysiology, and closed-loop stimulation. Their setup integrates Bonsai and Open Ephys with multiple modules they have developed for robust real-time tracking and behavior-based closed-loop stimulation. In the preprint, they describe using the system to record place cell activity in the hippocampus and medial entorhinal cortex, and present a case where they used the system for closed-loop optogenetic stimulation of grid cells in the entorhinal cortex as examples of what the system is capable of. Expanding the Open Ephys system to include animal tracking and behavior-based closed-loop stimulation extends the availability of high-quality, low-cost experimental setup within standardized data formats.

Read more on BioRxiv, or on GitHub!

Buccino A, Lepperød M, Dragly S, Häfliger P, Fyhn M, Hafting T (2018). Open Source Modules for Tracking Animal Behavior and Closed-loop Stimulation Based on Open Ephys and Bonsai. BioRxiv.

Head-Fixed Setup for Combined Behavior, Electrophysiology, and Optogenetics

June 12, 2018

In a recent publication in the Frontiers in Systems Neuroscience, Solari and colleagues of the Hungarian Academy of Sciences and Semmelweis University have shared the following about a behavioral setup for temporally controlled rodent behavior. This arrangement allows for training of head-fixed animals with calibrated sound stimuli, precisely timed fluid and air puff presentations as reinforcers. It combines microcontroller-based behavior control with a sound delivery system for acoustic stimuli, fast solenoid valves for reinforcement delivery and a custom-built sound attenuated chamber, and is shown to be suitable for combined behavior, electrophysiology and optogenetics experiments. This system utilizes an optimal open source setup of both hardware and software through using Bonsai, Bpod and OpenEphys.

Read more here!


Solari N, Sviatkó K, Laszlovszky T, Hegedüs P and Hangya B (2018). Open Source Tools for Temporally Controlled Rodent Behavior Suitable for Electrophysiology and Optogenetic Manipulations. Front. Syst. Neurosci. 12:18. doi: 10.3389/fnsys.2018.00018

ToxTrac: A fast and robust software for tracking organisms

June 8, 2018

OpenBehavior has shared a variety of popular open-source tracking software, and there’s another to add to the list: ToxTrac!

Alvaro Rodriguez and colleagues from Umeå University in Umeå, Sweden, have developed ToxTrac, an open-source Windows program optimized for high-speed tracking of animals. It uses an advanced tracking algorithm that requires no specific knowledge of the geometry of tracked bodies and can therefore be used for a variety of species. ToxTrac can also track multiple bodies in multiple arenas simultaneously, while maintaining individual identification. The software is fast, operating at a rate >25 frames per second, and robust against false positives. ToxTrac generates useful statistics and heat maps in real scale that can be exported in image, text and excel formats to provide useful information about locomotor activity in rodents, insects, fish, etc.

Learn more about ToxTrac here:

Or Download ToxTrac software here:

Rodriguez A, Zhang H, Klaminder J, Brodin T, Andersson PL, Andersson M. ToxTrac: A fast and robust software for tracking organisms. Methods Ecol Evol. 2018;9:460–464.

FaceMap: Unsupervised analysis of rodent behaviors

May 15, 2018

We developed a toolbox for videographic processing of head-fixed rodent behaviors. It extracts the principal components of the animal’s behavior, either in a single movie, or across movies recorded simultaneously. Several regions of interest in the movie can be processed simultaneously (such as the whisker pad or the nose). We found that the behavioral components from the full face of the mouse predicted up to half of the explainable neural activity across the brain (see

In addition to extracting movement components, it can compute the pupil area of the rodent using a center-of-mass estimation. Also, in experiments in which the mouse sits on a surface with texture, the software can estimate the running speed of the mouse. The software is available here (

LocoWhisk: Quantifying rodent exploration and locomotion behaviours

March 8, 2018

Robyn A. Grant, from Manchester Metropolitan University, has shared the following on Twitter regarding the development of the LocoWhisk arena:

“Come help me develop my new arena. Happy to hear from anyone looking to test it or help me develop it further.”

The LocoWhisk system is a new, portable behavioural set-up that incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source). The system has so far been successfully piloted on many rodent models, and would benefit from further validation and commercialisation opportunities.

Learn more here:

Article in Nature on monitoring behavior in rodents

An interesting summary of recent methods for monitoring behavior in rodents was published this week in Nature.The article mentions Lex Kravitz and his lab’s efforts on the Feeding Experimentation Device (FED) and also OpenBehavior. Check it out:


December 18, 2017

ZebraTrack is a cost-effective imaging setup for distraction-free behavioral acquisition with automated tracking using open-source ImageJ software and workflow for extraction of behavioral endpoints of zebrafish. This ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins.

Nema, S., Hasan, W., Bhargava, A., & Bhargava, Y. (2016). A novel method for automated tracking and quantification of adult zebrafish behaviour during anxiety. Journal of Neuroscience Methods, 271, 65-75. doi:10.1016/j.jneumeth.2016.07.004