Category: Video Analysis

KineMouse Wheel

October 10, 2018

On Hackaday, Richard Warren of the Sawtell Lab at Columbia University has shared his design for KineMouse Wheel, a light-weight running wheel for head-fixed locomotion that allows for 3D positioning of mice with a single camera.


Locomotive behavior is a common behavioral readout used in neuroscience research, and running wheels are a great tool for assessing motor function in head-fixed mice. KineMouse Wheel takes this tool a step further. Constructed out of light-weight, transparent polycarbonate with an angled mirror mounted inside, this innovative device allows for a single camera to capture two views of locomotion simultaneously. When combined with DeepLabCut, a deep-learning tracking software, head-fixed mice locomotion can be captured in three dimensions allowing for a more complete assessment of motor behavior. This wheel can also be further customized to fit the needs of a lab by using different materials for the build. More details about the KineMouse Wheel are available at hackaday.io, in addition to a full list of parts and build instructions.

Read more about KineMouse Wheel on Hackaday,

and check out other awesome open-source tools on the OpenBehavior Hackaday list!


 

OpenBehavior Feedback Survey

We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.

We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.

https://american.co1.qualtrics.com/jfe/form/SV_0BqSEKvXWtMagqp

Thanks!

EthoWatcher: a tool for behavioral and video-tracking analysis in laboratory animals

September 26, 2018

In Computers in Biology and Medicine, Carlos Fernando Crispin Jr. and colleagues share their software EthoWatcher: a computational tool that supports video-tracking, detailed ethography, and extraction of kinematic variables from video files of laboratory animals.


The freely available EthoWatcher software has two modules: a tracking module and an ethography module. The tracking module permits the controlled separation of the target from its background, the extraction of image attributes used to calculate distances traveled, orientation, length, area and a path graph of the target. The ethography module allows recording of catalog-based behaviors from video files, the environment, or frame-by-frame. The output reports latency, frequency, and duration of each behavior as well as the sequence of events in a time-segmented format fixed by the user. EthoWatcher was validated conducting tests on the detection of the known behavioral effects of drugs and on kinematic measurements.

Read more in their paper or download the software from the EthoWatcher webpage!


Junior, C. F., Pederiva, C. N., Bose, R. C., Garcia, V. A., Lino-De-Oliveira, C., & Marino-Neto, J. (2012). ETHOWATCHER: Validation of a tool for behavioral and video-tracking analysis in laboratory animals. Computers in Biology and Medicine,42(2), 257-264. doi:10.1016/j.compbiomed.2011.12.002

Argus

September 5, 2018

In a recent Behavior Research Methods article, Soaleha Shams and colleagues share Argus, a data extraction and analysis tool built in the open-source R language for tracking zebrafish behavior.


Based on a formerly developed custom-software for zebrafish behavior tracking, Argus was developed with behavioral researchers in mind. It includes a  new, user-friendly, and efficient graphical user interface and offers simplicity and flexibility in measuring complex zebrafish behavior through customizable parameters set by the researcher. The program is validated against two commercially available programs for zebrafish behavior analysis, and measures up in its ability to track speed, freezing, erratic movement, and interindividual distance. In summary, Argus is shown to be a novel, cost- effective, and customizable method for the analysis and quantification of both single and socially interacting zebrafish.

Read more here!


Q&A with Dr. Mackenzie Mathis on her experience with developing DeepLabCut

August 22, 2018

Dr. Mackenzie Mathis, Principal Investigator of the Adaptive Motor Control Lab (Rowland Institute at Harvard University), has shared the following responses to a short Q&A about the inspiration behind, development of and sharing of DeepLabCut — a toolbox for animal tracking using deep-learning.


What inspired you and your colleagues to create this toolbox as opposed to using previously developed commercial software?

Alexander Mathis and I both worked on behaviors where we wanted to track particular features, and they proved to be unreliably tracked with the methods we tried. Specifically, Alexander has an odor-guided navigation task that he works on in the lab of Prof. Venkatesh Murthy at Harvard, where the mice are placed in a very large “endless” paper trail and he inkjet prints odors for them to follow to get rewards (chocolate milk). The position of the snout is very important to measure accurately, so background subtraction or other heuristics didn’t work when the nose crossed the trail and when the droplet was right in front of the snout. I worked on a skilled joystick behavior for mice, and I wanted to track joints accurately and non-invasively – a challenging problem for little hands. So, we teamed up with Prof. Matthias Bethge at the University of Tuebingen, to work on a new approach. He suggested we start looking into the rapidly advancing human pose estimation literature, and we looked at several before deciding to seriously benchmark DeeperCut, a top performing algorithm in the large MPII dataset. Those authors did something very clever, namely, they used a deep neural network (ResNet) that was pre-trained on a large image set called ImageNet. This gives the ResNet a chance to learn natural scene statistics first. Remarkably, we found that we could use only a few frames to very accurately track the snout in the odor-guided navigation task, so we next tried videos from my joystick task, and to flex DeepLabCut’s muscles, we teamed up with Kevin Cury (who, like myself was an alumni of Prof. Nao Uchida’s group) to track fruit flies in the 3D chamber. After all this benchmarking, we built a toolbox that implements a complete pipeline to extract and label frames, train and evaluate the deep neural nets, as well as analyze new experimental videos.  We call this toolbox DeepLabCut, as a nod to DeeperCut.

What was the motivation for immediately sharing your work as an open source tool, thus making it accessible to the broader neuroscience community?

Some of the options we first tried to track with were very expensive commercial systems, and they failed quite badly. On the other hand, deep learning has revolutionized computer vision in the last few years, so we were eager to try some new approaches to solve the problem. So, in addition to being advocates of open science, we really wanted to make a toolbox that someone with minimal to no coding experience could, absolutely for free, track whatever they wanted.

We also know peer review can be slow, so as soon as we had the toolbox in place, we wrote up the arxiv paper and released the code base immediately. Honestly, it has been one of my most rewarding papers – the feedback from our peers, and seeing what people have used the code for, has been a very rewarding experience. This was my first preprint, and especially for methods manuscripts, I now cannot imagine another way to share our future work too.

How do you think open source tools, such as yours, will continue to impact the progress of scientific research?

Open source code and preprints have been the norm in some fields for decades (such as math and physics), and I am really excited to see it come of age in biology and neuroscience. I am excited to see how tools will continue to improve as the community gets behind them, just as we could build on DeeperCut, which was open source. Also, at least in my experience, many individuals write their own code, which leads to a lot of duplicated efforts. Moreover, datasets are becoming increasingly more complicated and code to work with such data need to be robust shared. My expectation is that open source code will become the norm in the future, which can only help science become more robust.

Even before formal publication this week (see Nature Neuroscience), we estimate that about 100 labs are actively using DeepLabCut, so releasing the code before publication, we hope,  has really allowed for rapid progress to be made. We were also very happy that The Atlantic could highlight some of the early adopters, as it’s one thing to say you made something, but it’s another to hear others saying it is actually ‘something.’


DeepLabCut provides an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. Read more on the website, or in Nature Neuroscience.

 


An inexpensive, scalable Picamera system for tracking rats in large spaces

August 15, 2018

In the Journal of Neurophysiology, Sachin S. Deshmuhk and colleagues share their design for a Picamera system that allows for tracking of animals in large behavioral arenas.


Studies of spatial navigation and its neural correlates have been limited in the past by the reach of recording cables and tracking ability in small behavioral arenas. With the implementation of long-range, wireless neural recording systems, researchers are not able to expand the size of their behavioral arenas to study spatial navigation, but a way to accurately track animals in these larger arenas is necessary. The Picamera system is a low-cost, open-source scalable multi-camera tracking system that can be used to track behavior in combination with wireless recording systems. The design is comprised of 8 overhead Raspberry Pi cameras (capable of recording at a high frame rate in a large field of view) recording video independently in individual Raspberry Pi microcomputers and processed using the Picamera Python library. When compared with a commercial tracking software for the same purpose, the Picamera system reportedly performed better with improvements in inter-frame interval jitter and temporal accuracy, which improved the ability to establish relationships between recorded neural activity and video. The Picamera system is an affordable, efficient solution for tracking animals in large spaces.

Read more here!

Or check out their GitHub!


Saxena, R., Barde, W., and Deshmukh, S.S. An inexpensive, scalable camera system for tracking rats in large spaces (2018). Journal of Neurophysiology. https://doi.org/10.1152/jn.00215.2018

Collaboration between OpenBehavior and Hackaday.io

July 23, 2018

OpenBehavior has been covering open-source neuroscience projects for a few years, and we are always thrilled to see projects that are well documented and can be easily reproduced by others.  To further this goal, we have formed a collaboration with Hackaday.io, who have provided a home for OpenBehavior on their site.  This can be found at: https://hackaday.io/OpenBehavior, where we currently have 36 projects listed ranging from electrophysiology to robotics to behavior.  We are excited about this collaboration because it provides a straightforward way for people to document their projects with instructions, videos, images, data, etc.  Check it out, see what’s there, and if you want your project linked to the OpenBehavior page simply tag it as “OPENBEHAVIOR” or drop us a line at the Hackaday page.

Note: This collaboration between OpenBehavior and Hackaday.io is completely non-commercial, meaning that we don’t pay Hackaday.io for anything, nor do we receive any payments from them.  It’s simply a way to further our goal of promoting open-source neuroscience tools and their goal of growing their science and engineering community.


https://hackaday.io/OpenBehavior

 

Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai

June 15, 2018

In a recent preprint on BioRxiv, Alessio Buccino and colleagues from the University of Oslo provide a step-by-step guide for setting up an open source, low cost, and adaptable system for combined behavioral tracking, electrophysiology, and closed-loop stimulation. Their setup integrates Bonsai and Open Ephys with multiple modules they have developed for robust real-time tracking and behavior-based closed-loop stimulation. In the preprint, they describe using the system to record place cell activity in the hippocampus and medial entorhinal cortex, and present a case where they used the system for closed-loop optogenetic stimulation of grid cells in the entorhinal cortex as examples of what the system is capable of. Expanding the Open Ephys system to include animal tracking and behavior-based closed-loop stimulation extends the availability of high-quality, low-cost experimental setup within standardized data formats.

Read more on BioRxiv, or on GitHub!


Buccino A, Lepperød M, Dragly S, Häfliger P, Fyhn M, Hafting T (2018). Open Source Modules for Tracking Animal Behavior and Closed-loop Stimulation Based on Open Ephys and Bonsai. BioRxiv. http://dx.doi.org/10.1101/340141

Head-Fixed Setup for Combined Behavior, Electrophysiology, and Optogenetics

June 12, 2018

In a recent publication in the Frontiers in Systems Neuroscience, Solari and colleagues of the Hungarian Academy of Sciences and Semmelweis University have shared the following about a behavioral setup for temporally controlled rodent behavior. This arrangement allows for training of head-fixed animals with calibrated sound stimuli, precisely timed fluid and air puff presentations as reinforcers. It combines microcontroller-based behavior control with a sound delivery system for acoustic stimuli, fast solenoid valves for reinforcement delivery and a custom-built sound attenuated chamber, and is shown to be suitable for combined behavior, electrophysiology and optogenetics experiments. This system utilizes an optimal open source setup of both hardware and software through using Bonsai, Bpod and OpenEphys.

Read more here!

GitHub


Solari N, Sviatkó K, Laszlovszky T, Hegedüs P and Hangya B (2018). Open Source Tools for Temporally Controlled Rodent Behavior Suitable for Electrophysiology and Optogenetic Manipulations. Front. Syst. Neurosci. 12:18. doi: 10.3389/fnsys.2018.00018

ToxTrac: A fast and robust software for tracking organisms

June 8, 2018

OpenBehavior has shared a variety of popular open-source tracking software, and there’s another to add to the list: ToxTrac!


Alvaro Rodriguez and colleagues from Umeå University in Umeå, Sweden, have developed ToxTrac, an open-source Windows program optimized for high-speed tracking of animals. It uses an advanced tracking algorithm that requires no specific knowledge of the geometry of tracked bodies and can therefore be used for a variety of species. ToxTrac can also track multiple bodies in multiple arenas simultaneously, while maintaining individual identification. The software is fast, operating at a rate >25 frames per second, and robust against false positives. ToxTrac generates useful statistics and heat maps in real scale that can be exported in image, text and excel formats to provide useful information about locomotor activity in rodents, insects, fish, etc.

Learn more about ToxTrac here: https://doi.org/10.1111/2041-210X.12874

Or Download ToxTrac software here: https://toxtrac.sourceforge.io


Rodriguez A, Zhang H, Klaminder J, Brodin T, Andersson PL, Andersson M. ToxTrac: A fast and robust software for tracking organisms. Methods Ecol Evol. 2018;9:460–464. https://doi.org/10.1111/2041-210X.12874