Robyn Grant, from Manchester Metropolitan University, has shared the following with Open Behavior regarding the development of an automated rodent tracking (ART) program:
We have developed a program (ART, available from: http://mwa.bretthewitt.net/downloads.php) that can automatically track rodent positon and movement. It is able to track head movements, body movements and also aspects of body size. It is able to identify certain behaviours from video footage too, such as rotations, moving forwards, interacting with objects and staying still. Our program is really flexible, so it can have additional modules that can be easily “plugged in”. For example, at the moment, it has a manual tracker module, which allows for your automatic tracking to be validated with manual tracking points (using MWA: Hewitt, Yap & Grant 2016, Journal of Open Research Software). This versatility means that in the future other modules might be added, such as additional behaviour identifiers, or other trackers such as for feet or whiskers.
Our program, ART, is also very automatic. It has minimal user input, but still performs as well as other trackers that require a lot of manual processing. It can automatically find video frames where the mouse is present and will only track these frames; or you can specify to track only when the mouse is locomoting, or rotating, for example. We hope that this tracker will form a solid basis from which to investigate rodent behaviour further.
Robert Sachdev, from the Neurocure Cluster of Excellence, Humboldt Universität Zu Berlin, Germany, has generously shared the following regarding automated optical tracking of animal movement:
“We have developed a method for tracking the motion of whiskers, limbs and whole animals in real-time. We show how to use a plug and play Pixy camera to monitor the real-time motion of multiple colored objects and apply the same tools for post-hoc analysis of high-speed video. Our method has major advantages over currently available methods: we can track the motion of multiple adjacent whiskers in real-time, and apply the same methods post-hoc, to “recapture” the same motion at a high temporal resolution. Our method is flexible; it can track objects that are similarly shaped like two adjacent whiskers, forepaws or even two freely moving animals. With this method it becomes possible to use the phase of movement of particular whiskers or a limb to perform closed-loop experiments.”
Figure using Pixy for two adjacent whiskers A. Setup. Head-fixed mice are acclimatized to whisker painting, and trained to use their whiskers to contact a piezo-film touch sensor. A Pixy camera is used to track whiskers in real-time (left), a high-speed color camera is used simultaneously to acquire data. B. Paradigm for whisker task. A sound-cue initiates the trial. The animal whisks one of the two painted whiskers into contact with a piezo-film sensor and if contact reaches threshold, the animal obtains a liquid reward. There is a minimum inter-trial interval of 10 seconds. C, Capturing whisker motion in real-time. The movement and location of the D1 and D2 whiskers shown at two consecutive time points (20 ms apart, left & right images). Lines corresponding to the location of the two whiskers (middle panel) acquired with Spike2 software. The waveform of whisker data reflects the spatial location and the dimensions of the tracked box around the whisker, which can both change as the whisker moves
Jakob Voigts, from the Massachusetts Institute of Technology, has shared the following regarding www.open-ephys.org. Open Ephys aims to distribute reliable open source software as well as tools for extracellular recording and stimulation.
“Open Ephys is a collaborative effort to develop, document, and distribute open-source tools for systems neuroscience. Since the spring of 2011, our main focus has been on creating a multichannel data acquisition system optimized for recordings in freely behaving rodents. However, most of our tools are general enough to be used in applications involving other model organisms and electrode types.
We believe that open-source tools can improve basic scientific research in a variety of ways. They are often less expensive than their closed-source counterparts, making it more affordable to scale up one’s experiments. They are readily modifiable, giving scientists a degree of flexibility that is not usually provided by commercial systems. They are more transparent, which leads to a better understanding of how one’s data is being generated. Finally, by encouraging researchers to document and share tools they would otherwise keep to themselves, the open-source community reduces redundant development efforts, thereby increasing overall scientific productivity.” – Jakob Voigts
Open Ephys features devices such as the flexDrive, a “chronic drive implant for extracellular electrophysiology”, as well as an arduino-based tetrode twister. The Pulse Pal generates precise voltage pulses. Also featured on Open Ephys is software such as Symphony, a MATLAB-based data acquisition system for electrophysiology.
Dr. Ewelina Knapska from the Nencki Institute of Experimental Biology in Warsaw, Poland has shared the following regarding Eco-HAB, an RFID-based system for automated tracking:
Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.
The Janelia Automatic Animal Behavior Annotator (JAABA) is a machine learning-based system that enables researchers to automatically detect behaviors in video of behaving animals. Through our system, users begin by labeling the behavior of the animal, e.g. walking, grooming, or following, in few of the video frames. Using machine learning, JAABA learns behavior detectors based on these labels that can then be used to automatically classify the behaviors of animals in other videos. JAABA has a fast and intuitive graphical user interface, and multiple ways to visualize and understand the classifier by a non-machine learning user. JAABA has enabled extraction of detailed, scientifically meaningful measurements of the behavioral effects in large experiments.
Olivier Friard of The University of Turin has generously shared the following about BORIS:
BORIS is an open-source, multiplatform standalone program that allows a user-specific coding environment for a computer-based review of previously recorded videos or live observations. The program allows defining a highly-customised, project-based ethogram that can then be shared with collaborators or can be imported or modified. Once the coding process is completed, the program can extract a time-budget or single or grouped observations automatically and present an at-a-glance summary of the main behavioural features. The observation data and analyses can be exported in various formats (e.g.; CSV or XLS). The behavioural events can also be plotted and exported in various graphic formats (e.g.; PNG, EPS, and PDF).
BORIS is currently used around the world for strikingly different approaches to the study of animal and human behaviour.
The latest version of BORIS can be downloaded from www.boris.unito.it, where the manual and some tutorials are also available.
Friard, O. and Gamba, M. (2016), BORIS: a free, versatile open-source
event-logging software for video/audio coding and live observations.
Methods Ecol Evol, 7: 1325–1330. doi:10.1111/2041-210X.12584
Gonçalo Lopez of the Champalimaud Neuroscience Programme has shared the following about Bonsai:
Bonsai is an open-source visual programming language for composing reactive and asynchronous data streams coming from video cameras, microphones, electrophysiology systems or data acquisition boards. It gives the experimenter an easy way not only to collect data from all these devices simultaneously, but also to quickly and flexibly chain together real-time processing pipelines that can be fed back to actuator systems for closed-loop stimulation using sound, images or other kinds of digital and analog output.
Bonsai is being used around the world to design all kinds of behavioral experiments from eye-tracking to virtual reality, spanning multiple model organisms from fish to humans.
Ian Dublon, former postdoctoral researcher at the Swedish University of Agricultural Sciences has shared the following about Scintillate, an open-source graphical viewer for time-series calcium imaging evaluation and pre-processing:
The program itself is relatively simple and originated out of the real need to rapidly appraise 2d calcium imaging datasets and the frustrations with existing acquisition software relying exclusively on defined regions of interest (ROIs). There is of course nothing wrong with ROIs but it is useful to rapidly appraise the the whole matrix. It really depends on the biological question being asked.
At its most simple, when you open a stacked Tiff, scintillate uses MATLAB’s imabsdiff to look at the absolute difference across the image matrix for successive frames in the stack. Several other more powerful tools, including background pre-stimulus subtraction and ICA (using the excellent FastICA) are provided to help the user ascertain the value of the studied preparation. This helps to confirm and pinpoint areas of signal change, allowing the imager to make adjustments to image acquisition parameters or simply to start afresh.
Written as a GUI in MATLAB and with source and GUIDE files provided alongside a getting started manual, it is designed both for image acquisition people and for MATLAB coders. If the compiler toolbox is present its possible to package a version of scintillate that will run without a MATLAB install present, making it ideal for running on the image acquisition workstation. Simply provide it with an acquired stacked tiff and within three or so clicks and no command line syntax you are ready to go.
Code is hosted at GitHub here: github.com/dublon/scintillate and scintillate uses some freely available 3rd party code which is gladly acknowledged throughout. Scintillate is not designed to replace traditional analysis methods but rather to provide a means of open-source rapid evaluation during the pre-processing stage. It is hoped that by hosting it on GitHub it may be developed further and thus adjusted to suit each persons individual imaging needs.
Jan Zimmerman, a postdoc fellow in the Glimcher Lab at New York University has shared the following about Oculomatic:
Video-based noninvasive eye trackers are an extremely useful tool for many areas of research. Many open-source eye trackers are available but current open-source systems are not designed to track eye movements with the temporal resolution required to investigate the mechanisms of oculomotor behavior. Commercial systems are available but employ closed source hardware and software and are relatively expensive therefore limiting wide-spread use.
Here we present Oculomatic, an open-source software and modular hardware solution to eye tracking for use in humans and non-human primates. Our fully modular hardware implementation relies on machine vision USB3 camera systems paired with affordable lens options from the surveillance sector. Our cross platform software implementation (C++) relies on openFrameworks as well as openCV and uses binary image statistics (following Green’s theorem) to compute pupil location and diameter. Oculomatic makes use of data acquisition devices to output eye position as calibrated analog voltages for direct integration into the electrophysiological acquisition chain.
Oculomatic features high temporal resolution (up to 600Hz), real-time eye tracking with high spatial accuracy (< 0.5°), and low system latency (< 1.8ms) at a relatively low-cost (< 1000USD). We tested Oculomatic during regular monkey training and task performance in two independent laboratories and compared Oculomatic performance to existing scleral search-coils. While being fully non-invasive, Oculomatic performed favorably to the gold standard of search-coils with only a slight decrease in spatial accuracy.
We propose this system can support a range of research into the properties and neural mechanisms of oculomotor behavior as well as provide an affordable tool to scale non-human primate electrophysiology further.
Zimmermann, J., Vazquez, Y., Glimcher, P. W., Pesaran, B., & Louie, K. (2016). Oculomatic: High speed, reliable, and accurate open-source eye tracking for humans and non-human primates. Journal of Neuroscience Methods, 270, 138-146.
The Laubach Lab at American University investigates executive control and decision making, focusing on the role of the prefrontal cortex. Through their GitHub repository, these researchers provide 3D print files for many of the behavioral devices used in their lab, including a Nosepoke and a Lickometer designed from rats. The repository also includes a script that reads MedPC files into Python in a usable way.