June 15, 2018
In a recent preprint on BioRxiv, Alessio Buccino and colleagues from the University of Oslo provide a step-by-step guide for setting up an open source, low cost, and adaptable system for combined behavioral tracking, electrophysiology, and closed-loop stimulation. Their setup integrates Bonsai and Open Ephys with multiple modules they have developed for robust real-time tracking and behavior-based closed-loop stimulation. In the preprint, they describe using the system to record place cell activity in the hippocampus and medial entorhinal cortex, and present a case where they used the system for closed-loop optogenetic stimulation of grid cells in the entorhinal cortex as examples of what the system is capable of. Expanding the Open Ephys system to include animal tracking and behavior-based closed-loop stimulation extends the availability of high-quality, low-cost experimental setup within standardized data formats.
Read more on BioRxiv, or on GitHub!
Buccino A, Lepperød M, Dragly S, Häfliger P, Fyhn M, Hafting T (2018). Open Source Modules for Tracking Animal Behavior and Closed-loop Stimulation Based on Open Ephys and Bonsai. BioRxiv. http://dx.doi.org/10.1101/340141
June 12, 2018
In a recent publication in the Frontiers in Systems Neuroscience, Solari and colleagues of the Hungarian Academy of Sciences and Semmelweis University have shared the following about a behavioral setup for temporally controlled rodent behavior. This arrangement allows for training of head-fixed animals with calibrated sound stimuli, precisely timed fluid and air puff presentations as reinforcers. It combines microcontroller-based behavior control with a sound delivery system for acoustic stimuli, fast solenoid valves for reinforcement delivery and a custom-built sound attenuated chamber, and is shown to be suitable for combined behavior, electrophysiology and optogenetics experiments. This system utilizes an optimal open source setup of both hardware and software through using Bonsai, Bpod and OpenEphys.
Read more here!
Solari N, Sviatkó K, Laszlovszky T, Hegedüs P and Hangya B (2018). Open Source Tools for Temporally Controlled Rodent Behavior Suitable for Electrophysiology and Optogenetic Manipulations. Front. Syst. Neurosci. 12:18. doi: 10.3389/fnsys.2018.00018
June 8, 2018
OpenBehavior has shared a variety of popular open-source tracking software, and there’s another to add to the list: ToxTrac!
Alvaro Rodriguez and colleagues from Umeå University in Umeå, Sweden, have developed ToxTrac, an open-source Windows program optimized for high-speed tracking of animals. It uses an advanced tracking algorithm that requires no specific knowledge of the geometry of tracked bodies and can therefore be used for a variety of species. ToxTrac can also track multiple bodies in multiple arenas simultaneously, while maintaining individual identification. The software is fast, operating at a rate >25 frames per second, and robust against false positives. ToxTrac generates useful statistics and heat maps in real scale that can be exported in image, text and excel formats to provide useful information about locomotor activity in rodents, insects, fish, etc.
Learn more about ToxTrac here: https://doi.org/10.1111/2041-210X.12874
Or Download ToxTrac software here: https://toxtrac.sourceforge.io
Rodriguez A, Zhang H, Klaminder J, Brodin T, Andersson PL, Andersson M. ToxTrac: A fast and robust software for tracking organisms. Methods Ecol Evol. 2018;9:460–464. https://doi.org/10.1111/2041-210X.12874
June 6, 2018
This post is relevant for MedPC users who also use MatLab or Python for data analysis.
We recently became aware that many MedPC users are not saving precise times for behavioral events from their experiments. A method called time-event codes was worked out around 2000 by Russ Church and his group at Brown, working with MedAssociates. Marcelo Caetano, a former postdoc in the Laubach Lab at Yale, incorporated this approach into an existing MatLab function (MedParse, written by Kumar Narayanan during his PhD training in the Laubach Lab at Yale). More recently, the code was ported to Python by Kyra Swanson, a Phd student in the Laubach Lab at American University. It is available at https://github.com/LaubachLab/MedParse. MedPC code for saving precise times of behavioral events (example in MedPC Template) and MatLab and Python functions are provided that convert MedPC data (see the template) into “time-event codes,” i.e., a two-column matrix with times (column 1) and events (column 2).
May 21, 2018
Meaghan Creed has developed a novel device for assessing preferences by mice among fluids in their homecages, i.e. two-bottle choice test. She shared the design on http://hackaday.io and contributed the summary of it below.
Often in behavioral neuroscience, we need to measure how often and how much a mouse will consume multiple liquids in their home cage. Examples include sucrose preference tasks in models of depression, or oral drug self-administration (ie. Morphine, opiates) in the context of addiction. Classically, two bottles are filled with liquids and volumes are manually recorded at a single time point. Here, we present a low-cost, two-sipper apparatus that mounts on the inside of a standard mouse cage. Interactions are detected using photointerrupters at the base of each sipper which are logged to an SD card using a standard Arduino. Sippers are constructed from 15 mL conical tubes which allows additional volumetric measurements, the rest of the holding apparatus is 3D printed, and the apparatus is constructed with parts from Arduino and Sparkfun. This automated approach allows for high temporal resolution collected over 24 hours, allowing measurements of patterns of intake in addition to volume measurements. Since we don’t need to manually weigh bottles we can do high-throughput studies, letting us run much larger cohorts.
This is designed such that each set of 2 sippers uses its own Arduino and SD card. With a bit of modification to the code one Arduino Uno can be programmed to log from 6 cages onto the same SD card. Arduino compatible boards with more GPIOs (like Arduino Mega) can log from up to 56 sippers on one Arduino.
May 15, 2018
We developed a toolbox for videographic processing of head-fixed rodent behaviors. It extracts the principal components of the animal’s behavior, either in a single movie, or across movies recorded simultaneously. Several regions of interest in the movie can be processed simultaneously (such as the whisker pad or the nose). We found that the behavioral components from the full face of the mouse predicted up to half of the explainable neural activity across the brain (see https://www.biorxiv.org/content/early/2018/04/22/306019).
In addition to extracting movement components, it can compute the pupil area of the rodent using a center-of-mass estimation. Also, in experiments in which the mouse sits on a surface with texture, the software can estimate the running speed of the mouse. The software is available here (https://github.com/carsen-stringer/FaceMap).
April 2, 2018
Check out the Ethoscopes platform!
Ethoscopes enable high-throughput analysis of behavior in Drosophila and other animals for <$100. The system is capable of real-time video tracking, is based on raspberry pi, and even has its own R package for data analysis. All software and build specifications are available at http://lab.gilest.ro/ethoscope.
March 9, 2018
O’Leary and colleagues describe an open-source touch-screen for rodent behavioral testing. The manuscript is well documented and includes all of the parts needed to build the system on your own. Very useful methods for testing cognitive function and relating findings across species (rodents, primates, humans). Congrats to the authors on setting a high standard for open-source neuroscience!
O’Leary, J.D., O’Leary, O.F., Cryan, J.F. et al. Behav Res (2018). https://doi.org/10.3758/s13428-018-1030-y
March 8, 2018
Robyn A. Grant, from Manchester Metropolitan University, has shared the following on Twitter regarding the development of the LocoWhisk arena:
“Come help me develop my new arena. Happy to hear from anyone looking to test it or help me develop it further.”
The LocoWhisk system is a new, portable behavioural set-up that incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source). The system has so far been successfully piloted on many rodent models, and would benefit from further validation and commercialisation opportunities.
Learn more here: https://crackit.org.uk/locowhisk-quantifying-rodent-exploration-and-locomotion-behaviours
March 1, 2018
From the Kravitz lab at the NIH comes a simple device for dispensing pre-measured quantities of food at regular intervals throughout the day. Affectionately known as “SnackClock”, this device uses a 24-hour clock movement to rotate a dispenser wheel one revolution per day. The wheel contains 12 compartments, which allows the device to dispense 12 pre-measured “snacks” at regular 2 hour intervals. The Kravitz lab has used this device to dispense high-fat diet throughout the day, rather than giving mice one big piece once per day. The device is very simple to build and use, requiring just two 3D printed parts and a ~$10 clock movement. There is no microcontroller or coding required for this device, and it runs on one AA battery for >1 year. The 3D files are supplied and can be edited to fit SnackClock in different brands of caging, or to adjust the number of snack compartments. With additional effort the clock movement could be replaced by a stepper motor to allow for dispensing at irregular or less frequent intervals.