April 2, 2018
Check out the Ethoscopes platform!
Ethoscopes enable high-throughput analysis of behavior in Drosophila and other animals for <$100. The system is capable of real-time video tracking, is based on raspberry pi, and even has its own R package for data analysis. All software and build specifications are available at http://lab.gilest.ro/ethoscope.
March 8, 2018
Robyn A. Grant, from Manchester Metropolitan University, has shared the following on Twitter regarding the development of the LocoWhisk arena:
“Come help me develop my new arena. Happy to hear from anyone looking to test it or help me develop it further.”
The LocoWhisk system is a new, portable behavioural set-up that incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source). The system has so far been successfully piloted on many rodent models, and would benefit from further validation and commercialisation opportunities.
Learn more here: https://crackit.org.uk/locowhisk-quantifying-rodent-exploration-and-locomotion-behaviours
An interesting summary of recent methods for monitoring behavior in rodents was published this week in Nature.The article mentions Lex Kravitz and his lab’s efforts on the Feeding Experimentation Device (FED) and also OpenBehavior. Check it out: https://www.nature.com/articles/d41586-018-02403-5
February 6, 2018
Brian Isett, who is now at Carnegie Mellon, has kindly shared the following tutorial regarding the creation and implementation of a Rodent Running Disk he designed while at University of California, Berkeley.
“Awake, naturalistic behavior is the gold standard for many neuroscience experiments. Increasingly, researchers using the mouse model system strive to achieve this standard while also having more control than a freely moving animal. Using head-fixation, a mouse can be positioned very precisely relative to ongoing stimuli, but often at the cost of naturalism. One way to overcome this problem is to use the natural running of the mouse to control stimulus presentation in a closed-loop “virtual navigation” environment. This combination allows for awake, naturalistic behavior, with the added control of head-fixation. A key element of this paradigm is to have a very fast way of decoding mouse locomotion.
In this tutorial, we describe using an acrylic disk mounted to an optical encoder to achieve fast locomotion decoding. Using an Arduino to decode the TTL pulses coming from the optical encoder, real-time, closed-loop stimuli can be easily presented to a head-fixed mouse. This ultimately allowed us to present tactile gratings to a mouse performing a whisker-mediated texture discrimination task as a “virtual foraging task” — tactile stimuli moved past the whiskers synchronously with mouse locomotion. But the design is equally useful for measuring mouse running position and speed in a very precise way.”
The tutorial may be found here.
Isett, B.R., Feasel, S.H., Lane, M.A., and Feldman, D.E. (2018). Slip-Based Coding of Local Shape and Texture in Mouse S1. Neuron 97, 418–433.e5.
January 3rd, 2018
The following behavioral platform was developed and published by Xinfeng Chen and Haohong Li, from Huazhong University of Science and Technology, Wuhan, China
ArControl: Arduino Control Platform is a comprehensive behavioral platform developed to deliver stimuli and monitor responses. This easy-to-use, high-performance system uses an Arduino UNO board and a simple drive circuit along with a stand-along GUI application. Experimental data is automatically recorded with the built-in data acquisition function and the entire behavioral schedule is stored within the Arduino chip. Collectively, this makes ArControl a “genuine, real-time system with high temporal resolution”. Chen and Li have tested ArControl using a Go/No-Go task and a probabilistic switching behavior task. The results of their work show that ArControl is a reliable system for behavioral research.
Source codes and PCB drafts may be found here: ArControl Github
November 28, 2017
Pyper is developed by The Margrie Laboratory.
Pyper provides real-time or pre-recorded motion tracking of a specimen in an open-field. Pyper can send TTL pulses based on detection of the specimen within user-defined regions of interest. The software can be used through the command line or through a built-in graphical user interface. The live feed can be provided by a USB or Raspberry Pi camera.
Find more information here.
Manual for Pyper.
November 8th, 2017
Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.
3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.
More information on 3D tracker may be found on the system’s website, www.3dtracker.org
Additionally, a dynamic poster on the system was presented on November 12, 2017 at the Society for Neuroscience annual meeting. Click here for more information.
September 7, 2017
Researchers at the National Eye Institute and the University of Oldenberg, Germany, have developed the OMR-arena for measuring visual acuity in mice.
The OMR-arena is an automated measurement and stimulation system that was developed to determine visual thresholds in mice. The system uses an optometer to characterize the visual performance of mice in a free moving environment. This system uses a video-tracking system to monitor the head movement of mice while presenting appropriate 360° stimuli. The head tracker is used to adjust the desired stimulus to the head position, and to automatically calculate visual acuity. This device, in addition to being open-source and affordable, offers an objective way for researchers to measure visual performance of free moving mice.
Kretschmer F, Kretschmer V, Kunze VP, Kretzberg J (2013) OMR-Arena: Automated Measurement and Stimulation System to Determine Mouse Visual Thresholds Based on Optomotor Responses. PLoS ONE 8(11): e78058.
Lucy Palmer and Andrew Micallef, of the Florey Institute of Neuroscience and Mental Health, University of Melbourne, Melbourne, VIC, Australia, have shared the following Arduino and Python based platform for Go/ No-Go tasks in an article published by Frontiers in Cellular Neuroscience.
The Go/No-Go sensory task requires an animal to report a decision in response to a stimulus. In “Go” trials, the subject must respond to a target stimulus with an action, while in “No-Go” trials, the subject withholds a response. To execute this task, a behavioral platform was created which consists of three main components: 1) a water reward delivery system, 2) a lick sensor, and 3) a sensory stimulation apparatus. The water reward is administered by a gravity flow water system, controlled by a solenoid pinch valve, while licking is monitored by a custom-made piezo-based sensor. An Arduino Uno Rev3 simultaneously controls stimulus and reward delivery. In addition, the Arduino records lick frequency and timing through the piezo sensor. A Python script, employing the pyserial library, aids communication between the Arduino and a host computer.
The GitHub for the project may be found here.
August 23, 2017
In Frontiers Neuroscience, Tyler Libey and Eberhard E. Fetz share their open-source device for recording neural activity from free behaving non-human primates in their home cages and administering reward.
This device is designed to document bodily movement and neural activity and deliver rewards to monkeys behaving freely in their home cages. This device allows researchers to explore behaviors in freely moving non-human primates rather than simply relying on rigid and tightly controlled movements which lends itself to further understanding movement, reward, and the neural signals involved with these behaviors. Studying free-moving animals may offer essential insight to understanding the neural signals associated with reward-guided movement, which may offer guidance in developing more accurate brain machine interfaces. The behavior monitoring system incorporates existing untethered recording equipment, Neurochip, and a custom hub to control a cage-mounted feeder to deliver short-latency rewards. A depth camera is used to provide gross movement data streams from the home cage in addition to the neural activity that is recorded.
Libey T and Fetz EE (2017) Open-Source, Low Cost, Free-Behavior Monitoring, and Reward System for Neuroscience Research in Non-human Primates. Front. Neurosci. 11:265. doi: 10.3389/fnins.2017.00265