Category: Behavioral Chambers

An inexpensive, scalable Picamera system for tracking rats in large spaces

August 15, 2018

In the Journal of Neurophysiology, Sachin S. Deshmuhk and colleagues share their design for a Picamera system that allows for tracking of animals in large behavioral arenas.


Studies of spatial navigation and its neural correlates have been limited in the past by the reach of recording cables and tracking ability in small behavioral arenas. With the implementation of long-range, wireless neural recording systems, researchers are not able to expand the size of their behavioral arenas to study spatial navigation, but a way to accurately track animals in these larger arenas is necessary. The Picamera system is a low-cost, open-source scalable multi-camera tracking system that can be used to track behavior in combination with wireless recording systems. The design is comprised of 8 overhead Raspberry Pi cameras (capable of recording at a high frame rate in a large field of view) recording video independently in individual Raspberry Pi microcomputers and processed using the Picamera Python library. When compared with a commercial tracking software for the same purpose, the Picamera system reportedly performed better with improvements in inter-frame interval jitter and temporal accuracy, which improved the ability to establish relationships between recorded neural activity and video. The Picamera system is an affordable, efficient solution for tracking animals in large spaces.

Read more here!

Or check out their GitHub!


Saxena, R., Barde, W., and Deshmukh, S.S. An inexpensive, scalable camera system for tracking rats in large spaces (2018). Journal of Neurophysiology. https://doi.org/10.1152/jn.00215.2018

An opensource lickometer and microstructure analysis program

August 8, 2018

In HardwareX, an open access journal for designing, building and customizing opensource scientific hardware, Martin A. Raymond and colleagues share their design for a user-constructed, low-cost lickometer.


Researchers interested in ingestive behaviors of rodents commonly use licking behavior as a readout for the amount of fluid a subject consumes, as recorded by a lickometer. Commercially available lickometers are powerful tools to measure this behavior, but can be expensive and often require further customization. The authors offer their own design for an opensource lickometer that utilizes readily available or customizable components such as a PC sound card and 3D printed drinking bottle holder. The data from this device is collected by Audacity, and opensource audio program, which is then converted to a .csv format which can be analyzed using an R script made available by the authors to assess various features of licking microstructure. A full bill of materials, instructions for assembly and links to design files are available in the paper.

Check out the full publication here!


Raymond, M. A., Mast, T. G., & Breza, J. M. (2018). An open-source lickometer and microstructure analysis program. HardwareX, 4. doi:10.1016/j.ohx.2018.e00035

Collaboration between OpenBehavior and Hackaday.io

July 23, 2018

OpenBehavior has been covering open-source neuroscience projects for a few years, and we are always thrilled to see projects that are well documented and can be easily reproduced by others.  To further this goal, we have formed a collaboration with Hackaday.io, who have provided a home for OpenBehavior on their site.  This can be found at: https://hackaday.io/OpenBehavior, where we currently have 36 projects listed ranging from electrophysiology to robotics to behavior.  We are excited about this collaboration because it provides a straightforward way for people to document their projects with instructions, videos, images, data, etc.  Check it out, see what’s there, and if you want your project linked to the OpenBehavior page simply tag it as “OPENBEHAVIOR” or drop us a line at the Hackaday page.

Note: This collaboration between OpenBehavior and Hackaday.io is completely non-commercial, meaning that we don’t pay Hackaday.io for anything, nor do we receive any payments from them.  It’s simply a way to further our goal of promoting open-source neuroscience tools and their goal of growing their science and engineering community.


https://hackaday.io/OpenBehavior

 

Head-Fixed Setup for Combined Behavior, Electrophysiology, and Optogenetics

June 12, 2018

In a recent publication in the Frontiers in Systems Neuroscience, Solari and colleagues of the Hungarian Academy of Sciences and Semmelweis University have shared the following about a behavioral setup for temporally controlled rodent behavior. This arrangement allows for training of head-fixed animals with calibrated sound stimuli, precisely timed fluid and air puff presentations as reinforcers. It combines microcontroller-based behavior control with a sound delivery system for acoustic stimuli, fast solenoid valves for reinforcement delivery and a custom-built sound attenuated chamber, and is shown to be suitable for combined behavior, electrophysiology and optogenetics experiments. This system utilizes an optimal open source setup of both hardware and software through using Bonsai, Bpod and OpenEphys.

Read more here!

GitHub


Solari N, Sviatkó K, Laszlovszky T, Hegedüs P and Hangya B (2018). Open Source Tools for Temporally Controlled Rodent Behavior Suitable for Electrophysiology and Optogenetic Manipulations. Front. Syst. Neurosci. 12:18. doi: 10.3389/fnsys.2018.00018

LocoWhisk: Quantifying rodent exploration and locomotion behaviours

March 8, 2018

Robyn A. Grant, from Manchester Metropolitan University, has shared the following on Twitter regarding the development of the LocoWhisk arena:

“Come help me develop my new arena. Happy to hear from anyone looking to test it or help me develop it further.”

The LocoWhisk system is a new, portable behavioural set-up that incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source). The system has so far been successfully piloted on many rodent models, and would benefit from further validation and commercialisation opportunities.

Learn more here: https://crackit.org.uk/locowhisk-quantifying-rodent-exploration-and-locomotion-behaviours

MAPLE: a Modular Automated Platform for Large-Scale Experiments

January 8th, 2018 
The de Bivort lab and FlySorter, LLC are happy to share on OpenBehavior their open-source Drosophila handling platform, called MAPLE: Modular Automated Platform for Large-Scale Experiments.

Drosophila Melanogaster has proven a valuable genetic model organism due to the species’ rapid reproduction, low-maintenance, and extensive genetic documentation. However, the tedious chore of handling and manually phenotyping remains a limitation with regards to data collection. MAPLE: a Modular Automated Platform for Large-Scale Experiments provides a solution to this limitation.

MAPLE is a Drosophila-handing robot that boasts a modular design, allowing the platform to both automate diverse phenotyping assays and aid with lab chores (e.g., collecting virgin female flies). MAPLE permits a small-part manipulator, a USB digital camera, and a fly manipulator to work simultaneously over a platform of flies. Failsafe mechanisms allow users to leave MAPLE unattended without risking damage to MAPLE or the modules.

The physical platform integrates phenotyping and animal husbandry to allow end-to-end experimental protocols. MAPLE features a large, physically-open workspace for user convenience. The sides, top, and bottom are made of clear acrylic to allow optical phenotyping at all time points other than when the end-effector carriages are above the modules. Finally, the low cost and scalability allow large-scale experiments ($3500 vs hundreds of thousands for a “fly-flipping” robot).

MAPLE’s utility and versatility were demonstrated through the execution of two tasks: collection of virgin female flies, and a large-scale longitudinal measurement of fly social networks and behavior.

Links to materials:

CAD files

Control Software

Raw data and analysis scripts 

De Bivort Lab Site 


 

Airtrack

November 28, 2017

Airtrack was developed in LARKUM Lab by Mostafa Nashaat, Hatem Oraby, Robert Sachdev, York Winter and Matthew Larkum. Alexander Schill, engineer at Charité workshop (CWW) had a significant contribution to the design of the platform and the airtrack table.


Airtrack is a head-fixed behavioral environment that uses a lightweight physical maze floating on an air table that moves around the animal’s body under the direct control of the animal itself, solving many problems associated with using virtual reality for head-fixed animals.

Illustrative Image of the Airtrack


More Information can be found at http://www.neuro-airtrack.com/

Nashaat, MA, Oraby, H, Sachdev, RNS, Winter, Y, Larkum, ME. (2016).
Air-Track: a real-world floating environment for active sensing in head-fixed mice.
Journal of Neurophysiology 116 (4) 1542-1553; DOI:10.1152/jn.00088.2016

Autonomous Training of a Forelimb Motor Task

November 3, 2017

Greg Silas, from the University of Ottawa, has kindly contributed the following to OpenBehavior.


“Silasi et al developed a low-cost system for fully autonomous training of group housed mice on a forelimb motor task. We demonstrate the feasibility of tracking both end-point as well as kinematic performance of individual mice, each performing thousands of trials over 2.5 months. The task is run and controlled by a Raspberry Pi microcomputer, which allows for cages to be monitored remotely through an active internet connection.”

Click here to submit a piece of open-source software or hardware to OpenBehavior.

Moving Wall Box (MWB)

October 26th, 2017

Andreas Genewsky, from the Max-Planck Institute of Psychiatry, has generously shared the following regarding his Moving Wall Box task and associated apparatus.


“Typicallly, behavioral paradigms which aim to asses active vs. passive fear responses, involve the repeated application of noxius stimuli like electric foot shocks (step-down avoidance, step-through avoidance, shuttle-box). Alternative methods to motivate the animals and ultimately induce a conflict situation which needs to be overcome often involve food and/or water deprivation.

In order to repeatedly assess fear coping strategies in an emotional challenging situation without footshocks, food or water deprivation (comlying to the Reduce & Refine & Replace 3R principles), we devised a novel testing strategy, henceforward called the Moving Wall Box (MWB) task. In short, during the MWB task a mouse is repeatedly forced to jump over a small ice-filled box (10 trials, 1 min inter-trial intervals ITI), by slowly moving walls (2.3 mm/s, over 60 s), whereby the presence of the animal is automatically sensed via balances and analyzed by a microcontroller board which in turn controls the movements of the walls. The behavioral readouts are (1) the latency to reach the other compartment (high levels of behavioral inhibition lead to high latencies) and (2) the number of inter-trial shuttles per trial (low levels of behavioral inhibition lead to high levels of shuttles during the ITI).

The MWB offers the possibility to conduct simultaneous in vivo electrophysiological recordings, which could be later aligned to the behavioral responses (escapes). Therefore the MWB task fosters the study of activity patterns in, e.g., optogenetically identified neurons with respect to escape responses in a highly controlled setting. To our knowledge there is no other available compatible behavioral paradigm.”