Category: Sensors

Microwave-based Homecage Motion Detector

June 25, 2018

Andreas Genewsky and colleagues from the Max Planck Institute of Psychiatry have shared the design, construction and validation of a simplified, low-cost, radar-based motion detector for home cage activity monitoring in mice. This simple, open-source device allows for motion detection without visual contact to the animal and can be used with various cage types. It features a custom printed circuit board and motion detector shield for Arduino, which saves raw activity and timestamped data in CSV files onto an SD card; the authors also provide a Python script for data analysis and generation of actograms. This device offers a cost-effective, DIY alternative to optical imaging of home-cage activity.

Read more from the Journal of Biomedical Engineering publication!


Genewsky, A., Heinz, D. E., Kaplick, P. M., Kilonzo, K., & Wotjak, C. T. (2017). A simplified microwave-based motion detector for home cage activity monitoring in mice. Journal of Biological Engineering,11(1). doi:10.1186/s13036-017-0079-y

Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai

June 15, 2018

In a recent preprint on BioRxiv, Alessio Buccino and colleagues from the University of Oslo provide a step-by-step guide for setting up an open source, low cost, and adaptable system for combined behavioral tracking, electrophysiology, and closed-loop stimulation. Their setup integrates Bonsai and Open Ephys with multiple modules they have developed for robust real-time tracking and behavior-based closed-loop stimulation. In the preprint, they describe using the system to record place cell activity in the hippocampus and medial entorhinal cortex, and present a case where they used the system for closed-loop optogenetic stimulation of grid cells in the entorhinal cortex as examples of what the system is capable of. Expanding the Open Ephys system to include animal tracking and behavior-based closed-loop stimulation extends the availability of high-quality, low-cost experimental setup within standardized data formats.

Read more on BioRxiv, or on GitHub!


Buccino A, Lepperød M, Dragly S, Häfliger P, Fyhn M, Hafting T (2018). Open Source Modules for Tracking Animal Behavior and Closed-loop Stimulation Based on Open Ephys and Bonsai. BioRxiv. http://dx.doi.org/10.1101/340141

Open-source touch-screen for rodent behavioral testing

March 9, 2018

O’Leary and colleagues describe an open-source touch-screen for rodent behavioral testing. The manuscript is well documented and includes all of the parts needed to build the system on your own. Very useful methods for testing cognitive function and relating findings across species (rodents, primates, humans). Congrats to the authors on setting a high standard for open-source neuroscience!

O’Leary, J.D., O’Leary, O.F., Cryan, J.F. et al. Behav Res (2018). https://doi.org/10.3758/s13428-018-1030-y

DIY Rodent Running Disk

February 6, 2018 

Brian Isett, who is now at Carnegie Mellon, has kindly shared the following tutorial regarding the creation and implementation of a Rodent Running Disk he designed while at University of California, Berkeley.


“Awake, naturalistic behavior is the gold standard for many neuroscience experiments.  Increasingly, researchers using the mouse model system strive to achieve this standard while also having more control than a freely moving animal. Using head-fixation, a mouse can be positioned very precisely relative to ongoing stimuli, but often at the cost of naturalism. One way to overcome this problem is to use the natural running of the mouse to control stimulus presentation in a closed-loop “virtual navigation” environment. This combination allows for awake, naturalistic behavior, with the added control of head-fixation. A key element of this paradigm is to have a very fast way of decoding mouse locomotion.
In this tutorial, we describe using an acrylic disk mounted to an optical encoder to achieve fast locomotion decoding. Using an Arduino to decode the TTL pulses coming from the optical encoder, real-time, closed-loop stimuli can be easily presented to a head-fixed mouse. This ultimately allowed us to present tactile gratings to a mouse performing a whisker-mediated texture discrimination task as a “virtual foraging task” — tactile stimuli moved past the whiskers synchronously with mouse locomotion. But the design is equally useful for measuring mouse running position and speed in a very precise way.”

The tutorial may be found here.


Isett, B.R., Feasel, S.H., Lane, M.A., and Feldman, D.E. (2018). Slip-Based Coding of Local Shape and Texture in Mouse S1. Neuron 97, 418–433.e5.

MAPLE: a Modular Automated Platform for Large-Scale Experiments

January 8th, 2018 
The de Bivort lab and FlySorter, LLC are happy to share on OpenBehavior their open-source Drosophila handling platform, called MAPLE: Modular Automated Platform for Large-Scale Experiments.

Drosophila Melanogaster has proven a valuable genetic model organism due to the species’ rapid reproduction, low-maintenance, and extensive genetic documentation. However, the tedious chore of handling and manually phenotyping remains a limitation with regards to data collection. MAPLE: a Modular Automated Platform for Large-Scale Experiments provides a solution to this limitation.

MAPLE is a Drosophila-handing robot that boasts a modular design, allowing the platform to both automate diverse phenotyping assays and aid with lab chores (e.g., collecting virgin female flies). MAPLE permits a small-part manipulator, a USB digital camera, and a fly manipulator to work simultaneously over a platform of flies. Failsafe mechanisms allow users to leave MAPLE unattended without risking damage to MAPLE or the modules.

The physical platform integrates phenotyping and animal husbandry to allow end-to-end experimental protocols. MAPLE features a large, physically-open workspace for user convenience. The sides, top, and bottom are made of clear acrylic to allow optical phenotyping at all time points other than when the end-effector carriages are above the modules. Finally, the low cost and scalability allow large-scale experiments ($3500 vs hundreds of thousands for a “fly-flipping” robot).

MAPLE’s utility and versatility were demonstrated through the execution of two tasks: collection of virgin female flies, and a large-scale longitudinal measurement of fly social networks and behavior.

Links to materials:

CAD files

Control Software

Raw data and analysis scripts 

De Bivort Lab Site 


 

ArControl: Arduino Control Platform

January 3rd, 2018

The following behavioral platform was developed and published by Xinfeng Chen and Haohong Li, from Huazhong University of Science and Technology, Wuhan, China


ArControl: Arduino Control Platform is a comprehensive behavioral platform developed to deliver stimuli and monitor responses. This easy-to-use, high-performance system uses an Arduino UNO board and a simple drive circuit along with a stand-along GUI application. Experimental data is automatically recorded with the built-in data acquisition function and the entire behavioral schedule is stored within the Arduino chip. Collectively, this makes ArControl a “genuine, real-time system with high temporal resolution”. Chen and Li have tested ArControl using a Go/No-Go task and a probabilistic switching behavior task. The results of their work show that ArControl is a reliable system for behavioral research.

Source codes and PCB drafts may be found here: ArControl Github

 

 

Pyper

November 28, 2017

Pyper is developed by The Margrie Laboratory.


Pyper provides real-time or pre-recorded motion tracking of a specimen in an open-field. Pyper can send TTL pulses based on detection of the specimen within user-defined regions of interest.  The software can be used through the command line or through a built-in graphical user interface. The live feed can be provided by a USB or Raspberry Pi camera.

Example of Pyper tracking a mouse in an open field


Find more information here.

Manual for Pyper.

3DTracker – 3D video tracking system for animal behavior

November 8th, 2017

Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.


3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.


More information on 3D tracker may be found on the system’s website, www.3dtracker.org

Additionally, a dynamic poster on the system was presented on November 12, 2017 at the Society for Neuroscience annual meeting. Click here for more information.

Autonomous Training of a Forelimb Motor Task

November 3, 2017

Greg Silas, from the University of Ottawa, has kindly contributed the following to OpenBehavior.


“Silasi et al developed a low-cost system for fully autonomous training of group housed mice on a forelimb motor task. We demonstrate the feasibility of tracking both end-point as well as kinematic performance of individual mice, each performing thousands of trials over 2.5 months. The task is run and controlled by a Raspberry Pi microcomputer, which allows for cages to be monitored remotely through an active internet connection.”

Click here to submit a piece of open-source software or hardware to OpenBehavior.

Moving Wall Box (MWB)

October 26th, 2017

Andreas Genewsky, from the Max-Planck Institute of Psychiatry, has generously shared the following regarding his Moving Wall Box task and associated apparatus.


“Typicallly, behavioral paradigms which aim to asses active vs. passive fear responses, involve the repeated application of noxius stimuli like electric foot shocks (step-down avoidance, step-through avoidance, shuttle-box). Alternative methods to motivate the animals and ultimately induce a conflict situation which needs to be overcome often involve food and/or water deprivation.

In order to repeatedly assess fear coping strategies in an emotional challenging situation without footshocks, food or water deprivation (comlying to the Reduce & Refine & Replace 3R principles), we devised a novel testing strategy, henceforward called the Moving Wall Box (MWB) task. In short, during the MWB task a mouse is repeatedly forced to jump over a small ice-filled box (10 trials, 1 min inter-trial intervals ITI), by slowly moving walls (2.3 mm/s, over 60 s), whereby the presence of the animal is automatically sensed via balances and analyzed by a microcontroller board which in turn controls the movements of the walls. The behavioral readouts are (1) the latency to reach the other compartment (high levels of behavioral inhibition lead to high latencies) and (2) the number of inter-trial shuttles per trial (low levels of behavioral inhibition lead to high levels of shuttles during the ITI).

The MWB offers the possibility to conduct simultaneous in vivo electrophysiological recordings, which could be later aligned to the behavioral responses (escapes). Therefore the MWB task fosters the study of activity patterns in, e.g., optogenetically identified neurons with respect to escape responses in a highly controlled setting. To our knowledge there is no other available compatible behavioral paradigm.”