Category: Recording Devices

3DTracker – 3D video tracking system for animal behavior

November 8th, 2017

Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.


3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.

More information on 3D tracker may be found on the system’s website, www.3dtracker.org

Additionally, a dynamic poster on the system will be presented on November 12th at the Society for Neuroscience annual meeting. Click here for more information.

OMR Arena

September 7, 2017

Researchers at the National Eye Institute and the University of Oldenberg, Germany, have developed the OMR-arena for measuring visual acuity in mice.


The OMR-arena is an automated measurement and stimulation system that was developed to determine visual thresholds in mice. The system uses an optometer to characterize the visual performance of mice in a free moving environment. This system uses a video-tracking system to monitor the head movement of mice while presenting appropriate 360° stimuli. The head tracker is used to adjust the desired stimulus to the head position, and to automatically calculate visual acuity. This device, in addition to being open-source and affordable, offers an objective way for researchers to measure visual performance of free moving mice.


Kretschmer F, Kretschmer V, Kunze VP, Kretzberg J (2013) OMR-Arena: Automated Measurement and Stimulation System to Determine Mouse Visual Thresholds Based on Optomotor Responses. PLoS ONE 8(11): e78058. https://doi.org/10.1371/journal.pone.0078058

Open Source platform for Sensory Tasks

August, 2017

Lucy Palmer and Andrew Micallef, of the Florey Institute of Neuroscience and Mental Health, University of Melbourne, Melbourne, VIC, Australia, have shared the following Arduino and Python based platform for Go/ No-Go tasks in an article published by Frontiers in Cellular Neuroscience.


The Go/No-Go sensory task requires an animal to report a decision in response to a stimulus. In “Go” trials, the subject must respond to a target stimulus with an action, while in “No-Go” trials, the subject withholds a response. To execute this task, a behavioral platform was created which consists of three main components: 1) a water reward delivery system, 2) a lick sensor, and 3) a sensory stimulation apparatus. The water reward is administered by a gravity flow water system, controlled by a solenoid pinch valve, while licking is monitored by a custom-made piezo-based sensor. An Arduino Uno Rev3 simultaneously controls stimulus and reward delivery. In addition, the Arduino records lick frequency and timing through the piezo sensor. A Python script, employing the pyserial library, aids communication between the Arduino and a host computer.


The GitHub for the project may be found here.

Free-Behavior Monitoring and Reward System for Non-human Primates

August, 2017

In Frontiers Neuroscience, Tyler Libey and Eberhard E. Fetz share their open-source device for recording neural activity from free behaving non-human primates in their home cages and administering reward.


This device is designed to document bodily movement and neural activity and deliver rewards to monkeys behaving freely in their home cages. This device allows researchers to explore behaviors in freely moving non-human primates rather than simply relying on rigid and tightly controlled movements which lends itself to further understanding movement, reward, and the neural signals involved with these behaviors. Studying free-moving animals may offer essential insight to understanding the neural signals associated with reward-guided movement, which may offer guidance in developing more accurate brain machine interfaces. The behavior monitoring system incorporates existing untethered recording equipment, Neurochip, and a custom hub to control a cage-mounted feeder to deliver short-latency rewards. A depth camera is used to provide gross movement data streams from the home cage in addition to the neural activity that is recorded.


Libey T and Fetz EE (2017) Open-Source, Low Cost, Free-Behavior Monitoring, and Reward System for Neuroscience Research in Non-human Primates. Front. Neurosci. 11:265. doi: 10.3389/fnins.2017.00265

FinchScope

May 19th, 2017 

William Liberti, from the Gardner Lab out of Boston University, has shared the following with Open Behavior regarding ‘FinchScope’. Although originally designed for finches, the 3D printed single-photon fluorescent imaging microscope has since been adapted for rodents and other avian species.


The FinchScope project aims to provide a modular in-vivo optophysiology rig for awake, freely behaving animals, with a transparent acquisition and analysis pipeline. The goal is to produce a customizable and scaleable single-photon fluorescent imaging microscope system that takes advantage of developing open-source analysis platforms. These tools are built from easily procured off-the-shelf components and 3D printed parts.
We provide designs for a 3D printed,  lightweight, wireless-capable microscope and motorized commutator, designed for multi-month monitoring the neural activity (via genetically encoded calcium indicators) of zebra finches while they sing their courtship songs. It has since been adapted for rodents, and to other birds such as canaries.
The Github project page can be found here.
Link to share: https://edspace.american.edu/openbehavior/2017/05/19/finchscope/

Attys

January 28, 2017 

Dr. Bernd Porr has also shared the following open source bioamplifier:


Attys is an open source wearable data acquisition device with a special focus on biomedical signals such as heart activity (ECG), muscle activity (EMG) and brain activity (EEG). In contrast to many neurogadgets, the Attys transmits the data as it’s being recorded without any compression or pre-filtering, and at its full precision of 24 bits, to a mobile phone, tablet or PC. This guarantees maximum possible openness so that the raw data can be published alongside the processed data, as required now by many research councils.

All software for the Attys is open source which includes the firmware of the Attys.

The story of the Attys started four years ago, when Dr. Bernd Porr filmed numerous YouTube clips to educate the public about the possibilities and limits of biosignal measurement (see BPM Biosignals). The site has been very popular ever since and visitors have been asking if a ready made bio-amp could be made available. This was the birth of Attys.”

 

 

Ultrasonic Vocalization (USV) Detector

December 21, 2016

David Barker from the National Institute on Drug Abuse Intramural Research Program has shared the following regarding the development of a device designed to allow the automatic detection of 50kHz ultrasonic vocalizations.


Ultrasonic vocalizations (USVs) have been utilized to infer animals’ affective states in multiple research paradigms including animal models of drug abuse, depression, fear or anxiety disorders, Parkinson’s disease, and in studying neural substrates of reward processing. Currently, the analysis of USV data is performed manually, and thus time consuming.

The present method was developed in order to allow for the automated detection of 50-kHz ultrasonic vocalizations using a template detection procedure. The detector runs in XBAT, an extension developed for MATLAB developed by the Bioacoustics Research Program at Cornell University. The specific template detection procedure for ultrasonic vocalizations along with a number of companion tools were developed and tested by our laboratory. Details related to the detector’s performance can be found within our published work and a detailed readme file is published along with the MATLAB package on our GitHub.

Our detector was designed to be freely shared with the USV research community with the hope that all members of the community might benefit from its use. We have included instructions for getting started with XBAT, running the detector, and developing new analysis tools. We encourage users that are familiar with MATLAB to develop and share new analysis tools. To facilitate this type of collaboration, all files have been shared as part of a GitHub repository, allowing for suggested changes or novel contribution to be made to the software package. I would happily integrate novel analysis tools created by others into future releases of the detector.

Work on a detector for 22-kHz vocalizations is ongoing; the technical challenges for detecting 22-kHz vocalizations, which are nearer to audible noise, are more difficult. Those interested in contributing to this can email me at djamesbarker@gmail-dot-com or find me on twitter (@DavidBarker_PhD).


UCLA Miniscope Project

Daniel Aharoni of the Golshani, Silva, and Khakh Lab at UCLA has shared the following about Miniscope:


miniscope1
This open source miniature fluorescence microscope uses wide-field fluorescence imaging to record neural activity in awake, freely behaving mice. The Miniscope has a mass of 3 grams and uses a single, flexible coaxial cable (0.3mm to 1.5mm diameter) to carry power, control signals, and imaging data to open source Data Acquisition (DAQ) hardware and software. Miniscope.org provides a centralized location for sharing design files, source code, and other relevant information so that a community of users can share ideas and developments related to this important imaging technique. Our goal is to help disseminate this technology to the larger neuroscience community and build a foundation of users that will continue advancing this technology and contribute back to the project. While the Miniscope system described here is not an off-the-shelf commercial solution, we have focused on making it as easy as possible for the average neuroscience lab to build and modify, requiring minimal soldering and hands on assembly.
miniscope2
Video demonstrating GCamp6F imaging in CA1 using the UCLA Miniscope

Automated Home-Cage Functional Imaging

Timothy Murphy and his colleagues at the University of British Columbia have developed an automated system for mesoscopic functional imaging that allows subjects to self-initiate head-fixation and imaging within the home-cage. In their 2016 paper, “High-throughput automated home-cage mesoscopic functional imaging of mouse cortex,” Dr. Murphy and his colleagues present this device and demonstrate its use with a group of calcium indicator transgenic mice. The supplementary material to this paper includes a diagram of the hardware, a graphic representation of the training cage, several videos of subjects interacting with the device and sample imaging data. The Python source code and 3D print files can be found on Dr. Murphey’s UBC webpage.

Murphy, T. H., Boyd, J. D., Bolaños, F., Vanni, M. P., Silasi, G., Haupt, D., & LeDue, J. M. (2016). High-throughput automated home-cage mesoscopic functional imaging of mouse cortex. Nature Communications, 7, 11611.

Feeding Experimentation Device (FED)

WP_20160320_003Feeding Experimentation Device (FED) is a home cage-compatible feeding system that measures food intake with high accuracy and temporal resolution. FED offers a low-cost alternative (~$350) to commercial feeders, with the convenience of use in tradition colony rack caging.

In their 2016 paper, “Feeding Experimentation Device (FED): A flexible open-source device for measuring feeding behavior,” Katrina P. Nguyen, Timothy J. O’Neal, Olurotimi A. Bolonduro, Elecia White, and Alexxai V. Kravitz validate the reliability of food delivery and precise measurement of feeding behavior provided by FED, as well as, demonstrate the application of FED in an experiment examining light and dark-cycle feeding trends, and another measuring optogenetically-evoked feeding.

WP_20160324_10_54_40_Pro

KravitzLab has shared the Arduino scripts for controlling FED, as well as, the python code used to analyze the feeding data collected by FED on the KravitzLab Github. Additionally, build instructions and power considerations are detailed on the FED Wiki page and 3D Design Files provided through TinkerCAD.


Nguyen, Katrina; O’Neal, Timothy; Bolonduro, Olurotimi; White, Elecia; Kravitz, Alexxai (2016). Feeding Experimentation Device (FED): A flexible open-source device for measuring feeding behavior. J Neurosci Methods, 267:108-14.