Category: Recording Devices

ArControl: Arduino Control Platform

January 3rd, 2018

The following behavioral platform was developed and published by Xinfeng Chen and Haohong Li, from Huazhong University of Science and Technology, Wuhan, China


ArControl: Arduino Control Platform is a comprehensive behavioral platform developed to deliver stimuli and monitor responses. This easy-to-use, high-performance system uses an Arduino UNO board and a simple drive circuit along with a stand-along GUI application. Experimental data is automatically recorded with the built-in data acquisition function and the entire behavioral schedule is stored within the Arduino chip. Collectively, this makes ArControl a “genuine, real-time system with high temporal resolution”. Chen and Li have tested ArControl using a Go/No-Go task and a probabilistic switching behavior task. The results of their work show that ArControl is a reliable system for behavioral research.

Source codes and PCB drafts may be found here: ArControl Github

 

 

Pyper

November 28, 2017

Pyper is developed by The Margrie Laboratory.


Pyper provides real-time or pre-recorded motion tracking of a specimen in an open-field. Pyper can send TTL pulses based on detection of the specimen within user-defined regions of interest.  The software can be used through the command line or through a built-in graphical user interface. The live feed can be provided by a USB or Raspberry Pi camera.

Example of Pyper tracking a mouse in an open field


Find more information here.

Manual for Pyper.

3DTracker – 3D video tracking system for animal behavior

November 8th, 2017

Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.


3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.


More information on 3D tracker may be found on the system’s website, www.3dtracker.org

Additionally, a dynamic poster on the system was presented on November 12, 2017 at the Society for Neuroscience annual meeting. Click here for more information.

OMR Arena

September 7, 2017

Researchers at the National Eye Institute and the University of Oldenberg, Germany, have developed the OMR-arena for measuring visual acuity in mice.


The OMR-arena is an automated measurement and stimulation system that was developed to determine visual thresholds in mice. The system uses an optometer to characterize the visual performance of mice in a free moving environment. This system uses a video-tracking system to monitor the head movement of mice while presenting appropriate 360° stimuli. The head tracker is used to adjust the desired stimulus to the head position, and to automatically calculate visual acuity. This device, in addition to being open-source and affordable, offers an objective way for researchers to measure visual performance of free moving mice.


Kretschmer F, Kretschmer V, Kunze VP, Kretzberg J (2013) OMR-Arena: Automated Measurement and Stimulation System to Determine Mouse Visual Thresholds Based on Optomotor Responses. PLoS ONE 8(11): e78058. 

Open Source platform for Sensory Tasks

August, 2017

Lucy Palmer and Andrew Micallef, of the Florey Institute of Neuroscience and Mental Health, University of Melbourne, Melbourne, VIC, Australia, have shared the following Arduino and Python based platform for Go/ No-Go tasks in an article published by Frontiers in Cellular Neuroscience.


The Go/No-Go sensory task requires an animal to report a decision in response to a stimulus. In “Go” trials, the subject must respond to a target stimulus with an action, while in “No-Go” trials, the subject withholds a response. To execute this task, a behavioral platform was created which consists of three main components: 1) a water reward delivery system, 2) a lick sensor, and 3) a sensory stimulation apparatus. The water reward is administered by a gravity flow water system, controlled by a solenoid pinch valve, while licking is monitored by a custom-made piezo-based sensor. An Arduino Uno Rev3 simultaneously controls stimulus and reward delivery. In addition, the Arduino records lick frequency and timing through the piezo sensor. A Python script, employing the pyserial library, aids communication between the Arduino and a host computer.


The GitHub for the project may be found here.

Free-Behavior Monitoring and Reward System for Non-human Primates

August 23, 2017

In Frontiers Neuroscience, Tyler Libey and Eberhard E. Fetz share their open-source device for recording neural activity from free behaving non-human primates in their home cages and administering reward.


This device is designed to document bodily movement and neural activity and deliver rewards to monkeys behaving freely in their home cages. This device allows researchers to explore behaviors in freely moving non-human primates rather than simply relying on rigid and tightly controlled movements which lends itself to further understanding movement, reward, and the neural signals involved with these behaviors. Studying free-moving animals may offer essential insight to understanding the neural signals associated with reward-guided movement, which may offer guidance in developing more accurate brain machine interfaces. The behavior monitoring system incorporates existing untethered recording equipment, Neurochip, and a custom hub to control a cage-mounted feeder to deliver short-latency rewards. A depth camera is used to provide gross movement data streams from the home cage in addition to the neural activity that is recorded.


Libey T and Fetz EE (2017) Open-Source, Low Cost, Free-Behavior Monitoring, and Reward System for Neuroscience Research in Non-human Primates. Front. Neurosci. 11:265. doi: 10.3389/fnins.2017.00265

FinchScope

May 19th, 2017 

William Liberti, from the Gardner Lab out of Boston University, has shared the following with Open Behavior regarding ‘FinchScope’. Although originally designed for finches, the 3D printed single-photon fluorescent imaging microscope has since been adapted for rodents and other avian species.


The FinchScope project aims to provide a modular in-vivo optophysiology rig for awake, freely behaving animals, with a transparent acquisition and analysis pipeline. The goal is to produce a customizable and scaleable single-photon fluorescent imaging microscope system that takes advantage of developing open-source analysis platforms. These tools are built from easily procured off-the-shelf components and 3D printed parts.
We provide designs for a 3D printed,  lightweight, wireless-capable microscope and motorized commutator, designed for multi-month monitoring the neural activity (via genetically encoded calcium indicators) of zebra finches while they sing their courtship songs. It has since been adapted for rodents, and to other birds such as canaries.
 

The Github project page can be found here.

Attys

January 28, 2017 

Dr. Bernd Porr has also shared the following open source bioamplifier:


“Attys is an open source wearable data acquisition device with a special focus on biomedical signals such as heart activity (ECG), muscle activity (EMG) and brain activity (EEG). In contrast to many neurogadgets, the Attys transmits the data as it’s being recorded without any compression or pre-filtering, and at its full precision of 24 bits, to a mobile phone, tablet or PC. This guarantees maximum possible openness so that the raw data can be published alongside the processed data, as required now by many research councils.

All software for the Attys is open source which includes the firmware of the Attys.

The story of the Attys started four years ago, when Dr. Bernd Porr filmed numerous YouTube clips to educate the public about the possibilities and limits of biosignal measurement (see BPM Biosignals). The site has been very popular ever since and visitors have been asking if a ready made bio-amp could be made available. This was the birth of Attys.”

 


Find more about Attys here. 

Ultrasonic Vocalization (USV) Detector

December 21, 2016

David Barker from the National Institute on Drug Abuse Intramural Research Program has shared the following regarding the development of a device designed to allow the automatic detection of 50kHz ultrasonic vocalizations.


Ultrasonic vocalizations (USVs) have been utilized to infer animals’ affective states in multiple research paradigms including animal models of drug abuse, depression, fear or anxiety disorders, Parkinson’s disease, and in studying neural substrates of reward processing. Currently, the analysis of USV data is performed manually, and thus time consuming.

The present method was developed in order to allow for the automated detection of 50-kHz ultrasonic vocalizations using a template detection procedure. The detector runs in XBAT, an extension developed for MATLAB developed by the Bioacoustics Research Program at Cornell University. The specific template detection procedure for ultrasonic vocalizations along with a number of companion tools were developed and tested by our laboratory. Details related to the detector’s performance can be found within our published work and a detailed readme file is published along with the MATLAB package on our GitHub.

Our detector was designed to be freely shared with the USV research community with the hope that all members of the community might benefit from its use. We have included instructions for getting started with XBAT, running the detector, and developing new analysis tools. We encourage users that are familiar with MATLAB to develop and share new analysis tools. To facilitate this type of collaboration, all files have been shared as part of a GitHub repository, allowing for suggested changes or novel contribution to be made to the software package. I would happily integrate novel analysis tools created by others into future releases of the detector.

Work on a detector for 22-kHz vocalizations is ongoing; the technical challenges for detecting 22-kHz vocalizations, which are nearer to audible noise, are more difficult. Those interested in contributing to this can email me at djamesbarker@gmail-dot-com or find me on twitter (@DavidBarker_PhD).


The GitHub can be found here.

UCLA Miniscope Project

October 7, 2016

Daniel Aharoni of the Golshani, Silva, and Khakh Lab at UCLA has shared the following about Miniscope:


This open source miniature fluorescence microscope uses wide-field fluorescence imaging to record neural activity in awake, freely behaving mice. The Miniscope has a mass of 3 grams and uses a single, flexible coaxial cable (0.3mm to 1.5mm diameter) to carry power, control signals, and imaging data to open source Data Acquisition (DAQ) hardware and software. Our goal is to help disseminate this technology to the larger neuroscience community and build a foundation of users that will continue advancing this technology and contribute back to the project. While the Miniscope system described here is not an off-the-shelf commercial solution, we have focused on making it as easy as possible for the average neuroscience lab to build and modify, requiring minimal soldering and hands on assembly.


Miniscope.org provides a centralized location for sharing design files, source code, and other relevant information so that a community of users can share ideas and developments related to this important imaging technique.
Video demonstrating GCamp6F imaging in CA1 using the UCLA Miniscope