Category: Behavioral Apparatus

Ratcave

AUGUST 29, 2019

Nicholas A. Del Grosso and Anton Sirota at the Bernstein Centre for Computational Neuroscience recently published their new project called Ratcave, a Python 3D graphics library that allows researchers to create and 3D stimuli in their experiments:


Neuroscience experiments often require the use of software to present stimuli to a subject and subsequently record their responses. Many current libraries lack 3D graphic support necessary for psychophysics experiments. While python and other programming languages may have 3D graphics libraries, it is hard to integrate these into psychophysics libraries without modification. In order to increase programming of 3D graphics suitable for the existing environment of Python software, the authors developed Ratcave.

Ratcave is an open-source, cross-platform Python library that adds 3D stimulus support to all OpenGL-based 2D Python stimulus libraries. These libraries include VisionEgg, Psychopy, Pyglet, and PyGam. Ratcave comes with resources including basic 3D object primitives and wide range of 3D light effects. Ratcave’s intuitive object-oriented interface allows for all objects, which include meshes, lights, and cameras, can be repositioned, rotated, and scaled. Objects can also be parented to one another to specify complex relationships of objects. By sending the data as a single array using OpenGL’s VAO (Vertex Array Object) functionality, the processing of drawing much more efficient. This approach allows over 30,000 vertices to be rendered at a performance level surpassing the needs of most behavioral research studies.

An advantage of Ratcave is that it allows researchers to continue to use their preferred libraries, since Ratcave supplements existing python stimulus libraries, making it easy to add on 3d stimuli to current libraries. The manuscript also reports that Ratcave has been tested and implemented in other’s research, actively showing reproducibility across labs and experiments.

Details on the hardware and software can be found at https://github.com/ratcave/ratcave.

Information on Ratcave can also be found on the https://ratcave.readthedocs.org.


SpikeGadgets

AUGUST 22, 2019

We’d like to highlight groups and companies that support an open-source framework to their software and/or hardware in behavioral neuroscience. One of these groups is SpikeGadgets, a company co-founded by Mattias Karlsson and Magnus Karlsson.


SpikeGadgets is a group of electrophysiologists and engineers who are working to develop neuroscience hardware and software tools. Their open-source software, Trodes, is a cross-platform software suite for neuroscience data acquisition and experimental control, which is made up of modules that communicate with a centralized GUI to visualize and save electrophysiological data. Trodes has a camera module and a StateScript module, which is a state-based scripting language that can be used to program behavioral tasks through using lights, levels, beam breaks, lasers, stimulation sources, audio, solenoids, etc. The camera module can be used to acquire video that can synchronize to neural recordings; the camera module can track the animal’s position in real-time or play it back after the experiment. The camera module can work with USB webcams or GigE cameras.

Paired with the Trodes software and StateScript language is the SpikeGadgets hardware that can be purchased on their website. The hardware is used for data acquisition (Main Control Unit, used for electrophysiology) and behavioral control (Environmental Control Unit).  SpikeGadgets also provides both Matlab and Python toolboxes on their site that can be used to analyze both behavioral and electrophysiological data. Trodes can be used on Windows, Linux, or Mac, and there are step-by-step instructions for how to install and use Trodes on the group’s bitbucket page.

Spikegadgets mission is “to develop the most advanced neuroscience tools on the market, while preserving ease of use and science-driven customization.”

 


For more information on SpikeGadgets or to download or purchase their software or hardware, check out their website here.

There is additional documentation on their BitBucket Wiki, with a user manual, instructions for installation, and FAQ.

Check out their entire list of collaborators, contributors, and developers here.

RAD

August 1, 2019

In their recent eNeuro article, Bridget Matikainen-Ankney and colleagues from the Kravitz Lab have developed and shared their device, rodent activity detector (RAD), a low-cost system that can track and record activity in rodent home cages.


Physical activity is an important measure used in many research studies and is an important determinant of human health. Current methods for measuring physical activity in laboratory rodents have limitations including high expense, specialized caging/equipment, and high computational overhead. To address these limitations, Matikainen-Ankney et al. designed an open-source and cost-effective device for measuring rodent behavior.

In their new manuscript, they describe the design and implementation of RAD, rodent activity detector. The system allows for high throughput installation, minimal investigator intervention and circadian monitoring.  The design includes a battery powered passive infrared (PIR) sensor, microcontroller, microSD card logger, and an oLED screen for displaying data. All of the build instructions for RAD manufacture and programming, including the Arduino code, are provided on the project’s website.

The system records the number of PIR active bouts and the total duration the PIR is active each minute. The authors report that RAD is useful for quantifying changes across minutes rather than on a second to second time-scale, so the default data-logging frequency is set to one minute. The CSV files can be viewed and data visualized using provided python scripts. Device validation with video monitoring strongly correlated PIR data with speed and showed it recorded place to place locomotion but not slow or in place movements. To verify the device’s utility, RAD was used to collect physical activity data from 40 animals for 10 weeks. RAD detected high fat diet (HFD)-induced changes in activity and quantified individual animals’ circadian rhythms. Several major advantages of this tool are that the PIR sensor is not triggered by activity in other cages, it can detect and quantify within-mouse activity changes over time, and little investigator intervention other than infrequent battery replacement is necessary. Although the design was optimized for the lab’s specific caging, the open-source nature of the project makes it easily modifiable.

More details on RAD can be found in their eNeuro manuscript here, and all documentation can also be found on the project’s Hackaday.io page.


Matikainen-Ankney, B. A., Garmendia-Cedillos, M., Ali, M., Krynitsky, J., Salem, G., Miyazaki, N. L., … Kravitz, A. V. (2019). Rodent Activity Detector (RAD), an Open Source Device for Measuring Activity in Rodent Home Cages. ENeuro, 6(4). https://doi.org/10.1523/ENEURO.0160-19.2019

Teensy-based Interface

July 3, 2019

Michael Romano and colleagues from the Han Lab at Boston University recently published their project using a Teensy microcontroller to control an sCMOS camera in behavioral experiments to obtain high temporal precision:


Teensy microcontrollers are becoming increasingly more popular and widespread in the neuroscience community. One benefit of using a Teensy is its ease of programming for those with little programming experience, as it uses Arduino/C++ language. An additional benefit of using a Teensy microcontroller is that it can take in and send out time-precise signals. Romano et al. developed a flexible Teensy 3.2-based interface for data acquisition and delivery of analog and digital signals during a rodent locomotion tracking experiment and in a trace eye blink conditioning experiment. The group shows how the interface can be paired with optical calcium imaging as well. The setup integrates a sCMOS camera with behavioral experiments, and the interface is rather user-friendly.

The Teensy interface ensures that the data is temporally precise, and the Teensy interface can also deliver digital signals with microsecond precision to capture images from a paired sCMOS camera. Calcium imaging can be performed during the eye blink conditioning experiment. This was done through pulses send to the camera to capture calcium activity in the hippocampus at 20 Hz from the Teensy. Additionally, the group shows that the Teensy interface can also generate analog sound waveforms to drive speakers for the eye blink experiment. The study shows how an inexpensive piece of lab equipment, like a simple Teensy microcontroller, can be utilized to drive multiple aspects of a neuroscience experiment, and provides inspiration for future experiments to utilize microcontrollers to control behavioral experiments.

 

For more details on the project, check out the project’s GitHub here.

 

Romano, M., Bucklin, M., Gritton, H., Mehrotra, D., Kessel, R., & Han, X. (2019). A Teensy microcontroller-based interface for optical imaging camera control during behavioral experiments. Journal of Neuroscience Methods, 320, 107-115.

 

AutonoMouse

May 10, 2019

In a recently published article (Erskine et al., 2019), The Schaefer lab at the Francis Crick Institute introduced their new open-source project called AutonoMouse.


AutonoMouse is a fully automated, high-throughput system for self-initiated conditioning and behavior tracking in mice. Many aspects of behavior can be analyzed through having rodents perform in operant conditioning tasks. However, in operant experiments, many variables can potentially alter or confound results (experimenter presence, picking up and handling animals, altered physiological states through water restriction, and the issue that rodents often need to be individually housed to keep track of their individual performances). This was the main motivation for the authors to investigate a way to completely automate operant conditioning. The authors developed AutonoMouse as a fully automated system that can track large numbers (over 25) of socially-housed mice through implanted RFID chips on mice. With the RFID trackers and other analyses, the behavior of mice can be tracked as they train and are subsequently tested on (or self-initiate testing in) an odor discrimination task over months with thousands of trials performed every day. The novelty in this study is the fully automated nature or the entire system (training, experiments, water delivery, weighing the animals are all automated) and the ability to keep mice socially-housed 24/7, all while still training them and tracking their performance in an olfactory operant conditioning task. The modular set-up makes it possible for AutonoMouse to be used to study many other sensory modalities, such as visual stimuli or in decision-making tasks. The authors provide a components list, layouts, construction drawings, and step-by-step instructions for the construction and use of AutonoMouse in their publication and on their project’s github.


For more details, check out this youtube clip interview with Andreas Schaefer, PI on the project.

 

The github for the project’s control software is located here: https://github.com/RoboDoig/autonomouse-control and for the project’s design and hardware instructions is here: https://github.com/RoboDoig/autonomouse-design. The schedule generation program is located here: https://github.com/RoboDoig/schedule-generator


Actifield

March 21, 2019

Victor Wumbor-Apin Kumbol and colleagues have developed and shared Actifield, an automated open-source actimeter for rodents, in a recent HardwareX publication.


Measuring locomotor activity can be a useful readout for understanding effects of a number of experimental manipulations related to neuroscience research. Commercially available locomotor activity recording devices can be cost-prohibitive and often lack the ability to be customized to fit a specific lab’s needs. Kumbol et al. offer an open-source alternative that utilizes infrared motion detection and an arduino to record activity in a variety of chamber set ups. A full list of build materials, links to 3D-print and laser-cut files, and assembly instructions are available in their publication.

Read more from HardwareX!


Dual-port Lick Detector

January 16, 2019

In the Journal of Neurophysiology, Brice Williams and colleagues have  shared their design for a novel dual-port lick detector. This device can be used for both real-time measurement and manipulation of licking behavior in head-fixed mice.


Measuring licking behavior in mice provides a valuable metric of sensory-motor processing and can be nicely paired with simultaneous neural recordings. Williams and colleagues have developed their own device for precise measuring of licking behavior as well as for manipulating this behavior in real time. To address limitations of many available lick sensors, the authors designed their device to be smaller (appropriate for mice), contactless (to diminish electric artifacts for neural recording), and precise to a submillisecond timescale. This dual-port detector can be implemented to detect directional licking behavior during sensory tasks and can be used in combination with neural recording. Further, given the submillisecond precision of this device, it can be used in a closed-loop system to perturb licking behaviors via neural inhibition. Overall, this dual-port lick detector is a cost-effective, replicable solution that can be used in a variety of applications.

Learn how to build your own here!

And be sure to check out their Github.


ELOPTA: a novel microcontroller-based operant device

December 19, 2018

In 2007, Adam Hoffman and colleagues shared their design for an Electric Operant Testing Apparatus (ELOPTA) in Behavior Research Methods.


Operant behavior is commonly studied in behavioral neuroscience, therefore there is a need for devices to train and collect data from animals in operant procedures. Commercially available systems often require training to program and use and can be expensive. Hoffman and colleagues developed a system that can automatically control operant procedures and record behavioral outputs. This system is intended to be easy to use because it is easily programmable, portable and durable.

Read more here!


Hoffman, A.M., Song, J. & Tuttle, E.M. Behavior Research Methods (2007) 39: 776. https://doi.org/10.3758/BF03192968

TRIO Platform

December 12, 2018

Vladislav Voziyanov and colleagues have developed and shared the TRIO Platform, a low-profile in vivo imaging support and restraint system for mice.


In vivo optical imaging methods are common tools for understanding neural function in mice. This technique is often performed in head-fixed,  anesthetized animals, which requires monitoring of anesthesia level and body temperature while stabilizing the head. Fitting each of the components necessary for these experiments on a standard microscope stage can be rather difficult. Voziyanov and colleagues have shared their design for the TRIO (Three-In-One) Platform. This system is compact and  provides sturdy head fixation, a gas anesthesia mask, and warm water bed. While the design is compact enough to work with a variety of microscope stages, the use of 3D printed components makes this design customizable.

https://www.frontiersin.org/files/Articles/184541/fnins-10-00169-HTML/image_m/fnins-10-00169-g004.jpg

Read more about the TRIO Platform in Frontiers in Neuroscience!

The design files and list of commercially available build components are provided here.


PsiBox: Automated Operant Conditioning in the Mouse Home Cage

November 30, 2018

Nikolas Francis and Patrick Kanold of the University of Maryland share their design for Psibox, a platform for automated operant conditioning in the mouse home cage, in Frontiers in Neural Circuits.


The ability to collect behavioral data from large populations of subjects is advantageous for advancing behavioral neuroscience research. However, few cost-effective options are available for collecting large sums of data especially for operant behaviors. Francis and Kanold have developed and shared Psibox,  an automated operant conditioning system. It incorporates three modules for central control , water delivery, and home cage interface, all of which can be customized with different parts. The system was validated for training mice in a positive reinforcement auditory task and can be customized for other tasks as well. The full, low-cost system allows for quick training of groups of mice in an operant task with little day-to-day experimenter involvement.

Learn how to set up your own Psibox system here!


Francis, NA., Kanold, PO., (2017). Automated operant conditioning in the mouse home cage. Front. Neural Circuits.