Home » rat

Tag: rat

An Open Source Automated Bar Test for Measuring Catalepsy in Rats

August 6, 2020

Researchers at the University of Guelph have created a low-cost automated apparatus for measuring catalepsy that increases measurement accuracy and reduces observer bias.


Catalepsy is a measure of muscular rigidity that can result from several factors including Parkinson’s disease, or pharmacological exposure to antipsychotics or cannabis. Catalepsy bar tests are widely used to measure this rigidity. The test consists of placing the arms of a rodent on a horizontal bar that has been raised off the ground and measuring the time it takes for the subject to remove themselves from this imposed posture. Traditionally, this has been measured by an experimenter with a stopwatch, or with prohibitively expensive commercial apparatus that have issues of their own. The automated bar test described here uses a 3D printed base with an Arduino operated design to make the design simple and affordable. This design sets itself apart by using extremely low-cost beam break sensors that avoid pitfalls of the traditional “complete the circuit” approach where changes in rat grip can result in false measurements. The beam break sensors to are used to determine whether the rat is on the bar or not and automatically measures and stores the time the rat takes to remove itself from the bar on an SD card for later retrieval. The device has been validated in rats; however, the bar height is adjustable so there is no reason it cannot be used in other rodents as well. This bar test thus makes catalepsy measures easy, accurate, and limits any experimenter bias due to manual measurements.

Learn more about this project in the recent paper!

Or check out the hackaday project page


Luciani, K. R., Frie, J. A., & Khokhar, J. Y. (2020). An Open Source Automated Bar Test for Measuring Catalepsy in Rats. ENeuro, 7(3). https://doi.org/10.1523/ENEURO.0488-19.2020

CerebraLux

June 25, 2020

This week we want to shed some light on a project from Robel Dagnew and colleagues from UCLA called CerebraLux, a wireless system for optogenetic stimulation.


Optogenetic methods have been a crucial tool for understanding the role that certain neural cell populations have in modulating or maintaining a variety of behaviors. This tool requires a light source to be passed through a fiber optic probe, and in many experimental setups this is achieved through a long fiber optic cable to attach the light source to the probe. This long cable can impose limitations on experiments where animals are behaving freely around behavior chambers or mazes. One obvious solution is to deliver light via a wireless controller communicating with a headmounted light source, but existing systems can be cost-prohibitive or to build in a lab  requires access to specialized manufacturing equipment. To address the need for a a low-cost wireless optogenetic probe, Dagnew and colleagues developed CerebraLux which is built using off-the-shelf and accessible custom parts. This device consists of two major components: the optic component which features a milled baseplate capable of holding and connecting an optic fiber and LED (a part of the electronic portion); and the electronic component which features a custom-printed circuit board (PCB), lithium battery, IR receiver, LED, and magnets to align and connect the two components of the device. The device is controlled via a custom GUI (built with the TkInter Python 2.7 library) which sends pulses to the device via an Arduino Uno. More details about the build of these components and the process for communicating with the device via the GUI are available in Dagnew et. al. The CerebraLux design and operations manual, which includes access to the 3D design files for the milled parts, the print design for the PCB, and code for communicating with the device, is available in the appendix of the paper, while the code for the GUI is available from the Walwyn Lab website. Be sure to check out the paper for information about how they validated the device in vivo. The cost of all the component parts (as of 2017) comes in just under $200, providing to be a cost-effective solution for labs seeking a wireless optogenetic probe.

Read more about CerebraLux here!


Dagnew, R., Lin, Y., Agatep, J., Cheng, M., Jann, A., Quach, V., . . . Walwyn, W. (2017). CerebraLux: A low-cost, open-source, wireless probe for optogenetic stimulation. Neurophotonics, 4(04), 1. doi:10.1117/1.nph.4.4.045001

Rodent Arena Tracker (RAT)

June 18, 2020

Jonathan Krynitsky and colleagues from the Kravitz lab at Washington University have constructed and shared RAT, a closed loop system for machine vision rodent tracking and task control.


The Rodent Arena Tracker, or RAT, is a low cost wireless position tracker for automatically tracking mice in high contrast arenas. The device can use subject position information to control other devices in real time, allowing for closed loop control of various tasks based on positional behavior data. The device is based on the OpenMV Cam M7 (openmv.io), an opensource machine vision camera equipped with onboard processing for real-time analysis which reduces data storage requirements and removes the need for an external computer. The authors optimized the control code for tracking mice and created a custom circuit board to run the device off a battery and include a real-time clock for synchronization, a BNC input/output port, and a push button for starting the device. The build instructions for RAT, as well as validation data to highlight effectiveness and potential uses for the device are available in their recent publication. Further, all the design files, such as the PCB design, 3D printer files, python code, etc, are available on hackaday.io.

Read the full article here!

Or check out the project on hackaday.io!


Krynitsky, J., Legaria, A. A., Pai, J. J., Garmendia-Cedillos, M., Salem, G., Pohida, T., & Kravitz, A. V. (2020). Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System. Eneuro, 7(3). doi:10.1523/eneuro.0485-19.2020

 

ACRoBaT

May 14, 2020

David A. Bjånes and Chet T. Moritz from the University of Washington in Seattle have developed and published their device for training rats to perform a modified center out task.


As neuroscience tools for studying rodent brains have improved in the 21st century, researchers have started to utilize increasingly complex tasks to study their behavior, sometimes adapting tasks commonly used with primates. One such task used for studying motor behavior, the center-out reaching task, has been modified for use in rodents. Bjånes and Moritz have further contributed to the adaptation of this task by creating ACRoBaT, or the Automated Center-out Rodent Behavioral Trainer. This device features two custom printed PCBs, a 3D printed housing unit, an Arduino microchip, and other commercially available parts that can be mounted outside a behavioral arena. It also provides a fully automated algorithm to train rats based on behavioral feedback fed into the device through various sensors. The authors show the effectiveness of the device with data from 18 rats across different conditions to find the optimal training procedure for this task. Information for how to build the device is available in their publication, as well as on Github.

Read the full publication here, or check out the files on GitHub!


Visual stimulator with customizable light spectra

May 7, 2020

Katrin Franke, Andre Maia Chagas and colleagues have developed and shared a spatial visual stimulator with an arbitrary-spectrum of light for visual neuroscientists.


Vision research, quite obviously, relies on control of visual stimuli in an experiment. There are a great number of commercially available devices and hardware that are implemented in presenting visual stimuli to human and other species, however, these devices are predominantly developed for the visual spectrum of humans. For other species, such as drosophila, zebrafish, and rodents, their visual spectrum includes UV, and the devices used in studies sometimes fail to present this range of stimulus, and therefore often limits our understanding of the visual systems of other organisms. To address this, Franke, Chagas and colleagues developed an open source, generally low cost visual stimulator which can be customized with up to 6 chromatic channels. Given the components used to build the device, the spectrum of light can be arbitrary and customizable to be adapted to different animal models based on their visual spectrum. The details of this device, including the parts list and information for a custom python library for generating visual stimuli (QDSpy), can be found in the eLife publication. The device is tested and shown to work with stimulating the mouse retina and in vivo zebrafish studies; details on these experiments can also be found in the publication.

Check out the eLife article here!


Franke, K., Chagas, A. M., Zhao, Z., Zimmermann, M. J., Bartel, P., Qiu, Y., . . . Euler, T. (2019). An arbitrary-spectrum spatial visual stimulator for vision research. ELife, 8. doi:10.7554/elife.48779

neurotic

April 30, 2020

Jeffrey P. Gill and colleagues have developed and shared a new toolbox for synchronizing video and neural signals, cleverly named neurotic!


Collecting neural data and behavioral data are fundamental to behavioral neuroscience, and the ability to synchronize these data streams are just as important as collecting the information in the first place. To make this process a little simpler, Gill et al. developed an open-source option called neurotic, a NEUROscience Tool for Interactive Characterization. This tool is programmed in Python and includes a simple GUI, which makes it accessible for users with little coding experience. Users can read in a variety of file formats for neural data and video, which they can then process, filter, analyze, annotate and plot. To show the effectiveness across species and signal types, the authors tested the software with aplysia feeding behavior and human beam walking. Given its open-source nature and strong integration of other popular open-source packages, this software will continue to develop and improve as the community uses it.

Read more about neurotic here!

Check out the documentation here.


Gill, J. P., Garcia, S., Ting, L. H., Wu, M., & Chiel, H. J. (2020). Neurotic: Neuroscience Tool for Interactive Characterization. Eneuro. doi:10.1523/eneuro.0085-20.2020

PASTA

April 16, 2020

Thanks to Jan Homolak from the Department of Pharmacology, University of Zagreb School of Medicine, Zagreb, Croatia for sharing the following about repurposing a digital kitchen scale for neuroscience research: a complete hardware and software cookbook for PASTA (Platform for Acoustic STArtle).


“As we were starving for a solution on how to obtain relevant and reliable information from a kitchen scale sometimes used in very creative ways in neuroscience research, we decided to cut the waiting and cook something ourselves. Here we introduce a complete hardware and software cookbook for PASTA, a guide on how to demolish your regular kitchen scale and use the parts to turn it into a beautiful multifunctional neurobehavioral platform. This project is still medium raw, as its the work in progress, however, we hope you will still find it well done.
PASTA comes in various flavors such as:
– complete hardware design for PASTA
– PASTA data acquisition software codes (C++/Arduino)
– PASTA Chef: An automatic experimental protocol execution Python script for data acquisition and storage
– ratPASTA (R-based Awesome Toolbox for PASTA): An R-package for PASTA data analysis and visualization

..and all can be found on bioRxiv here: https://www.biorxiv.org/content/10.1101/2020.04.10.035766v1.supplementary-material

bon appétit!”


Toolboxes for Spike and LFP Analysis

April 9, 2020

There are a number of open source toolboxes available for neural data analysis, especially for spike and local field potential data. With more options comes a more difficult decision when it comes to selecting the toolbox that’s right for your data. Fortunately, Valentina Unakafova and Alexander Gail have compared several toolboxes for spike and LFP analysis, connectivity analysis, dimensionality reduction, and generalized linear modeling. They discuss the major features of software available for Python and MATLAB (Octave) including Brainstorm, Chronux, Elephant, FieldTrip, gramm, Spike Viewer, and SPIKY. They include succinct tables for assessing system and program requirements, quality of documentation and support, and data types accepted by each toolbox. Using an open-access dataset, they assess the functionality of the programs and finish their comparison with highlighting advantages of each toolbox to consider when trying to find the one that works best for your data. The files they used to compare toolboxes are all available from GitHub to supplement their paper.

Analysis of spike and local field potential (LFP) data is an essential part of neuroscientific research.

Read their full comparison here.

Check out their GitHub for the project here.


Open Source Whisking Video Database

April 2, 2020

Alireza Azafar and colleagues at the Donders Institute for Brain, Cognition, and Behaviour at Rabound University have published an open source database of high speed videos of whisking behavior in freely moving rodents.


As responsible citizens, it’s possible you are missing lab and working from home. Maybe you have plenty to do, or maybe you’re looking for some new data to analyze to increase your knowledge of active sensing in rodents! Well, in case you don’t have that data at hand and can’t collect it yourself for a few more weeks, we have a treat for you! Azafar et al. have shared a database of 6,642 high quality videos featuring juvenile and adult male rats and adult male mice exploring stationary objects. This dataset includes a wide variety of experimental conditions including genetic, pharmacological, and sensory deprivation interventions to explore how active sensing behavior is modulated by different factors. Information about interventions and experimental conditions are available as a supplementary Excel file.

The videos are available as mp4 files as well as MATLAB matrices that can be converted into a variety of data formats. All videos underwent quality control and feature different amounts of noise and background, which makes a great tool for mastering video analysis. A toolbox is available from this group on Github for a variety of whisker analysis methods including nose and whisker tracking. This database is a great resource for studying sensorimotor computation, top-down mechanisms of sensory navigation, cross-species comparison of active sensing, and effects of pharmacological intervention of whisking behaviors.

Read more about active sensing behavior, data collection methods, and rationale here.

The database is available through GigaScience DB. 

You can also try out their Matlab Whisker Tracker from Github!


Phenopy

April 17, 2019

In a recent Nature Protocol’s article, Edoardo Balzani and colleagues from Valter Tucci’s lab have developed and shared Phenopy, a Python-based open-source analytical platform for behavioral phenotyping.


Behavioral phenotyping of mice using classic methods can be a long process and is susceptible to high variability, leading to inconsistent results. To reduce variance and speed up to process of behavioral analysis, Balzani et al. developed Phenopy, an open-source software for recording and analyzing behavioral data for phenotyping. The software allows for recording components of a behavioral task in combination with electrophysiology data. It is capable of performing online analysis as well as analysis of recorded data on a large scale, all within a user-friendly interface. Information about the software is available in their publication, available from Nature Protocols.*

Check out the full article from Nature Protocols!


(*alternatively available on ResearchGate)