Home » mouse

Tag: mouse

CerebraLux

June 25, 2020

This week we want to shed some light on a project from Robel Dagnew and colleagues from UCLA called CerebraLux, a wireless system for optogenetic stimulation.


Optogenetic methods have been a crucial tool for understanding the role that certain neural cell populations have in modulating or maintaining a variety of behaviors. This tool requires a light source to be passed through a fiber optic probe, and in many experimental setups this is achieved through a long fiber optic cable to attach the light source to the probe. This long cable can impose limitations on experiments where animals are behaving freely around behavior chambers or mazes. One obvious solution is to deliver light via a wireless controller communicating with a headmounted light source, but existing systems can be cost-prohibitive or to build in a lab  requires access to specialized manufacturing equipment. To address the need for a a low-cost wireless optogenetic probe, Dagnew and colleagues developed CerebraLux which is built using off-the-shelf and accessible custom parts. This device consists of two major components: the optic component which features a milled baseplate capable of holding and connecting an optic fiber and LED (a part of the electronic portion); and the electronic component which features a custom-printed circuit board (PCB), lithium battery, IR receiver, LED, and magnets to align and connect the two components of the device. The device is controlled via a custom GUI (built with the TkInter Python 2.7 library) which sends pulses to the device via an Arduino Uno. More details about the build of these components and the process for communicating with the device via the GUI are available in Dagnew et. al. The CerebraLux design and operations manual, which includes access to the 3D design files for the milled parts, the print design for the PCB, and code for communicating with the device, is available in the appendix of the paper, while the code for the GUI is available from the Walwyn Lab website. Be sure to check out the paper for information about how they validated the device in vivo. The cost of all the component parts (as of 2017) comes in just under $200, providing to be a cost-effective solution for labs seeking a wireless optogenetic probe.

Read more about CerebraLux here!


Dagnew, R., Lin, Y., Agatep, J., Cheng, M., Jann, A., Quach, V., . . . Walwyn, W. (2017). CerebraLux: A low-cost, open-source, wireless probe for optogenetic stimulation. Neurophotonics, 4(04), 1. doi:10.1117/1.nph.4.4.045001

Rodent Arena Tracker (RAT)

June 18, 2020

Jonathan Krynitsky and colleagues from the Kravitz lab at Washington University have constructed and shared RAT, a closed loop system for machine vision rodent tracking and task control.


The Rodent Arena Tracker, or RAT, is a low cost wireless position tracker for automatically tracking mice in high contrast arenas. The device can use subject position information to control other devices in real time, allowing for closed loop control of various tasks based on positional behavior data. The device is based on the OpenMV Cam M7 (openmv.io), an opensource machine vision camera equipped with onboard processing for real-time analysis which reduces data storage requirements and removes the need for an external computer. The authors optimized the control code for tracking mice and created a custom circuit board to run the device off a battery and include a real-time clock for synchronization, a BNC input/output port, and a push button for starting the device. The build instructions for RAT, as well as validation data to highlight effectiveness and potential uses for the device are available in their recent publication. Further, all the design files, such as the PCB design, 3D printer files, python code, etc, are available on hackaday.io.

Read the full article here!

Or check out the project on hackaday.io!


Krynitsky, J., Legaria, A. A., Pai, J. J., Garmendia-Cedillos, M., Salem, G., Pohida, T., & Kravitz, A. V. (2020). Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System. Eneuro, 7(3). doi:10.1523/eneuro.0485-19.2020

 

Visual stimulator with customizable light spectra

May 7, 2020

Katrin Franke, Andre Maia Chagas and colleagues have developed and shared a spatial visual stimulator with an arbitrary-spectrum of light for visual neuroscientists.


Vision research, quite obviously, relies on control of visual stimuli in an experiment. There are a great number of commercially available devices and hardware that are implemented in presenting visual stimuli to human and other species, however, these devices are predominantly developed for the visual spectrum of humans. For other species, such as drosophila, zebrafish, and rodents, their visual spectrum includes UV, and the devices used in studies sometimes fail to present this range of stimulus, and therefore often limits our understanding of the visual systems of other organisms. To address this, Franke, Chagas and colleagues developed an open source, generally low cost visual stimulator which can be customized with up to 6 chromatic channels. Given the components used to build the device, the spectrum of light can be arbitrary and customizable to be adapted to different animal models based on their visual spectrum. The details of this device, including the parts list and information for a custom python library for generating visual stimuli (QDSpy), can be found in the eLife publication. The device is tested and shown to work with stimulating the mouse retina and in vivo zebrafish studies; details on these experiments can also be found in the publication.

Check out the eLife article here!


Franke, K., Chagas, A. M., Zhao, Z., Zimmermann, M. J., Bartel, P., Qiu, Y., . . . Euler, T. (2019). An arbitrary-spectrum spatial visual stimulator for vision research. ELife, 8. doi:10.7554/elife.48779

neurotic

April 30, 2020

Jeffrey P. Gill and colleagues have developed and shared a new toolbox for synchronizing video and neural signals, cleverly named neurotic!


Collecting neural data and behavioral data are fundamental to behavioral neuroscience, and the ability to synchronize these data streams are just as important as collecting the information in the first place. To make this process a little simpler, Gill et al. developed an open-source option called neurotic, a NEUROscience Tool for Interactive Characterization. This tool is programmed in Python and includes a simple GUI, which makes it accessible for users with little coding experience. Users can read in a variety of file formats for neural data and video, which they can then process, filter, analyze, annotate and plot. To show the effectiveness across species and signal types, the authors tested the software with aplysia feeding behavior and human beam walking. Given its open-source nature and strong integration of other popular open-source packages, this software will continue to develop and improve as the community uses it.

Read more about neurotic here!

Check out the documentation here.


Gill, J. P., Garcia, S., Ting, L. H., Wu, M., & Chiel, H. J. (2020). Neurotic: Neuroscience Tool for Interactive Characterization. Eneuro. doi:10.1523/eneuro.0085-20.2020

3DOC: 3D Operant Conditioning

April 23, 2020

Raffaele Mazziotti from the Istituto di Neuroscienze CNR di Pisa has generously shared the following about 3DOC, a recently developed and published project from their team.


“Operant conditioning is a classical paradigm and a standard technique used in experimental psychology in which animals learn to perform an action in order to achieve a reward. By using this paradigm, it is possible to extract learning curves and measure accurately reaction times. Both these measurements are proxy of cognitive capabilities and can be used to evaluate the effectiveness of therapeutic interventions in mouse models of disease. Recently in our Lab, we constructed a fully 3D printable chamber able to perform operant conditioning using off-the-shelf, low-cost optical and electronic components, that can be reproduced rigorously in any laboratory equipped with a 3D printer with a total cost around 160€. Requirements include a 3D printable filament ( e.g. polylactic acid, PLA), a low-cost microcontroller (e.g. Arduino UNO), and a single-board computer (e.g. Raspberry Pi). We designed the chamber entirely using 3D modelling for several reasons: first, it has a high degree of reproducibility, since the model is standardized and can be downloaded to print the same structure with the same materials throughout different laboratories. Secondly, it can be easily customized in relation to specific experimental needs. Lastly, it can be shared through online repositories (Github: https://github.com/raffaelemazziotti/oc_chamber). With these cost-efficient and accessible components, we assayed the possibility to perform two-alternative forced-choice operant conditioning using audio-visual cues while tracking in the real-time mouse position. As a proof of principle of customizability, we added a version of the OC chamber that is able to show more complex visual stimuli (e.g. Images). This version includes an edit of the frontal wall that can host a TFT monitor and code that runs on Psychopy2 on Raspberry PI. This tool can be employed to test learning and memory in models of disease. We expect that the open design of the chamber will be useful for scientific teaching and research as well as for further improvements from the open hardware community.”

Check out the full publication here.

Or take a peak at the Github for this project.


PASTA

April 16, 2020

Thanks to Jan Homolak from the Department of Pharmacology, University of Zagreb School of Medicine, Zagreb, Croatia for sharing the following about repurposing a digital kitchen scale for neuroscience research: a complete hardware and software cookbook for PASTA (Platform for Acoustic STArtle).


“As we were starving for a solution on how to obtain relevant and reliable information from a kitchen scale sometimes used in very creative ways in neuroscience research, we decided to cut the waiting and cook something ourselves. Here we introduce a complete hardware and software cookbook for PASTA, a guide on how to demolish your regular kitchen scale and use the parts to turn it into a beautiful multifunctional neurobehavioral platform. This project is still medium raw, as its the work in progress, however, we hope you will still find it well done.
PASTA comes in various flavors such as:
– complete hardware design for PASTA
– PASTA data acquisition software codes (C++/Arduino)
– PASTA Chef: An automatic experimental protocol execution Python script for data acquisition and storage
– ratPASTA (R-based Awesome Toolbox for PASTA): An R-package for PASTA data analysis and visualization

..and all can be found on bioRxiv here: https://www.biorxiv.org/content/10.1101/2020.04.10.035766v1.supplementary-material

bon appétit!”


Toolboxes for Spike and LFP Analysis

April 9, 2020

There are a number of open source toolboxes available for neural data analysis, especially for spike and local field potential data. With more options comes a more difficult decision when it comes to selecting the toolbox that’s right for your data. Fortunately, Valentina Unakafova and Alexander Gail have compared several toolboxes for spike and LFP analysis, connectivity analysis, dimensionality reduction, and generalized linear modeling. They discuss the major features of software available for Python and MATLAB (Octave) including Brainstorm, Chronux, Elephant, FieldTrip, gramm, Spike Viewer, and SPIKY. They include succinct tables for assessing system and program requirements, quality of documentation and support, and data types accepted by each toolbox. Using an open-access dataset, they assess the functionality of the programs and finish their comparison with highlighting advantages of each toolbox to consider when trying to find the one that works best for your data. The files they used to compare toolboxes are all available from GitHub to supplement their paper.

Analysis of spike and local field potential (LFP) data is an essential part of neuroscientific research.

Read their full comparison here.

Check out their GitHub for the project here.


Open Source Whisking Video Database

April 2, 2020

Alireza Azafar and colleagues at the Donders Institute for Brain, Cognition, and Behaviour at Rabound University have published an open source database of high speed videos of whisking behavior in freely moving rodents.


As responsible citizens, it’s possible you are missing lab and working from home. Maybe you have plenty to do, or maybe you’re looking for some new data to analyze to increase your knowledge of active sensing in rodents! Well, in case you don’t have that data at hand and can’t collect it yourself for a few more weeks, we have a treat for you! Azafar et al. have shared a database of 6,642 high quality videos featuring juvenile and adult male rats and adult male mice exploring stationary objects. This dataset includes a wide variety of experimental conditions including genetic, pharmacological, and sensory deprivation interventions to explore how active sensing behavior is modulated by different factors. Information about interventions and experimental conditions are available as a supplementary Excel file.

The videos are available as mp4 files as well as MATLAB matrices that can be converted into a variety of data formats. All videos underwent quality control and feature different amounts of noise and background, which makes a great tool for mastering video analysis. A toolbox is available from this group on Github for a variety of whisker analysis methods including nose and whisker tracking. This database is a great resource for studying sensorimotor computation, top-down mechanisms of sensory navigation, cross-species comparison of active sensing, and effects of pharmacological intervention of whisking behaviors.

Read more about active sensing behavior, data collection methods, and rationale here.

The database is available through GigaScience DB. 

You can also try out their Matlab Whisker Tracker from Github!


Camera Control

February 6, 2020

The Adaptive Motor Control Lab at Harvard recently posted their project, Camera Control, a python based camera software GUI, to Github.


Camera Control is an open-source software package written by postdoctoral fellow Gary Kane that allows video to be recorded in sync with behavior. The python GUI and scripts allows investigators to record from multiple imaging source camera feeds with associated timestamps for each frame. When used in combination with a NIDAQ card, timestamps from a behavioral task can also be recorded on the falling edge of a TTL signal. This allows video analysis to be paired with physiological recording which can be beneficial in assessing behavioral results. This package requires Windows 10, Anaconda, and Git, and is compatible with Imaging Source USB3 cameras. The software package is accessible for download from the lab’s github and instructions for installation and video recording are provided.

Find more on Github.


Kane, G. & Mathis, M. (2019). Camera Control: record video and system timestamps from Imaging Source USB3 cameras. GitHub. https://zenodo.org/badge/latestdoi/200101590

MouseMove

July 18, 2019

In a 2015 Scientific Reports article, Andre Samson and colleagues shared their project MouseMove, an open-source software for quantifying movement in the open field test:


The Open Field (OF) test is a commonly used assay for monitoring exploratory behavior and locomotion in rodents. Most research groups use commercial systems for recording and analyzing behavior in the OF test, but these commercial systems can be expensive and lack flexibility. A few open-source OF systems have been developed, but are limited in the movement parameters that can be collected and analyzed. MouseMove is the first open-source software capable of providing qualitative and quantitative information on mouse locomotion in a semi-automated and high-throughput approach. With the aim of providing a freely available program for analyzing OF test data, these researchers developed a software that accurately quantifies numerous parameters of movement.

In their manuscript, Samson et al. describe the design and implementation of MouseMove. Their OF system allows for the measurement of distance, speed, and laterality with >96% accuracy. They use MouseMove as a method to analyze OF behavior of mice after experimental stroke to show reduced locomotor activity and quantify laterality deficits. The system is used in combination with the open source program ImageJ and the MTrack2 plugin to analyze pre-recorded OF test video.

The system has two downloadable components, the ImageJ macro and a separate program with the custom-built MouseMove GUI. ImageJ is used to subtract the background video from the experiment and create an image of the animals total trajectory. The MouseMove GUI then completes a detailed analysis of the movement patterns, measuring the fractional time spent stationary, the distance traveled, speed mean and various details of laterality. The results are depicted in both a visual/graphical form and as a saveable text file. In the manuscript, they provide step-wise instructions of how to use Mousemove. The authors additionally highlight the defined region-of-interest (ROI) ability of the software that makes it suitable for analysis of cognitive tests such as Novel Object Recognition. This tool offers relatively fast video-processing of motor cognitive behaviors and has many applications for the study of rodent models of brain injury/stimulation to measure altered locomotion.

 

More information on MouseMove can be found in their manuscript here.


Samson, A. L., Ju, L., Ah Kim, H., Zhang, S. R., Lee, J. A. A., Sturgeon, S. A., … Schoenwaelder, S. M. (2015). MouseMove: an open source program for semi-automated analysis of movement and cognitive testing in rodents. Scientific Reports, 5, 16171.  doi: 10.1038/srep16171