Tag: mouse

Multi-channel Fiber Photometry

October 24, 2018

Qingchun Guo and colleagues share their cost-effective, multi-channel fiber photometry system in Biomedical Optics Express.


Fiber photometry is a viable tool for recording in vivo calcium activity in freely behaving animals. In combination with genetically encoded calcium indicators, this tool can be used to measure neuronal and population activity from a genetically defined subset of neurons. Guo and colleagues have developed a set-up to allow for recording from multiple brain regions, or multiple animals, simultaneously with the use of a galvano-mirror system. This creative and simple solution reduces the number of detectors necessary for multi-channel data collection. This expands the ability of researchers to collect calcium imaging data from many subjects in a cost-effective way.

Read more here!


OpenFeeder

October 17, 2018

In the journal HardwareX, Jinook Oh and colleagues share their design for OpenFeeder, an automatic feeder for animal experiments.


Automatic delivery of precisely measured food amounts is important when studying reward and feeding behavior. Commercially available devices are often designed with specific species and food types in mind, limiting the ways that they can be used. This open-source automatic feeding design can easily be customized for food types from seeds to pellets to fit the needs of any species. OpenFeeder integrates plexiglass tubes, Arduino Uno, a motor driver, and piezo sensor to reliably deliver accurate amounts of food, and can also be built using 3D printed parts.

Read more from HardwareX.

Or check out the device on Open Science Framework and Github.

 

KineMouse Wheel

October 10, 2018

On Hackaday, Richard Warren of the Sawtell Lab at Columbia University has shared his design for KineMouse Wheel, a light-weight running wheel for head-fixed locomotion that allows for 3D positioning of mice with a single camera.


Locomotive behavior is a common behavioral readout used in neuroscience research, and running wheels are a great tool for assessing motor function in head-fixed mice. KineMouse Wheel takes this tool a step further. Constructed out of light-weight, transparent polycarbonate with an angled mirror mounted inside, this innovative device allows for a single camera to capture two views of locomotion simultaneously. When combined with DeepLabCut, a deep-learning tracking software, head-fixed mice locomotion can be captured in three dimensions allowing for a more complete assessment of motor behavior. This wheel can also be further customized to fit the needs of a lab by using different materials for the build. More details about the KineMouse Wheel are available at hackaday.io, in addition to a full list of parts and build instructions.

Read more about KineMouse Wheel on Hackaday,

and check out other awesome open-source tools on the OpenBehavior Hackaday list!


 

Q&A with Dr. Mackenzie Mathis on her experience with developing DeepLabCut

August 22, 2018

Dr. Mackenzie Mathis, Principal Investigator of the Adaptive Motor Control Lab (Rowland Institute at Harvard University), has shared the following responses to a short Q&A about the inspiration behind, development of and sharing of DeepLabCut — a toolbox for animal tracking using deep-learning.


What inspired you and your colleagues to create this toolbox as opposed to using previously developed commercial software?

Alexander Mathis and I both worked on behaviors where we wanted to track particular features, and they proved to be unreliably tracked with the methods we tried. Specifically, Alexander has an odor-guided navigation task that he works on in the lab of Prof. Venkatesh Murthy at Harvard, where the mice are placed in a very large “endless” paper trail and he inkjet prints odors for them to follow to get rewards (chocolate milk). The position of the snout is very important to measure accurately, so background subtraction or other heuristics didn’t work when the nose crossed the trail and when the droplet was right in front of the snout. I worked on a skilled joystick behavior for mice, and I wanted to track joints accurately and non-invasively – a challenging problem for little hands. So, we teamed up with Prof. Matthias Bethge at the University of Tuebingen, to work on a new approach. He suggested we start looking into the rapidly advancing human pose estimation literature, and we looked at several before deciding to seriously benchmark DeeperCut, a top performing algorithm in the large MPII dataset. Those authors did something very clever, namely, they used a deep neural network (ResNet) that was pre-trained on a large image set called ImageNet. This gives the ResNet a chance to learn natural scene statistics first. Remarkably, we found that we could use only a few frames to very accurately track the snout in the odor-guided navigation task, so we next tried videos from my joystick task, and to flex DeepLabCut’s muscles, we teamed up with Kevin Cury (who, like myself was an alumni of Prof. Nao Uchida’s group) to track fruit flies in the 3D chamber. After all this benchmarking, we built a toolbox that implements a complete pipeline to extract and label frames, train and evaluate the deep neural nets, as well as analyze new experimental videos.  We call this toolbox DeepLabCut, as a nod to DeeperCut.

What was the motivation for immediately sharing your work as an open source tool, thus making it accessible to the broader neuroscience community?

Some of the options we first tried to track with were very expensive commercial systems, and they failed quite badly. On the other hand, deep learning has revolutionized computer vision in the last few years, so we were eager to try some new approaches to solve the problem. So, in addition to being advocates of open science, we really wanted to make a toolbox that someone with minimal to no coding experience could, absolutely for free, track whatever they wanted.

We also know peer review can be slow, so as soon as we had the toolbox in place, we wrote up the arxiv paper and released the code base immediately. Honestly, it has been one of my most rewarding papers – the feedback from our peers, and seeing what people have used the code for, has been a very rewarding experience. This was my first preprint, and especially for methods manuscripts, I now cannot imagine another way to share our future work too.

How do you think open source tools, such as yours, will continue to impact the progress of scientific research?

Open source code and preprints have been the norm in some fields for decades (such as math and physics), and I am really excited to see it come of age in biology and neuroscience. I am excited to see how tools will continue to improve as the community gets behind them, just as we could build on DeeperCut, which was open source. Also, at least in my experience, many individuals write their own code, which leads to a lot of duplicated efforts. Moreover, datasets are becoming increasingly more complicated and code to work with such data need to be robust shared. My expectation is that open source code will become the norm in the future, which can only help science become more robust.

Even before formal publication this week (see Nature Neuroscience), we estimate that about 100 labs are actively using DeepLabCut, so releasing the code before publication, we hope,  has really allowed for rapid progress to be made. We were also very happy that The Atlantic could highlight some of the early adopters, as it’s one thing to say you made something, but it’s another to hear others saying it is actually ‘something.’


DeepLabCut provides an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. Read more on the website, or in Nature Neuroscience.

 


An opensource lickometer and microstructure analysis program

August 8, 2018

In HardwareX, an open access journal for designing, building and customizing opensource scientific hardware, Martin A. Raymond and colleagues share their design for a user-constructed, low-cost lickometer.


Researchers interested in ingestive behaviors of rodents commonly use licking behavior as a readout for the amount of fluid a subject consumes, as recorded by a lickometer. Commercially available lickometers are powerful tools to measure this behavior, but can be expensive and often require further customization. The authors offer their own design for an opensource lickometer that utilizes readily available or customizable components such as a PC sound card and 3D printed drinking bottle holder. The data from this device is collected by Audacity, and opensource audio program, which is then converted to a .csv format which can be analyzed using an R script made available by the authors to assess various features of licking microstructure. A full bill of materials, instructions for assembly and links to design files are available in the paper.

Check out the full publication here!


Raymond, M. A., Mast, T. G., & Breza, J. M. (2018). An open-source lickometer and microstructure analysis program. HardwareX, 4. doi:10.1016/j.ohx.2018.e00035

NeRD: an open-source neural recording device

July 16, 2018

In a special issue of Journal of Neural Engineering, Dominique Martinez and colleagues their share design for NeRD, an open source neural recording device for wireless transmission of local field potential (LFP) data in in freely-behaving animals.


Electrophysiological recording of local field potentials in freely-behaving animals is a prominent tool used by researchers for assessing the neural basis of behavior. When performing these recordings, cables are commonly used to transmit data to the recording equipment, which tethers the animals and can interfere with natural behavior. Wireless transmission of LFP data has the advantage of removing the cable between the animal and the recording equipment, but is hampered by the large number of data to be transmitted at a relatively high rate.
To reduce transmission bandwidth, Martinez et al. propose an encoder/decoder algorithm based on adaptive non-uniform quantization. As proof-of- concept, they developed a NeRD prototype that digitally transmits eight channels encoded at 10 kHz with 2 bits per sample. This lightweight device occupies a small volume and is powered with a small battery allowing for 2h 40min of autonomy. The power dissipation is 59.4 mW for a communication range of 8 m and transmission losses below 0.1%. The small weight and low power consumption offer the possibility of mounting the entire device on the head of a rodent without resorting to a separate head-stage and battery backpack. The use of adaptive quantization in the wireless transmitting neural implant allows for lower transmission bandwidths, preservation of high signal fidelity, and preservation of fundamental frequencies in LFPs from a compact and lightweight device.
Read more here!

Github


CHEndoscope: A Compact Head-Mounted Endoscope for In Vivo Calcium Imaging in Freely Behaving Mice

July 2, 2018

In Current Protocols in Neuroscience, Alexander Jacob and colleagues share their open source compact head-mounted endoscope (CHEndoscope) for imaging in the awake behaving mouse.


This miniature microscope device is designed to provide an accessible set of calcium imaging tools to investigate the relationship between behavior and population neuronal activity for in vivo rodents. The CHEndoscope is open source, flexible, and consists of only 4 plastic components that can be 3D printed. It uses an implanted gradient index (GRIN) lens in conjunction with the genetically encoded calcium indicator GCaMP6 to image calcium transients from hundreds of neurons simultaneously in awake behaving mice. The aim of the open source model is to provide an accessible and flexible set of calcium imaging tools for the neuroscience research community. The linked article describes in depth the assembly, surgical implantation, data collection, and processing of calcium signals using the CHEndoscope.

Link to paper: https://currentprotocols.onlinelibrary.wiley.com/doi/abs/10.1002/cpns.51

GitHub: https://github.com/jf-lab/chendoscope


Jacob, A. D., Ramsaran, A. I., Mocle, A. J., Tran, L. M., Yan, C., Frankland, P. W., & Josselyn, S. A. (2018). A compact head‐mounted endoscope for in vivo calcium imaging in freely behaving mice. Current Protocols in Neuroscience, 84, e51. doi: 10.1002/cpns.51

Microwave-based Homecage Motion Detector

June 25, 2018

Andreas Genewsky and colleagues from the Max Planck Institute of Psychiatry have shared the design, construction and validation of a simplified, low-cost, radar-based motion detector for home cage activity monitoring in mice. This simple, open-source device allows for motion detection without visual contact to the animal and can be used with various cage types. It features a custom printed circuit board and motion detector shield for Arduino, which saves raw activity and timestamped data in CSV files onto an SD card; the authors also provide a Python script for data analysis and generation of actograms. This device offers a cost-effective, DIY alternative to optical imaging of home-cage activity.

Read more from the Journal of Biomedical Engineering publication!


Genewsky, A., Heinz, D. E., Kaplick, P. M., Kilonzo, K., & Wotjak, C. T. (2017). A simplified microwave-based motion detector for home cage activity monitoring in mice. Journal of Biological Engineering,11(1). doi:10.1186/s13036-017-0079-y

Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai

June 15, 2018

In a recent preprint on BioRxiv, Alessio Buccino and colleagues from the University of Oslo provide a step-by-step guide for setting up an open source, low cost, and adaptable system for combined behavioral tracking, electrophysiology, and closed-loop stimulation. Their setup integrates Bonsai and Open Ephys with multiple modules they have developed for robust real-time tracking and behavior-based closed-loop stimulation. In the preprint, they describe using the system to record place cell activity in the hippocampus and medial entorhinal cortex, and present a case where they used the system for closed-loop optogenetic stimulation of grid cells in the entorhinal cortex as examples of what the system is capable of. Expanding the Open Ephys system to include animal tracking and behavior-based closed-loop stimulation extends the availability of high-quality, low-cost experimental setup within standardized data formats.

Read more on BioRxiv, or on GitHub!


Buccino A, Lepperød M, Dragly S, Häfliger P, Fyhn M, Hafting T (2018). Open Source Modules for Tracking Animal Behavior and Closed-loop Stimulation Based on Open Ephys and Bonsai. BioRxiv. http://dx.doi.org/10.1101/340141

Head-Fixed Setup for Combined Behavior, Electrophysiology, and Optogenetics

June 12, 2018

In a recent publication in the Frontiers in Systems Neuroscience, Solari and colleagues of the Hungarian Academy of Sciences and Semmelweis University have shared the following about a behavioral setup for temporally controlled rodent behavior. This arrangement allows for training of head-fixed animals with calibrated sound stimuli, precisely timed fluid and air puff presentations as reinforcers. It combines microcontroller-based behavior control with a sound delivery system for acoustic stimuli, fast solenoid valves for reinforcement delivery and a custom-built sound attenuated chamber, and is shown to be suitable for combined behavior, electrophysiology and optogenetics experiments. This system utilizes an optimal open source setup of both hardware and software through using Bonsai, Bpod and OpenEphys.

Read more here!

GitHub


Solari N, Sviatkó K, Laszlovszky T, Hegedüs P and Hangya B (2018). Open Source Tools for Temporally Controlled Rodent Behavior Suitable for Electrophysiology and Optogenetic Manipulations. Front. Syst. Neurosci. 12:18. doi: 10.3389/fnsys.2018.00018