Tag: rodent

Actifield

March 21, 2019

Victor Wumbor-Apin Kumbol and colleagues have developed and shared Actifield, an automated open-source actimeter for rodents, in a recent HardwareX publication.


Measuring locomotor activity can be a useful readout for understanding effects of a number of experimental manipulations related to neuroscience research. Commercially available locomotor activity recording devices can be cost-prohibitive and often lack the ability to be customized to fit a specific lab’s needs. Kumbol et al. offer an open-source alternative that utilizes infrared motion detection and an arduino to record activity in a variety of chamber set ups. A full list of build materials, links to 3D-print and laser-cut files, and assembly instructions are available in their publication.

Read more from HardwareX!


idtracker.ai

February 20, 2019

Francisco Romero Ferrero and colleagues have developed idtracker.ai, an algorithm and software for tracking individuals in large collectives of unmarked animals, recently described in Nature Methods.


Tracking individual animals in large collective groups can give interesting insights to behavior, but has proven to be a challenge for analysis. With advances in artificial intelligence and tracking software, it has become increasingly easier to collect such information from video data. Ferrero et al. have developed an algorithm and tracking software that features two deep networks. The first tracks animal identification and the second tracks when animals touch or cross paths in front of one another. The software has been validated to track individuals with high accuracy in cohorts of up to 100 animals with diverse species from rodents to zebrafish to ants. This software is free, fully-documented and available online with additional jupyter notebooks for data analysis.

Check out their website with full documentation, the recent Nature Methods article, BioRXiv preprint, and a great video of idtracker.ai tracking 100 zebrafish!


Dual-port Lick Detector

January 16, 2019

In the Journal of Neurophysiology, Brice Williams and colleagues have  shared their design for a novel dual-port lick detector. This device can be used for both real-time measurement and manipulation of licking behavior in head-fixed mice.


Measuring licking behavior in mice provides a valuable metric of sensory-motor processing and can be nicely paired with simultaneous neural recordings. Williams and colleagues have developed their own device for precise measuring of licking behavior as well as for manipulating this behavior in real time. To address limitations of many available lick sensors, the authors designed their device to be smaller (appropriate for mice), contactless (to diminish electric artifacts for neural recording), and precise to a submillisecond timescale. This dual-port detector can be implemented to detect directional licking behavior during sensory tasks and can be used in combination with neural recording. Further, given the submillisecond precision of this device, it can be used in a closed-loop system to perturb licking behaviors via neural inhibition. Overall, this dual-port lick detector is a cost-effective, replicable solution that can be used in a variety of applications.

Learn how to build your own here!

And be sure to check out their Github.


DeepSqueak

January 9, 2019

Kevin Coffey has shared the following about DeepSqueak, a deep learning-based system for detection and analysis of ultrasonic vocalizations, which he developed with Russell Marx.


Rodents engage in social communication through a rich repertoire of ultrasonic vocalizations (USVs). Recording and analysis of USVs can be performed noninvasively in almost any rodent behavioral model to provide rich insights into the emotional state and motor function. Despite strong evidence that USVs serve an array of communicative functions, technical and financial limitations have inhibited widespread adoption of vocalization analysis. Manual USV analysis is slow and laborious, while existing automated analysis software are vulnerable to broad spectrum noise routinely encountered in the testing environment.

To promote accessible and accurate USV research, we present “DeepSqueak”, a fully graphical MATLAB package for high-throughput USV detection, classification, and analysis. DeepSqueak applies state-of-the-art regional object detection neural networks (Faster-RCNN) to detect USVs. This dramatically reduces the false positive rate to facilitate reliable analysis in standard experimental conditions. DeepSqueak included pre-trained detection networks for mouse USVs, and 50 kHz and 22 kHz rat USVs. After detection, USVs can be clustered by k-means models or classified by trainable neural networks.

Read more in their recent publication and check out DeepSqueak on Github!


TRIO Platform

December 12, 2018

Vladislav Voziyanov and colleagues have developed and shared the TRIO Platform, a low-profile in vivo imaging support and restraint system for mice.


In vivo optical imaging methods are common tools for understanding neural function in mice. This technique is often performed in head-fixed,  anesthetized animals, which requires monitoring of anesthesia level and body temperature while stabilizing the head. Fitting each of the components necessary for these experiments on a standard microscope stage can be rather difficult. Voziyanov and colleagues have shared their design for the TRIO (Three-In-One) Platform. This system is compact and  provides sturdy head fixation, a gas anesthesia mask, and warm water bed. While the design is compact enough to work with a variety of microscope stages, the use of 3D printed components makes this design customizable.

https://www.frontiersin.org/files/Articles/184541/fnins-10-00169-HTML/image_m/fnins-10-00169-g004.jpg

Read more about the TRIO Platform in Frontiers in Neuroscience!

The design files and list of commercially available build components are provided here.


Live Mouse Tracker

December 5, 2018

In a recent preprint, Fabrice de Chaumont and colleagues share Live Mouse Tracker, a real-time behavioral analysis system for groups of mice.


Monitoring social interactions of mice is an important aspect to understand pre-clinical models of various psychiatric disorders, however, gathering data on social behaviors can be time-consuming and often limited to a few subjects at a time. With advances in computer vision, machine learning, and individual identification methods, gathering social behavior data from many mice is now easier. de Chaumont and colleagues have developed Live Mouse Tracker which allows for behavior tracking for up to 4 mice at a time with RFID sensors. The use of infrared/depth RGBD cameras allow for tracking of animal shape and posture. This tracking system automatically labels behaviors on an individual, dyadic, and group level. Live Mouse Tracker can be used to assess complex social behavioral differences between mice.

Learn more on BioRXiv, or check out the Live Mouse Tracker website!


PsiBox: Automated Operant Conditioning in the Mouse Home Cage

November 30, 2018

Nikolas Francis and Patrick Kanold of the University of Maryland share their design for Psibox, a platform for automated operant conditioning in the mouse home cage, in Frontiers in Neural Circuits.


The ability to collect behavioral data from large populations of subjects is advantageous for advancing behavioral neuroscience research. However, few cost-effective options are available for collecting large sums of data especially for operant behaviors. Francis and Kanold have developed and shared Psibox,  an automated operant conditioning system. It incorporates three modules for central control , water delivery, and home cage interface, all of which can be customized with different parts. The system was validated for training mice in a positive reinforcement auditory task and can be customized for other tasks as well. The full, low-cost system allows for quick training of groups of mice in an operant task with little day-to-day experimenter involvement.

Learn how to set up your own Psibox system here!


Francis, NA., Kanold, PO., (2017). Automated operant conditioning in the mouse home cage. Front. Neural Circuits.

FreemoVR: virtual reality for freely moving animals

November 14, 2018

John Stowers and colleagues from the Straw Lab at the University of Frieburg have developed and shared FreemoVR, a virtual reality set-up for unrestrained animals.


Virtual reality (VR) systems can help to mimic nature in behavioral paradigms, which help us to understand behavior and brain function. Typical VR systems require that animals are movement restricted, which limits natural responses. The FreemoVR system was developed to address these issues and allows for virtual reality to be integrated with freely moving behavior. This system can be used with a number of different species including mice, zebrafish, and Drosophila. FreemoVR has been validated to investigate several behavior in tests of height-aversion, social interaction, and visuomotor responses in unrestrained animals.

 

Read more on the Straw Lab site, Nature Methods paper, or access the software on Github.


Q&A with Dr. Mackenzie Mathis on her experience with developing DeepLabCut

August 22, 2018

Dr. Mackenzie Mathis, Principal Investigator of the Adaptive Motor Control Lab (Rowland Institute at Harvard University), has shared the following responses to a short Q&A about the inspiration behind, development of and sharing of DeepLabCut — a toolbox for animal tracking using deep-learning.


What inspired you and your colleagues to create this toolbox as opposed to using previously developed commercial software?

Alexander Mathis and I both worked on behaviors where we wanted to track particular features, and they proved to be unreliably tracked with the methods we tried. Specifically, Alexander has an odor-guided navigation task that he works on in the lab of Prof. Venkatesh Murthy at Harvard, where the mice are placed in a very large “endless” paper trail and he inkjet prints odors for them to follow to get rewards (chocolate milk). The position of the snout is very important to measure accurately, so background subtraction or other heuristics didn’t work when the nose crossed the trail and when the droplet was right in front of the snout. I worked on a skilled joystick behavior for mice, and I wanted to track joints accurately and non-invasively – a challenging problem for little hands. So, we teamed up with Prof. Matthias Bethge at the University of Tuebingen, to work on a new approach. He suggested we start looking into the rapidly advancing human pose estimation literature, and we looked at several before deciding to seriously benchmark DeeperCut, a top performing algorithm in the large MPII dataset. Those authors did something very clever, namely, they used a deep neural network (ResNet) that was pre-trained on a large image set called ImageNet. This gives the ResNet a chance to learn natural scene statistics first. Remarkably, we found that we could use only a few frames to very accurately track the snout in the odor-guided navigation task, so we next tried videos from my joystick task, and to flex DeepLabCut’s muscles, we teamed up with Kevin Cury (who, like myself was an alumni of Prof. Nao Uchida’s group) to track fruit flies in the 3D chamber. After all this benchmarking, we built a toolbox that implements a complete pipeline to extract and label frames, train and evaluate the deep neural nets, as well as analyze new experimental videos.  We call this toolbox DeepLabCut, as a nod to DeeperCut.

What was the motivation for immediately sharing your work as an open source tool, thus making it accessible to the broader neuroscience community?

Some of the options we first tried to track with were very expensive commercial systems, and they failed quite badly. On the other hand, deep learning has revolutionized computer vision in the last few years, so we were eager to try some new approaches to solve the problem. So, in addition to being advocates of open science, we really wanted to make a toolbox that someone with minimal to no coding experience could, absolutely for free, track whatever they wanted.

We also know peer review can be slow, so as soon as we had the toolbox in place, we wrote up the arxiv paper and released the code base immediately. Honestly, it has been one of my most rewarding papers – the feedback from our peers, and seeing what people have used the code for, has been a very rewarding experience. This was my first preprint, and especially for methods manuscripts, I now cannot imagine another way to share our future work too.

How do you think open source tools, such as yours, will continue to impact the progress of scientific research?

Open source code and preprints have been the norm in some fields for decades (such as math and physics), and I am really excited to see it come of age in biology and neuroscience. I am excited to see how tools will continue to improve as the community gets behind them, just as we could build on DeeperCut, which was open source. Also, at least in my experience, many individuals write their own code, which leads to a lot of duplicated efforts. Moreover, datasets are becoming increasingly more complicated and code to work with such data need to be robust shared. My expectation is that open source code will become the norm in the future, which can only help science become more robust.

Even before formal publication this week (see Nature Neuroscience), we estimate that about 100 labs are actively using DeepLabCut, so releasing the code before publication, we hope,  has really allowed for rapid progress to be made. We were also very happy that The Atlantic could highlight some of the early adopters, as it’s one thing to say you made something, but it’s another to hear others saying it is actually ‘something.’


DeepLabCut provides an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. Read more on the website, or in Nature Neuroscience.

 


An opensource lickometer and microstructure analysis program

August 8, 2018

In HardwareX, an open access journal for designing, building and customizing opensource scientific hardware, Martin A. Raymond and colleagues share their design for a user-constructed, low-cost lickometer.


Researchers interested in ingestive behaviors of rodents commonly use licking behavior as a readout for the amount of fluid a subject consumes, as recorded by a lickometer. Commercially available lickometers are powerful tools to measure this behavior, but can be expensive and often require further customization. The authors offer their own design for an opensource lickometer that utilizes readily available or customizable components such as a PC sound card and 3D printed drinking bottle holder. The data from this device is collected by Audacity, and opensource audio program, which is then converted to a .csv format which can be analyzed using an R script made available by the authors to assess various features of licking microstructure. A full bill of materials, instructions for assembly and links to design files are available in the paper.

Check out the full publication here!


Raymond, M. A., Mast, T. G., & Breza, J. M. (2018). An open-source lickometer and microstructure analysis program. HardwareX, 4. doi:10.1016/j.ohx.2018.e00035