Tag: rodent

Touchscreen Cognition and MouseBytes

NOVEMBER 21, 2019

Tim Bussey and Lisa Saksida from Western University and the BrainsCAN group developed touchscreen device chambers that can be used to measure rodent behavior. While the touchscreens themselves are not an open-source device, we appreciate the open-science push for creating a user community, performing workshops and tutorials, and data sharing. Most notably, their sister project, MouseBytes, is an open-access database for all cognitive data collected from the touchscreen-related tasks:


Touchscreen History:

In efforts to develop a cognitive testing method for rodents that would optimally reflect a touchscreen testing method in humans, Bussey et al., (1994, 1997a,b) developed a touchscreen apparatus for rats, which was subsequently adapted for mice as well. In short, the touchscreens allow for computer-aided graphics to be presented to a rodent and the rodent can make choices in a task based on which stimuli appear. The group published a “tutorial” paper detailing the behavior and proper training methods to get rats to perform optimally using these devices (Bussey et al., 2008). Additionally, in 2013, three separate Nature Protocols articles were published by this group, with details on how to use the touchscreens in tasks assessing executive function, learning and memory, and working memory and pattern separation in rodents (Horner et al., 2013; Mar et al., 2013; Oomen et al., 2013).

Most recently, the group has developed https://touchscreencognition.org/ which is a place for user forums, discussion, training information, etc. The group is actively doing live training sessions as well for anyone interested in using touchscreens in their tasks. Their twitter account, @TouchScreenCog, highlights recent trainings as well. Through developing automated tests for specific behaviors, this data can be extrapolated across labs and tasks.


MouseBytes:

Additionally, MouseBytes is an open-access database where scientists can upload their data to, or can analyze other data already collected from another group. Not only does this reduce redundancy of experiments, but also allows for transparency and reproducibility for the community. The site also performs data comparison and interactive data visualization for any data uploaded onto the site. There are also guidelines and video tutorials on the site as well.


Nature Protocols Tutorials:

Horner, A. E., Heath, C. J., Hvoslef-Eide, M., Kent, B. A., Kim, C. H., Nilsson, S. R., … & Bussey, T. J. (2013). The touchscreen operant platform for testing learning and memory in rats and mice. Nature protocols, 8(10), 1961.

Mar, A. C., Horner, A. E., Nilsson, S. R., Alsiö, J., Kent, B. A., Kim, C. H., … & Bussey, T. J. (2013). The touchscreen operant platform for assessing executive function in rats and mice. Nature protocols, 8(10), 1985.

Oomen, C. A., Hvoslef-Eide, M., Heath, C. J., Mar, A. C., Horner, A. E., Bussey, T. J., & Saksida, L. M. (2013). The touchscreen operant platform for testing working memory and pattern separation in rats and mice. Nature protocols, 8(10), 2006.

Original Touchscreen Articles:

Bussey, T. J., Muir, J. L., & Robbins, T. W. (1994). A novel automated touchscreen procedure for assessing learning in the rat using computer graphic stimuli. Neuroscience Research Communications, 15(2), 103-110.

Bussey, T. J., Padain, T. L., Skillings, E. A., Winters, B. D., Morton, A. J., & Saksida, L. M. (2008). The touchscreen cognitive testing method for rodents: how to get the best out of your rat. Learning & memory, 15(7), 516-523.

 

You can buy the touchscreens here.

 

Editor’s Note: We understand that Nature Protocols is not an open-access journal and that the touchscreens must be purchased from a commercial company and are not technically open-source. However, we appreciate the group’s ongoing effort to streamline data across labs, to put on training workshops, and to provide an open-access data repository for this type of data.

Updates on LocoWhisk and ART

OCTOBER 3, 2019

Dr Robyn Grant from Manchester Metropolitan University in Manchester, UK has shared her group’s most recent project called LocoWhisk, which is a hardware and software solution for measuring rodent exploratory, sensory and motor behaviours:


In describing the project, Dr Grant writes, “Previous studies from our lab have shown that that analysing whisker movements and locomotion allows us to quantify the behavioural consequences of sensory, motor and cognitive deficits in rodents. Independent whisker and feet trackers existed but there was no fully-automated, open-source software and hardware solution, that could measure both whisker movements and gait.

We developed the LocoWhisk arena and new accompanying software, that allows the automatic detection and measurement of both whisker and gait information from high-speed video footage. The arena can easily be made from low-cost materials; it is portable and incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source).

The software, ARTv2 is freely available and open source. ARTv2 is also fully-automated and has been developed from our previous ART software (Automated Rodent Tracker).

ARTv2 contains new whisker and foot detector algorithms. On high-speed video footage of freely moving small mammals (including rat, mouse and opossum), we have found that ARTv2 is comparable in accuracy, and in some cases significantly better, than readily available software and manual trackers.

The LocoWhisk system enables the collection of quantitative data from whisker movements and locomotion in freely behaving rodents. The software automatically records both whisker and gait information and provides added statistical tools to analyse the data. We hope the LocoWhisk system and software will serve as a solid foundation from which to support future research in whisker and gait analysis.”

For more details on the ARTv2 software, check out the github page here.

Check out the paper that describes LocoWhisk and ARTv2, which has recently been published in the Journal of Neuroscience Methods.

LocoWhisk was initially shared and developed through the NC3Rs CRACK IT website here.


RAD

August 1, 2019

In their recent eNeuro article, Bridget Matikainen-Ankney and colleagues from the Kravitz Lab have developed and shared their device, rodent activity detector (RAD), a low-cost system that can track and record activity in rodent home cages.


Physical activity is an important measure used in many research studies and is an important determinant of human health. Current methods for measuring physical activity in laboratory rodents have limitations including high expense, specialized caging/equipment, and high computational overhead. To address these limitations, Matikainen-Ankney et al. designed an open-source and cost-effective device for measuring rodent behavior.

In their new manuscript, they describe the design and implementation of RAD, rodent activity detector. The system allows for high throughput installation, minimal investigator intervention and circadian monitoring.  The design includes a battery powered passive infrared (PIR) sensor, microcontroller, microSD card logger, and an oLED screen for displaying data. All of the build instructions for RAD manufacture and programming, including the Arduino code, are provided on the project’s website.

The system records the number of PIR active bouts and the total duration the PIR is active each minute. The authors report that RAD is useful for quantifying changes across minutes rather than on a second to second time-scale, so the default data-logging frequency is set to one minute. The CSV files can be viewed and data visualized using provided python scripts. Device validation with video monitoring strongly correlated PIR data with speed and showed it recorded place to place locomotion but not slow or in place movements. To verify the device’s utility, RAD was used to collect physical activity data from 40 animals for 10 weeks. RAD detected high fat diet (HFD)-induced changes in activity and quantified individual animals’ circadian rhythms. Several major advantages of this tool are that the PIR sensor is not triggered by activity in other cages, it can detect and quantify within-mouse activity changes over time, and little investigator intervention other than infrequent battery replacement is necessary. Although the design was optimized for the lab’s specific caging, the open-source nature of the project makes it easily modifiable.

More details on RAD can be found in their eNeuro manuscript here, and all documentation can also be found on the project’s Hackaday.io page.


Matikainen-Ankney, B. A., Garmendia-Cedillos, M., Ali, M., Krynitsky, J., Salem, G., Miyazaki, N. L., … Kravitz, A. V. (2019). Rodent Activity Detector (RAD), an Open Source Device for Measuring Activity in Rodent Home Cages. ENeuro, 6(4). https://doi.org/10.1523/ENEURO.0160-19.2019

DeepBehavior

June 20, 2019

Ahmet Arac from Peyman Golshani’s lab at UCLA recently developed DeepBehavior, a deep-learning toolbox with post processing methods for video analysis of behavior:


Recently, there has been a major push for more fine-grained and detailed behavioral analysis in the field of neuroscience. While there are methods for taking high-speed quality video to track behavior, the data still needs to be processed and analyzed. DeepBehavior is a deep learning toolbox that automates this process, as its main purpose is to analyze and track behavior in rodents and humans.

The authors provide three different convolutional neural network models (TensorBox, YOLOv3, and OpenPose) which were chosen for their ease of use, and the user can decide which model to implement based on what experiment or what kind of data they aim to collect and analyze. The article provides methods and tips on how to train neural networks with this type of data, and gives methods for post-processing of image data.

In the manuscript, the authors give examples of utilizing DeepBehavior in five behavioral tasks in both animals and humans. For rodents, they use a food pellet reaching task, a three-chamber test, and social interaction of two mice. In humans, they use a reaching task and a supination / pronation task. They provide 3D kinematic analysis in all tasks, and show that the transfer learning approach accelerates network training when images from the behavior videos are used. A major benefit of this tool is that it can be modified and generalized across behaviors, tasks, and species. Additionally, DeepBehavior uses several different neural network architectures, and uniquely provides post-processing methods for 3D kinematic analysis, which separates it from previously published toolboxes for video behavioral analysis. Finally, the authors emphasize the potential for using this toolbox in a clinical setting with analyzing human motor function.

 

For more details, take a look at their project’s Github.

All three models used in the paper also have their own Github: TensorBox, YOLOv3, and openpose.


Arac, A., Zhao, P., Dobkin, B. H., Carmichael, S. T., & Golshani, P. (2019). DeepBehavior: A deep learning toolbox for automated analysis of animal and human behavior imaging data. Frontiers in systems neuroscience, 13.

 

Automated classification of self-grooming in mice

May 16, 2019

In the Journal of Neuroscience Methods, Bastijn van den Boom and colleagues have shared their ‘how-to’ instructions for implementing behavioral classification with JAABA, featuring bonsai and motr!


In honor of our 100th post on OpenBehavior, we wanted to feature a project that exemplifies how multiple open-source projects can be implemented to address a common theme in behavioral neuroscience: tracking and classifying complex behaviors! The protocol from Van den Boom et al.  implements JAABA, an open-source machine learning based behavior detection system; motr, an open-source mouse trajectory tracking software; and bonsai, an open-source system capable of streaming and recording video. Together they use these tools to process videos of mice performing grooming behaviors in a variety of behavioral setups.

They then compare multiple tools for analyzing grooming behavior sequences in both wild-type and genetic knockout mice with a tendency to over groom. The JAABA trained classifier outperforms the commercially available behavior analysis software and more closely aligns with manual analysis of behavior by expert observers. This offers a novel, cost-effective and easy to use method for assessing grooming behavior in mice comparable to that of an expert observer, with the efficient advantage of being automatic. How to instructions for how to train your own JAABA classifier can be found in their paper!

Read more in their publication here!


Phenopy

April 17, 2019

In a recent Nature Protocol’s article, Edoardo Balzani and colleagues from Valter Tucci’s lab have developed and shared Phenopy, a Python-based open-source analytical platform for behavioral phenotyping.


Behavioral phenotyping of mice using classic methods can be a long process and is susceptible to high variability, leading to inconsistent results. To reduce variance and speed up to process of behavioral analysis, Balzani et al. developed Phenopy, an open-source software for recording and analyzing behavioral data for phenotyping. The software allows for recording components of a behavioral task in combination with electrophysiology data. It is capable of performing online analysis as well as analysis of recorded data on a large scale, all within a user-friendly interface. Information about the software is available in their publication, available from Nature Protocols.*

Check out the full article from Nature Protocols!


(*alternatively available on ResearchGate)

Telemetry System for Recording EEG

March 29, 2019

In a 2011 Journal of Neuroscience Methods article, Pishan Chang and colleagues shared their design for an open-source, novel telemetry system for recording EEG in small animals.


EEG monitoring in freely-behaving small animals is a useful technique for observing natural fluctuations in neural activity over time. Monitoring frequencies above 80 Hz continuously over a period of weeks can be a challenge. Chang et al. have shared their design for a system that combines an implantable telemetric sensor, radio-frequency transmission, and an open-source data acquisition software to collect EEG data over a span of up to 8 weeks. Various modifications to the system  have increased the longevity of the device and reduced transmission noise to provide continuous and reliable data. Schematics of the device, transmission system, and validation results in a population of epileptic rodents are available in their publication.

 

Learn more from the Journal of Neuroscience Methods!


Actifield

March 21, 2019

Victor Wumbor-Apin Kumbol and colleagues have developed and shared Actifield, an automated open-source actimeter for rodents, in a recent HardwareX publication.


Measuring locomotor activity can be a useful readout for understanding effects of a number of experimental manipulations related to neuroscience research. Commercially available locomotor activity recording devices can be cost-prohibitive and often lack the ability to be customized to fit a specific lab’s needs. Kumbol et al. offer an open-source alternative that utilizes infrared motion detection and an arduino to record activity in a variety of chamber set ups. A full list of build materials, links to 3D-print and laser-cut files, and assembly instructions are available in their publication.

Read more from HardwareX!


idtracker.ai

February 20, 2019

Francisco Romero Ferrero and colleagues have developed idtracker.ai, an algorithm and software for tracking individuals in large collectives of unmarked animals, recently described in Nature Methods.


Tracking individual animals in large collective groups can give interesting insights to behavior, but has proven to be a challenge for analysis. With advances in artificial intelligence and tracking software, it has become increasingly easier to collect such information from video data. Ferrero et al. have developed an algorithm and tracking software that features two deep networks. The first tracks animal identification and the second tracks when animals touch or cross paths in front of one another. The software has been validated to track individuals with high accuracy in cohorts of up to 100 animals with diverse species from rodents to zebrafish to ants. This software is free, fully-documented and available online with additional jupyter notebooks for data analysis.

Check out their website with full documentation, the recent Nature Methods article, BioRXiv preprint, and a great video of idtracker.ai tracking 100 zebrafish!


Dual-port Lick Detector

January 16, 2019

In the Journal of Neurophysiology, Brice Williams and colleagues have  shared their design for a novel dual-port lick detector. This device can be used for both real-time measurement and manipulation of licking behavior in head-fixed mice.


Measuring licking behavior in mice provides a valuable metric of sensory-motor processing and can be nicely paired with simultaneous neural recordings. Williams and colleagues have developed their own device for precise measuring of licking behavior as well as for manipulating this behavior in real time. To address limitations of many available lick sensors, the authors designed their device to be smaller (appropriate for mice), contactless (to diminish electric artifacts for neural recording), and precise to a submillisecond timescale. This dual-port detector can be implemented to detect directional licking behavior during sensory tasks and can be used in combination with neural recording. Further, given the submillisecond precision of this device, it can be used in a closed-loop system to perturb licking behaviors via neural inhibition. Overall, this dual-port lick detector is a cost-effective, replicable solution that can be used in a variety of applications.

Learn how to build your own here!

And be sure to check out their Github.