Category: Behavior Tracking

Autopilot

DECEMBER 12, 2019

Jonny Saunders from Michael Wehr’s lab at the University of Oregon recently posted a preprint documenting their project Autopilot, which is a python framework for running behavioral experiments:


Autopilot is a python framework for behavioral experiments through utilizing Raspberry Pi microcontrollers. Autopilot incorporates all aspects of an experiment, including the hardware, stimuli, behavioral task paradigm, data management, data visualization, and a user interface. The authors propose that Autopilot is the fastest, least expensive, most flexibile behavioral system that is currently available.

The benefit of using Autopilot is that it allows more experimental flexibility, which lets researchers to optimize it for their specific experimental needs. Additionally, this project exemplifies how useful a raspberry pi can be for performing experiments and recording data. The preprint discusses many benefits of raspberry pis, including their speed, precision and proper data logging, and they only cost $35 (!!). Ultimately, the authors developed Autopilot in an effort to encourage users to write reusable, portable experiments that is put into a public central library to push replication and reproducibility.

 

For more information, check out their presentation or the Autopilot website here.

Additionally documentation is here, along with a github repo, and a link to their preprint is here.


Oat: online animal tracker

December 5, 2019

Jonathan Newman of the Wilson Lab at Massachusetts Institute of Technology has developed and shared a set of programs for processing video.


The only thing you’ll enjoy more than an oat milk latte from your favorite coffeeshop is the Oat collection of video processing components designed for use with Linux! Developed by Jonathan Newman of Open Ephys, this set of programs is useful for processing video, extracting object position information, and streaming data. While it was designed for use with real-time animal position tracking, it can be used in multiple applications that require real-time object tracking. The individual Oat components each have a standard interface that can be chained together to create complex dataflow networks for capturing  processing, and recording video streams.

Read more about Oat on the Open Ephys website, or check out more on Github!


https://open-ephys.org/oat

Touchscreen Cognition and MouseBytes

NOVEMBER 21, 2019

Tim Bussey and Lisa Saksida from Western University and the BrainsCAN group developed touchscreen device chambers that can be used to measure rodent behavior. While the touchscreens themselves are not an open-source device, we appreciate the open-science push for creating a user community, performing workshops and tutorials, and data sharing. Most notably, their sister project, MouseBytes, is an open-access database for all cognitive data collected from the touchscreen-related tasks:


Touchscreen History:

In efforts to develop a cognitive testing method for rodents that would optimally reflect a touchscreen testing method in humans, Bussey et al., (1994, 1997a,b) developed a touchscreen apparatus for rats, which was subsequently adapted for mice as well. In short, the touchscreens allow for computer-aided graphics to be presented to a rodent and the rodent can make choices in a task based on which stimuli appear. The group published a “tutorial” paper detailing the behavior and proper training methods to get rats to perform optimally using these devices (Bussey et al., 2008). Additionally, in 2013, three separate Nature Protocols articles were published by this group, with details on how to use the touchscreens in tasks assessing executive function, learning and memory, and working memory and pattern separation in rodents (Horner et al., 2013; Mar et al., 2013; Oomen et al., 2013).

Most recently, the group has developed https://touchscreencognition.org/ which is a place for user forums, discussion, training information, etc. The group is actively doing live training sessions as well for anyone interested in using touchscreens in their tasks. Their twitter account, @TouchScreenCog, highlights recent trainings as well. Through developing automated tests for specific behaviors, this data can be extrapolated across labs and tasks.


MouseBytes:

Additionally, MouseBytes is an open-access database where scientists can upload their data to, or can analyze other data already collected from another group. Not only does this reduce redundancy of experiments, but also allows for transparency and reproducibility for the community. The site also performs data comparison and interactive data visualization for any data uploaded onto the site. There are also guidelines and video tutorials on the site as well.


Nature Protocols Tutorials:

Horner, A. E., Heath, C. J., Hvoslef-Eide, M., Kent, B. A., Kim, C. H., Nilsson, S. R., … & Bussey, T. J. (2013). The touchscreen operant platform for testing learning and memory in rats and mice. Nature protocols, 8(10), 1961.

Mar, A. C., Horner, A. E., Nilsson, S. R., Alsiö, J., Kent, B. A., Kim, C. H., … & Bussey, T. J. (2013). The touchscreen operant platform for assessing executive function in rats and mice. Nature protocols, 8(10), 1985.

Oomen, C. A., Hvoslef-Eide, M., Heath, C. J., Mar, A. C., Horner, A. E., Bussey, T. J., & Saksida, L. M. (2013). The touchscreen operant platform for testing working memory and pattern separation in rats and mice. Nature protocols, 8(10), 2006.

Original Touchscreen Articles:

Bussey, T. J., Muir, J. L., & Robbins, T. W. (1994). A novel automated touchscreen procedure for assessing learning in the rat using computer graphic stimuli. Neuroscience Research Communications, 15(2), 103-110.

Bussey, T. J., Padain, T. L., Skillings, E. A., Winters, B. D., Morton, A. J., & Saksida, L. M. (2008). The touchscreen cognitive testing method for rodents: how to get the best out of your rat. Learning & memory, 15(7), 516-523.

 

You can buy the touchscreens here.

 

Editor’s Note: We understand that Nature Protocols is not an open-access journal and that the touchscreens must be purchased from a commercial company and are not technically open-source. However, we appreciate the group’s ongoing effort to streamline data across labs, to put on training workshops, and to provide an open-access data repository for this type of data.

The OpenMV project: Machine Vision with Python

November 14, 2019

OpenMV – Better, Stronger, Faster and only $65 USD


Recent updates to the firmware for the OpenMV H7 Camera have brought some new functionality to this device, which is popular for open source neuroscience projects (e.g. Rodent Arena Tracker, or RAT: https://hackaday.io/project/162481-rodent-arena-tracker-rat). The new firmware allows for use of the popular TensorFlow library for machine learning on this MicroPython-based device. It’s small (1.5 by 1.75 inches), consumes only a max of 140 mA when processing data, has 1 MB of RAM and 2 MB of flash, and runs 64-bits computations at 4800 MHz (3.84 GB/s bandwidth). OpenMV is capable of frame differencing, color tracking, marker tracking, face detection, eye tracking, person detection (with TensorFlow Lite), and more. The project supports a very easy to use GUI, the OpenMV IDE. It’s intuitive to use, and offers a number of ready to go applications. Arduino users will feel right at home, despite the code being Python based.

Check out the project here: https://openmv.io/.

An automated behavioral box to assess forelimb function in rats

November 7, 2019

Chelsea C. Wong and colleagues at the University of California – San Francisco have developed and shared a design for an open-source behavioral chamber for the measurement of forelimb function in rats.


Forelimb function (reaching, grasping, retrieving, etc) is a common readout of behaviors for studying neural correlates of motor learning, neural plasticity and recovery from injury. One version of the task used commonly to study these behaviors, the Whishaw single-pellet reach-to-grasp task, traditionally requires an experimenter to manually present each pellet and to shape the behavior rats by placing a subsequent pellet only when the rat has relocated to the other end of the cage over multiple trials. Wong et al. developed an open source, low-cost, automated high-throughput version of this task. The behavioral apparatus, constructed out of commercially available acrylic sheets, features a custom built pellet dispenser, cameras and IR detectors for measuring position of a rat and position of a pellet, and an Arduino board to integrate information about the animal with dispensing of the pellet. Code for automation of the task was built in MATLAB and includes a GUI for altering experiment parameters. Data collected can be analyzed using MATLAB, excel, or most other statistical programming languages. The authors provide example data from the device to highlight its potential use for combining this reaching task with chronic electrophysiological recording techniques. The full design is available in their publication in Journal of Neuroscience Methods.

Check out the full publication here!


Wong, C. C., Ramanathan, D. S., Gulati, T., Won, S. J., & Ganguly, K. (2015). An automated behavioral box to assess forelimb function in rats. Journal of Neuroscience Methods, 246, 30–37. doi: 10.1016/j.jneumeth.2015.03.008

Automated Home-Cage Rodent Two-bottle Choice Test: open-source success story

October 31, 2019

Elizabeth Godynyuk and colleagues from the Creed Lab at Washington University, St. Louis recently published their design for a two-bottle choice homecage apparatus in eNeuro. It incorporates the original design (published on Hackaday.io in May 2018), modifications from Jude Frie and Jibran Khokar (Frie & Khokhar, 2019), and additional improvements over the course of use. This project is a great example of collaborative open-source tool development.


Studies of liquid ingestive behaviors are used in neuroscience to investigate reward-related behavior, metabolism, and circadian biology. Accurate measurement of these behaviors are needed when studying drug administration, preference between two substances, and measuring caloric intake. To measure consummatory behavior in mice between two liquids, members of the Creed lab designed a low-cost and arduino-based device to automatically measure consumption in a homecage two-bottle choice test. Posted to Hackaday in May 2018, the initial version of the device used photointerrupters to measure time at the sipper, 15 mL conical tubes for volumetric measurements of fluid, and a 3D printed holder for the apparatus. Data from the photobeams are recorded to an SD card using a standard Arduino. In August 2018, the project was updated to Version 2, to make it battery powered and include a screen to display data. They made the editable TinkerCAD design available on hackaday.io.

In October 2018, Dr. Jibran Khokhar and colleagues at the University of Guelph posted a project log highlighting the modifications making the device larger and suitable for studying liquid intake in rats. This updated design was published in April 2019 in HardwareX. This device gives the advantage of being able to analyze the drinking microstructure by recording licking behavior and volume consumed in real time. Modifications include larger liquid reservoirs and adding a hydrostatic depth sensor, allowing each bout of drinking to correspond to a specific change in volume.

In current day, Elizabeth Godynyuk and colleagues from the Creed lab have shared their own updated version of the device in eNeuro. It remains low-cost and open-course and results validating the device with preference testing are shared. Furthermore, the authors show that the two-bottle choice test apparatus can be integrated with a fiber photometry system. In the eNeuro article, Godynuyuk et al. cite Frie and Khokhar’s modifications to highlight how the design can be easily adjusted to fit investigator needs.

These two projects show how open source projects can be modified and how different groups can collaborate to improve upon designs. This shows how open source projects allow research groups can modify designs to best address their research questions instead of forming their research questions based on the commercial tools available.

Creed Lab Version 1: https://hackaday.io/project/158279-automated-mouse-homecage-two-bottle-choice-test

Creed Lab Version 2: https://hackaday.io/project/160388-automated-mouse-homecage-two-bottle-choice-test-v2

Frie and Khokar 2019 (HardwareX): https://www.sciencedirect.com/science/article/pii/S2468067219300045#b0005

Godynyuk et al 2019 (eNeuro): https://www.eneuro.org/content/6/5/ENEURO.0292-19.2019.long


Frie, J. A., & Khokhar, J. Y. (2019). An open source automated two-bottle choice test apparatus for rats. HardwareX, 5, e00061. https://doi.org/10.1016/j.ohx.2019.e00061

Godynyuk, E., Bluitt, M. N., Tooley, J. R., Kravitz, A. V., & Creed, M. C. (2019). An Open-Source, Automated Home-Cage Sipper Device for Monitoring Liquid Ingestive Behavior in Rodents. Eneuro, 6(5), ENEURO.0292-19.2019. https://doi.org/10.1523/ENEURO.0292-19.2019

Updates on LocoWhisk and ART

OCTOBER 3, 2019

Dr Robyn Grant from Manchester Metropolitan University in Manchester, UK has shared her group’s most recent project called LocoWhisk, which is a hardware and software solution for measuring rodent exploratory, sensory and motor behaviours:


In describing the project, Dr Grant writes, “Previous studies from our lab have shown that that analysing whisker movements and locomotion allows us to quantify the behavioural consequences of sensory, motor and cognitive deficits in rodents. Independent whisker and feet trackers existed but there was no fully-automated, open-source software and hardware solution, that could measure both whisker movements and gait.

We developed the LocoWhisk arena and new accompanying software, that allows the automatic detection and measurement of both whisker and gait information from high-speed video footage. The arena can easily be made from low-cost materials; it is portable and incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source).

The software, ARTv2 is freely available and open source. ARTv2 is also fully-automated and has been developed from our previous ART software (Automated Rodent Tracker).

ARTv2 contains new whisker and foot detector algorithms. On high-speed video footage of freely moving small mammals (including rat, mouse and opossum), we have found that ARTv2 is comparable in accuracy, and in some cases significantly better, than readily available software and manual trackers.

The LocoWhisk system enables the collection of quantitative data from whisker movements and locomotion in freely behaving rodents. The software automatically records both whisker and gait information and provides added statistical tools to analyse the data. We hope the LocoWhisk system and software will serve as a solid foundation from which to support future research in whisker and gait analysis.”

For more details on the ARTv2 software, check out the github page here.

Check out the paper that describes LocoWhisk and ARTv2, which has recently been published in the Journal of Neuroscience Methods.

LocoWhisk was initially shared and developed through the NC3Rs CRACK IT website here.


SpikeGadgets

AUGUST 22, 2019

We’d like to highlight groups and companies that support an open-source framework to their software and/or hardware in behavioral neuroscience. One of these groups is SpikeGadgets, a company co-founded by Mattias Karlsson and Magnus Karlsson.


SpikeGadgets is a group of electrophysiologists and engineers who are working to develop neuroscience hardware and software tools. Their open-source software, Trodes, is a cross-platform software suite for neuroscience data acquisition and experimental control, which is made up of modules that communicate with a centralized GUI to visualize and save electrophysiological data. Trodes has a camera module and a StateScript module, which is a state-based scripting language that can be used to program behavioral tasks through using lights, levels, beam breaks, lasers, stimulation sources, audio, solenoids, etc. The camera module can be used to acquire video that can synchronize to neural recordings; the camera module can track the animal’s position in real-time or play it back after the experiment. The camera module can work with USB webcams or GigE cameras.

Paired with the Trodes software and StateScript language is the SpikeGadgets hardware that can be purchased on their website. The hardware is used for data acquisition (Main Control Unit, used for electrophysiology) and behavioral control (Environmental Control Unit).  SpikeGadgets also provides both Matlab and Python toolboxes on their site that can be used to analyze both behavioral and electrophysiological data. Trodes can be used on Windows, Linux, or Mac, and there are step-by-step instructions for how to install and use Trodes on the group’s bitbucket page.

Spikegadgets mission is “to develop the most advanced neuroscience tools on the market, while preserving ease of use and science-driven customization.”

 


For more information on SpikeGadgets or to download or purchase their software or hardware, check out their website here.

There is additional documentation on their BitBucket Wiki, with a user manual, instructions for installation, and FAQ.

Check out their entire list of collaborators, contributors, and developers here.

Pathfinder

AUGUST 8, 2019

Matthew Cooke and colleagues from Jason Snyder’s lab at University of British Columbia recently developed open source software to detect spatial navigation behavior in animals called Pathfinder:


Spatial navigation is studied across several different paradigms for different purposes in animals; through analyzing spatial behaviors we can gain insight into how an animal learns a task, how they change their approach strategy, and generally observing goal-directed behaviors. Pathfinder is an open source software that can analyze rodent navigation. The software intends to automatically classify patterns of navigation as a rodent performs in a task. Pathfinder can analyze subtle patterns in spatial behavior that simple analysis measures may not always be able to pick up on. Specifically, many water maze analyses use escape latency or path length as an analysis measure, but the authors point out that the time it takes to reach the platform may not differ while the strategy does, so using latency may not be the most optimal measure for analyzing an animal’s strategy and therefore experimenters may miss out on key differences in behavior. Therefore, Pathfinder aims to analyze more subtle aspects of the task to determine differences in spatial navigation and strategy.

Originally intended for water maze navigation, pathfinder can also be used to analyze many other spatial behaviors across different tasks, mazes, and species. The software takes x-y coordinates from behavior tracking software (for example, it can open files from Noldus Ethovision, ActiMetrics’ Watermaze, Stoelting’s Anymaze, and the open-source project ezTrack from Denise Cai’s lab), and then calculates the best-fit search strategy for each rodent’s trial. For the morris water maze task, trials are fit into several categories: Direct Swim, Directed Search, Focal Search, Spatial indirect, Chaining, Scanning, Thigmotaxis, and Random Search.

Pathfinder runs in Python and has an easy-to-use GUI; many aspects and parameters can be adjusted to analyze different tasks or behaviors.

For more details, check out their BioRxiV preprint here.

There’s a nice (humorous!) writeup of the project on the Snyder Lab website.

You can also download the project and view more details on their github:
https://matthewbcooke.github.io/Pathfinder/

https://github.com/MatthewBCooke/Pathfinder/


RAD

August 1, 2019

In their recent eNeuro article, Bridget Matikainen-Ankney and colleagues from the Kravitz Lab have developed and shared their device, rodent activity detector (RAD), a low-cost system that can track and record activity in rodent home cages.


Physical activity is an important measure used in many research studies and is an important determinant of human health. Current methods for measuring physical activity in laboratory rodents have limitations including high expense, specialized caging/equipment, and high computational overhead. To address these limitations, Matikainen-Ankney et al. designed an open-source and cost-effective device for measuring rodent behavior.

In their new manuscript, they describe the design and implementation of RAD, rodent activity detector. The system allows for high throughput installation, minimal investigator intervention and circadian monitoring.  The design includes a battery powered passive infrared (PIR) sensor, microcontroller, microSD card logger, and an oLED screen for displaying data. All of the build instructions for RAD manufacture and programming, including the Arduino code, are provided on the project’s website.

The system records the number of PIR active bouts and the total duration the PIR is active each minute. The authors report that RAD is useful for quantifying changes across minutes rather than on a second to second time-scale, so the default data-logging frequency is set to one minute. The CSV files can be viewed and data visualized using provided python scripts. Device validation with video monitoring strongly correlated PIR data with speed and showed it recorded place to place locomotion but not slow or in place movements. To verify the device’s utility, RAD was used to collect physical activity data from 40 animals for 10 weeks. RAD detected high fat diet (HFD)-induced changes in activity and quantified individual animals’ circadian rhythms. Several major advantages of this tool are that the PIR sensor is not triggered by activity in other cages, it can detect and quantify within-mouse activity changes over time, and little investigator intervention other than infrequent battery replacement is necessary. Although the design was optimized for the lab’s specific caging, the open-source nature of the project makes it easily modifiable.

More details on RAD can be found in their eNeuro manuscript here, and all documentation can also be found on the project’s Hackaday.io page.


Matikainen-Ankney, B. A., Garmendia-Cedillos, M., Ali, M., Krynitsky, J., Salem, G., Miyazaki, N. L., … Kravitz, A. V. (2019). Rodent Activity Detector (RAD), an Open Source Device for Measuring Activity in Rodent Home Cages. ENeuro, 6(4). https://doi.org/10.1523/ENEURO.0160-19.2019