Tag: video

Updates on LocoWhisk and ART

OCTOBER 3, 2019

Dr Robyn Grant from Manchester Metropolitan University in Manchester, UK has shared her group’s most recent project called LocoWhisk, which is a hardware and software solution for measuring rodent exploratory, sensory and motor behaviours:


In describing the project, Dr Grant writes, “Previous studies from our lab have shown that that analysing whisker movements and locomotion allows us to quantify the behavioural consequences of sensory, motor and cognitive deficits in rodents. Independent whisker and feet trackers existed but there was no fully-automated, open-source software and hardware solution, that could measure both whisker movements and gait.

We developed the LocoWhisk arena and new accompanying software, that allows the automatic detection and measurement of both whisker and gait information from high-speed video footage. The arena can easily be made from low-cost materials; it is portable and incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source).

The software, ARTv2 is freely available and open source. ARTv2 is also fully-automated and has been developed from our previous ART software (Automated Rodent Tracker).

ARTv2 contains new whisker and foot detector algorithms. On high-speed video footage of freely moving small mammals (including rat, mouse and opossum), we have found that ARTv2 is comparable in accuracy, and in some cases significantly better, than readily available software and manual trackers.

The LocoWhisk system enables the collection of quantitative data from whisker movements and locomotion in freely behaving rodents. The software automatically records both whisker and gait information and provides added statistical tools to analyse the data. We hope the LocoWhisk system and software will serve as a solid foundation from which to support future research in whisker and gait analysis.”

For more details on the ARTv2 software, check out the github page here.

Check out the paper that describes LocoWhisk and ARTv2, which has recently been published in the Journal of Neuroscience Methods.

LocoWhisk was initially shared and developed through the NC3Rs CRACK IT website here.


SpikeGadgets

AUGUST 22, 2019

We’d like to highlight groups and companies that support an open-source framework to their software and/or hardware in behavioral neuroscience. One of these groups is SpikeGadgets, a company co-founded by Mattias Karlsson and Magnus Karlsson.


SpikeGadgets is a group of electrophysiologists and engineers who are working to develop neuroscience hardware and software tools. Their open-source software, Trodes, is a cross-platform software suite for neuroscience data acquisition and experimental control, which is made up of modules that communicate with a centralized GUI to visualize and save electrophysiological data. Trodes has a camera module and a StateScript module, which is a state-based scripting language that can be used to program behavioral tasks through using lights, levels, beam breaks, lasers, stimulation sources, audio, solenoids, etc. The camera module can be used to acquire video that can synchronize to neural recordings; the camera module can track the animal’s position in real-time or play it back after the experiment. The camera module can work with USB webcams or GigE cameras.

Paired with the Trodes software and StateScript language is the SpikeGadgets hardware that can be purchased on their website. The hardware is used for data acquisition (Main Control Unit, used for electrophysiology) and behavioral control (Environmental Control Unit).  SpikeGadgets also provides both Matlab and Python toolboxes on their site that can be used to analyze both behavioral and electrophysiological data. Trodes can be used on Windows, Linux, or Mac, and there are step-by-step instructions for how to install and use Trodes on the group’s bitbucket page.

Spikegadgets mission is “to develop the most advanced neuroscience tools on the market, while preserving ease of use and science-driven customization.”

 


For more information on SpikeGadgets or to download or purchase their software or hardware, check out their website here.

There is additional documentation on their BitBucket Wiki, with a user manual, instructions for installation, and FAQ.

Check out their entire list of collaborators, contributors, and developers here.

Pathfinder

AUGUST 8, 2019

Matthew Cooke and colleagues from Jason Snyder’s lab at University of British Columbia recently developed open source software to detect spatial navigation behavior in animals called Pathfinder:


Spatial navigation is studied across several different paradigms for different purposes in animals; through analyzing spatial behaviors we can gain insight into how an animal learns a task, how they change their approach strategy, and generally observing goal-directed behaviors. Pathfinder is an open source software that can analyze rodent navigation. The software intends to automatically classify patterns of navigation as a rodent performs in a task. Pathfinder can analyze subtle patterns in spatial behavior that simple analysis measures may not always be able to pick up on. Specifically, many water maze analyses use escape latency or path length as an analysis measure, but the authors point out that the time it takes to reach the platform may not differ while the strategy does, so using latency may not be the most optimal measure for analyzing an animal’s strategy and therefore experimenters may miss out on key differences in behavior. Therefore, Pathfinder aims to analyze more subtle aspects of the task to determine differences in spatial navigation and strategy.

Originally intended for water maze navigation, pathfinder can also be used to analyze many other spatial behaviors across different tasks, mazes, and species. The software takes x-y coordinates from behavior tracking software (for example, it can open files from Noldus Ethovision, ActiMetrics’ Watermaze, Stoelting’s Anymaze, and the open-source project ezTrack from Denise Cai’s lab), and then calculates the best-fit search strategy for each rodent’s trial. For the morris water maze task, trials are fit into several categories: Direct Swim, Directed Search, Focal Search, Spatial indirect, Chaining, Scanning, Thigmotaxis, and Random Search.

Pathfinder runs in Python and has an easy-to-use GUI; many aspects and parameters can be adjusted to analyze different tasks or behaviors.

For more details, check out their BioRxiV preprint here.

There’s a nice (humorous!) writeup of the project on the Snyder Lab website.

You can also download the project and view more details on their github:
https://matthewbcooke.github.io/Pathfinder/

https://github.com/MatthewBCooke/Pathfinder/


Automated Rodent Tracker (ART)

May 5, 2017

Robyn Grant, from Manchester Metropolitan University, has shared the following with Open Behavior regarding the development of an automated rodent tracking (ART) program:


We have developed a program, ART, that can automatically track rodent position and movement. It is able to track head movements, body movements and also aspects of body size. It is able to identify certain behaviours from video footage too, such as rotations, moving forwards, interacting with objects and staying still. Our program is really flexible, so it can have additional modules that can be easily “plugged in”. For example, at the moment, it has a manual tracker module, which allows for your automatic tracking to be validated with manual tracking points (using MWA: Hewitt, Yap & Grant 2016, Journal of Open Research Software). This versatility means that in the future other modules might be added, such as additional behaviour identifiers, or other trackers such as for feet or whiskers.

Our program, ART, is also very automatic. It has minimal user input, but still performs as well as other trackers that require a lot of manual processing. It can automatically find video frames where the mouse is present and will only track these frames; or you can specify to track only when the mouse is locomoting, or rotating, for example. We hope that this tracker will form a solid basis from which to investigate rodent behaviour further.

 


ART may be downloaded here.