Home » Data Analysis

Category: Data Analysis

DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila

July 23, 2020

Semih Günel and colleagues have created a deep learning-based pose estimator for studying how neural circuits control limbed behaviors in tethered Drosophila.


Appendage tracking is an important behavioral measure in motor circuit research. Up until now, algorithms for accurate 3D pose estimation in small animals such as Drosophila did not exist. Rather, researchers have had to use alternative approaches such as placing small reflective biomarkers on fly leg segments. While this method is appropriate for larger animals, implementing this strategy in drosophila-sized animals is motion limiting, labor intensive, and cannot estimate 3D information, therefore limiting the accuracy of behavioral measures. DeepFly3D is a PyTorch and PyQT5 based software designed to solve these issues and provide a user-friendly user interface for pose estimation and appendage tracking. DeepFly3D makes of use of supervised deep learning for 2D joint detection and a multicamera setup to iteratively infer 3D poses. This new approach allows for sub-millimeter scale accuracy of automated measurements. Amazingly, DeepFly3D is not limited to drosophila and can be modified to study other animals, such as rodents, primates, and humans. DeepFly3D therefore allows for versatile pose estimation while also permitting an extraordinary level of behavioral detail and accuracy.

Read more in the paper!

Or check out the project’s GitHub!


Günel, S., Rhodin, H., Morales, D., Campagnolo, J., Ramdya, P., & Fua, P. (2019). Deepfly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila. ELife, 8. https://doi.org/10.7554/eLife.48571

MNE Scan: Software for real-time processing of electrophysiological data

July 9, 2020

In a 2018 Journal of Neuroscience Methods article, Lorenz Esch and colleagues present MNE Scan, a software that provides real-time acquisition and processing of electrophysiological data.


MNE Scan is a state-of-the-art real-time processing software for clinical MEG and EEG data. By allowing for real-time analysis of neuronal activity, MNE Scan enables the optimization of input stimuli and permits the use of neurofeedback. MNE Scan is based on the open-source MNE-CPP library. Written in C++, MNE-CPP is a software framework that processes standard electrophysiological data formats and is compatible with Windows, Mac, and Linux. Compared to other open-source real-time electrophysiological processing software, MNE Scan is designed to meet medical regulatory requirements such as the IEC 62304. This makes MNE Scan ideal for clinical studies and is already in active use with an FDA approved pediatric MEG system. MNE Scan has also been validated in several different use cases, making it a robust solution for the processing of MEG and EEG data in a variety of scenarios.

Read more in the paper here!

Or check it out right from their website!


Rodent Arena Tracker (RAT)

June 18, 2020

Jonathan Krynitsky and colleagues from the Kravitz lab at Washington University have constructed and shared RAT, a closed loop system for machine vision rodent tracking and task control.


The Rodent Arena Tracker, or RAT, is a low cost wireless position tracker for automatically tracking mice in high contrast arenas. The device can use subject position information to control other devices in real time, allowing for closed loop control of various tasks based on positional behavior data. The device is based on the OpenMV Cam M7 (openmv.io), an opensource machine vision camera equipped with onboard processing for real-time analysis which reduces data storage requirements and removes the need for an external computer. The authors optimized the control code for tracking mice and created a custom circuit board to run the device off a battery and include a real-time clock for synchronization, a BNC input/output port, and a push button for starting the device. The build instructions for RAT, as well as validation data to highlight effectiveness and potential uses for the device are available in their recent publication. Further, all the design files, such as the PCB design, 3D printer files, python code, etc, are available on hackaday.io.

Read the full article here!

Or check out the project on hackaday.io!


Krynitsky, J., Legaria, A. A., Pai, J. J., Garmendia-Cedillos, M., Salem, G., Pohida, T., & Kravitz, A. V. (2020). Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System. Eneuro, 7(3). doi:10.1523/eneuro.0485-19.2020

 

Video Repository Initiative on OpenBehavior

June 11, 2020

Last fall when teaching an undergraduate course on computational methods in neuroscience at American University, we wanted to bring in some of the tools for video analysis that have been promoted on OpenBehavior. The idea was to introduce these tools at the end of the course, after the students had learned a bit about Python, Anaconda, Jupyter, Arduinos, etc. We decided on using ezTrack from the Cai Lab as it is written in Python and uses Jupyter notebooks. It was easy to prepare for this topic until we realized that we needed simple videos for tracking. Those from our lab are from operant chambers illuminated with infrared LEDs and require a good bit of preprocessing to be suited for analysis with simple tracking algorithms. In addition, we use Long-Evans rats in our studies and they are a challenge to track given their coloring. So, we looked around the web for example videos and were surprised by the lack of sharing example videos by labs who have developed, published with, and promoted tools for video analysis. Most videos that we found show the results of tracking and did not provide raw video data. We did find a nice example of open-field behavior by mice (Samson et al., 2015), and used the supplemental videos from this now 5 year old paper for the course.

These experiences made us wonder if having a collection of videos for teaching and training would be useful to the community. A collection of video recordings of animals engaged in standard neuroscience behavioral tasks (e.g. feeding, foraging, fear conditioning, operant learning) would be useful for educational purposes, e.g. students could read published papers to understand the experimental design and then analyze data from the published studies using modifications of available tutorial code for packages such as ezTrack or others. For researchers, these same videos would be useful for reproducing analyses from published studies, and quickly learning how to use published code to analyze their own data. Furthermore, with the development of tools that use advanced statistical methods for video analysis (e.g. DeepLabCut, B-SOiD), it seems warranted to have a repository available that could be used to benchmark algorithms and explore their parameter space. One could even envision an analysis competition using standard benchmark videos similar to what is available in the field of machine learning, and that have had impact on the development of powerful algorithms that go well beyond the performance of those that were available only a decade ago (e.g. xgboost).

So we are posting today to ask for community participation in the creation of a video repository. The plan is to post license-free videos to the OpenBehavior Google Drive account. Our OpenBehavior team will convert the files to a standard (mp4) format and post links to the videos on the OpenBehavior website, so they will be accessible to the community. The website will list the creator of the video file, the camera and software used for the recording, the resolution, frame rate and duration of recording, the species and information on the behavioral experiment (and a link to the publication or preprint if the work is from a manuscript).

For studies in rodents, we are especially interested in videos showing overhead views from open-field and operant arena experiments and close-up videos of facial reactions, eyeblinks, oral movements and limb reaching. We are happy to curate videos from other species (fish, birds, monkeys, people) as well.

If you are interested in participating, please complete the form on this page or reach out to us via email at openbehavior@gmail.com or Twitter at @OpenBehavior.

 

SpikeForest

May 21, 2020

Hot off the eLife press, Jeremy Magland and colleagues have shared SpikeForest, a tool for validating automated neural spike sorters.


Spike sorting is a crucial step in neural data analysis. Manual spike sorting is time consuming and sensitive to human error, so much effort has been placed into developing automated algorithms to perform this necessary step. However, even with rapid development and sharing of these tools, there is little information to guide researchers for which algorithm may best serve their needs and that it offers the accuracy needed to give a complete scope of the data. To address this, Magland and colleagues across 11 research groups have developed and contributed data for SpikeForest. This python based software suite utilizes a large database of ephys recordings featuring ground truth units (units that have spike patterns known a priori), a parallel processing pipeline to benchmark algorithm performance, and a web interface for users to explore results. This tool can be used to assess which algorithm works best to extract data from different recording and experimental methods (in vivo, ex vivo, tetrode, etc) and provides accurate evaluation metrics for comparison. Information about the spike sorting algorithms that SpikeForest can compare are available in the recent publication, as well as a preliminary comparison of these algorithms based on community provided datasets. The SpikeForest Interface also allows users to sort their own data with a few modifications to the code, which is discussed in the publication. Be sure to check it out!

Read about SpikeForest here!

Or explore the SpikeForest web interface here!


neurotic

April 30, 2020

Jeffrey P. Gill and colleagues have developed and shared a new toolbox for synchronizing video and neural signals, cleverly named neurotic!


Collecting neural data and behavioral data are fundamental to behavioral neuroscience, and the ability to synchronize these data streams are just as important as collecting the information in the first place. To make this process a little simpler, Gill et al. developed an open-source option called neurotic, a NEUROscience Tool for Interactive Characterization. This tool is programmed in Python and includes a simple GUI, which makes it accessible for users with little coding experience. Users can read in a variety of file formats for neural data and video, which they can then process, filter, analyze, annotate and plot. To show the effectiveness across species and signal types, the authors tested the software with aplysia feeding behavior and human beam walking. Given its open-source nature and strong integration of other popular open-source packages, this software will continue to develop and improve as the community uses it.

Read more about neurotic here!

Check out the documentation here.


Gill, J. P., Garcia, S., Ting, L. H., Wu, M., & Chiel, H. J. (2020). Neurotic: Neuroscience Tool for Interactive Characterization. Eneuro. doi:10.1523/eneuro.0085-20.2020

PASTA

April 16, 2020

Thanks to Jan Homolak from the Department of Pharmacology, University of Zagreb School of Medicine, Zagreb, Croatia for sharing the following about repurposing a digital kitchen scale for neuroscience research: a complete hardware and software cookbook for PASTA (Platform for Acoustic STArtle).


“As we were starving for a solution on how to obtain relevant and reliable information from a kitchen scale sometimes used in very creative ways in neuroscience research, we decided to cut the waiting and cook something ourselves. Here we introduce a complete hardware and software cookbook for PASTA, a guide on how to demolish your regular kitchen scale and use the parts to turn it into a beautiful multifunctional neurobehavioral platform. This project is still medium raw, as its the work in progress, however, we hope you will still find it well done.
PASTA comes in various flavors such as:
– complete hardware design for PASTA
– PASTA data acquisition software codes (C++/Arduino)
– PASTA Chef: An automatic experimental protocol execution Python script for data acquisition and storage
– ratPASTA (R-based Awesome Toolbox for PASTA): An R-package for PASTA data analysis and visualization

..and all can be found on bioRxiv here: https://www.biorxiv.org/content/10.1101/2020.04.10.035766v1.supplementary-material

bon appétit!”


Toolboxes for Spike and LFP Analysis

April 9, 2020

There are a number of open source toolboxes available for neural data analysis, especially for spike and local field potential data. With more options comes a more difficult decision when it comes to selecting the toolbox that’s right for your data. Fortunately, Valentina Unakafova and Alexander Gail have compared several toolboxes for spike and LFP analysis, connectivity analysis, dimensionality reduction, and generalized linear modeling. They discuss the major features of software available for Python and MATLAB (Octave) including Brainstorm, Chronux, Elephant, FieldTrip, gramm, Spike Viewer, and SPIKY. They include succinct tables for assessing system and program requirements, quality of documentation and support, and data types accepted by each toolbox. Using an open-access dataset, they assess the functionality of the programs and finish their comparison with highlighting advantages of each toolbox to consider when trying to find the one that works best for your data. The files they used to compare toolboxes are all available from GitHub to supplement their paper.

Analysis of spike and local field potential (LFP) data is an essential part of neuroscientific research.

Read their full comparison here.

Check out their GitHub for the project here.


Open Source Whisking Video Database

April 2, 2020

Alireza Azafar and colleagues at the Donders Institute for Brain, Cognition, and Behaviour at Rabound University have published an open source database of high speed videos of whisking behavior in freely moving rodents.


As responsible citizens, it’s possible you are missing lab and working from home. Maybe you have plenty to do, or maybe you’re looking for some new data to analyze to increase your knowledge of active sensing in rodents! Well, in case you don’t have that data at hand and can’t collect it yourself for a few more weeks, we have a treat for you! Azafar et al. have shared a database of 6,642 high quality videos featuring juvenile and adult male rats and adult male mice exploring stationary objects. This dataset includes a wide variety of experimental conditions including genetic, pharmacological, and sensory deprivation interventions to explore how active sensing behavior is modulated by different factors. Information about interventions and experimental conditions are available as a supplementary Excel file.

The videos are available as mp4 files as well as MATLAB matrices that can be converted into a variety of data formats. All videos underwent quality control and feature different amounts of noise and background, which makes a great tool for mastering video analysis. A toolbox is available from this group on Github for a variety of whisker analysis methods including nose and whisker tracking. This database is a great resource for studying sensorimotor computation, top-down mechanisms of sensory navigation, cross-species comparison of active sensing, and effects of pharmacological intervention of whisking behaviors.

Read more about active sensing behavior, data collection methods, and rationale here.

The database is available through GigaScience DB. 

You can also try out their Matlab Whisker Tracker from Github!


Rigbox: an open source toolbox for probing neurons and behavior

January 30, 2020

In a recent preprint, Jai Bhagat, Miles J. Wells and colleagues shared a toolbox, developed by Christopher Burgess, for streamlining behavioral neuroscience experiments.


In behavioral neuroscience, it’s important to keep track of both behavioral data and neural data, and have it done so in a way that makes analysis simpler later on. One of the best ways to achieve this is by having a centralized system for running behavioral and neural recording software while streaming all the data. To address this, Burgess and team developed Rigbox, a high-performance, open-source software toolbox that facilitates a modular approach to designing experiments. Rigbox runs in MATLAB (with some Java and C for network communication and processing speed improvements), and its main submodule, Signals, allows intuitive programming of behavioral tasks. While it was originally developed for behavioral analysis from mice in a steering wheel driven task, the authors show its feasibility for human behavioral tasks (psychophysics & pong game), highlighting the broad array of ways this toolbox can be used in neuroscience.
For more, check out the full preprint!
Or jump right in on Github.