Home » drosophila

Tag: drosophila

DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila

July 23, 2020

Semih Günel and colleagues have created a deep learning-based pose estimator for studying how neural circuits control limbed behaviors in tethered Drosophila.

Appendage tracking is an important behavioral measure in motor circuit research. Up until now, algorithms for accurate 3D pose estimation in small animals such as Drosophila did not exist. Rather, researchers have had to use alternative approaches such as placing small reflective biomarkers on fly leg segments. While this method is appropriate for larger animals, implementing this strategy in drosophila-sized animals is motion limiting, labor intensive, and cannot estimate 3D information, therefore limiting the accuracy of behavioral measures. DeepFly3D is a PyTorch and PyQT5 based software designed to solve these issues and provide a user-friendly user interface for pose estimation and appendage tracking. DeepFly3D makes of use of supervised deep learning for 2D joint detection and a multicamera setup to iteratively infer 3D poses. This new approach allows for sub-millimeter scale accuracy of automated measurements. Amazingly, DeepFly3D is not limited to drosophila and can be modified to study other animals, such as rodents, primates, and humans. DeepFly3D therefore allows for versatile pose estimation while also permitting an extraordinary level of behavioral detail and accuracy.

Read more in the paper!

Or check out the project’s GitHub!

Günel, S., Rhodin, H., Morales, D., Campagnolo, J., Ramdya, P., & Fua, P. (2019). Deepfly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila. ELife, 8. https://doi.org/10.7554/eLife.48571

Visual stimulator with customizable light spectra

May 7, 2020

Katrin Franke, Andre Maia Chagas and colleagues have developed and shared a spatial visual stimulator with an arbitrary-spectrum of light for visual neuroscientists.

Vision research, quite obviously, relies on control of visual stimuli in an experiment. There are a great number of commercially available devices and hardware that are implemented in presenting visual stimuli to human and other species, however, these devices are predominantly developed for the visual spectrum of humans. For other species, such as drosophila, zebrafish, and rodents, their visual spectrum includes UV, and the devices used in studies sometimes fail to present this range of stimulus, and therefore often limits our understanding of the visual systems of other organisms. To address this, Franke, Chagas and colleagues developed an open source, generally low cost visual stimulator which can be customized with up to 6 chromatic channels. Given the components used to build the device, the spectrum of light can be arbitrary and customizable to be adapted to different animal models based on their visual spectrum. The details of this device, including the parts list and information for a custom python library for generating visual stimuli (QDSpy), can be found in the eLife publication. The device is tested and shown to work with stimulating the mouse retina and in vivo zebrafish studies; details on these experiments can also be found in the publication.

Check out the eLife article here!

Franke, K., Chagas, A. M., Zhao, Z., Zimmermann, M. J., Bartel, P., Qiu, Y., . . . Euler, T. (2019). An arbitrary-spectrum spatial visual stimulator for vision research. ELife, 8. doi:10.7554/elife.48779


June 27, 2019

Carlos Ribeiro’s lab at Champalimaud recently published their new project called optoPAD in eLife:

Both the analysis of behavior and of neural activity need to be time-precise in order to make any correlation or comparison to each other. The analysis of behavior can be done through many methods (as seen by many featured projects on this site!). The Ribeiro lab has previously published their work on flyPAD (Itskov et al., 2014), which is a system for automated analysis of feeding behavior in Drosophila with high temporal precision. However, in attempts to manipulate specific feeding behaviors, the group wanted to go one step further to manipulate neural activity during feeding, and needed a method to do so that would be precise enough to compare with behavior.

In their new manuscript, Moreira et al. describe the design and implementation of a high-throughput system of closed-loop optogenetic manipulation of neurons in Drosophila during feeding behavior. Named optoPAD, the system allows for specific perturbation of specific groups of neurons. They use optoPAD as a method to induce appetitive and aversive effects on feeding through activating or inhibiting gustatory neurons in a closed-loop manner. OptoPAD is a combination of the previous flyPAD system with an additional method for stimulating LEDs for optogenetic perturbation. They also used their system combined with Bonsai, a current open-source framework for behavioral analysis.

The system first uses flyPAD to measure the interaction of the fly with the food given in an experiment. Then, Bonsai detects when the fly interacts with a food electrode, then sending a signal to a microcontroller which will turn on an LED for optogenetic perturbation of neurons in the fly. The authors additionally highlight the flexibility and expandability of the optoPAD system. They detail how flyPAD, once published and then implemented in an optogenetics framework by their group, had been successfully adapted by another group, which is a great example of the benefit of open-source sharing of projects.


Details on the hardware and software can be found at the Ribeiro lab Github. More details on flyPAD, the original project, can be found on their github as well.

Information on FlyPAD can also be found on the FlyPAD website and in the FlyPAD paper .

Moreira, J. M., Itskov, P. M., Goldschmidt, D., Steck, K., Walker, S. J., & Ribeiro, C. (2019). optoPAD: a closed-loop optogenetics system to study the circuit basis of feeding behaviors. eLife, doi: 10.7554/eLife.43924


February 20, 2019

Francisco Romero Ferrero and colleagues have developed idtracker.ai, an algorithm and software for tracking individuals in large collectives of unmarked animals, recently described in Nature Methods.

Tracking individual animals in large collective groups can give interesting insights to behavior, but has proven to be a challenge for analysis. With advances in artificial intelligence and tracking software, it has become increasingly easier to collect such information from video data. Ferrero et al. have developed an algorithm and tracking software that features two deep networks. The first tracks animal identification and the second tracks when animals touch or cross paths in front of one another. The software has been validated to track individuals with high accuracy in cohorts of up to 100 animals with diverse species from rodents to zebrafish to ants. This software is free, fully-documented and available online with additional jupyter notebooks for data analysis.

Check out their website with full documentation, the recent Nature Methods article, BioRXiv preprint, and a great video of idtracker.ai tracking 100 zebrafish!

FreemoVR: virtual reality for freely moving animals

November 14, 2018

John Stowers and colleagues from the Straw Lab at the University of Frieburg have developed and shared FreemoVR, a virtual reality set-up for unrestrained animals.

Virtual reality (VR) systems can help to mimic nature in behavioral paradigms, which help us to understand behavior and brain function. Typical VR systems require that animals are movement restricted, which limits natural responses. The FreemoVR system was developed to address these issues and allows for virtual reality to be integrated with freely moving behavior. This system can be used with a number of different species including mice, zebrafish, and Drosophila. FreemoVR has been validated to investigate several behavior in tests of height-aversion, social interaction, and visuomotor responses in unrestrained animals.


Read more on the Straw Lab site, Nature Methods paper, or access the software on Github.