William Liberti, from the Gardner Lab out of Boston University, has shared the following with Open Behavior regarding ‘FinchScope’. Although originally designed for finches, the 3D printed single-photon fluorescent imaging microscope has since been adapted for rodents and other avian species.
The FinchScope project aims to provide a modular in-vivo optophysiology rig for awake, freely behaving animals, with a transparent acquisition and analysis pipeline. The goal is to produce a customizable and scaleable single-photon fluorescent imaging microscope system that takes advantage of developing open-source analysis platforms. These tools are built from easily procured off-the-shelf components and 3D printed parts.
We provide designs for a 3D printed, lightweight, wireless-capable microscope and motorized commutator, designed for multi-month monitoring the neural activity (via genetically encoded calcium indicators) of zebra finches while they sing their courtship songs. It has since been adapted for rodents, and to other birds such as canaries.
Andre Chagas, creator of OpenNeuroscience, has generously shared the following with OpenBehavior regarding an arduino-based, 3D-printed nose poke device:
“This nose poke device was built as “proof-of-principle”. The idea was to show that scientists too can leverage from the open source philosophy and the knowledge built by the community that is developing around open source hardware. Moreover, the bill of materials was kept simple and affordable. One device can be built for ~25 dollars and should take 2-3 hours to build, including the time to print parts.
The device is organised as follows: The 3D printed frame (which can also be built with other materials when a printer is not available) contains a hole where the animals are expected to insert their snouts. At the front part of the hole, an infrared led is aligned with an infrared detector. This forms an “infrared curtain” at the hole’s entrance. If this curtain is interrupted, a signal is sent to a microcontroller (an Arduino in this case), and it can be used to trigger other electronic components, such as a water pump, or an led indicator, or in this case a Piezo buzzer.
At the back of the hole, a white LED is placed to indicate that the system is active and ready for “nose pokes”.
The microcontroller, contains the code responsible for controlling the electronic parts, and can easily be changed, as it is written for Arduino and several code examples/tutorials (for begginners and experts) can be found online.”
Robert Sachdev, from the Neurocure Cluster of Excellence, Humboldt Universität Zu Berlin, Germany, has generously shared the following regarding automated optical tracking of animal movement:
“We have developed a method for tracking the motion of whiskers, limbs and whole animals in real-time. We show how to use a plug and play Pixy camera to monitor the real-time motion of multiple colored objects and apply the same tools for post-hoc analysis of high-speed video. Our method has major advantages over currently available methods: we can track the motion of multiple adjacent whiskers in real-time, and apply the same methods post-hoc, to “recapture” the same motion at a high temporal resolution. Our method is flexible; it can track objects that are similarly shaped like two adjacent whiskers, forepaws or even two freely moving animals. With this method it becomes possible to use the phase of movement of particular whiskers or a limb to perform closed-loop experiments.”
Figure using Pixy for two adjacent whiskers A. Setup. Head-fixed mice are acclimatized to whisker painting, and trained to use their whiskers to contact a piezo-film touch sensor. A Pixy camera is used to track whiskers in real-time (left), a high-speed color camera is used simultaneously to acquire data. B. Paradigm for whisker task. A sound-cue initiates the trial. The animal whisks one of the two painted whiskers into contact with a piezo-film sensor and if contact reaches threshold, the animal obtains a liquid reward. There is a minimum inter-trial interval of 10 seconds. C, Capturing whisker motion in real-time. The movement and location of the D1 and D2 whiskers shown at two consecutive time points (20 ms apart, left & right images). Lines corresponding to the location of the two whiskers (middle panel) acquired with Spike2 software. The waveform of whisker data reflects the spatial location and the dimensions of the tracked box around the whisker, which can both change as the whisker moves