Home » freely behaving

Tag: freely behaving

CerebraLux

June 25, 2020

This week we want to shed some light on a project from Robel Dagnew and colleagues from UCLA called CerebraLux, a wireless system for optogenetic stimulation.


Optogenetic methods have been a crucial tool for understanding the role that certain neural cell populations have in modulating or maintaining a variety of behaviors. This tool requires a light source to be passed through a fiber optic probe, and in many experimental setups this is achieved through a long fiber optic cable to attach the light source to the probe. This long cable can impose limitations on experiments where animals are behaving freely around behavior chambers or mazes. One obvious solution is to deliver light via a wireless controller communicating with a headmounted light source, but existing systems can be cost-prohibitive or to build in a lab  requires access to specialized manufacturing equipment. To address the need for a a low-cost wireless optogenetic probe, Dagnew and colleagues developed CerebraLux which is built using off-the-shelf and accessible custom parts. This device consists of two major components: the optic component which features a milled baseplate capable of holding and connecting an optic fiber and LED (a part of the electronic portion); and the electronic component which features a custom-printed circuit board (PCB), lithium battery, IR receiver, LED, and magnets to align and connect the two components of the device. The device is controlled via a custom GUI (built with the TkInter Python 2.7 library) which sends pulses to the device via an Arduino Uno. More details about the build of these components and the process for communicating with the device via the GUI are available in Dagnew et. al. The CerebraLux design and operations manual, which includes access to the 3D design files for the milled parts, the print design for the PCB, and code for communicating with the device, is available in the appendix of the paper, while the code for the GUI is available from the Walwyn Lab website. Be sure to check out the paper for information about how they validated the device in vivo. The cost of all the component parts (as of 2017) comes in just under $200, providing to be a cost-effective solution for labs seeking a wireless optogenetic probe.

Read more about CerebraLux here!


Dagnew, R., Lin, Y., Agatep, J., Cheng, M., Jann, A., Quach, V., . . . Walwyn, W. (2017). CerebraLux: A low-cost, open-source, wireless probe for optogenetic stimulation. Neurophotonics, 4(04), 1. doi:10.1117/1.nph.4.4.045001

Rodent Arena Tracker (RAT)

June 18, 2020

Jonathan Krynitsky and colleagues from the Kravitz lab at Washington University have constructed and shared RAT, a closed loop system for machine vision rodent tracking and task control.


The Rodent Arena Tracker, or RAT, is a low cost wireless position tracker for automatically tracking mice in high contrast arenas. The device can use subject position information to control other devices in real time, allowing for closed loop control of various tasks based on positional behavior data. The device is based on the OpenMV Cam M7 (openmv.io), an opensource machine vision camera equipped with onboard processing for real-time analysis which reduces data storage requirements and removes the need for an external computer. The authors optimized the control code for tracking mice and created a custom circuit board to run the device off a battery and include a real-time clock for synchronization, a BNC input/output port, and a push button for starting the device. The build instructions for RAT, as well as validation data to highlight effectiveness and potential uses for the device are available in their recent publication. Further, all the design files, such as the PCB design, 3D printer files, python code, etc, are available on hackaday.io.

Read the full article here!

Or check out the project on hackaday.io!


Krynitsky, J., Legaria, A. A., Pai, J. J., Garmendia-Cedillos, M., Salem, G., Pohida, T., & Kravitz, A. V. (2020). Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System. Eneuro, 7(3). doi:10.1523/eneuro.0485-19.2020

 

Open Source Whisking Video Database

April 2, 2020

Alireza Azafar and colleagues at the Donders Institute for Brain, Cognition, and Behaviour at Rabound University have published an open source database of high speed videos of whisking behavior in freely moving rodents.


As responsible citizens, it’s possible you are missing lab and working from home. Maybe you have plenty to do, or maybe you’re looking for some new data to analyze to increase your knowledge of active sensing in rodents! Well, in case you don’t have that data at hand and can’t collect it yourself for a few more weeks, we have a treat for you! Azafar et al. have shared a database of 6,642 high quality videos featuring juvenile and adult male rats and adult male mice exploring stationary objects. This dataset includes a wide variety of experimental conditions including genetic, pharmacological, and sensory deprivation interventions to explore how active sensing behavior is modulated by different factors. Information about interventions and experimental conditions are available as a supplementary Excel file.

The videos are available as mp4 files as well as MATLAB matrices that can be converted into a variety of data formats. All videos underwent quality control and feature different amounts of noise and background, which makes a great tool for mastering video analysis. A toolbox is available from this group on Github for a variety of whisker analysis methods including nose and whisker tracking. This database is a great resource for studying sensorimotor computation, top-down mechanisms of sensory navigation, cross-species comparison of active sensing, and effects of pharmacological intervention of whisking behaviors.

Read more about active sensing behavior, data collection methods, and rationale here.

The database is available through GigaScience DB. 

You can also try out their Matlab Whisker Tracker from Github!