November 14, 2018
John Stowers and colleagues from the Straw Lab at the University of Frieburg have developed and shared FreemoVR, a virtual reality set-up for unrestrained animals.
Virtual reality (VR) systems can help to mimic nature in behavioral paradigms, which help us to understand behavior and brain function. Typical VR systems require that animals are movement restricted, which limits natural responses. The FreemoVR system was developed to address these issues and allows for virtual reality to be integrated with freely moving behavior. This system can be used with a number of different species including mice, zebrafish, and Drosophila. FreemoVR has been validated to investigate several behavior in tests of height-aversion, social interaction, and visuomotor responses in unrestrained animals.
Read more on the Straw Lab site, Nature Methods paper, or access the software on Github.
Stowers, J. R., Hofbauer, M., Bastien, R., Griessner, J., Higgins, P., Farooqui, S., . . . Straw, A. D. (2017). Virtual reality for freely moving animals. Nature Methods, 14(10), 995-1002. doi:10.1038/nmeth.4399
October 17, 2018
In the journal HardwareX, Jinook Oh and colleagues share their design for OpenFeeder, an automatic feeder for animal experiments.
Automatic delivery of precisely measured food amounts is important when studying reward and feeding behavior. Commercially available devices are often designed with specific species and food types in mind, limiting the ways that they can be used. This open-source automatic feeding design can easily be customized for food types from seeds to pellets to fit the needs of any species. OpenFeeder integrates plexiglass tubes, Arduino Uno, a motor driver, and piezo sensor to reliably deliver accurate amounts of food, and can also be built using 3D printed parts.
Read more from HardwareX.
Or check out the device on Open Science Framework and Github.
October 10, 2018
On Hackaday, Richard Warren of the Sawtell Lab at Columbia University has shared his design for KineMouse Wheel, a light-weight running wheel for head-fixed locomotion that allows for 3D positioning of mice with a single camera.
Locomotive behavior is a common behavioral readout used in neuroscience research, and running wheels are a great tool for assessing motor function in head-fixed mice. KineMouse Wheel takes this tool a step further. Constructed out of light-weight, transparent polycarbonate with an angled mirror mounted inside, this innovative device allows for a single camera to capture two views of locomotion simultaneously. When combined with DeepLabCut, a deep-learning tracking software, head-fixed mice locomotion can be captured in three dimensions allowing for a more complete assessment of motor behavior. This wheel can also be further customized to fit the needs of a lab by using different materials for the build. More details about the KineMouse Wheel are available at hackaday.io, in addition to a full list of parts and build instructions.
Read more about KineMouse Wheel on Hackaday,
and check out other awesome open-source tools on the OpenBehavior Hackaday list!
We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.
We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.
October 3, 2018
Thomas Akam and researchers from the Champalimaud Foundation and Oxford University have developed pyControl, a system that combines open-source hardware and software for control of behavioral experiments.
The ability to seamlessly control various aspects of a complex task is important for behavioral neuroscience research. pyControl, an open-source framework, combines Python scripts and a Micropython microcontroller for the control of behavioral experiments. This framework can be run through a command line interface (CLI), or in a user-friendly graphical user interface (GUI) that allows users to manage a variety of devices such as nose pokes, LED drivers, stepper motor controllers and more. The data collected using this system can then be imported easily into Python for data analysis. In addition to complete documentation on the pyControl website, users are welcome to ask questions and interact with the developers and other users via a pyControl Google group.
Read more on the pyControl website.
Purchase the pyControl breakout board at OpenEphys.
Or check out the pyControl Google group!
September 19, 2018
In HardwareX, Brendan Drackley and colleagues share VASIC, an open source weight-bearing device for high-throughput and unbiased behavioral pain assessment in rodents.
The assessment of pain in animal models is a key component in understanding and developing treatments for chronic pain. Drackley and colleagues developed VASIC (Voluntary Access Static Incapacitance Chamber), a modified version of a weight-bearing test. A brief water deprivation encourages rats or mice to seek water in a test chamber, set up with a weighing platforms under the water spout, which can assess weight shifting to an unaffected side in animals with damage to nerves or inflammatory pain. The design incorporates a custom printed circuit board (available from the paper), infrared sensor, Arduino microcontroller, 3D printed parts, and open source software for analysis. A full parts list, links to files, and data from a validation study are available in their paper.
Read more here!
August 15, 2018
In the Journal of Neurophysiology, Sachin S. Deshmuhk and colleagues share their design for a Picamera system that allows for tracking of animals in large behavioral arenas.
Studies of spatial navigation and its neural correlates have been limited in the past by the reach of recording cables and tracking ability in small behavioral arenas. With the implementation of long-range, wireless neural recording systems, researchers are not able to expand the size of their behavioral arenas to study spatial navigation, but a way to accurately track animals in these larger arenas is necessary. The Picamera system is a low-cost, open-source scalable multi-camera tracking system that can be used to track behavior in combination with wireless recording systems. The design is comprised of 8 overhead Raspberry Pi cameras (capable of recording at a high frame rate in a large field of view) recording video independently in individual Raspberry Pi microcomputers and processed using the Picamera Python library. When compared with a commercial tracking software for the same purpose, the Picamera system reportedly performed better with improvements in inter-frame interval jitter and temporal accuracy, which improved the ability to establish relationships between recorded neural activity and video. The Picamera system is an affordable, efficient solution for tracking animals in large spaces.
Read more here!
Or check out their GitHub!
Saxena, R., Barde, W., and Deshmukh, S.S. An inexpensive, scalable camera system for tracking rats in large spaces (2018). Journal of Neurophysiology. https://doi.org/10.1152/jn.00215.2018
August 8, 2018
In HardwareX, an open access journal for designing, building and customizing opensource scientific hardware, Martin A. Raymond and colleagues share their design for a user-constructed, low-cost lickometer.
Researchers interested in ingestive behaviors of rodents commonly use licking behavior as a readout for the amount of fluid a subject consumes, as recorded by a lickometer. Commercially available lickometers are powerful tools to measure this behavior, but can be expensive and often require further customization. The authors offer their own design for an opensource lickometer that utilizes readily available or customizable components such as a PC sound card and 3D printed drinking bottle holder. The data from this device is collected by Audacity, and opensource audio program, which is then converted to a .csv format which can be analyzed using an R script made available by the authors to assess various features of licking microstructure. A full bill of materials, instructions for assembly and links to design files are available in the paper.
Check out the full publication here!
Raymond, M. A., Mast, T. G., & Breza, J. M. (2018). An open-source lickometer and microstructure analysis program. HardwareX, 4. doi:10.1016/j.ohx.2018.e00035
July 23, 2018
OpenBehavior has been covering open-source neuroscience projects for a few years, and we are always thrilled to see projects that are well documented and can be easily reproduced by others. To further this goal, we have formed a collaboration with Hackaday.io, who have provided a home for OpenBehavior on their site. This can be found at: https://hackaday.io/OpenBehavior, where we currently have 36 projects listed ranging from electrophysiology to robotics to behavior. We are excited about this collaboration because it provides a straightforward way for people to document their projects with instructions, videos, images, data, etc. Check it out, see what’s there, and if you want your project linked to the OpenBehavior page simply tag it as “OPENBEHAVIOR” or drop us a line at the Hackaday page.
Note: This collaboration between OpenBehavior and Hackaday.io is completely non-commercial, meaning that we don’t pay Hackaday.io for anything, nor do we receive any payments from them. It’s simply a way to further our goal of promoting open-source neuroscience tools and their goal of growing their science and engineering community.
June 25, 2018
Andreas Genewsky and colleagues from the Max Planck Institute of Psychiatry have shared the design, construction and validation of a simplified, low-cost, radar-based motion detector for home cage activity monitoring in mice. This simple, open-source device allows for motion detection without visual contact to the animal and can be used with various cage types. It features a custom printed circuit board and motion detector shield for Arduino, which saves raw activity and timestamped data in CSV files onto an SD card; the authors also provide a Python script for data analysis and generation of actograms. This device offers a cost-effective, DIY alternative to optical imaging of home-cage activity.
Read more from the Journal of Biomedical Engineering publication!
Genewsky, A., Heinz, D. E., Kaplick, P. M., Kilonzo, K., & Wotjak, C. T. (2017). A simplified microwave-based motion detector for home cage activity monitoring in mice. Journal of Biological Engineering,11(1). doi:10.1186/s13036-017-0079-y