Oculomatic Eye-Tracking

Oct 14, 2016

Jan Zimmerman, a postdoc fellow in the Glimcher Lab at New York University has shared the following about Oculomatic:

Video-based noninvasive eye trackers are an extremely useful tool for many areas of research. Many open-source eye trackers are available but current open-source systems are not designed to track eye movements with the temporal resolution required to investigate the mechanisms of oculomotor behavior. Commercial systems are available but employ closed source hardware and software and are relatively expensive therefore limiting wide-spread use.

Here we present Oculomatic, an open-source software and modular hardware solution to eye tracking for use in humans and non-human primates. Our fully modular hardware implementation relies on machine vision USB3 camera systems paired with affordable lens options from the surveillance sector. Our cross platform software implementation (C++) relies on openFrameworks as well as openCV and uses binary image statistics (following Green’s theorem) to compute pupil location and diameter. Oculomatic makes use of data acquisition devices to output eye position as calibrated analog voltages for direct integration into the electrophysiological acquisition chain.

Oculomatic features high temporal resolution (up to 600Hz), real-time eye tracking with high spatial accuracy (< 0.5°), and low system latency (< 1.8ms) at a relatively low-cost  (< 1000USD). We tested Oculomatic during regular monkey training and task performance in two independent laboratories and compared Oculomatic performance to existing scleral search-coils. While being fully non-invasive, Oculomatic performed favorably to the gold standard of search-coils with only a slight decrease in spatial accuracy.

We propose this system can support a range of research into the properties and neural mechanisms of oculomotor behavior as well as provide an affordable tool to scale non-human primate electrophysiology further.

 

This research tool was created by your colleagues. Please acknowledge the Principal Investigator, cite the article in which the tool was described, and include an RRID in the Materials and Methods of your future publications.  Project portal RRID:SCR_021465; Software RRID:SCR_021525

Have questions? Send us an email!