Python-based open source eye tracking tool

No readers like this yet.
open source button on keyboard

Opensource.com

A few of weeks ago I got a email from a friend who was attending an education technology conference. In the note he referenced PyGaze, an open source project I might be interested in.

I have a deep interest in educational psychology, and so I was fascinated by what I read about PyGaze—an open source toolbox for eye tracking in Python. The website told me that it runs on Linux, but I wanted to learn more about eye tracking and the role it plays in psychological research. I also wanted to know more about the project and how it is contributing to research and its implications for open source.

In this interview, the lead developer for the project, Edwin Dalmaijer, who works at the University of Oxford's Department of Experimental Psychology doing research and programming, provides a fascinating description of PyGaze and the significance of eye tracking in research.

Most of the time, where people look is also where they're attention is. If I record where you were looking, I can figure out what attracts your attention. PyGaze... bundles code for a large range of different eye trackers from different manufacturers into a single interface.

Tell us about you and your work. Are you an open source enthusiast?

My work as a researcher in experimental psychology requires me to program experiments, analyses, and sometimes entire software libraries or graphical user interfaces. In research, we often deal with very obscure hardware that does awfully specific things, such as tracking test subjects' eye movements or pupil response, their brain waves, their grip force, and whatever weird thing you can think of.

This hardware is often sold by small vendors that do not have time for direct support or specific coding documentation. As a researcher who relies on these products, you often hack your way around SDKs and APIs until you find something that works for you.

Once I manage to talk to such a piece of obscure hardware, I incorporate its functionality in a more user-friendly library. PyGaze is a good example of this: it bundles code for a large range of different eye trackers from different manufacturers into a single interface. We also keep adding functionality for other things, including game controllers and joysticks, webcams, and devices that can monitor physiology.

Obviously, other people in my field are doing the same. If we all share our custom software, we do not have to reinvent the wheel every time we need to use new hardware. So, open source saves us time and money on software development. An added bonus is that colleagues can collaborate on their software, reducing the chances of persistent bugs. You see this happening with PyGaze, where an increasingly large group of users find and solve issues.

It's not only data acquisition that can be a pain; analyses can get really complicated too. Academics come up with really clever ways of looking at their data and visualizing their analyses in incredibly pretty figures. Here, sharing code is equally vital; it helps us share good code and scrutinize each others' work. This is quite important, as scripting mistakes are easy to make and miss with only one pair of eyes.

What is the purpose of eye tracking? How does that help in your research?

Most of the time, where people look is also where they're attention is. If I record where you were looking, I can figure out what attracts your attention. This can be important for basic research, for example when we want to know what features attract attention. This can tell us what kind of visual information we use to make sense of the world around us. It can also be useful in applied research—marketing researchers love eye tracking because it can tell them where people look at in their advertisements. Do they see the company logo, or are they too distracted by the model in the skimpy outfit? (Sex doesn't always sell!)

In addition, the dynamics of your eye movements can tell us all sorts of things about what distracts you, and what motivates you. By closely monitoring the velocity and trajectory of your saccades (very quick eye movements), we can learn a lot about the basic properties of attention and the motor system.

Finally, we can use eye trackers to measure pupil size. Interest in this technique is currently peaking again, and people are finding all sorts of things. For example, your pupils increase in size in anticipation to reward, but also in surprise (e.g., for not getting a reward). It also seems that pupils increase or decrease their size in expectation of where you will be moving your eyes.

What is it about open source software and Python in particular that lends itself so well to your work?

Well, for starters, it's free. When I started programming, I was a poor student who couldn't afford a legal copy of Matlab (a major programming language in our field). Python was free, so I turned to that instead.

Now that I'm used to it, I love it's high-levelness. Some people prefer lower-level languages that allow you to define and control everything (looking at you, C!), but I love how quick and dirty you can program in Python. Stuff just works, and you don't have to spend ages on redefining the basics.

On top of this, Python is super versatile. It has amazing libraries for parallel processing, multi-threading, socket communication, you name it. These things are crucial in handling the kind of obscure peripheral hardware that we use in experimental psychology and cognitive neuroscience. Also, when you need something incredibly specific (say, to monitor your hamster's physical activity), it's quite likely that someone has already written a library for it. And you can find that library on GitHub for all to use and improve.

Python users tend to be really keen on sharing their work, and to collaborate on really excellent projects (SciPy, NumPy, and matplotlib are all indispensable for scientists!). This and its user-friendly handling, make Python the ideal general programming language.

From reading your research paper, PyGaze: An open-source toolbox for eye tracking I noticed that PyGaze is on GitHub. How important is collaboration to your ongoing efforts to improve the software?

Very important, because I can be an awful programmer at times. I tend to be impatient, overlook things, and be a bit disorganized. It can be a big help to have other people look through my code and filter out the mistakes. I really need to thank Sebastiaan Mathŏt here, as he is much more thorough and has a better feeling for how to organize and maintain big projects.

Are there any forks of PyGaze? Have those forks helped your work and development of PyGaze?

There are quite a few. A quick look on GitHub tells me it's 25 now! The most crucial is Sebastiaan Mathŏt's fork, as this is where all the new OpenSesame integration comes from. OpenSesame is an experiment builder for the social sciences. Psychologists and neuroscientists can use it to create their experiments in a graphical environment by dragging and dropping the individual building blocks onto a timeline. Some people prefer this over programming their experiments, as it requires less technical know-how and can actually be quicker in some cases.

Sebastiaan has written PyGaze plug-ins for OpenSesame so that people can now use that to access the broad range of eye trackers that PyGaze supports.

Where do you see the PyGaze project going in the short term? Do you have long term goals for the project?

Short term, we are working on a lot of under-the-hood stuff, including better variable management and getting rid of some annoying bugs. In the longer run, we would love to incorporate support for more eye trackers and support for even more obscure peripherals! I have started to work with electroencephalography (EEG) again, so I expect some support for that in the future.

In addition, I have been working on an analysis suite to complement the existing experimental library. It would basically be a GUI with an associated library for the analysis of eye tracking data. It should support the same range of trackers that PyGaze supports, and it should allow people to do statistics and visualizations of gaze data, saccades, and pupillometry.

Tags
User profile image.
Educator, entrepreneur, open source advocate, life long learner, Python teacher. M.A. in Educational Psychology, M.S. Ed. in Educational Leadership, Linux system administrator.

5 Comments

Is there a fork where someone is modifying PyGaze to control a computer mouse that do not have control of there arms? Also, what hardware is used to detect the eye gaze? Is there some type of inferred eye reflection camera?

Rick .. I asked Edwin to comment and then there'll be an answer to your questions.

In reply to by Rick Weinberg (not verified)

Both good questions!

1) To my knowledge, no one has successfully tried to use PyGaze to do cursor control. In all honesty, there are likely better ways of doing that beyond PyGaze, because it is such a high-level library. The primary focus on fullscreen psychological experiments rather than low-level integration with the operating system. That being said, most of the code that deals with eye-tracker communications is written without too much overhead, close to the original API or SDK. This would make it an ideal library to modify for cursor control, provided your goal is to have it work in Python, with multiple types of trackers.

2) The hardware is specialised equipment, produced by companies such as SR Research (EyeLink), SensoMotoric Instruments, Tobii, and The EyeTribe. These are all (infrared) cameras, and their images feed into algorithms that aim to find the pupil and the reflection of a source of infrared light. For more information on those algorithms, see: http://www.pygaze.org/2015/06/webcam-eye-tracker/

In reply to by Rick Weinberg (not verified)

I would love to modify PyGaze to allow for eye gaze and controlling the mouse. I just don't think I have the skills yet. Also, I'm interested is hacking the the hardware. I've seen the Tobii EyeGaze device and it is $14,000. I want to "disrupt" the assistive technology market by making "life improvement devices" affordable for all who need them. So I guess I'm asking if there are any opensource hardwares that could help me hack the inferred retina reflection cameras?

I should have started off with this. Thanks so much for answering my questions. Your answers are so very helpful and extensive. My ultimate goal is to have a special education hackathon to combine things, make things and modify pre existing things to help people with learning differences, special needs and financially disadvantaged.

In reply to by Edwin (not verified)

Great plans! Fortunately, you're not alone in this, so you wouldn't have to re-invent the wheel.

There are relatively cheep eye-trackers that are targeted at consumers. The EyeTribe, for example, is just over $100 (they advertise with $99, but that's excluding VAT and delivery): see http://theeyetribe.com/ (The downside is that it's only 60 Hz; fine for most consumer purposes, but not necessarily for research.)

Also, there are open hard/software projects:

- openEyes (open hardware, open software) is a very promising project: http://thirtysixthspan.com/openEyes/

- Pupil (closed(?) hardware, open software) is a cool project for a head-mounted tracker that can record at 120 Hz; see https://pupil-labs.com/

- Ogama (no hardware, open software) is an increasingly popular project. I haven't used it, but it seems ok: http://www.ogama.net/

You can find quite a complete overview on the COGAIN website: http://wiki.cogain.org/index.php/Eye_Trackers

Good luck with your project!

In reply to by Rick Weinberg (not verified)

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.