When the Infrared Hits Your Eye Like a Big Pizza Pie…

Tobii and Pizza Hut have just teamed up to bring you the “Pizza Hut Subconscious Menu.” Now let me be clear and tell you I’m a huge fan of both Pizza Hut and Tobii eye tracking technology. But I think this project is really only good for one thing – generating a little buzz for Pizza Hut, and a little buzz for Tobii.

I just finished working on a huge eye tracking project, featuring the Tobii Mobile Eye-Tracking unit and the X2-60 Eye Tracker. The X2-60, when mounted on a PC monitor, worked almost flawlessly. As expected, it provided us with interesting, unanticipated, rich data to help us figure out what it is we’re trying to figure out. (Sorry, I’m not at liberty to disclose the details of the project.) And it worked with most of the users, regardless of age, race, physical characteristics, and the use of glasses.

Then we conducted tests on the Mobile Eye Tracker. Essentially, what the mobile unit does, is take the X2-60, and puts it into a contraption that makes it possible to mount a cell phone or tablet and get some eye tracking results. And sometimes it works. We had problems with glasses; obviously bifocals are right out, but even monofocal and reading glasses proved problematic. We had problems tracking some people with very light colored eyes; not sure why. Overweight people had problems if they carried enough of their weight in their faces to give them overly chubby cheeks. People with heavy bags under their eyes didn’t track well. Add to this that the software, unlike for the PC, doesn’t actually run through the phone – you’re just taking a video of the environment and triangulating based on a not-quite-exact configuration and calibration. There is a variety of settings you can use, including different mounts that will put the phone at different angles, or closer to or farther from the camera. But once you’ve decided which setting and configuration works best for your study, you have to stick with it so you can compare the results from different users. That means that if a different setting would work better for a different user, you are stuck. Once we got the unit calibrated, the participant had to remain very still as any slight shift in angle changed the dynamic and tainted the data. Similarly, if they subconsciously put their hands in their laps, they risked completely blocking the tracker. This added pressure to the moderator and the participants and led to less natural behavior.

This is not to say the project was a failure – it was not. We got great data – even using the mobile unit. I’m still a fan of Tobii. What I’m saying is that the mobile configuration is VERY temperamental and subject to fail at the slightest change in the environment. And that’s where we come to the Pizza Hut project. This will be done on mobile devices, with people of all different sizes, ages, and races, with all manner of physical characteristics (height, weight, eye color), and some will have glasses, some might have eye injuries, some just might not track well. They won’t be sitting still for a calibration session before they try to order. They’ll be looking at the device at different angles, they’ll sometimes be holding it in their hands. They might have their hands in front of the sensor. Some will look at it in the daytime, next to a window, with bright sunshine. Still others will use it at night, in a dimly lit restaurant.

It’s a great idea, but unless Tobii has new technology (entirely possible – and I’d LOVE TO SEE IT!) then I don’t expect this promotion to have any legs. It will generate buzz for both entities and give them a little free promotion (like from me), but don’t expect it to work very well, and don’t expect it to actually read your mind. Still, I’m looking forward to giving it a try. I’ll update you as soon as I do.

(For a humorous critique of the new Pizza Hut / Tobii collaboration, check outStephen Colbert’s review)