Sunday, November 8, 2015

Eye-tracking sensor maker makes play for big time

as long as they can win over some of the most demanding consumers: video gamers.
Since John Elvesjo noticed a sensor tracking his eye movements in a lab experiment, the technology he developed with Henrik Eskilsson and Marten Skogo has helped disabled people use a computer by identifying where they are looking on the screen.
The system uses invisible infra-red light to illuminate the eyes. Camera sensors capture the reflection off the retina and cornea to gauge where the eye is, and where it is looking.
The mass-market potential looks almost limitless. Advertisers could adapt billboard images depending on where you rest your gaze. A car could alert you when you're about to fall asleep. Eskilsson says eye tracking will one day be found in all laptops, smartphones, tablets and automobiles.
First up is the computer gaming hardware market. As a player looks to one part of the screen, the image will pan across the landscape and open up a new field of vision.
Whether it catches on in the fiercely competitive gaming industry could depend on a deal struck this year between Eskilsson's company Tobii and Ubisoft, maker of blockbuster game "Assassin's Creed: Rogue".
Tracking the player's gaze, the eyes of warrior Shay Patrick Cormac look across seascapes, forts and battlefields as he hunts assassins in North America during the Seven Years War in the PC version of the game.
The success of this and other tie-ins is the biggest test yet for Tobii, which is making no revenue from supplying its technology for the deal.
The aim is to get enough players interested to lure other gaming companies for deals that would bring in revenue. So far it has a handful of other tie-ins and Eskilsson said eye-tracking will have to reach at least 30 to 50 games before it can be regarded as mainstream.
The prize is huge: Tobii's sales ambitions suggest an overall market for gaming eye-tracking sensors that could top $5 billion a year in revenue, about six times the firm's market value.
"Eye-tracking makes it possible to create a more human device," said Eskilsson at Tobii's Stockholm headquarters, his laptop slipping into standby mode after noticing that he had looked away.
"Not only by steering with your eyes, but with hands, voice and where you are looking. All put together."
SMARTPHONE HURDLES
The company faces further hurdles before it can break into the far larger smartphone market.
Fund manager Inge Heydorn at Sentat Asset Management in Stockholm compared Tobii's gaming-focused business to a hard-to-value stock option and said its sensors must become cheaper and smaller and consume less energy if they are to be used for smartphones packing far less battery power than laptops.
"They don't know if they will get power consumption down. We don't know. Nobody knows," said Heydorn, who holds no Tobii stock.
Tobii dominates the market for now - its $75 million in 2014 sales is five times that of its closest rival among about 20 eye-tracking technology firms - by selling sensors as disability aids and for behavioural studies in research.
Keeping that edge may prove a challenge now that big technology firms, some of them Tobii customers, are looking at whether to develop their own technology.
South Korean giant Samsung's latest phone reads the position of the user's face, something Eskilsson sees as a precursor to full-blown eye tracking.
Tobii's deep-pocketed backers include Swedish group Investor, with a 19 percent stake, Intel Capital and early Spotify investor Northzone, both with roughly 8 percent.
Expectations for profit growth are sky-high and Tobii's share price has almost tripled since its April listing. The company is investing about 150 million Swedish crowns ($18 million) annually to expand in PC gaming.
"It's going to take a couple of years for that to become a volume market. It's not 10 years away, but within a couple of years," said Eskilsson.
Hans Otterling at third-biggest shareholder Northzone said Tobii was "totally capable" of carrying on by itself, without being swallowed by a bigger company. He said its value lay in the range of areas where eye-tracking may be applied.

"Imagine a surgeon, his hands free, able to steer things with his eyes. There is really just your imagination setting the limits," he said.

Eye tracking on the ISS

The Eye Tracking Device (ETD) is a headmounted device, designed for measurement of three-dimensional eye and head movements under experimental and natural conditions. The tracker permits comprehensive measurement of eye movement (three degrees of freedom) and optionally head movement (six degrees of freedom). It represents an important tool for the investigation of sensorimotor behaviour, particularly of the vestibular and oculomotor systems in both health and disease.
It was originally developed by the German Space Agency (DLR) for use on the International Space Station (ISS) and was uploaded to the station as part of the joint European / Russian space programme in early 2004. The device was designed by Prof. Dr. Andrew H. Clarke (Vestibular Lab, Charité Berlin) together with the companies Chronos Vision and Mtronix in Berlin and integrated for space utilisation by the Munich-based company Kayser-Threde.
In the first set of experiments, conducted by Prof. Clarke’s team in cooperation with the Moscow Institute for Biomedical Problems, the Eye Tracking Device was used for the measurement of Listing's plane - a coordinate framework, which is used to define the movement of the eyes in the head. The scientific goal was to determine how Listing’s plane is altered under various gravity conditions. In particular the influence of long-duration microgravity on board the ISS and of the subsequent return to Earth’s gravity was examined. The findings contribute to our understanding of neural plasticity in the vestibular and oculomotor systems.


These experiments were commenced in the spring 2004 and continued until late 2008 with a series of cosmonauts and astronauts, who each spent six months on board the ISS.
Examination of the orientation of Listing's plane during the course of a prolonged space mission is of particular interest, as on Earth the Listing’s plane appears to be dependent on input from the vestibular system i.e. detected through the head position with relation to gravity. By exposing the astronaut to the weightlessness of space, this experiment can follow the subsequent adaptation of the astronaut’s vestibular system during the flight and after returning to Earth. The key question in this experiment is to what extent the orientation of Listing’s plane is altered by the adaptation of the vestibular system to weightlessness, or under gravitational levels less than or greater than those of Earth. A further question is whether the body compensates for the missing inputs from the vestibular system by substituting other mechanisms during long-term spaceflight.The digital eye tracking cameras - designed around state-of-the-art CMOS image sensors - are interfaced to a dedicated processor board in the host PC via bi-directional, high speed digital transmission links (400 Mbit/s). This PCI plug-in board carries the front-end processing architecture, consisting of digital signal processors (DSP) and programmable logic devices (FPGA) for binocular, online image and signal acquisition.
For the eye tracking task, a substantial data reduction is performed by the sensor and the front-end processing. Thus, only preselected data are transferred from the image sensor through to the host PC where the final algorithms and data storage are implemented. This eliminates the bottleneck caused by standard frame-by-frame image acquisition, and thus facilitates considerably higher image sampling rates.
This processing architecture is integrated into a ruggedised, IBM compatible PC, which permits visualisation of the eyes and the corresponding signals. An important design feature is the digital storage of all image sequences from the cameras as digital files on exchangeable hard disk. After completion of each ISS mission, the hard disk containing the recordings is returned to Earth. This ensures comprehensive and reliable image processing analysis in the investigators’ lab and minimises the time required for the experiment on the ISS.

Jump-start your eye tracking research

Since its early days, eye tracking has come a long way. With the evolution of computer technology, eye tracking has become less intrusive, more affordable, accessible, and sessions increasingly comfortable and easy to set up. Long gone are the scary Frankenstein head-mounts. Modern eye trackers are hardly any larger than smartphones and provide an extremely natural experience for respondents. Remote, non-intrusive methods now render eye tracking an easy to use tool in human behavior research that allows to objectively measure eye movements in real time.
Beyond doubt, eye tracking enjoys a rapidly growing popularity within a vast variety of academic and commercial research areas. Providing information on how visual attention is distributed and changing for a presented stimulus, eye tracking is widely used to assess human behavior in market research, neuroscience, human-computer interaction, and numerous other scientific domains.
What is eye tracking?
Put most simply, eye tracking refers to the measurement of eye activity. More specifically, eye tracking implies the recording of eye position (point of gaze) and movement on a 2D screen or in 3D environments based on the optical tracking of corneal reflections to assess visual attention.
How does eye tracking work? Most modern eye trackers utilize near-infrared technology along with a high-resolution camera to track gaze positions. The underlying concept is commonly referred to as Pupil Center Corneal Reflection (PCCR): Near-infrared light is directed toward the center of the eye (pupil) causing visible reflections in the cornea (outer-most optical element of the eye), which are tracked by a camera.
There are two types of eye tracker: Remote (also called screen- or desktop-based) and head-mounted (also called mobile). While remote eye trackers are mounted below or placed close to a computer or screen, mobile eye trackers are mounted onto lightweight eyeglass frames (eye tracking glasses) and allow the respondent to walk around freely.
What can eye tracking reveal?
Regardless of which eye tracking system you use, gaze points and fixations are the raw metrics you can derive from eye tracking. Gaze points constitute the basic unit of measure – one gaze point equals one raw sample captured by the eye tracker. Gaze points can be aggregated into fixations, a period in which our eyes are locked toward a specific object. To concentrate your analysis on specific regions of the stimulus, you can define Areas of Interest (AOI). Generating heat maps allows you to visualize fixation positions and temporal changes of fixations as an overlay on a specific stimulus across different respondents.
There are several ways to analyze gaze positions and fixations. For example, you can analyze the fixation sequences and the performance of different regions in an image or a video with respect to the Time to First Fixation (TTFF), the number of respondents looking toward a specified region (respondent count), or AOI revisits after looking away.
While these metrics are commonly used to track eye movements modulated by visual attention and stimulus features, there are a few others that allow to assess states of emotional arousal and cognitive workload that drive eye motion. Those “advanced” metrics include pupil dilation, distance to the screen, ocular vergence, and blinks.
To learn more about eye tracking metrics and their characteristics, download our brand-new Pocket Guide to Eye Tracking now.
Why does it make sense to combine eye tracking with other biometric sensors?
Eye movements are tightly linked to visual attention. As a matter of fact, you just can‘t move your eyes without moving attention (if you manage to cheat science, kudos to you! You‘ll probably enter the Guinness Book World of Records). However, you certainly can shift attention without moving your eyes. While eye tracking can tell us what people look at and what they see, it can’t tell us what people perceive.
Eye tracking gives incredible insights into where we direct our eyes at a certain time and how eye movements are modulated by visual attention and stimulus features (size, brightness, color, and location). However, tracking gaze positions alone doesn‘t tell us anything particular about the cognitive processes and the emotional states that guide eye movements. In these cases, eye tracking needs to be complemented by other biometric sensors such as EEG, GSR, EMG, or facial expression analysis to capture the full picture of human behavior in that very moment and gain meaningful insights into the spatio-temporal dynamics of attention, emotion, and motivation.
Since its early days, eye tracking has come a long way. With the evolution of computer technology, eye tracking has become less intrusive, more affordable, accessible, and sessions increasingly comfortable and easy to set up. Long gone are the scary Frankenstein head-mounts. Modern eye trackers are hardly any larger than smartphones and provide an extremely natural experience for respondents. Remote, non-intrusive methods now render eye tracking an easy to use tool in human behavior research that allows to objectively measure eye movements in real time.
Beyond doubt, eye tracking enjoys a rapidly growing popularity within a vast variety of academic and commercial research areas. Providing information on how visual attention is distributed and changing for a presented stimulus, eye tracking is widely used to assess human behavior in market research, neuroscience, human-computer interaction, and numerous other scientific domains.
What is eye tracking?
Put most simply, eye tracking refers to the measurement of eye activity. More specifically, eye tracking implies the recording of eye position (point of gaze) and movement on a 2D screen or in 3D environments based on the optical tracking of corneal reflections to assess visual attention.
How does eye tracking work? Most modern eye trackers utilize near-infrared technology along with a high-resolution camera to track gaze positions. The underlying concept is commonly referred to as Pupil Center Corneal Reflection (PCCR): Near-infrared light is directed toward the center of the eye (pupil) causing visible reflections in the cornea (outer-most optical element of the eye), which are tracked by a camera.
There are two types of eye tracker: Remote (also called screen- or desktop-based) and head-mounted (also called mobile). While remote eye trackers are mounted below or placed close to a computer or screen, mobile eye trackers are mounted onto lightweight eyeglass frames (eye tracking glasses) and allow the respondent to walk around freely.
What can eye tracking reveal?
Regardless of which eye tracking system you use, gaze points and fixations are the raw metrics you can derive from eye tracking. Gaze points constitute the basic unit of measure – one gaze point equals one raw sample captured by the eye tracker. Gaze points can be aggregated into fixations, a period in which our eyes are locked toward a specific object. To concentrate your analysis on specific regions of the stimulus, you can define Areas of Interest (AOI). Generating heat maps allows you to visualize fixation positions and temporal changes of fixations as an overlay on a specific stimulus across different respondents.
There are several ways to analyze gaze positions and fixations. For example, you can analyze the fixation sequences and the performance of different regions in an image or a video with respect to the Time to First Fixation (TTFF), the number of respondents looking toward a specified region (respondent count), or AOI revisits after looking away.
While these metrics are commonly used to track eye movements modulated by visual attention and stimulus features, there are a few others that allow to assess states of emotional arousal and cognitive workload that drive eye motion. Those “advanced” metrics include pupil dilation, distance to the screen, ocular vergence, and blinks.
To learn more about eye tracking metrics and their characteristics, download our brand-new Pocket Guide to Eye Tracking now.
Why does it make sense to combine eye tracking with other biometric sensors?
Eye movements are tightly linked to visual attention. As a matter of fact, you just can‘t move your eyes without moving attention (if you manage to cheat science, kudos to you! You‘ll probably enter the Guinness Book World of Records). However, you certainly can shift attention without moving your eyes. While eye tracking can tell us what people look at and what they see, it can’t tell us what people perceive.
Eye tracking gives incredible insights into where we direct our eyes at a certain time and how eye movements are modulated by visual attention and stimulus features (size, brightness, color, and location). However, tracking gaze positions alone doesn‘t tell us anything particular about the cognitive processes and the emotional states that guide eye movements. In these cases, eye tracking needs to be complemented by other biometric sensors such as EEG, GSR, EMG, or facial expression analysis to capture the full picture of human behavior in that very moment and gain meaningful insights into the spatio-temporal dynamics of attention, emotion, and motivation.

Why you should combine EEG with other biometric sensors

Beyond doubt, electroencephalography (EEG) is your means of choice when it comes to measuring brain activity associated with perception, cognitive behavior, and emotional processes. EEG can be considered as biometric sensor with the highest time resolution, revealing substantial insights into sub-second brain dynamics of engagement, motivation, frustration, cognitive workload, and further metrics associated with stimulus processing, action preparation, and execution (read our blog posts on EEG metrics and brain processes that can be measured with EEG).
Yet, in combination with other biosensors such as eye tracking, galvanic skin response(GSR) or facial recognition EEG can become even more powerful.
Here’s why.
EEG and eye tracking
Eye tracking provides immediate feedback on why specific brain activity emerged and which elements in the visual field induced the electrical activity. Taken by itself, EEG can convey merely a general idea as to why a person shows high workload or heightened emotional arousal at a certain point in time while being confronted with visual stimulus content. Without knowing where exactly the person is looking at in that very moment, it is patently impossible to identify clearly which particular visual stimulus triggered the increase in brain activity.
Complementing EEG with eye tracking can also reveal if a person missed to see a relevant cue that they were supposed to see. Even more, with synchronized eye tracking and EEG data at hand, you effectively can investigate if, for example, a stimulus in the periphery diverted the person’s attention without provoking a counteraction or movement that could have been picked up by the EEG.
Why you should combine EEG and eye tracking:
  1. Synchronizing eye tracking and EEG data allows you to detect the amount of workload or interest generated by a specific stimulus at a specific point in time. With eye tracking, you exactly know where, when, and what a person is looking at and which stimulus is driving the increased brain activity.
  2. Eye tracking delivers information about the exact orientation of the eyeball, thereby helping you to easily identify artifacts such as blinks, eye movements etc. and decontaminate your EEG data. Clean data allows clean responses to your research questions – in fact, there is no substitute for clean data (read our blog post on the basics of EEG data processing).
  3. Eye tracking delivers pupillometry arousal measures that you can combine with EEG-based intensity measures such as motivation and engagement in order to obtain comprehensive insights into cognitive processing.
EEG and GSR
EEG offers valuable information on the quality of an emotion, commonly referred to asvalence, which can either be positive (“yay!”) or negative (“nay!”) – is a person drawn towards or rather drawn away by a certain stimulus? While EEG effectively measures the presence of an emotion, it reveals only little about the intensity of that emotion, which is generally referred to as arousal.
Why you should combine EEG and GSR:
One of the most sensitive markers for emotional arousal is galvanic skin response (GSR). GSR reflects the amount of sweat secretion from sweat glands triggered by emotional stimulation. To equally assess both the quality of an emotion and its intensity, it is worth combining EEG with skin conductance measures to link valence picked up by EEG with arousal derived from GSR.
EEG and facial expression analysis
Facial expression analysis impressively captures changes of facial features on a moment-to-moment basis, indicating which muscle groups are active during smiling, crying, being angry etc. While facial recognition certainly is a powerful measure to assess emotional valence based on facial movements, it doesn’t tell if the person truly is in a specific emotional state or mood – facial expression analysis can’t differentiate between a genuinely happy, hysterical or even phoney smile.
Why you should combine EEG and facial expression analysis:
Unlike facial recognition, EEG is able to monitor the global emotional state of a person, which cannot be controlled consciously (you can fake your smile, but you can’t trick your brain). Combining the two modalities allows you to get insights into both the moment-by-moment changes in emotional expression as well as variations in emotional states across a longer time span.
Whenever you synchronize EEG with other biometric sensors you can’t go wrong – you just add more specifics to the picture as every sensor contributes a valuable feature that you cannot get with any other. Who knows? You might even light upon a previously unknown, entirely new brain process driving exactly that specific emotion. Just think about it!

No comments:

Post a Comment