Eye tracking

From Infogalactic: the planetary knowledge core
(Redirected from Commercial eye tracking)
Jump to: navigation, search
scientists track eye movements in glaucoma patients to check vision impairment while driving
This is the article for the study of eye movement. For the tendency to visually track potential prey, see eye-stalking

Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. An eye tracker is a device for measuring eye positions and eye movement. Eye trackers are used in research on the visual system, in psychology, in psycholinguistics, marketing, as an input device for human computer interaction, and in product design. There are a number of methods for measuring eye movement. The most popular variant uses video images from which the eye position is extracted. Other methods use search coils or are based on the electrooculogram.

Yarbus eye tracker from the 1960s

History

In the 1800s, studies of eye movement were made using direct observations.

In 1879 in Paris, Louis Émile Javal observed that reading does not involve a smooth sweeping of the eyes along the text, as previously assumed, but a series of short stops (called fixations) and quick saccades.[1] This observation raised important questions about reading, which were explored during the 1900s: On which words do the eyes stop? For how long? When does it regress back to already seen words?

An example of fixations and saccades over text. This is the typical pattern of eye movement during reading. The eyes never move smoothly over still text.

Edmund Huey[2] built an early eye tracker, using a sort of contact lens with a hole for the pupil. The lens was connected to an aluminum pointer that moved in response to the movement of the eye. Huey studied and quantified regressions (only a small proportion of saccades are regressions), and he showed that some words in a sentence are not fixated.

The first non-intrusive eye trackers were built by Guy Thomas Buswell in Chicago, using beams of light that were reflected on the eye and then recording them on film. Buswell made systematic studies into reading[3] and picture viewing.[4]

In the 1950s, Alfred L. Yarbus[5] did important eye tracking research and his 1967 book is often quoted. He showed the task given to a subject has a very large influence on the subject's eye movement. He also wrote about the relation between fixations and interest:

"All the records ... show conclusively that the character of the eye movement is either completely independent of or only very slightly dependent on the material of the picture and how it was made, provided that it is flat or nearly flat."[6] The cyclical pattern in the examination of pictures "is dependent not only on what is shown on the picture, but also on the problem facing the observer and the information that he hopes to gain from the picture."[7]
This study by Yarbus (1967) is often referred to as evidence on how the task given to a person influences his or her eye movement.
"Records of eye movements show that the observer's attention is usually held only by certain elements of the picture.... Eye movement reflects the human thought processes; so the observer's thought may be followed to some extent from records of eye movement (the thought accompanying the examination of the particular object). It is easy to determine from these records which elements attract the observer's eye (and, consequently, his thought), in what order, and how often."[6]
"The observer's attention is frequently drawn to elements which do not give important information but which, in his opinion, may do so. Often an observer will focus his attention on elements that are unusual in the particular circumstances, unfamiliar, incomprehensible, and so on."[8]
"... when changing its points of fixation, the observer's eye repeatedly returns to the same elements of the picture. Additional time spent on perception is not used to examine the secondary elements, but to reexamine the most important elements."[9]
This study by Hunziker (1970)[10] on eye tracking in problem solving used simple 8 mm film to track eye movement by filming the subject through a glass plate on which the visual problem was displayed.[11][12]

In the 1970s, eye tracking research expanded rapidly, particularly reading research. A good overview of the research in this period is given by Rayner.[13]

In 1980, Just and Carpenter[14] formulated the influential Strong eye-mind Hypothesis, the hypothesis that "there is no appreciable lag between what is fixated and what is processed". If this hypothesis is correct, then when a subject looks at a word or object, he or she also thinks about (process cognitively), and for exactly as long as the recorded fixation. The hypothesis is often taken for granted by researchers using eye tracking. However, gaze-contingent techniques offer an interesting option in order to disentangle overt and covert attentions, to differentiate what is fixated and what is processed.

During the 1980s, the eye-mind hypothesis was often questioned in light of covert attention,[15][16] the attention to something that one is not looking at, which people often do. If covert attention is common during eye tracking recordings, the resulting scan path and fixation patterns would often show not where our attention has been, but only where the eye has been looking, and so eye tracking would not indicate cognitive processing.

The 1980s also saw the birth of using eye tracking to answer questions related to human-computer interaction. Specifically, researchers investigated how users search for commands in computer menus.[17] Additionally, computers allowed researchers to use eye-tracking results in real time, primarily to help disabled users.[18]

More recently, there has been growth in using eye tracking to study how users interact with different computer interfaces. Specific questions researchers ask are related to the how easy different interfaces are for users.[19] The results of the eye tracking research can lead to changes in design of the interface. Yet another recent area of research focuses on Web development. This can include how users react to drop-down menus or where they focus their attention on a Website so the developer knows where to place an advertisement.[20]

According to Hoffman,[21] current consensus is that visual attention is always slightly (100 to 250 ms) ahead of the eye. But as soon as attention moves to a new position, the eyes will want to follow.[22]

We still cannot infer specific cognitive processes directly from a fixation on a particular object in a scene.[23] For instance, a fixation on a face in a picture may indicate recognition, liking, dislike, puzzlement etc. Therefore, eye tracking is often coupled with other methodologies, such as introspective verbal protocols.

Tracker types

Eye trackers measure rotations of the eye in one of several ways, but principally they fall into three categories: (i) measurement of the movement of an object (normally, a special contact lens) attached to the eye, (ii) optical tracking without direct contact to the eye, and (iii) measurement of electric potentials using electrodes placed around the eyes.

Eye-attached tracking

The first type uses an attachment to the eye, such as a special contact lens with an embedded mirror or magnetic field sensor, and the movement of the attachment is measured with the assumption that it does not slip significantly as the eye rotates. Measurements with tight fitting contact lenses have provided extremely sensitive recordings of eye movement, and magnetic search coils are the method of choice for researchers studying the dynamics and underlying physiology of eye movement. It allows the measurement of eye movement in horizontal, vertical and torsion directions.[24]

Optical tracking

An eye-tracking head-mounted display. Each eye has an LED light source (gold-color metal) on the side of the display lens, and a camera under the display lens.

The second broad category uses some non-contact, optical method for measuring eye motion. Light, typically infrared, is reflected from the eye and sensed by a video camera or some other specially designed optical sensor. The information is then analyzed to extract eye rotation from changes in reflections. Video-based eye trackers typically use the corneal reflection (the first Purkinje image) and the center of the pupil as features to track over time. A more sensitive type of eye tracker, the dual-Purkinje eye tracker,[25] uses reflections from the front of the cornea (first Purkinje image) and the back of the lens (fourth Purkinje image) as features to track. A still more sensitive method of tracking is to image features from inside the eye, such as the retinal blood vessels, and follow these features as the eye rotates. Optical methods, particularly those based on video recording, are widely used for gaze tracking and are favored for being non-invasive and inexpensive.

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Electric potential measurement

The third category uses electric potentials measured with electrodes placed around the eyes. The eyes are the origin of a steady electric potential field, which can also be detected in total darkness and if the eyes are closed. It can be modelled to be generated by a dipole with its positive pole at the cornea and its negative pole at the retina. The electric signal that can be derived using two pairs of contact electrodes placed on the skin around one eye is called Electrooculogram (EOG). If the eyes move from the centre position towards the periphery, the retina approaches one electrode while the cornea approaches the opposing one. This change in the orientation of the dipole and consequently the electric potential field results in a change in the measured EOG signal. Inversely, by analysing these changes in eye movement can be tracked. Due to the discretisation given by the common electrode setup two separate movement components – a horizontal and a vertical – can be identified. A third EOG component is the radial EOG channel,[26] which is the average of the EOG channels referenced to some posterior scalp electrode. This radial EOG channel is sensitive to the saccadic spike potentials stemming from the extra-ocular muscles at the onset of saccades, and allows reliable detection of even miniature saccades.[27]

Due to potential drifts and variable relations between the EOG signal amplitudes and the saccade sizes make it challenging to use EOG for measuring slow eye movement and detecting gaze direction. EOG is, however, a very robust technique for measuring saccadic eye movement associated with gaze shifts and detecting blinks. Contrary to video-based eye-trackers, EOG allows recording of eye movements even with eyes closed, and can thus be used in sleep research. It is a very light-weight approach that, in contrast to current video-based eye trackers, only requires very low computational power, works under different lighting conditions and can be implemented as an embedded, self-contained wearable system.[28] It is thus the method of choice for measuring eye movement in mobile daily-life situations and REM phases during sleep. The major disadvantage of EOG is its relatively poor gaze direction accuracy compared to a video tracker. That is, it is difficult using EOG to determine with good accuracy exactly where a subject is looking, though the time of eye movements can be determined.

Technologies and techniques

The most widely used current designs are video-based eye trackers. A camera focuses on one or both eyes and records their movement as the viewer looks at some kind of stimulus. Most modern eye-trackers use the center of the pupil and infrared / near-infrared non-collimated light to create corneal reflections (CR). The vector between the pupil center and the corneal reflections can be used to compute the point of regard on surface or the gaze direction. A simple calibration procedure of the individual is usually needed before using the eye tracker.[29]

Two general types of infrared / near-infrared (also known as active light) eye tracking techniques are used: bright-pupil and dark-pupil. Their difference is based on the location of the illumination source with respect to the optics. If the illumination is coaxial with the optical path, then the eye acts as a retroreflector as the light reflects off the retina creating a bright pupil effect similar to red eye. If the illumination source is offset from the optical path, then the pupil appears dark because the retroreflection from the retina is directed away from the camera.[30]

Bright-pupil tracking creates greater iris/pupil contrast, allowing more robust eye tracking with all iris pigmentation, and greatly reduces interference caused by eyelashes and other obscuring features.[31] It also allows tracking in lighting conditions ranging from total darkness to very bright. But bright-pupil techniques are not effective for tracking outdoors, as extraneous IR sources interfere with monitoring.[citation needed]

Another, less used, method is known as passive light. It uses the visible light to illuminate, which may cause some distractions to users.[30] Another challenge with this method is the contrast of the pupil is less than in the active light methods, therefore, the center of iris is used for calculating the vector instead.[32] This calculation needs to detect the boundary of iris and the white sclera (limbus tracking). It presents another challenge for vertical eye movements due to obstruction of eyelids.[33]

Eye-tracking setups vary greatly; some are head-mounted, some require the head to be stable (for example, with a chin rest), and some function remotely and automatically track the head during motion. Most use a sampling rate of at least 30 Hz. Although 50/60 Hz is more common, today many video-based eye trackers run at 240, 350 or even 1000/1250 Hz, which is needed in order to capture fixational eye movements or correctly measure saccade dynamics.

Eye movements are typically divided into fixations and saccades – when the eye gaze pauses in a certain position, and when it moves to another position, respectively. The resulting series of fixations and saccades is called a scanpath. Smooth pursuit describes the eye following a moving object. Fixational eye movements include microsaccades, which are small, involuntary saccades that occur during attempted fixation. Most information from the eye is made available during a fixation or smooth pursuit, but not during a saccade.[citation needed] The central one or two degrees of the visual angle (that area of the visual field which falls on the fovea) provide the bulk of visual information; the input from larger eccentricities (the periphery) has less resolution and little to no colour, although contrast and movement is detected better in peripheral vision. Hence, the locations of fixations or smooth pursuit along a scanpath show what information loci on the stimulus were processed during an eye tracking session. On average, fixations last for around 200 ms during the reading of linguistic text, and 350 ms during the viewing of a scene. Preparing a saccade towards a new goal takes around 200 ms.[citation needed]

Scanpaths are useful for analyzing cognitive intent, interest, and salience. Other biological factors (some as simple as gender) may affect the scanpath as well. Eye tracking in human–computer interaction (HCI) typically investigates the scanpath for usability purposes, or as a method of input in gaze-contingent displays, also known as gaze-based interfaces.[34]

Data presentation

To allow interpretation of the data that is recorded by the various types of eye trackers exist various software that animates or visually represents it, so that the visual behavior of one or more users can be graphically resumed. Graphical presentation is rarely the basis of research results, since they are limited in terms of what can be analysed - research relying on eye tracking, for example, usually requires quantitative measures of the eye movement events and their parameters, The following visualisations are the most commonly used:

Animated representations of a point on the interface This method is used when the visual behavior is examined individually indicating where the user focused their gaze in each moment, complemented with a small path that indicates the previous saccade movements, as seen in the image.

Static representations of the saccade path This is fairly similar to the one described above with the difference that this is static method. A higher level of expertise than with the animated ones is required to interpret this.

Heat maps An alternative static representation, mainly used for the agglomerated analysis of the visual exploration patterns in a group of users, differing from both methods explained before. In these representations, the ‘hot’ zones or zones with higher density designate where the users focused their gaze (not their attention) with a higher frequency. Heat maps are the best-known visualization technique for eyetracking studies.[35]

Blind zones maps, or focus maps This method is a simplified version of the Heat maps where the visually less attended zones by the users are displayed clearly, thus allowing for an easier understanding of the most relevant information, that is to say, we are informed about which zones were not seen by the users.

Eye tracking vs. gaze tracking

File:Gaze Tracking Example.jpg
An Example of Eye Gaze Trace Overlay

Eye trackers necessarily measure the rotation of the eye with respect to the measuring system. If the measuring system is head mounted, as with EOG, then eye-in-head angles are measured. If the measuring system is table mounted, as with scleral search coils or table mounted camera (“remote”) systems, then gaze angles are measured.

In many applications, the head position is fixed using a bite bar, a forehead support or something similar, so that eye position and gaze are the same. In other cases, the head is free to move, and head movement is measured with systems such as magnetic or video based head trackers.

For head-mounted trackers, head position and direction are added to eye-in-head direction to determine gaze direction. For table-mounted systems, such as search coils, head direction is subtracted from gaze direction to determine eye-in-head position.

Eye tracking in practice

A great deal of research has gone into studies of the mechanisms and dynamics of eye rotation, but the goal of eye tracking is most often to estimate gaze direction. Users may be interested in what features of an image draw the eye, for example. It is important to realize that the eye tracker does not provide absolute gaze direction, but rather can only measure changes in gaze direction. In order to know precisely what a subject is looking at, some calibration procedure is required in which the subject looks at a point or series of points, while the eye tracker records the value that corresponds to each gaze position. (Even those techniques that track features of the retina cannot provide exact gaze direction because there is no specific anatomical feature that marks the exact point where the visual axis meets the retina, if indeed there is such a single, stable point) An accurate and reliable calibration is essential for obtaining valid and repeatable eye movement data, and this can be a significant challenge for non-verbal subjects or those who have unstable gaze.

Each method of eye tracking has advantages and disadvantages, and the choice of an eye tracking system depends on considerations of cost and application. There are offline methods and online procedures like AttentionTracking. There is a trade-off between cost and sensitivity, with the most sensitive systems costing many tens of thousands of dollars and requiring considerable expertise to operate properly. Advances in computer and video technology have led to the development of relatively low cost systems that are useful for many applications and fairly easy to use. Interpretation of the results still requires some level of expertise, however, because a misaligned or poorly calibrated system can produce wildly erroneous data.

Eye tracking while driving a car in a difficult situation

Frames from narrow road eye tracking described in this section[36]

The eye movement of two groups of drivers have been filmed with a special head camera by a team of the Swiss Federal Institute of Technology: Novice and experienced drivers had their eye-movement recorded while approaching a bend of a narrow road. The series of images has been condensed from the original film frames[37] to show 2 eye fixations per image for better comprehension.

Each of these stills correspond approximately to 0.5 seconds in realtime.

The series of images shows an example of eye fixations #9 to #14 of a typical novice and an experienced driver.

Comparison of the top images shows that the experienced driver checks the curve and even has Fixation No. 9 left to look aside while the novice driver needs to check the road and estimate his distance to the parked car.

In the middle images the experienced driver is now fully concentrating on the location where an oncoming car could be seen. The novice driver concentrates his view on the parked car.

In the bottom image the novice is busy estimating the distance between the left wall and the parked car, while the experienced driver can use his peripheral vision for that and still concentrates his view on the dangerous point of the curve: If a car appears there he has to give way, i. e. stop to the right instead of passing the parked car.[38]

Eye tracking of younger and elderly people in walking

Elderly subjects depend more on foveal vision than younger subjects during walking. Their walking speed is decreased by a limited visual field, probably caused by a deteriorated peripheral vision.

Younger subjects make use of both their central and peripheral vision while walking. Their peripheral vision allows faster control over the process of walking.[39]

Applications

A wide variety of disciplines use eye tracking techniques, including cognitive science, psychology (notably psycholinguistics, the visual world paradigm), human-computer interaction (HCI), marketing research and medical research (neurological diagnosis). Specific applications include the tracking eye movement in language reading, music reading, human activity recognition, the perception of advertising, and the playing of sport.[40] Uses include:

  • Cognitive Studies
  • Medical Research
  • Laser refractive surgery
  • Human Factors
  • Computer Usability
  • Translation Process Research
  • Vehicle Simulators
  • In-vehicle Research
  • Training Simulators
  • Fatigue Detection
  • Virtual Reality
  • Adult Research
  • Infant Research
  • Adolescent Research
  • Geriatric Research
  • Primate Research
  • Sports Training
  • fMRI / MEG / EEG
  • Commercial eye tracking (web usability, advertising, marketing, automotive, etc.)
  • Finding good clues
  • Communication systems for disabled
  • Improved image and video communications
  • Product development
  • Employee training
  • Computer Science: Activity Recognition[41][42][43]
  • Image and video compression
  • Computer vision[44]

Commercial applications

In recent years, the increased sophistication and accessibility of eye tracking technologies have generated a great deal of interest in the commercial sector. Applications include web usability, advertising, sponsorship, package design and automotive engineering. In general, commercial eye tracking studies function by presenting a target stimulus to a sample of consumers while an eye tracker is used to record the activity of the eye. Examples of target stimuli may include websites, television programs, sporting events, films, commercials, magazines, newspapers, packages, shelf displays, consumer systems (ATMs, checkout systems, kiosks), and software. The resulting data can be statistically analyzed and graphically rendered to provide evidence of specific visual patterns. By examining fixations, saccades, pupil dilation, blinks and a variety of other behaviors researchers can determine a great deal about the effectiveness of a given medium or product. While some companies complete this type of research internally, there are many private companies that offer eye tracking services and analysis.

One of the most prominent fields of commercial eye tracking research is web usability.[citation needed] While traditional usability techniques are often quite powerful in providing information on clicking and scrolling patterns, eye tracking offers the ability to analyze user interaction between the clicks and how much time a user spends between clicks. This provides valuable insight into which features are the most eye-catching, which features cause confusion and which ones are ignored altogether. Specifically, eye tracking can be used to assess search efficiency, branding, online advertisements, navigation usability, overall design and many other site components. Analyses may target a prototype or competitor site in addition to the main client site.

Eye tracking is commonly used in a variety of different advertising media. Commercials, print ads, online ads and sponsored programs are all conducive to analysis with current eye tracking technology. For instance in newspapers, eye tracking studies can be used to find out in what way advertisements should be mixed with the news in order to catch the subject’s eyes.[45] Analyses focus on visibility of a target product or logo in the context of a magazine, newspaper, website, or televised event. For example, an analysis of eye movements over advertisements in the Yellow Pages. They studied what particular features caused people to notice an ad and if they viewed ads in a particular order and how viewing times varied. The study revealed that ad size, graphics, color, and copy all influence attention to advertisements.This allows researchers to assess in great detail how often a sample of consumers fixates on the target logo, product or ad. As such, an advertiser can quantify the success of a given campaign in terms of actual visual attention.[46] Another example of this is a study that found that in a search engine results page authorship snippets received more attention than the paid ads or even the first organic result.[47]

Eye tracking also provides package designers with the opportunity to examine the visual behavior of a consumer while interacting with a target package. This may be used to analyze distinctiveness, attractiveness and the tendency of the package to be chosen for purchase. Eye tracking is often utilized while the target product is in the prototype stage. Prototypes are tested against each other and competitors to examine which specific elements are associated with high visibility and appeal.

One of the most promising applications of eye tracking research is in the field of automotive design. Research is currently underway to integrate eye tracking cameras into automobiles. The goal of this endeavor is to provide the vehicle with the capacity to assess in real-time the visual behavior of the driver. The National Highway Traffic Safety Administration (NHTSA) estimates that drowsiness is the primary causal factor in 100,000 police-reported accidents per year. Another NHTSA study suggests that 80% of collisions occur within three seconds of a distraction. By equipping automobiles with the ability to monitor drowsiness, inattention, and cognitive engagement driving safety could be dramatically enhanced. Lexus claims to have equipped its LS 460 with the first driver monitor system in 2006, providing a warning if the driver takes his or her eye off the road.[48]

Since 2005, eye tracking is used in communication systems for disabled persons: allowing the user to speak, send e-mail, browse the Internet and perform other such activities, using only their eyes.[49] Eye control works even when the user has involuntary movement as a result of Cerebral palsy or other disabilities, and for those who have glasses or other physical interference which would limit the effectiveness of older eye control systems.[citation needed]

Eye tracking has also seen minute use in autofocus still camera equipment, where users can focus on a subject simply by looking at it through the viewfinder.

See also

Notes

  1. Reported in Huey 1908/1968.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Buswell (1922, 1937)
  4. (1935)
  5. Yarbus 1967
  6. 6.0 6.1 Yarbus 1967, p. 190
  7. Yarbus 1967, p. 194
  8. Yarbus 1967, p. 191
  9. Yarbus 1967, p. 193
  10. Hunziker, H. W. (1970). Visuelle Informationsaufnahme und Intelligenz: Eine Untersuchung über die Augenfixationen beim Problemlösen. Schweizerische Zeitschrift für Psychologie und ihre Anwendungen, 1970, 29, Nr 1/2 (english abstract: http://www.learning-systems.ch/multimedia/forsch1e.htm )
  11. http://www.learning-systems.ch/multimedia/eye movements problem solving.swf
  12. http://www.learning-systems.ch/multimedia/forsch1e.htm
  13. Rayner (1978)
  14. Just and Carpenter (1980)
  15. Posner (1980)
  16. Wright & Ward (2008)
  17. [1]
  18. [2], [3]
  19. [4], [5], [6]
  20. [7]
  21. Hoffman 1998
  22. Deubel and Schneider 1996
  23. Holsanova 2007
  24. David A. Robinson: A method of measuring eye movement using a scleral search coil in a magnetic field, IEEE Transactions on Bio-Medical Electronics, October 1963, 137–145 (PDF)
  25. Lua error in package.lua at line 80: module 'strict' not found.
  26. Elbert, T., Lutzenberger, W., Rockstroh, B., Birbaumer, N., 1985. Removal of ocular artifacts from the EEG. A biophysical approach to the EOG. Electroencephalogr Clin Neurophysiol 60, 455-463.
  27. Lua error in package.lua at line 80: module 'strict' not found.
  28. Lua error in package.lua at line 80: module 'strict' not found. [8]
  29. Lua error in package.lua at line 80: module 'strict' not found.
  30. 30.0 30.1 Lua error in package.lua at line 80: module 'strict' not found.
  31. The Eye: A Survey of Human Vision; Wikimedia Foundation
  32. Lua error in package.lua at line 80: module 'strict' not found.
  33. Lua error in package.lua at line 80: module 'strict' not found.
  34. Majaranta, P., Aoki, H., Donegan, M., Hansen, D.W., Hansen, J.P., Hyrskykari, A., Räihä, K.J., Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies, IGI Global, 2011
  35. Nielsen, Jakob. Pernice, Kara. (2010). "[9] Eyetracking Web Usability." New Rideres Publishing. p. 11. ISBN 0-321-49836-4. Google Book Search. Retrieved on October 28, 2013.
  36. Hans-Werner Hunziker, (2006) Im Auge des Lesers: foveale und periphere Wahrnehmung - vom Buchstabieren zur Lesefreude [In the eye of the reader: foveal and peripheral perception - from letter recognition to the joy of reading] Transmedia Stäubli Verlag Zürich 2006 ISBN 978-3-7266-0068-6 Based on data from:Cohen, A. S. (1983). Informationsaufnahme beim Befahren von Kurven, Psychologie für die Praxis 2/83, Bulletin der Schweizerischen Stiftung für Angewandte Psychologie
  37. Cohen, A. S. (1983). Informationsaufnahme beim Befahren von Kurven, Psychologie für die Praxis 2/83, Bulletin der Schweizerischen Stiftung für Angewandte Psychologie
  38. Pictures from: Hans-Werner Hunziker, (2006) Im Auge des Lesers: foveale und periphere Wahrnehmung – vom Buchstabieren zur Lesefreude [In the eye of the reader: foveal and peripheral perception – from letter recognition to the joy of reading] Transmedia Stäubli Verlag Zürich 2006 ISBN 978-3-7266-0068-6
  39. Itoh N, Fukuda T. (2002) Comparative study of eye movement in extent of central and peripheral vision and use by young and elderly walkers. Percept Mot Skills. 2002 Jun;94(3 Pt 2):1283–91
  40. See, e.g., newspaper reading studies.
  41. Bulling, A. et al.: Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography, Proc. of the 6th International Conference on Pervasive Computing (Pervasive 2008), pp. 19–37, Sydney, Australia, May 2008.
  42. Bulling, A. et al.: Eye Movement Analysis for Activity Recognition, Proc. of the 11th International Conference on Ubiquitous Computing (UbiComp 2009), pp. 41–50, Orlando, United States, September 2009.
  43. Bulling, A. et al.: Eye Movement Analysis for Activity Recognition Using Electrooculography, IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI).
  44. Karthikeyan, S. et al.: From Where and How to What We See, Proc. of International Conference on Computer Vision (ICCV 2013), pp. 19–37, Sydney, Australia, December 2013.
  45. "Eye Tracking Study: The New York Times vs. The Wall Street Journal"
  46. Lua error in package.lua at line 80: module 'strict' not found.
  47. "Eye Tracking Study: The Importance of Using Google Authorship in Search Results"[10]
  48. Lua error in package.lua at line 80: module 'strict' not found.
  49. Lua error in package.lua at line 80: module 'strict' not found.

References

  • Lua error in package.lua at line 80: module 'strict' not found.
  • Buswell, G.T. (1922). Fundamental reading habits: A study of their development. Chicago, IL: University of Chicago Press.
  • Buswell G.T. (1935). How People Look at Pictures. Chicago: Univ. Chicago Press 137–55. Hillsdale, NJ: Erlbaum
  • Buswell, G.T. (1937). How adults read. Chicago, IL: University of Chicago Press.
  • Carpenter, Roger H.S.; Movements of the Eyes (2nd ed.). Pion Ltd, London, 1988. ISBN 0-85086-109-8.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Duchowski, A. T., "A Breadth-First Survey of Eye Tracking Applications", Behavior Research Methods, Instruments, & Computers (BRMIC), 34(4), November 2002, pp. 455–470.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Ferguson RD (1998). Servo tracking system utilizing phase-sensitive detection of reflectance variations. US Patent # 5,767,941
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Hoffman, J. E. (1998). Visual attention and eye movements. In H. Pashler (ed.), Attention (pp. 119–154). Hove, UK: Psychology Press.
  • Holsanova, J. (forthcoming) Picture viewing and picture descriptions, Benjamins.
  • Huey, E.B. (1968). The psychology and pedagogy of reading. Cambridge, MA: MIT Press. (Originally published 1908)
  • Jacob, R. J. K. & Karn, K. S. (2003). Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. In R. Radach, J. Hyona, & H. Deubel (eds.), The mind's eye: cognitive and applied aspects of eye movement research (pp. 573–605). Boston: North-Holland/Elsevier.
  • Just MA, Carpenter PA (1980) A theory of reading: from eye fixation to comprehension. Psychol Rev 87:329–354
  • Liechty,J, Pieters, R, & Wedel, M. (2003). The Representation of Local and Global Exploration Modes in Eye Movements through Bayesian Hidden Markov Models. Psychometrika, 68 (4), 519–542.
  • Mulligan, JB, (1997). Recovery of Motion Parameters from Distortions in Scanned Images. Proceedings of the NASA Image Registration Workshop (IRW97), NASA Goddard Space Flight Center, MD
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Pirri, F., Pizzoli, M., Rudi, A, (2011). A general method for the point of regard estimation in 3D space. Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on, 921-928.doi:10.1109/CVPR.2011.5995634;
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Riju Srimal, Jorn Diedrichsen, Edward B. Ryklin, and Clayton E. Curtis. "Obligatory adaptation of saccade gains. J Neurophysiol. 2008 Mar;99(3):1554-8
  • Robinson, D. A. A method of measuring eye movement using a scleral search coil in a magnetic field. IEEE Trans. Biomed. Eng. vol. BME-l0, pp. 137–145, 1963
  • Wright, R.D., & Ward, L.M. (2008). Orienting of Attention. New York. Oxford University Press.
  • Lua error in package.lua at line 80: module 'strict' not found.. (Originally published in Russian 1962)

Commercial eye tracking

  • Bojko, A. (2006). Using Eye Tracking to Compare Web Page Designs: A Case Study. Journal of Usability Studies, Vol.1, No. 3. [11]
  • Bojko, A. & Stephenson, A. (2005). It's All in the Eye of the User: How eye tracking can help answer usability questions. User Experience, Vol. 4, No. 1.
  • Chandon, Pierre, J. Wesley Hutchinson, and Scott H. Young (2001), Measuring Value of Point-of-Purchase Marketing with Commercial Eye-Tracking Data. [12]
  • Duchowski, A. T., (2002) A Breadth-First Survey of Eye Tracking Applications, 'Behavior Research Methods, Instruments, & Computers (BRMIC),' 34(4), November 2002, pp. 455–470.
  • National Highway Traffic Safety Administration. (n.d.) Retrieved July 9, 2006, from [13]
  • Pieters, R., Wedel, M. & Zhang, J. (2007). Optimal Feature Advertising Under Competitive Clutter, Management Science, 2007, 51 (11) 1815–1828.
  • Pieters, R., & Wedel, M. (2007). Goal Control of Visual Attention to Advertising: The Yarbus Implication, Journal of Consumer Research, 2007, 34 (August), 224–233.
  • Pieters, R. & Wedel, M. (2004). Attention Capture and Transfer by elements of Advertisements. Journal of Marketing, 68 (2), 2004, 36–50.
  • Thomas RECORDING GmbH, high-speed Eye Tracking Systems for neuro-scientific purposes [14]
  • Weatherhead, James. (2005) Eye on the Future, 'British Computer Society, ITNOW Future of Computing,' 47 (6), pp. 32–33 [15]
  • Wedel, M. & Pieters, R. (2000). Eye fixations on advertisements and memory for brands: a model and findings. Marketing Science, 19 (4), 2000, 297–312.
  • Wittenstein, Jerran. (2006). EyeTracking sees gold in its technology. [Electronic Version]. San Diego Source, The Daily Transcript, April, 3rd, 2006. [16]