A psychotechnological review on eye-tracking systems: towards user experience
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Disability and Rehabilitation: Assistive Technology, 2012; 7(4): 261–281 © 2012 Informa UK, Ltd. ISSN 1748-3107 print/ISSN 1748-3115 online DOI: 10.3109/17483107.2011.635326 REVIEW ARTICLE A psychotechnological review on eye-tracking systems: towards user experience Maria Laura Mele1 & Stefano Federici1,2 Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 1 ECoNA, Interuniversity Centre for Research on Cognitive Processing in Natural and Artificial Systems, Sapienza University of Rome, Rome, Italy and 2Department of Human and Education Sciences, University of Perugia, Perugia, Italy Purpose: The aim of the present work is to show a critical review Implications for Rehabilitation of the international literature on eye-tracking technologies by focusing on those features that characterize them as • Eye-tracking methodologies facilitates communica- ‘psychotechnologies’. Method: A critical literature review was tive functions of gaze favoring a successful outcome in conducted through the main psychology, engineering, and many areas of rehabilitation. computer sciences databases by following specific inclusion • Eye-tracking systems would be an effective assistive and exclusion criteria. A total of 46 matches from 1998 to 2010 technology for integration, adaptation and neutraliza- were selected for content analysis. Results have been divided tion of the environmental barrier. For personal use only. into four broad thematic areas. Results: We found that, although • Successful rehabilitation outcomes are improved when there is a growing attention to end-users, most of the studies a holistic and interactive model is applied for both reviewed in this work are far from being considered as adopting design processes and assessment of the functional holistic human-computer interaction models that include components of the interaction. both individual differences and needs of users. User is often considered only as a measurement object of the functioning outside, where a person comes into contact with it and has to of the technological system and not as a real alter-ego of the work with it’ [1]) of eye-tracking systems, because this research intrasystemic interaction. Conclusion: In order to fully benefit topic is often domain of technical disciplines like engineering from the communicative functions of gaze, the research on eye- and ergonomy. In fact, although the first studies on gaze track- tracking must emphasize user experience. Eye-tracking systems ing were mainly focused on medical application, nowadays, the would become an effective assistive technology for integration, scientific literature on these systems seems to have forgotten its adaptation and neutralization of the environmental barrier origins. That is not surprising at all. In fact, it is a character- only when a holistic model can be applied for both design istic of the scientific interest and production to emphasize the processes and assessment of the functional components of the dominance of engineering on assistive technology giving less interaction. attention to the professional role of psychology [2–4]. Keywords: Assistive technology, eye gaze tracking, human Before describing the methodological criteria used in our computer interaction, psychotechnology review, in this introduction we will cover the following topics: i) a brief historical review; ii) an explanation of the definition of ‘psychotechnology’ and its distinction from ergonomics Introduction and engineering, to clarify the perspective from which we Although the use of eye-tracking devices dates back to remote analyze the eye-tracking literature here described, and iii) the times, the increasing interest of the scientific literature on this main purpose of this review. subject is quite recent. At present, a web search engine query for ‘eye-tracking’ currently yields about 40,000 results on Google A brief historical review and over 4,000 results on Google Scholar. However, these num- bers drastically decrease when exploring the dimensions of user Eye-tracking is a set of methods and techniques which allows experience (‘is about how it [system/technology] works on the the detection of both eye movements and fixations performed Correspondence: Maria Laura Mele, ECoNA, Interuniversity Centre for Research on Cognitive Processing in Natural and Artificial Systems, Sapienza University of Rome, Rome, Italy. Tel.: +39 347 7753176. E-mail: marialaura.mele@uniroma1.it (Accepted October 2011) 261
262 M. Laura Mele & S. Federici during a visual interaction in a given context. During the sec- virtual reality. Together with growing technological develop- ond half of the nineteenth century, following the studies on ments, the application of eye-tracking systems to the study of eye movement by Louis Emile Javal [5], the interest in devel- HCI is likely to be a promising approach to both analyze the oping eye-tracking techniques has become a goal of many mechanisms involved in the interaction and to improve aug- studies, especially for identifying saccadic eye movements mentative and alternative communication systems. Therefore, while reading a text. thanks to its adaptability to different contexts of use, the eye- Since the first analytical techniques were proposed, which tracker might be considered as a promising technology able required ‘invasive’ tools (Figure 1) for the detection of eye to either overcome or reduce the disadvantaged conditions movement through direct contact with the cornea [6], in the arising during the access to information and computer ser- twentieth century there has been a growing interest in more vices, namely as an assistive technology (AT), providing sup- accurate methods and less intrusive techniques, e.g. the cor- port to the access of information technologies for the disabled neal reflection technique for motion pictures developed by [18,19]. Dodge and Cline [7]. For many years, eye-tracking methodol- ogy has been used in medical research; however, starting from What is psychotechnology? 1970s, there has been remarkable development in eye-tracking Starting from Federici’s definition it is possible to consider Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 systems in connection with the evolution of marketing and a psychotechnology any ‘technology that emulates, extends, design. amplifies or modifies sensory-motor, psychological or cogni- New methodologies for research application, especially in tive functions of the mind’ [2,20,21]. This definition highlights human-computer interaction (HCI) [8], have been possible an intrasystemic perspective focusing on the relation between thanks to the growing development of eye gaze tracking tech- user and technology: the system in which the interaction takes niques. Most of the studies conducted in the 1970s focused on place must be considered as more than the mere addition of improving the technical accuracy of these systems and reduc- subjective and objective components but it becomes a gestal- ing the impact of instrumentation on users while executing tic phenomenon emerging from the human experience of the their tasks. By following this goal, the detection technique of technology. In this way, the ‘modification’ component plays a multiple reflections from the eye has been quite important; key role in overcoming the cause–effect perspective: ‘any tech- this technique improves detection accuracy by allowing the nology is hence able to permit the human being’s adaptation dissociation of eye rotations from head movement [9]. to the environment-system and, at the same time, force users For personal use only. Eye-tracking methods were originally used to analyze the to embrace cognitive and cultural modification and adapta- relationship between eye movements and visual stimuli (e.g. tion’ [21,22]. Starting from this theoretical background, the the movement of a target); however, together with technologi- psychotechnological approach is significantly different from cal development, a growing interest in the relationship between the ergonomical one, since it analyses the interaction system visual patterns and cognitive activities had been shown [10]. equally observing the personal, technological, and environ- Since the 1980s, with the development and dissemination of mental components of the interaction system under the lens microcomputers, the application of eye-tracking techniques of an intrasystemic and integrated perspective [22–25]. The in HCI has been established, especially concerning usability psychotechnological approach constitutes for us a general studies [11–17]. inclusion criteria and, at the same time, an interpretation In the last 2 decades, different approaches have been pro- perspective of the international literature on eye-tracking. We posed to study the user-interface interaction and perform the hence exclude from our bibliographic research all the studies simulation of operations in both artificial environments and that focus only on engineering or medical topics in favor of all the studies from 1998 to 2010 that use or propose eye-tracking systems by following the HCI criteria. Purpose of the review The aim of the following critical review of the international literature is to investigate state of the art of eye-tracking sys- tems by focusing on the psychotechnological aspects involved during the application of eye-tracking methodologies in the assessment and rehabilitation process. More specifically, our goal is to analyze the current state of eye-tracking systems to better understand the implications that would arise from the intrasystemic relation between users, eye-tracking technolo- gies, and environment of use by focusing on the analysis of their potential for different application context such as evalu- ation, therapy, and education. For the reasons already stated at the beginning of this Figure 1. A rudimental apparatus for tracking eye movements through direct contact of a lens with the cornea: the system detects the move- section—i.e. that the scientific production mainly follows a ment of the lens connected to the apparatus while the user is looking at technical approach and is based on the research methods of the target (i.e. a textual field [6]). the engineering journals—the use of keywords that match the Disability and Rehabilitation: Assistive Technology
A psychotechnological review on eye-tracking systems 263 classification criteria of this review would not have returned starting from the main purpose of the review previously many products. In fact, although different studies follow described in the introduction. experimental paradigms that can be considered as psycho- A total of 1,232 references were found. After removing technological, they are not classified as ‘psychotechnology’. duplicates and works written in languages other than English, For these reasons, we used inclusion and exclusion criteria the relevance based on the following inclusion and exclusion that are broader and not perfectly aligned with our clas- criteria has been evaluated. sification. We followed this methodological choice so as not to force the scientific products into arbitrarily constructed Inclusion criteria categories and, at the same time, to distinguish if they can The inclusion criteria were related to the following areas of be read under the user experience (UX) [1,26,27], HCI [8], investigation: and psychology perspectives. This is why we titled this work • the study of eye-tracking systems as assistive technologies not simply as a ‘critical review of eye-tracking systems’ but a designed to facilitate communication processes; review of eye-tracking under the specific perspective of the • the study of the sensory-motor, psychological and cog- psychotechnological one. nitive implications of eye gaze tracking systems arising The selected products describe a number of studies on eye- Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 from a psychotechnological approach; tracking that we classify under the following four thematic • the study of the involvement of eye-tracking systems in areas: ‘HCI: eye-tracking methods and techniques’; ‘Cognitive evaluation, rehabilitation, and learning contexts; processes involved in visual information processing’; ‘Eye- • the importance of studies by impact and number of tracking of subjects in developmental age’ and ‘Assistive tech- citations. nology use of eye-tracking systems as input device’. Moreover, in the ‘discussions’ section methodological issues, theoretical Exclusion criteria and practical implications will be critically analyzed. The studies related to the following fields of application have been excluded: Method • commercial applications of eye-tracking technologies on The bibliographic search was conducted through English visual marketing; language literature on all the studies published within the last • eye-tracking analysis of the physiological aspects involved For personal use only. 20 years. The EBSCO academic search complete database, in visual-tracking tasks, with the exception of studies the Cambridge Scientific Abstracts (CSA) database, and the conducted on subjects in childhood and adolescence; Institute of Electrical and Electronics Engineers (IEEE) Xplore • the usability evaluation of both web interfaces and soft- database were examined. The Applied Social Sciences Index ware by means of eye gaze tracking methods, used only and Abstracts (ASSIA), Medline, PsycINFO, PsycARTICLES as tools for detection of eye behavior and not as input and IEEE/IET Electronic Library (IEL) databases were con- devices; sulted using the ‘Eye-tracking’ OR ‘Eye-tracker’ OR ‘Eye gaze • the analysis of eye-tracking in individuals with pervasive tracking’ combination of keywords. Figure 2 shows more detail developmental disorder and attention deficit hyperactiv- in the selection process carried out for bibliographic research ity disorder (ADHD). Figure 2. Selection process of the products carried out by focusing on the features that characterize eye-tracking systems as ‘psychotechnologies’. © 2012 Informa UK, Ltd.
264 M. Laura Mele & S. Federici The results were divided by subject area, for a total of 46 HCI: eye-tracking methods and techniques (18 products); 2) matches published from 1998 to 2010 (Figure 3). About 70% Cognitive processes involved in visual information processing of results were published in conference proceedings and the (9 products); 3) Eye-tracking of subjects in developmental age remaining 30% were journal articles. Table I shows a struc- (6 products); 4) Assistive technology use of eye-tracking sys- tured analysis grid including the following information for tems as input device (13 products). A total of 46 experimental each work: i) method, ii) system used/developed, iii) type of studies has been retrieved on the main databases indexed by evaluation carried out and iv) main results. The overall results EBSCO, CSA and IEEE Xplore. have been classified following four thematic areas (Table I): Thematic area 1 • ‘HCI: eye-tracking methods and techniques’, 18 matches: HCI: eye-tracking methods and techniques [28–45]; The eye-tracking technique has been widely used in HCI, • ‘Cognitive processes involved in visual information pro- especially in usability evaluation studies. However, thanks cessing’, nine matches: [46–54]; to their versatility, the HCI research on eye-tracking systems • ‘Eye-tracking of subjects in developmental age’, six has increasingly turned its attention to applying input/output matches: [55–60]; techniques providing users to interact within the environ- Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 • ‘Assistive technology use of eye-tracking systems as input ment of use. In this section, we will discuss 18 studies con- device’, 13 matches: [61–73]. ducted from 1998 to 2010 [28–45], describing one of the most Although during the last decade several reviews and mono- important issues behind eye movement detection techniques, graphs on eye-tracking methods and techniques in different i.e. measurement accuracy, by analyzing two main research application fields (e.g. HCI, usability) have been proposed areas in this field: studies proposing techniques for the detec- [74–77], none of these products follow our inclusion and tion of eye position and studies focusing on the measurement exclusion criteria. of gaze trajectory. Table I. Selection results of reference products in the As suggested by Duchowski, eye-tracking research has databases from 1998 to 2010 listed by year of publication developed along three historical periods, each characterized and classified according to the following thematic areas: 1) by progressive interest towards different fields: the studies For personal use only. Figure 3. Selection process of the products indexed by the CSA, EBSCO and IEEE Xplore databases for the ‘Eye-Tracking’ OR ‘Eye-Tracker’ OR ‘Eye Gaze Tracking’ keywords. The literature research was conducted from 1998 to 2010 and follows two search phases: 1) overview of the results obtained without applying any constraint; 2) inclusion and exclusion criteria applied to narrow the fields of search. Disability and Rehabilitation: Assistive Technology
Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 For personal use only. Table I. Summary table of reviewed articles. Author/year/title Method System Evaluation Results Thematic area 1—Human–computer interaction: eye-tracking methods and techniques Kawato & Tetsutani, 2002. A circle-frequency filter (CFF) is The system is tested with The robustness of the proposed A detection rate of 98.5% was achieved. Real-time detection of applied to the skin-color region a commercial NTSC video algorithm is tested with 400 images The algorithm is pretty robust against between-the-eyes with a previously extracted in order to camera and a video capture taken from a face image database. scale changes as long as face image is circle-frequency filter [29]. extract candidate points for between- board. No other special H/W frontal. © 2012 Informa UK, Ltd. the-eyes together with face rotation is implemented. angles. Bhaskar et al., 2003. Blink Eye blink is detected by using frame The system is tested with a 1.6 13 stored video sequences of 700 A success rate of 97.0% in blink detection and eye tracking differencing followed by optical GHz Pentium IV computer. frames (768 × 576 PAL frames at detection and an automatic eyes for eye localization [30]. flow. This method is subsequently 25 fps), showing different people localizations of average rate of 22 combined with the Kanade-Totnasi- have been analysed. Each sequence frames per second was reached. Lucas (KLT) method to track the contained an average of five blinks. localized eyes. Bagci et al., 2004. Eye A face geometry and skin color The system is tested with Five different subjects of The tracker successfully located the tracking using Markov recognition model is proposed. a Canon GL2 camera with different ethnicities have been features in 99.2% of the frames. Closed models [33]. Temporal evolution is captured by the a frame size of 720 × 480 videorecorded in order to obtain eyes are detected with an error rate of Markov chain method. detecting skin color and four a total of 13,000 frames of indoors 1.2%. The classifier module determined feature points associated with videos with slightly varying the location of the iris in 98.5% of the nostrils and eyebrows. illumination conditions. frames, excluding closed eyes. For full- frontal faces the classification rate is close to 100%. Methods of using eye Haseyama & Kaneko, Eye tracking of video sequences is Two systems are proposed: 4 color video sequences (320 × 240 The method correctly tracked the eyes detection technique 2005. A robust human-eye made by 1) extracting the region the first extracts a region pixels, 24 bit levels, 30 fps) taken in different facial orientations and when tracking method in video including the both eyes and excluding including both eyes; the by a digital video camera have been the angle of the face in-plane rotation sequences [35]. other features, and 2) locating second locates each eye analyzed with different settings changes, even in low contrast images. each eye position from the region position. applied to the parameters for the previously extracted by using the execution. circle-frequency filter (CFF). Droege et al., 2008. A The main algorithms for pupil A low cost eye and gaze 11 algorithm suitable for a pupil All algorithms suffered from the poor comparison of pupil centre centre detection are selected for a tracking system developed method have been compared with resolution of the input device, some estimation algorithms [39]. comparative investigation. by Droege et al., 2007 and the algorithm previously developed of them were apparently unable to described in Droege, D., by the authors on an input device deal with low resolution input at all. Geier, T. and Paulus, D. with a poor resolution. The differences were often caused by (2007) Improved low cost the presence of the IR-light reflection gaze tracker. H. Istance and within the pupil. R. Bates (editors.), COGAIN 2007, pp 37–40. http://www. cogain.org/cogain2007/ Figueira et al., 2008. In order to evaluate eye tracking The ImageMagick++ library 8 tests have been proposed to The centre of the drawn synthetic pupil Evaluation tests for eye systems, the authors use synthetic to generate the synthetic eye evaluate the performance of the coincided with the coordinates used tracking systems [40]. images of the eye to simulate the most images is used on a computer algorithms in the most critical by the library to generate the synthetic critical situations. running Ubuntu Linux. situations. eye images. The library generator was reliable to quantify the performance of algorithms. Methods of using gaze Beach et al., 1998. Eye- Image processing algorithms is Low cost eye head-mounted Experimental information of The device is able to collecting detection technique tracker system for use with applied to the eye video to find the displays integrated with the evaluation with users is not data useful in human factors head-mounted displays [28]. location of the pupil in real-time. voice recognition able to provided. experimentation and has been used to use commercially available do a workload analysis and to measure hardware and software. different aspects of a subject’s visual A psychotechnological review on eye-tracking systems search patterns. 265 (Continued)
Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 For personal use only. Table 1. (Continued). 266 Author/year/title Method System Evaluation Results Amarnag et al., 2003. Real- To efficiently track the pupil in real- The system is been The evaluation has been carried The system was able to achieve an time eye tracking for human time without using infra-red devices, implemented on an Intel out by using the Clemson accuracy of 88.3% on the CMU computer interfaces [31]. a four stages algorithm is proposed: Pentium III 997 MHz with a University Audio Visual database and an accuracy of 86.4% and preprocessing, Bayesian classification, frame rate of 26 fps. Experiments (CUAVE) database 76.5% on the CUAVE database for the clustering and postprocessing. consisting of 36 subjects and stationary and moving speaker case Natural motion in the field of view of the CMU audio-visual dataset respectively. the camera has been made possible. consisting of stationary and moving subjects while speaking. Beymer & Flickner, 2003. A model alignment approach able The system consists of a wide The gaze point accuracy has A gaze point accuracy of 0.6? on a set of Eye gaze tracking using an to track the 3-D anatomy of the eye angle stereo (Videre Design’s been evaluated. one subject has 22 stereo pairs was achieved. active stereo head [32]. (corneal ball, pupil and angular offset MEGA-D stereo) for face been asked to look at 22 known M. Laura Mele & S. Federici of the fovea from the optical axis) is detection and a narrow FOV locations on a monitor: the error proposed. stereo for eye tracking. between ground truth and reported gaze has been measured. Kim et al., 2004. A hierarchical generalized regression The system is tested with a Users have been asked to keep The accuracy of gaze classification Nonintrusive eye gaze neural networks (H-GRNN) scheme near infra-red CCD camera their fixing gaze on 24 regions by H-GRNN was 92%. The system tracking under natural head to map eye and mirror parameters (TV ZOOM LENS M6Z, on a screen with a resolution of was able to keep the pupil images on movements [34]. is proposed. Two mirrors based Hitachi, Japan) and two 1152 × 864 pixels. the optical axis of the camera under on geometric and linear algebra conventional back-surface different head positions. calculations are used to compensate mirrors. head movements. Zhiwei et al., 2005. Eye gaze Two novel techniques are introduced The system is composed by The accuracy of both the gaze 1) First system: the average horizontal tracking under natural head to improve the existing gaze tracking two cameras mounted under tracking system proposed have angular gaze accuracy is 1.47 and the movements [36].Zhiwei & techniques: 1) a 3-D gaze tracking the computer monitor and been tested with six users. average vertical angular gaze accuracy Qiang, 2007. Novel eye gaze technique which estimates the an IR illuminator mounted is 1.87. 2) Second system: the average tracking techniques under 3-D direction of the gaze; 2) a 2-D around the centre of the angular gaze accuracy in the horizontal natural head movement mapping-based gaze estimation camera lens to produce the direction and vertical direction is 1.17 [37]. technique able to allow free head corneal glint in the eye image. and 1.38 respectively. movements and minimize the calibration procedure. Ying et al., 2007. A Eye gaze directions are detected by The system is composed 12 subjects (5 men, and 7 women), The error rate of gaze points was lower noncontact eye gaze measuring the relative positions of the by one Personal Computer clustered into 4 groups of 3 person than the error rate of fixation points. tracking system for human Purkinje image and the corneal reflect (PC), one eye gaze tracker have been asked to look at the The error rate decreased about 6% both computer interaction [38]. method. The relative position of composed of two near infra- points on the screen from left to in horizon and vertical. Purkinje image and pupil is extracted red light sources and one CCD right and from top to bottom. as parameter of gaze in 2-D. camera, one image grabber Gaze points and the fixation points and one screen. were recorded for each frame for 3 seconds. Manh Duong et al., 2008. Eye movements are recorded real- The system is composed by a The accuracy and the performance The average velocity was 1.12°/s, for Easy-setup eye movement time by means of two processes: programmable camera and a of the system have been tested with horizontal eye movement and 2.48°/s recording system for 1) a double circle fitting algorithm display. No computer system eight men and two women. The for the vertical eye movement. These human-computer to detect the centre of pupil from is used. The system works at following experiments have been velocities have been considered to interaction [41]. the captured image; 2) the variance 300 Hz sampling rate. carried out: fixations test, smooth indicate the accuracy of the system. projection function (VPF) to detect pursuit tests, range tests and the eye corners. performance tests. (Continued) Disability and Rehabilitation: Assistive Technology
Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 For personal use only. Table 1. (Continued). Author/year/title Method System Evaluation Results Topal et al., 2008. A Very low dimensional feature vectors Eye touch system (ETS) The performance of the prototype ILSA-2 performed well for six regions wearable head-mounted are proposed to process eye data. consists of the following ILSA-2 has been tested with users (with 100% recognition performance) sensor-based apparatus for components: an infra-red light and compared to a previously and satisfactorily for the others. eye tracking applications sensitive apparatus (ILSA), proposed prototype, ILSA-1. Comparing the user evaluation [42]. a data acquisition device performance with ILSA-1, the (DAQ), a computer software performance with ILSA-2 improved © 2012 Informa UK, Ltd. and an external power supply. the performance of region detector ILSA is equipped with few application by about 20%. IrDA sensors and IrDA LEDs mounted on a frame to detect the movements of the iris. Chi et al., 2009. Key Some major problems and their Different systems using the The following features which Most of the products meet only some techniques of eye gaze possible solutions of eye gaze tracking vector from Purkinje image are meant to guarantee a good of these requirements, moreover, tracking based on pupil (EGT) techniques direction are centre location to estimate the measurement have been evaluated: i) the main difficulties in using these corneal reflection [43]. proposed. gaze direction are analyzed. accuracy; ii) reliability, iii) robustness; techniques were encountered especially iv) nonintrusiveness; v) free head when attempting to reduce real-time movements; vi) no prior calibration; interferences. and vii) real-time response. Price, 2009. Infra-red–based The infra-red technique is used to The eyepieces are made of two The system has been tested on The results showed that:1) the eye-tracker system for measure saccades by means of the printed circuit boards (PCBs) healthy young adults using a chin estimation of the saccades was good saccades [44]. vertical arrangement of the infra-red mounted on either side of the stand to support the head. The provided; 2) the calibration phase was detector array. eye. Data acquisition is carried subjects have been asked to follow successful 3) there was no drift in the out using an NI PXI module. a LED light on a screen. zero-position readings from the photo- A LabView routine controls detectors. the lighting in the target board and the recording of the conditioned signals. Zhang et al., 2010. Design 3-D gaze estimate providing the The system is composed by The accuracy of camera calibration Low relative error results show that the and calibration for gaze coordinates of pupil and gaze two CCD cameras and two has been tested by calculating proposed method is accurate. tracking system [45]. orientation for tracking using a planar loop infra-red light sources. the difference between the mirror and planar calibration pattern Besides, the system is still lengths of some checkers through is proposed. composed of image grabber, measurement and the real length. personal computer and processor. Thematic area 2—Cognitive processes involved in visual information processing Movement Goldberg & Kotval, 1999. In order to to provide a framework An DBA systems Model 626 The validity for the assessment Compared with a randomly organized characteristics of the Computer interface for the eye movement data analysis infra-red corneal reflection of interface quality has been set of component buttons, well eyes evaluation using eye techniques, an eye movement- system is used. evaluated by analyzing the eye organized functional grouping resulted movements [46]. based analysis is carried out for movement locations and the in shorter scanpaths, covering smaller the evaluation of several computer scanpaths of 20 subjects (7 female, areas. The poorer interface resulted in interfaces. 5 male, ages ranged from 20 to 27) more fixations than the better interface. while using both good and poor The poor interface produced less software interfaces. efficient search behavior, the layout of component representations did not influence their interpretability. Henderson et al., 1999. In order to investigate the influence Eye movements are monitored Eye movements have been Results supported a model of eye The effects of semantic of semantic factors on eye movement using a Generation 5.5 recorded while 18 participants movement control during scene consistency on eye patterns during the free viewing Stanford Research Institute viewed line-drawing pictures of 24 viewing in which the eyes are initially movements during complex of complex natural scenes, the dual Purkinje image eye- natural scenes in preparation for a driven by visual factors and global scene viewing [47]. manipulation of the semantic tracker with a resolution of 1 memory test (experiment 1) or to scenes semantics, with cognitive and A psychotechnological review on eye-tracking systems consistency of a particular region of a minute arc and a linear output find a target object (experiment 2). semantic aspects of local scene regions complex natural scene is performed by over the range of the visual playing an increasingly important role. changing a specific target in that region. display used. 267 (Continued)
Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 For personal use only. Table 1. (Continued). 268 Author/year/title Method System Evaluation Results Cowen et al., 2002. An eye Eye movements and performance data Eye movements are recorded Four web pages have been Analyses of performance data provided movement analysis of web (response scores and task completion using a SMI’s Head-mounted presented twice to each of the 17 reliable evidence for a variety of page page usability [48]. times) are analyzed for the evaluation Eyetracking Device II (HED- participants. The subjects have and task effects, including a page by of web page usability. II) with scene camera. The been also asked to perform two task interaction. Four eye movement eye-tracker uses two small tasks while browsing any web measures were found to be sensitive to cameras (the eye camera and page. Performance data and similar patterns of difference between the scene camera) mounted on eye movement data have been pages and tasks that were evident in the a bicycle helmet for comfort, recorded. perfomance data. weighing only 450 g in total. Cooke, 2006. Is eye tracking Subjective measures of user are A remote eye-tracker 26 subjects, 21 women and 5 men From comparison of the aggregated the next step in usability analyzed to compare user experience ERICA and the GazeTracke of age from 19 to 36. Users have eye movement and survey data, the M. Laura Mele & S. Federici testing [49]?. with objective measures (eye fixation software are used to collect been asked to browse three web measures of eye fixation duration duration and eye fixation frequency). eye movement data. Eye sites and to fill out a survey. and eye fixation frequencies were movements are sampled at 30 compatible with the measure of MHz. A15” flat panel monitor perceived task difficulty. The mean task set at a resolution of 1024 × completion times also appeared to be 768 pixels was used to display compatible. the web pages. Components involved Ratwani et al., 2008. The processes required to extract Eye movement data are Seventeen subjects (10 women and Results supported the task analytic in the integration of Thinking graphically: specific information and to integrate collected using an LC 7 men) have been asked to analyze theories for specific information information connecting vision and information are examined by Technologies Eyegaze Analysis four sets of choropleth and answer extraction and the processes of visual cognition during graph collecting verbal protocol and eye System (LC Technologies, specific information extraction and and cognitive integration for integrative comprehension [51]. movement data. McLean, VA) heads-free eye- integration questions. questions. Further, the integrative tracker operating at 60 Hz and processes scaled up as graph complexity running from a single desktop increased, highlighting the importance personal computer with of these processes for integration in Windows 2000. more complex graphs. Neuro-psychological Xiao et al., 2007. Using In order to demonstrate that audio- Gaze position is measured Auditory stimuli of constant, All sound categories induced more mechanisms involved eye-tracking to study audio- visual perceptual integration affects at 500 Hz by an SR Research increasing, or decreasing pitch smooth-pursuit eye movement in visual tasks visual perceptual integration low-level oculomotor mechanisms, EyeLink-II system during the were presented. 20 subjects have than silence, with the greatest effect [50]. a visual-tracking task with a visual-tracking task on a 21 been asked to follow a small blue occurring with stimuli of increasing continuous, task-irrelevant sound inch CRT monitor. disk moving horizontally on a gray pitch. A possible explanation given by from a stationary source have been background, for a duration of 6 the authors is that integration of the combined. seconds. visual scene with continuous sound creates the perception of continuous visual motion. Guérard et al., 2009. The The “path length effect” [Parmentier, The eye movements are 42 with a normal or corrected- The results shows that eye tracking may processing of spatial FBR., Elford, G, & Maybery, MT. recorded with the Tobii to-normal vision have been asked provide a finer grained evaluation of information in short-term (2005). Transitional information eye-tracker (resolution and to fixate alternatively nine blue the cognitive processes active during memory [52]. in spatial serial memory: Path sampling rate are 0.25° and 50 calibration dots and perform a the encoding and rehearsal of spatial characteristics affect recall Hz) (Tobii Technology; www. memory task. information, since eye movements were performance. Journal of Experimental Tobii.se). The eye movements affected by path length to a greater Psychology: Learning, Memory are captured by an integrated extent than performance. & Cognition, 31, 412–427], is camera. systematically investigated using eye tracking and interference prodecures to explore the mechanisms responsible for the processing of spatial information. (Continued) Disability and Rehabilitation: Assistive Technology
Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 For personal use only. Table 1. (Continued). Author/year/title Method System Evaluation Results Humphrey et al., 2009. Scanpath comparisons, by using a Eye position is recorded 45 subjects (24 females and 21 A relationship between saliency and eye Domain knowledge string editing algorithm, are carried using an SMI iVIEW X males, age range 18–30, mean movements, shown by the similarity of moderates the influence out to analyze if saliency influences Hi-Speed eye-tracker with a age 22) domain specialists have actual scanpaths to those predicted by of visual saliency in scene where and how participants look. gaze position accuracy of 0.2 been asked to look at a first set of the saliency model (Itti & Koch, 2000) recognition [53]. degrees. photographs of real-world scenes was found. However, domain-specific © 2012 Informa UK, Ltd. and a second set of stimuli during knowledge can act as an overriding which they have been asked to factor, weakening this relationship identify the picture as old or never between saliency and eye movements. seen before. This effect was stable over time. Harrison et al., 2010. The integration of auditory and visual An head-mounted display 26 participants (8 males and 18 Participants’ accuracy at counting target Multisensory integration information are analyzed to assess Microvision Nomad™ ND2000 females, with ages from 18 to 35 audio-visual events was worse when with a head-mounted how background visual motion and with a single optical see- years) have been asked to count participants were walking. Compared display[54]. the relative movement of sound affect through monocle (800 × 600 audio-visual events that required with when they were sitting at a desk, a head-mounted display (HMD). pixels) connected to a Sony™ integration of sounds while participants’ accuracy at counting target U50 tablet computer is used. walking around a room, sitting in audio-visual events showed a trend the room, or sitting inside a mobile to be worse when they experienced room. a combination of background visual motion and the relative movement of sound. Thematic area 3—Eye-tracking of subjects in developmental age Analysis of saccadic Salman et al., 2006. Saccades The hypothesis that saccadic An infra-red El Mar eye- 20 target steps have been presented Saccadic latency decreased with eye movements in children [58]. parameters show variation with age tracker (El-Mar, Downsview, at each amplitude and direction to increasing age, saccadic gain and peak during childhood is analyzed by Ontario, Canada) is used. 39 children (21 males; 18 females) velocity did not vary with age. Saccadic investigating saccadic accuracy, peak aged 8-19 years. gains and peak velocities in children velocities, and latencies in response are similar to reported adult values. to horizontal and vertical target steps This implies maturity of the neural in developing children and young circuits responsible for making saccades adolescents. accurate and fast. Saccade latency decreases as the brain matures. Analysis of scene Gredeback & von Hofsten, A longitudinal design to analyze the Gaze is measured using an 20 infants (10 male, 10 female) Infants from 6 months of age can perception and visual 2004. Infants’ evolving infants’ ability to track temporarily ASL 504 infra-red remote from 6 to 12 months, tracked a represent the spatio-temporal dynamics tracking of moving representation of moving occluded objects that moved on eye-tracking system with a small yellow happy face with a red of occluded objects. Infants at all ages targets objects between 6 and 12 circular trajectories is described. sampling frequency of 60 Hz. nose, a red mouth, and blue eyes tested were able to predict when and months of age [55]. that moved counterclockwise in where the object would reappear after a circular trajectory. During each occlusion. The average rate of predictive cycle of the circular trajectory, the gaze crossings increased with occlusion visibility of the target has been duration. obstructed. Johnson et al., 2004. Where The relation between scanning An Applied Science Sixteen 3-month-olds (7 boys, 9 Perceivers, relative to non-perceivers, infants look determines how patterns and incipient object Laboratories model 504 girls), with a mean age of 92.9 days scanned more reliably in the vicinity they see. Johnson, Slemmer, perception by examining individual eye-tracker controlled by have been involved. A target- of the visible rod parts and scanned Amso, 2004 [56]. differences in performance of infants a standard PC computer is patterned beeping ball at the top more frequently across the range of is analyzed. used. A Macintosh computer left and bottom right corners of rod motion. These results suggest that presented displays on a 76-cm an imaginary rectangle has been emerging object concepts are tied computer monitor, recorded shown to subjects. closely to available visual information looking time judgments, and in the environment, and the process of calculated the habituation information pickup. criterion for each infant. A psychotechnological review on eye-tracking systems (Continued) 269
Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 For personal use only. Table 1. (Continued). 270 Author/year/title Method System Evaluation Results Hunnius & Geuze, 2004. Infants’ scanning of dynamic stimuli A corneal reflection eye- The characteristics of scanning Results indicated that the way infants Developmental changes in and its development during the first tracking system (ASL Model patterns between the ages of 6 and scanned these stimuli stabilized only visual scanning of dynamic few months of infancy is analyzed by 504) is used. Eye position data 26 weeks have been investigated after 18 weeks, which is slightly later faces and abstract stimuli in combining the recent improvements are sampled at 50 Hz. through repeated assessments of than the ages reported in the literature infants [57]. in infant eye-tracking techniques with 10 infants (5 girls, 5 boys) while on infants’ scanning of static stimuli. an intense longitudinal design. the infants looked at two dynamic From the 14-week session on, infants stimuli: 1) the naturally moving adapted their scanning behavior to the face of their mother and 2) an stimulus characteristics. abstract stimulus. Theuring et al., 2007. Object In order to investigate the object Gazeis measured using a Tobii Sixteen 12-month-old healthy, full- During the first test trial infants processing during a joint visual processing, a gaze following 1750 near infra-red eye- term infants (mean age 11 months displayed a brief novelty preference for M. Laura Mele & S. Federici gaze following task [59]. task has been carried out. tracker with an infant add-on and 28 days) processed an object the unattended object. These findings (Tobii Technology; www. during a following task. suggest that enhanced object processing Tobii.se). is a temporarily restricted phenomenon that has little effect on 12-month-old infants’ long-term-interaction with the environment. Jonsson et al., 2009. The present study focuses on The movements of interest are 14 infants and 6 adults have been Prospective head tracking is Prospective head tracking in two-dimensional head tracking as recorded by a six-camera, 240 asked to track an object, presented functional and can be extended to a infants [60]. expressed in prospective control Hz, ProReflex measurement on a large vertical screen that two-dimensional motion as early as during a head-unrestrained visual system (Qualisys Inc., moved along a circular trajectory. from 6 months of age. Young infants; tracking task. Sweden). Infra-red light are (i) displayed more extensive head emitted by a bank of IREDs movements, (ii) were less accurate, (iii) located on the cameras. had a less developed timing between head movements and object motion in the vertical dimension. Thematic area 4—Assistive technology use of eye-tracking systems as input device Chung-Hsien et al., An electro encephalography (EOG) The system is composed of The capability of the EOG based The training paths of the junior 2009. Eyeglasses based signal together with ultrasonic three parts: 1) EOG signal human wheelchair interface has volunteer shows that the collision electrooculography human- arrays is used to generate the driving collection and wheelchair been evaluated in a small and avoidance has been frequently activated. wheelchair interface [61]. command of wheelchairs. command generation, 2) crowded test environment with On the other side, the well trained perception-based active both a junior volunteer and a volunteer shows a smoother path and collision avoidance and 3) well-trained user (around 10 times collision free driving performance. wheelchair mechatronics training courses). system. Eun Yi et al., 2005. A real-time face and an eye detection Eye mouse system is The tracking performance and the The features were tracked throughout Eye mouse: mouse method using color and texture developed in a PC platform time have been taken to detect the the 100 frames. The average time to implementation using eye information are used. Specifically, with OS Windows XP. The face and the eye tested with a user. process 100 frames were about 22(ms). tracking [62]. the software part consists of four camera is Kocom-90, which These results show that the system modules: face detection using skin- captures 30 color images of can provide a cheap and user-friendly color model; eye detection using the size 320 240 per second. communication interface. MLP-based texture classifier; eye tracking using a mean shift algorithm, and mouse control. Hansen et al., 2002. A real-time tracking scheme using Infra-red light (IR) and The system has been tested off-line The results show that by using the Eye-typing using Markov a mean shift color tracker and an cameras to detect and track on 48 sequences of approximately active appearance model it is possible to and active appearance active appearance model of the eye is the eye are used. 1 minute of duration, taken with a directly deduce information regarding models [63]. proposed. low cost web camera. eye corners and pupil position in an easy and intuitive way. Disability and Rehabilitation: Assistive Technology (Continued)
Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 For personal use only. Table 1. (Continued). Author/year/title Method System Evaluation Results Hiley et al., 2006. A low cost The proposed system is based on a To produce the necessary To test the accuracy and The system demonstrates the feasibility human computer interface video-oculography technique using features for the video- throughput of the system, 24 of a low-cost and sufficiently accurate based on eye tracking [64]. infra-red corneal reflections. oculography, five near trials have been performed by two method of eye tracking. The throughput infra-red light emitting different users. To setup the tests, of eight gaze points per second was diodes (LEDs) are used. The 12 designated gaze points have achieved. The accuracy of the fixations © 2012 Informa UK, Ltd. images are captured by a CCD been identified at regular spacing based on the calculated eye-gazes were image sensor from a Logitech on the screen. within 1 cm of the on-screen gaze Quickcam Pro 4000. location. Corno et al., 2002. A cost- Three algorithms are applied: the first, A very affordable eye gaze The system has been implemented The perceived usability was good, even effective solution for eye identifies the face position; the second tool, composed by a PC and a and tested by a small number of if the extension of the test to a larger gaze assistive technology extracts the information about the cheap web cam is proposed. different users. The user was able number of users will give a better [65]. right eye position; the third extracts to choose among six areas on feedback. the position of centre of the pupil. the screen (two rows, three columns), which is currently the maximum resolution allowed by the system. Kocejko et al., 2008. Eye Two algorithms have been developed: The system consists of two The eye detection test procedure The proposed eye tracking method is mouse for disabled [66]. longest line detection (LLD) for the cameras and four infra-red has been proposed. The 15″ screen fast, reliable and relatively inexpensive. pupil position detection and screen (IR) electro-luminescent has been divided for 16 squares. Applying dual-camera system allows detection algorithm (SDA) for head- diodes (LED) used to mark The user had to concentrate the for compensating potential users head to-display position detection. the screen corners and gaze on the centre of each square movements. In all examined cases personal computer. Both several times. The system has been system appeared to work properly. cameras are attached to the tested on subjects from 20 to 75, Accuracy of the developed algorithm head of the user by means of with different iris and skin colour. strongly depends on proper longest line the glasses frame. detection. Miyoshi & Murata, 2001 The EMR-8 eye mark recorder is the The developed eye input 10 healthy male volunteers aged The pointing time with the eye-gazing (07-10 October)Input device eye tracking system using the cornea system consists of the eye 21 to 24 years have been asked system was longer than with the mouse. using eye-tracker in human- reflection method, which detects the tracking system (EMF-8) and to explore the effect of target The dwell time cancelled the inherent computer interaction [67]. eye movements from the image of a personal computer (CPU; size, the distance between the speed advantage of the eye. Button size Miyoshi & Murata, 2001 the eye captured by an eye camera Pentium 111 800 MHz, OS; centred button (fixating point) and pointing distance influenced the (18–21 September), usability attached to the head cap. MS-WindowsNT). and the targets and the direction pointing time with both the systems. of input device using of pointing movement. The total eye-tracker on button size, completion time and the distance between targets and duration of the three parts of direction of movement [68]. the pointing task have been measured in each experimental configuration. Perez et al., 2002. Design of A nonobstructive interface is Video sequences from the The percentage of correct detection The iris detection algorithm reached a virtual keyboard based on proposed to detect and track iris eye are captured in 320 × for the iris and reference point for over 98% and the reference a 100%. iris tracking [69]. position based on digital image 240 resolution and 24 bit four different video sequences of The iris and reference detection seem processing techniques. The position color depth (RGB). A small 803, 710, 913, 849 frames each one sufficiently accurate to enable an of the iris is detected in four steps: sticker is mounted below the has been evaluated. individual to operate keys with the eye reference detection, iris centre eye to serve as a reference in gaze with small error. detection, iris position computation measuring iris position. relative to the reference and determination of the eye position within the virtual keyboard. A psychotechnological review on eye-tracking systems (Continued) 271
Disabil Rehabil Assist Technol Downloaded from informahealthcare.com by Prof. Stefano Federici on 05/21/12 For personal use only. 272 Table 1. (Continued). Author/year/title Method System Evaluation Results Porta & Ravelli, 2009. The basic web surfing operations are WeyeB is implemented in C#, 10 users were involved (5 males Almost all testers succeeded in carrying WeyeB, an eye-controlled made easy to be performed without within the .NET Microsoft and 5 females aged between 23 and out their tasks, quickly adapting web browser for hands-free using the hands. The page scrolling framework. The Tobii 1750 42) have been left free to use the to the new interaction modality. navigation [70]. is made possible by developing a (Tobii Technology; www.Tobii. system for 10 minutes by browsing ‘Button press’, ‘scrolling activation/ M. Laura Mele & S. Federici upper or lower scroll button interface se), which integrates all the home page of www.libero.it. The deactivation’ and ‘change of scroll with a semi-transparent effect. The components (camera, infra- purpose of the experiments was to direction’ were performed successfully link selection is been made possible red lighting, etc.) into a 17’’ verify whether a totally novice user by all users, while ‘scroll speed by developing an eye gesture web monitor, is used. would be able to freely browse the control’ was problematic for only two browsing interface. web through WeyeB, exploiting the (inexperienced) testers. eyes only. Roberts et al., 2009. Microcontroller inputs are provided The Vocal, Motorized, and The system has not been tested The wheelchair is geared to operate Vocal, motorized, and by eye tracking software and Environmentally Controlled with users. at a maximum speed of 3 mph, with environmentally controlled equipment integrated into the current Chair (VMECC) is composed a capability of 5 mph. The motorized chair [71]. computer setup that is currently by: batteries, motor, the gear wheels can be removed at the user’s installed for communications on the system and its connection discretion allowing the chair to revert client’s chair. to the main axle, and the back to standard configuration. microcontroller circuit for the control of the motors forward and reverse directions. Špakov et al., 2009. In order to save space a scrollable The system is composed by a 8 subjects have been instructed By optimizing the keyboard Scrollable keyboards for keyboards with a common keyboard “Scrollable keyboards” where to eye type 30 easy to memorize layout according to letter to- letter casual eye typing [72]. layout QWERTY is developed by one or more rows are hidden phrases on the 2-row keyboard probabilities the authors were able to means of a method able to hide one or to save space and is interfaced as fast and accurately as possible, reduce the scroll button usage, which more rows. After a first evaluation the with the head-mounted and press a key on the ordinary further increased the typing speed keyword layout has been optimized EyeLink eye tracking system. keyboard when they finished from 7.26 wpm (QWERTY) to 8.86 according to the letter-to-letter each phrase. Two experiments wpm on the 1-row keyboard, and from probabilities. have been carried out 1) on the 11.17 wpm to 12.18 wpm on the 2-row keyboard with the QWERTY keyboard, respectively. layout and 2) with the optimized layout. Yuan-Pin et al., 2005. An effective illumination recognition A webcam mouse system is The system has been tested under The accuracy of face detection based on Webcam mouse using face technique combining K-Nearest implemented on a laptop PC various environmental conditions the KNN classifier is higher than 92% and eye tracking in various Neighbor (KNN) classifier and with a Pentium 4–2.4-GHz with complex background, in various illumination environments. illumination environments adaptive skin model is presented to CPU with a Logitech Webcam. such as office, external sunlight In real-time implementation, the [73]. realize the real-time tracking system. The captured frame format environment, darkness system successfully tracks user face and is 320x240, 15 frames per environment, outdoor, and coffee eyes features at 15 fps under standard second. shop. notebook platforms. Disability and Rehabilitation: Assistive Technology
You can also read