DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery

Page created by Willie Avila
 
CONTINUE READING
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
DENVER
COLORADO, USA
JUNE 25-28 2019
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
Table of Contents

ETRA Event Maps .................................................................. 2
ETRA Schedule Overview ..................................................... 4
Message from Conference Chairs ....................................... 6
ETRA 2019 Credits ................................................................ 8
Keynotes ............................................................................     9
Tutorials................................................................................ 16
Doctoral Symposium .......................................................... 23
ETRA Wednesday Sessions .............................................. 26
COGAIN Wednesday Sessions ......................................... 27
ET4S Wednesday Sessions .............................................. 30
ETRA 2019 Posters, Pre-function Space ........................... 32
ETRA 2019 Demos & Videos............................................... 34
ETRA Thursday Sessions ................................................. 36
ETWEB Thursday Sessions .............................................. 37
ETVIS Thursday Sessions ................................................ 39
Privacy Panel ...................................................................... 42
ETRA Friday Sessions ...................................................... 48
Sponsor Descriptions ........................................................ 50
Event Notes ......................................................................... 55
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
2

    Registraon   Exhibitors   Posters   Meeng Rooms   Meals   Office
3
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
4

                                                       06/25/2019 (Tuesday)                                                                                   06/26/2019 (Wednesday)

                                                                                                                                                                                                                                  ETRA Schedule for Tuesday & Wednesday
                                      Tutorials Track 1 (Oxford / Tutorials Track 2 (Red         Doctoral Symposium                               Workshops track                                         Talks Track 2 (Pikes
                        TIME                                                                                                 TIME                                        Talks Track 1 (Humboldt)
                                        Pikes Peak / Humbolt)             Cloud)                      (Torrey's)                                     (Oxford)                                                    Peak)

                                                                                                                                                                            Opening session
                                      T1: Deep learning in the       T3: Gaze Analytics                                                         Keynote Address by Oleg Komogortsev "Eye Tracking Sensors Past, Present,
                     8:00 - 10:00                                                                Doctoral Symposium        8:30 - 9:30
                                      eye tracking world             Pipeline                                                                                        Future and Their Applications"
                                                                                                                                                                    (Oxford / Humboldt / Pikes Peak)

                     10:00 - 10:30                               Coffee break (Ballroom Foyer)                            9:30 - 10:00                                   Coffee break (Ballroom Foyer)

                                                                                                                                                                          ETRA Session 1: Deep
                                      T1: Deep learning in the       T3: Gaze Analytics                                                       Workshop by Facebook
                     10:30 - 12:30                                                               Doctoral Symposium       10:00 - 12:00                                       Learning, Paths,             COGAIN Session 1
                                      eye tracking world             Pipeline                                                                      Reality Labs
                                                                                                                                                                           Transitions & Pursuits

                     12:30 - 13:30                     Lunch break (Oxford / Pikes Peak/Humbolt)                          12:00 - 13:00                             Lunch break (The Lockwood Kitchen)

                                      T2: Discussion and             T4: Eye Tracking in                                                                                     ETRA Session 2:
                                      standardisation of the         Autism and Other                                                         Workshop by        Tobii     Calibration, Cognition,
                     13:30 - 15:30                                                               Doctoral Symposium       13:00 - 15:00                                                                    COGAIN Session 2
                                      metrics for eye movement       Developmental                                                                    Pro                     Smartphones, &
                                      detection                      Conditions                                                                                                  Sequences

                     15:30 - 16:00                               Coffee break (Ballroom Foyer)                            15:00 - 15:30                                  Coffee break (Ballroom Foyer)

                                      T2: Discussion and             T4: Eye Tracking in
                                      standardisation of the         Autism and Other                                                                                     POSTERS Fast Forward           ET4S Session 1 (15:30-
                     16:00 - 18:00                                                               Doctoral Symposium       15:30 - 17:30
                                      metrics for eye movement       Developmental                                                                                         Session (16:15-17:30)                18:00)
                                      detection                      Conditions

                                                                                                                                               Reception, Poster Session, Video & Demo Session (Ellingwood A&B and Red
                                                                                                                          17:30 - 21:00
                                                                                                                                                                                 Cloud)

                                                            06/27/2019 (Thursday)                                                                                        06/28/2019 (Friday)

                                                                                                                                                                                                                                  ETRA Schedule for Thursday & Friday
                                     Workshops track (Red         Talks Track 1 (Oxford / Pikes                                                           Talks Track 1 (Oxford / Pikes
                       TIME                                                                           Talks Track 2 (Torrey's)             TIME                                                      Talks Track 2 (Torrey's)
                                           Cloud)                      Peak / Humboldt)                                                                         Peak Humboldt)

rs Past, Present,                             Keynote Address by Enkeleida Kasneci "From Gazing to Perceiving"                                           Privacy in Eye Tracking - Panel Discussion (Oxford / Pikes Peak /
                     8:30 - 9:30                                                                                                         8:30 - 9:30
                                                              (Oxford / Pikes Peak / Humboldt)                                                                                      Humboldt)

                    9:30 - 10:00                                     Coffee break (Ballroom Foyer)                                   9:30 - 10:00                            Coffee break (Ballroom Foyer)

                                      Workshop by SR             ETRA Session 3: Visualisations                                                             ETRA Session 6: Privacy,
                    10:00 - 12:00                                                                        ETWEB Session 1            10:00 - 12:00                                                    CHALLENGE Session 1
                                        Research Inc.                  & Programming                                                                       Authentication, Fitts of Skill

                    12:00 - 13:00                         Lunch break (Oxford / Pikes Peak / Humboldt)                              12:00 - 13:00                   Lunch break (Oxford / Pikes Peak / Humboldt)

                                                                                                          ETVIS Session 1:
                                                                     ETRA Session 4: Head-                                                               Town hall meeting, ETRA 2020 Introduction (Oxford / Pike's Peak /
                    13:00 - 15:00                                                                      Visualization Tools and      13:00 - 15:00
                                                                      Mounted Eye Tracking                                                                                         Humboldt)
                                                                                                             Techniques

                    15:00 - 15:30                                    Coffee break (Ballroom Foyer)

                                                                  ETRA Session 5:         Gaze       ETVIS Session 2:  Visual
                    15:30 - 17:30
                                                                    Detection and Prediction           Scanpath Comparison

od A&B and Red
                    18:30 - 21:30                           Banquet (Oxford / Pikes Peak / Humboldt)
         5
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
Message from Conference Chairs
                                                                                       ETRA 2019 also aims at bringing together science and business. We are
We are very pleased to welcome you to the 11th ACM Symposium on Eye                    most grateful to all of our sponsors. The ETRA conference is also supported
Tracking Research & Applications (ETRA) 2019 in Denver, Colorado, USA!                 by the efforts of the Special Interest Group on Computer-Human Interaction
We are excited to present an excellent program for you to experience. We               (SIGCHI) of the Association for Computing Machinery (ACM) and the ACM.
strongly believe that the program and proceedings represent the most vibrant           We would like to point your attention to sponsors’ workshops accompanying
eye tracking methodology and its application advances which meet rigorous              the main conference. This year we accommodate three sponsor workshops
VFLHQWL¿FFULWHULD                                                                    by Tobii Pro, Facebook Reality Labs, and SR Research Ltd. These
                                                                                       workshops are free for all ETRA attendees and are an excellent platform for
For more than twenty years, the ACM ETRA conference has been the premier               knowledge exchange between business practitioners and academics.
world-wide meeting place for the eye tracking community. ETRA is growing
DQGFKDQJLQJWRJHWKHUZLWKWKHH\HWUDFNLQJUHVHDUFK¿HOG)RUWKH¿UVWWLPH          We are very excited about our keynote speakers addressing current trends
this year, ETRA is being held annually after being biannual since its inception.       in eye tracking. Oleg Komogortsev, a PECASE award winner will discuss
                                                                                       the past and present status of eye tracking sensors, along with his vision
ETRA 2019 puts forth the effort to enhance interdisciplinary collaboration             for future development. He will also discuss applications that necessitate
EHWZHHQUHVHDUFKHUVLQGLIIHUHQWGLVFLSOLQHV¿HOGVDQGDUHDV7RPDNHWKLV           the presence of such sensors in VR/AR devices, along with applications
happen, we are pleased to have ETRA 2019 collocated with four excellent                WKDWKDYHWKHSRZHUWREHQH¿WVRFLHW\RQDODUJHVFDOH(QNHOHMGD.DVQHFL
thematic eye tracking meetings and tracks: Eye Tracking For Spatial                    will dive into more fundamental issues of human gazing, perception and
Research (ET4S), Computer – Gaze Interaction (COGAIN), Eye Tracking &                  H\HWUDFNLQJ6KHZLOOJREH\RQGWKHOLQHRIVLJKWVLPSOL¿FDWLRQE\H[SORULQJ
Visualizations (ETVIS), and Eye Tracking for Web (ETWEB). ET4S joined                  requirements needed to shift our paradigm from foveal to retina-aware eye
(75$IRUWKH¿UVWWLPH,WDLPVDWEULQJLQJWRJHWKHUUHVHDUFKHUVIURPGLIIHUHQW       tracking, and discussing novel ways to employ this new paradigm to further
areas who have a common interests in using eye tracking for research                   our understanding of human perception.
questions related to spatial information and spatial decision making. ETWEB
is a new initiative which covers topics related to Web (interface semantics            Putting on a conference takes many people working together towards a
extraction, interaction adaptation, etc.) and eye tracking (attention visualization,   common goal. We thank all the authors and volunteer reviewers that have
crowd sourcing, etc.). COGAIN focuses on all aspects of gaze interaction, with         contributed to this year’s submissions. We would especially like to highlight
special emphasis on eye-controlled assistive technology. It presents advances          the work of the area chairs who provided synthesizing meta-reviews, led
in these areas, leading to new capabilities in gaze interaction, gaze enhanced         discussions, and provided their expert guidance to reviewers and authors.
applications, and gaze contingent devices. ETVIS covers topics that are
UHODWHGWRYLVXDOL]DWLRQUHVHDUFK LQFOXGLQJLQIRUPDWLRQYLVXDOL]DWLRQVFLHQWL¿F     It has been our pleasure to serve in our capacity of Conference Chairs for
visualization, and visual analytics) and eye tracking. In 2019, ETRA will host for     DVHFRQG\HDU:HORRNIRUZDUGWRSXVKLQJWKHIURQWLHURIWKH¿HOGtogether
WKH¿UVWWLPHD&KDOOHQJH7UDFN,QWKLVRSHQLQJHGLWLRQWKH&KDOOHQJH7UDFN         with all of you at this year’s ETRA. We wish you a great time in Denver!
will consist of a mining challenge. An open call was put out to apply analytical
tools to a common human eye-movement data set. The challenge was to have
participants creatively engage their newest and most exciting mining tools and
approaches to make the most of this dataset. In ETRA 2019, we received 109
submissions to the main ETRA conference track. We accepted 28 full papers
and 18 short papers after a two-phase reviewing process.

This year we continue our tutorial program and doctoral students symposium
with 13 accepted submissions. This year we accommodate four half-
GD\H[FLWLQJWXWRULDOV'HHS/HDUQLQJLQWKH(\H7UDFNLQJ:RUOGE\3DZHá              Bonita Sharif & Krzysztof Krejtz
Kasprowski, Discussion and standardization of the metrics for eye movement             ETRA 2019 Conference Chairs
detection by Mikhail Startsev and Raimondas Zembys, Gaze Analytics
Pipeline by Nina Gehrer and Andrew Duchowsk, Eye Tracking in the Study of
Developmental Conditions: A Computer Scientists Primer by Frederick Shic.

6                                                                                                                                                                       7
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
ETRA 2019 Organization                                                 Web Chairs
                                                                       Adrian Pilkington (University of Nebraska-Lincoln, USA)
Conference Chairs                                                      Michael Decker (Bowling Green State University, USA)
Bonita Sharif (University of Nebraska-Lincoln, USA)
Krzysztof Krejtz (SWPS University of Social Sciences and Humanities,   Proceedings Chair
Poland)                                                                Stephen N. Spencer (University of Washington, USA)

Paper Chairs                                                           Local Arrangements Chairs
Veronica Sundstedt (Blekinge Tekniska Högskola, Sweden)                Hana Vrakova (University of Colorado Boulder, USA)
Paige Rodeghero (Clemson University, USA)                              Martha Crosby (University of Hawaii, USA)

Demo & Video Chairs                                                    Student Volunteer Chairs
Tanja Blascheck (Universität Stuttgart, Germany)                       Katarzyna Wisiecka (SWPS University of Social Sciences and
Fabian Deitelhoff (University of Applied Sciences and Arts Dortmund,   Humanities, Poland)
Germany)                                                               Ayush Kumar (Stony Brook University, USA)

Doctoral Symposium Chairs                                              Accessibility Chairs
Hana Vrakova (University of Colorado Boulder, USA)                     -XVW\QDĩXUDZVND 6:368QLYHUVLW\RI6RFLDO6FLHQFHVDQG+XPDQLWLHV
Reynold Bailey (Rochester Institute of Technology, USA)                Poland)
Ann McNamara (Texas A&M University, USA)
                                                                       Conference Companion Booklet Chairs
Tutorial Chairs                                                        Matthew Crosby (USA)
Preethi Vaidyanathan (LC Technologies Inc., USA)                       Adrian Pilkington (University of Nebraska-Lincoln, USA)
Diako Mardanbegi (Lancaster University, UK)                            Agnieszka Ozimek (SWPS University of Social Sciences and
                                                                       Humanities, Poland)
Poster Chairs
Arantxa Villanueva (Public University of Navarra, Spain)               Design and Artwork Chairs
Eakta Jain (University of Florida, USA)                                Matthew Crosby (USA)
                                                                       Adrian Pilkington (University of Nebraska-Lincoln, USA)
Eye Tracking Challenge Chairs                                          Meera Patel (University of Nebraska-Lincoln, USA)
Susana Martinez-Conde (State University of New York, USA)
Jorge Otero-Millan (Johns Hopkins University, USA)
                                                                       Co-sponsored by ACM SIGGRAPH and ACM SIGCHI
Sponsor Chairs
Oleg Komogortsev (Michigan State University, USA)                      ETRA Steering Committee
Kenan Bektas (Zurich University of Applied Sciences, Switzerland)      Andrew Duchowski (chair), Pernilla Qvarfordt, Paivi Majaranta

Social Media Chairs                                                    Special thanks to the 128 reviewers and 51 area chairs!
Anna Niedzielska (SWPS University of Social Sciences and Humanities,   Check out the ETRA 2019 website at
Poland)                                                                http://etra.acm.org/2019/areachairs.html and
Nina Gehrer (Eberhard Karls University of Tübingen, Germany)           http://etra.acm.org/2019/reviewers.html for a list of names.

8                                                                                                                                         9
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
ETRA 2019 Co-located Events Organization                         Keynotes
COGAIN (Communication by Gaze Interaction)
Organizers                                                       Eye Tracking Sensors Past, Present,
John Paulin Hansen, Technical University of Denmark, Denmark     Future and Their Applications
Päivi Majaranta, Tampere University, Finland                     Wednesday, June 26, 8:30 - 9:30
Program Co-Chairs                                                (Oxford / Humboldt / Pikes Peak)
Diako Mardanbegi, Lancaster University, United Kingdom
Ken Pfeuffer, Bundeswehr University Munich, Germany              OLEG KOMOGORTSEV, ASSOCIATE PROFESSOR AT
                                                                 TEXAS STATE UNIVERSITY, USA
ET4S (Eye Tracking for Spatial Research)
Organizers                                                       Abstract. The availability of eye tracking sensors is
Peter Kiefer, ETH Zurich, Switzerland                            set to explode, with billions of units available in future
Fabian Göbel, ETH Zurich, Switzerland                            Virtual Reality (VR) and Augmented Reality (AR)
David Rudi, ETH Zurich, Switzerland                              platforms. In my talk I will discuss the past and present
Ioannis Giannopoulos, TU Vienna, Austria                         status of eye tracking sensors, along with my vision for
                                                                 future development. I will also discuss applications that
Andrew T. Duchowski, Clemson University, USA
                                                                 necessitate the presence of such sensors in VR/AR
Martin Raubal, ETH Zurich, Switzerland
                                                                 devices, along with applications that have the power to
                                                                 EHQH¿WVRFLHW\RQDODUJHVFDOHZKHQ95$5VROXWLRQV
ETVIS (Eye Tracking and Visualization)                           are widely adopted.
Organizers
Michael Burch, Eindhoven University of Technology, Netherlands   Bio. Dr. Komogortsev is currently a tenured Associate Professor at Texas State
Pawel Kasprowski, Silesian University of Technology, Poland      University and a Visiting Scientist at Facebook Reality Labs at Facebook. Dr.
Leslie Blaha, Air Force Research Laboratory, USA                 Komogortsev has received his B.S. in Applied Mathematics from Volgograd
Social Media Chair                                               State University, Russia, and M.S./Ph.D. degree in Computer Science from
Ayush Kumar, Stony Brook University, USA                         Kent State University, Ohio. He has previously worked for such institutions
                                                                 as Johns Hopkins University, Notre Dame University, and Michigan State
ETWEB (Eye Tracking for the Web)                                 University. Dr. Komogortsev conducts research in eye tracking with a focus on
Organizers                                                       cyber security (biometrics), health assessment, human computer interaction,
Chandan Kumar, Institute WeST, University of Koblenz, Germany    usability, and bioengineering. This work has thus far yielded more than 100
Raphael Menges, Institute WeST, University of Koblenz, Germany   peer reviewed publications and several patents. Dr. Komogortsev’s research
Sukru Eraslan, METU Northern Cyprus Campus                       was covered by the national media including NBC News, Discovery, Yahoo,
Program Committee                                                Livesience and others. Dr. Komogortsev is a recipient of four Google awards
Alexandra Papoutsak, Pomona College, USA                         including two Virtual Reality Research Awards (2016, 2017), Google Faculty
Jacek Gwizdka, University of Texas, USA                          Research Award (2014), and Google Global Faculty Research Award (2018). Dr.
                                                                 Komogortsev has also won National Science Foundation CAREER award and
Scott MacKenzie, York University, Canada
                                                                 Presidential Early Career Award for Scientists and Engineers (PECASE) from
Simon Harper, University of Manchester, UK
                                                                 President Barack Obama on the topic of cybersecurity with the emphasis on eye
Caroline Jay, University of Manchester, UK
                                                                 movement-driven biometrics and health assessment. In addition, his research
Victoria Yaneva, University of Wolverhampton, UK                 is supported by the National Science Foundation, National Institute of Health,
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
From Gazing to Perceiving                                                         Wednesday, June 26 2019
Thursday, June 27, 8:30 - 9:30                                                    13:00 - 15:00 (Oxford)
(Oxford / Pikes Peak / Humboldt)
                                                                                  Tobii Pro Workshop
ENKELEJDA KASNECI, ASSOCIATE PROFESSOR OF COMPUTER
                                                                                  Tobii Pro solutions for VR experiments
SCIENCE PERCEPTION ENGINEERING LAB AT UNIVERSITY OF
TÜBINGEN, GERMANY
                                                                                  Presenter: Jonas Högström, Tobii Pro & Tim Holmes, Royal Holloway,
                                                                                  University of London
Abstract. Eye tracking technology is based on the assumption that our
perception follows the fovea – a tiny region in our retina responsible for
                                                                                  Abstract: Whereas experiments in Virtual Reality (VR) have grown much
sharp central vision. In fact, what we usually refer to as the line of sight is
                                                                                  more common over the last years, they are still not as common nor as
nothing but the imaginary line connecting the fovea to the gazed location.
                                                                                  well-supported as standard screen-based experiments. The choice of
However, our visual perception is far more complex than that: Gazing
                                                                                  research tool to choose goes hand in hand with the research question of
is not perceiving. As a tangible example, consider our retinal peripheral
                                                                                  interest, but today the same question can be approached from different
view. Whereas we cannot distinguish details in this region, movements
                                                                                  angles using screen-based experiments, glasses-based experiments,
are perceptible nonetheless. In this talk, I will go beyond the line of sight
                                                                                  360° VR media, and full 3D VR environments. The choice of what media
VLPSOL¿FDWLRQE\D H[SORULQJUHTXLUHPHQWVQHHGHGWRVKLIWRXUSDUDGLJP
                                                                                  to use in a VR experiment is determined by the researcher’s desired
from foveal to retina-aware eye tracking, and b) discussing novel ways
                                                                                  level of control of the stimulus, how representative it is supposed to be
to employ this new paradigm to further our understanding of human
                                                                                  WRDQRQDUWL¿FLDOZRUOGZKDWPHWULFVWKDWZLOOEHXVHGDQGWKHWLPHDQG
perception.
                                                                                  resources available for the project.
Bio. Enkelejda Kasneci is an Associate Professor
                                                                                  This workshop will present Tobii Pro’s solutions for conducting VR
of Computer Science at the University of Tübingen,
                                                                                  experiments, and will go through how areas of interests, trials, moving
Germany, where she leads the Perception
                                                                                  $2,V¿[DWLRQFODVVL¿FDWLRQDQGRWKHUFRQFHSWVDUHKDQGOHGLQWKH
Engineering Group. As a BOSCH-scholar, she
                                                                                  H[SHULPHQWZRUNÀRZ,WZLOOSURYLGHDQXQGHUVWDQGLQJRIZKDWSDUWVDUH
received her M.Sc. degree in Computer Science
                                                                                  taken care of by the software, and what is expected of the researcher
from the University of Stuttgart in 2007. In 2013,
                                                                                  themselves. Workshop attendees will get a chance to try VR hardware
she received her PhD in Computer Science from
                                                                                  and the software solutions themselves.
the University of Tübingen, Germany. For her
PhD research, she was awarded the research prize of the Federation
Südwestmetall in 2014. From 2013 to 2015, she was a Margarete-von-
Wrangell Fellow. Dr. Kasneci’s overarching and long-term vision aims at
computing systems that sense and infer the user’s cognitive state, actions,
and intentions based on eye movements. These systems set out to provide
information for assistive technologies applicable for many activities of
everyday life. Towards this vision, her research combines eye tracking
technology with machine learning in various multidisciplinary projects that
DUHVXSSRUWHGE\QDWLRQDOVFLHQWL¿FVRFLHWLHVDVZHOODVYDULRXVLQGXVWULDO
sources. In addition, she serves as academic for PlosOne as well as a
reviewer and PC member for several journals and major conferences.

12                                                                                                                                                          13
DENVER COLORADO, USA JUNE 25-28 2019 - Association for Computing Machinery
Wednesday, June 26 2019                                                           Thursday, June 27 2019                                      SR Research
10:00 - 12:00 (Oxford)                                                            10:00 - 12:00 (Red Cloud)

Facebook Reality Labs Workshop                                                    SR Research Workshop                                       EyeLink
Establishing a Ground-Truth for Eye Tracking                                      Title: Recording and analyzing gaze during
                                                                                  website interactions with EyeLink eye trackers
Robert Cavin, Research Lead, Eye Tracking, Facebook Reality Labs
Immo Schuetz, Postdoctoral Research Scientist, Facebook Reality Labs              Presenter: Dr. Sam Hutton, SR Research Ltd
Robin Sharma, Optical Scientist, Facebook Reality Labs
Kavitha Ratnam, Postdoctoral Research Scientist, Facebook Reality Labs            Abstract: Eye tracking can be a powerful tool in usability research and
Michele Rucci, Professor, Center for Visual Science, University of                graphical interface design, providing important information concerning
Rochester                                                                         where users direct their attention to websites and applications they are
Austin Roorda, Professor, School of Optometry, University of California           interacting with. In website usability, for example, eye tracking can reveal
Berkeley                                                                          important information about which areas of a web page are read, which
                                                                                  areas are skipped, or even which areas increase cognitive workload.
Calibration and performance evaluation of current eye trackers typically          In traditional eye tracking, the researcher has tight control over what is
rely on comparing known target positions to measured gaze directions              shown, where it is shown and when it is shown. Analysis of the gaze data
ZKLOHDSDUWLFLSDQWLV¿[DWLQJRQWKRVHWDUJHWV$PDSSLQJIXQFWLRQRU           typically involves mapping gaze up with various areas of inter est, and
geometric eye model is then optimized based on this correspondence,               UHSRUWLQJPHDVXUHVVXFKDV¿[DWLRQFRXQWDQGGZHOOWLPH(\HWUDFNLQJ
essentially treating the calibration targets as the “ground truth” for each       for usability research, however, introduces a number of complications
gaze direction. While this has worked reasonably well to achieve current          that traditional stimulus presentation and analysis software do not
calibration accuracies of around 0.5 degrees, trying to optimize beyond           always deal with adequately. For example, the participant themselves
this point reveals that calibration targets are more a self-report measure        determines what is shown, and when /where it is shown. As such, an
WKDQUHDOJURXQGWUXWK3DUWLFLSDQWFRPSOLDQFH¿[DWLRQDOH\HPRYHPHQWV          accurate recording of the screen is critical. Web pages often contain
such as drifts and micro-saccades, as well as the accuracy of positioning         dynamic (moving) content, and can themselves be scrolled, adding
the fovea or preferred viewing location itself all contribute to uncertainty in   further complications to traditional analysis approaches, in which interest
the “ground-truth” target location and thus form a lower bound for tracking       areas are typically static. This workshop will introduce new recording
accuracy.                                                                         and analysis software from SR Research that allows researchers to
                                                                                  record and quantify participants gaze whilst they interact with websites.
Many applications of eye tracking for virtual and augmented reality will          Key features include screen and audio recording, keypress and
UHTXLUHKLJKHUWUDFNLQJ¿GHOLW\WKDQZKDWLVFXUUHQWO\DYDLODEOH,QWKLV        mouse logging, the ability to provide a live preview of the gaze data
workshop, we will explore the hypothesis that measuring ground-truth              during recording, automatic scroll compensation at the analysis stage,
gaze in conjunction with a second, to-be-evaluated eye tracking system            automatic data segmentation and navigation based on URLs, data
FDQKHOSERRVWPRGHODQGWUDFNLQJDFFXUDF\LQWKHORQJWHUP:HGH¿QH            aggregation from multiple participants, mouse event data visualization
ground-truth as the mapping of real-world content onto the retinal locus of       DQGH[WUDFWLRQDQGQHZUHSRUWYDULDEOHVVSHFL¿FWRZHESDJHWUDFNLQJ
¿[DWLRQ6SHDNHUVZLOOSUHVHQWGLIIHUHQWDSSURDFKHVIURPDFDGHPLDDQG
industry, followed by a panel discussion on the viability and possibilities of
ground-truth eye tracking approaches. To continue the conversation after
the workshop, we invite participants to a Facebook-sponsored social after
the main conference events

14                                                                                                                                                          15
Tutorials                                                                        applications to eye movement data, (4) Recurrence Neural Networks
                                                                                 and its possible usages. The tutorial will NOT include the detailed
Deep Learning in the Eye Tracking World                                          mathematical explanation of neural network architecture and algorithms.
PAWEL KASPROWSKI, KASPROWSKI@POLSL.PL                                            All subjects will be explained with simple try-on examples using real eye
Tuesday, June 25, 2019, 8:00-12:30 (Oxford / Pikes Peak / Humbolt)               movement datasets.

Abstract. Recently deep learning has become a hype                               Bio. Dr. Pawel Kasprowski is an Assistant Professor at Institute of
word in computer science. Many problems, which till now                          Informatics, Silesian University of Technology, Poland. He received his
could be solved only using sophisticated algorithms, can                         Ph.D. in Computer Science in 2004 under the supervision of Prof. Jozef
be now solved with specially developed neural networks.                          Ober – one of the precursors of eye tracking. He has experience in both
                                                                                 eye tracking and data mining. His primary research interest includes
Deep learning also becomes more and more popular                                 using data mining methods to analyze eye movement signal. Dr. Pawel
                                                              PAWEL KASPROWSKI
in the eye tracking world. It may be used in any place                           Kasprowski teaches data mining at the University as well as during
ZKHUHVRPHNLQGRIFODVVL¿FDWLRQFOXVWHULQJRUUHJUHVVLRQ                      commercial courses. In the same time, he is an author of numerous
is needed. The tutorial aims to show the potential applications (like            publications concerning eye movement analysis.
calibration, event detection, gaze data analysis and so on), and – what is
more important – to show how to apply deep learning frameworks in such           Additional information for prospective participants: http://www.
research.                                                                        kasprowski.pl/tutorial/

There is a common belief that to use neural networks a strong
mathematical background is necessary as there is much theory which
must be understood before starting working. There is also a belief that,         Discussion and standardisation of
because most deep learning frameworks are just libraries in programming          the metrics for eye movement detection
languages, it is necessary to be a programmer and have knowledge of the          MIKHAIL STARTSEV, MIKHAIL.STARTSEV@TUM.DE
programming language that is used.                                               RAIMONDAS ZEMBLYS, R.ZEMBLYS@TF.SU.LT
                                                                                 Tuesday, June 25, 2019, 13:30-18:00 (Oxford / Pikes Peak / Humbolt)
:KLOHERWKDELOLWLHVDUHEHQH¿FLDOEHFDXVHWKH\PD\KHOSLQDFKLHYLQJ
better results, this tutorial aims to prove that deep networks may be used       Abstract. By now, a vast number
even by people who know only a little about the theory. I will show you          of algorithms and approaches for
ready-to-use networks with exemplary eye movement datasets and try to            detecting various eye movements
explain the most critical issues which you will have to solve when preparing      ¿[DWLRQVVDFFDGHV362SXUVXLW
your own experiments. After the tutorial, you will probably not become an        OKN, etc.) have been proposed
expert in deep learning, but you will know how to use it in practice with        and evaluated by researchers in the
your eye movement data.                                                          ¿HOG7KHUHSRUWHGUHVXOWVDUHQRW        RAIMONDAS ZEMBLYS  MIKHAIL STARTSEV

                                                                                 always directly comparable and easily
Audience. The tutorial is addressed to every person interested in deep           interpretable, even by experts. Part of this problem lies in the diversity of
learning; no special skills are required apart from some knowledge about         the metrics that are used to test the algorithms.
eye tracking and eye movement analysis. However, minimal programming
skills are welcome and may help in better understanding the problem.             The multitude of metrics reported in the literature is potentially
                                                                                 FRQIXVLQJERWKWRWKHUHVHDUFKHUVZKRZDQWWRMRLQWKH¿HOGDQGWRWKH
Scope. This tutorial will include: (1) gentle introduction to machine            established groups. Firstly, there is a number of sample-level measures:
OHDUQLQJFODVVL¿FDWLRQDQGUHJUHVVLRQSUREOHPV  LQWURGXFWLRQWRQHXUDO     &RKHQ¶VNDSSDYDOXHVVHQVLWLYLW\VSHFL¿FLW\)VFRUHVDQGDFFXUDF\
networks, (3) explanation of Convolutional Neural Networks and its               or disagreement rates. Secondly, a growing number of event-level
                                                                                 measures exist: average statistics of the “true” and detected events

16                                                 Tuesday 6.25.19                                                                                           17
(duration, amplitude, etc.), quantitative and qualitative scores proposed         from the tutorial regardless of their background, either by discovering
by Komogortsev et al. [2010], different ways of computing F1 scores               something new about the metrics they have or have not used before, or
[Hooge et al. 2018, Zemblys et al. 2018, Startsev et al. 2018], variations        by contributing to the discussion and sharing their experiences.
of the Cohen’s kappa [Zemblys et al. 2018, Startsev et al. 2019], temporal        Bio. Mikhail Startsev is a PhD student at the Technical University of
offset measures of Hooge et al. [2018], average intersection-over-union           Munich (TUM), Germany and a member of an International Junior
ratios [Startsev et al. 2018], and Levenshtein distance between event             5HVHDUFK*URXS³9LVXDO(I¿FLHQW6HQVLQJIRUWKH3HUFHSWLRQ$FWLRQ
sequences [Zemblys et al. 2018]. Almost all of the metrics listed above           Loop” (VESPA) under the supervision of Michael Dorr. He received his
can be computed for all eye movement classes taken together or for each           Diplom degree in Computational Mathematics and Informatics from the
considered class in isolation.                                                    Lomonosov Moscow State University (LMSU), Russia, in 2015, where
                                                                                  he was a member of the Graphics and Media Lab. Mikhail’s research is
Some aspects of these evaluation measures (especially on the level of             centred around the human visual system, with a particular emphasis on
events) contribute to their interpretability, bias, and suitability for various   the eye movements and saliency modelling, with several publications in
purposes and testing scenarios (e.g. whether expert manual annotations            human and computer vision-related conferences and journals.
are available for comparison, or whether the stimuli were synthetically
generated or recorded in naturalistic conditions). With the advent of             Dr. Raimondas Zemblys is currently a researcher at Siauliai University
machine learning-based models, the choice of a metric, a loss function,           (Lithuania) and research engineer at Smart Eye AB (Sweden). His
or a set of those should be motivated not just by differentiating between a       main research interests are eye-tracking methodology, eye-movement
handful of algorithms, but also by the metric’s ability to guide the training     data quality, event detection and applications of deep learning for eye-
process over thousands of epochs.                                                 movement analysis. He received his PhD in Informatics Engineering
                                                                                  from Kaunas University of Technology in 2013, worked as a postdoc
Right now, there is no clear-cut way of choosing a suitable metric for the        researcher at Lund University in 2013-2015 and Michigan State
problem of eye movement detection. Additionally, the set-up of an eye             University in 2017-2018.
tracking experiment has a bearing on the applicable evaluation strategies.
In this tutorial, we intend to provide an in-detail discussion of existing
metrics, which would supply both theoretical and practical insights. We
will illustrate our recommendations and conclusions through examples              Gaze Analytics Pipeline
and experimental evidence. This tutorial aims to facilitate discussion and        ANDREW DUCHOWSKI, ADUCHOW@CLEMSON.EDU
stimulate the researchers to employ uniform and well-grounded evaluation          NINA GEHRER, NINA.GEHRER@UNI-TUEBINGEN.DE
strategies.                                                                       Tuesday, June 25, 2019, 8:00-12:30 (Red Cloud)

Scope. The tutorial is aiming to provide its audience with a practice-            Abstract. This tutorial gives a
RULHQWHGRYHUYLHZRIWKHHYDOXDWLRQPHWULFVWKDWFDQEHXVHGLQWKH¿HOG         short introduction to experimental
of eye movement detection, covering a wide variety of set-ups, such as            design in general and with regard
eye movements with synthetic and naturalistic stimuli, in the presence            to eye tracking studies in particular.
or absence of manual annotations, as well as different purposes of the            Additionally, the design of three
HYDOXDWLRQ VHOHFWLQJWKHEHVWDOJRULWKPIRUDXWRPDWLFGHWHFWLRQ¿QGLQJ         different eye tracking studies            NINA GEHRER       ANDREW DUCHOWSKI
systematic biases in the annotations by different experts; training a             (using stationary as well as mobile
machine learning model) and evaluated entities (i.e. individual samples or        eye trackers) will be presented and the strengths and limitations of
whole events). The presentations will give recommendations for evaluation         their designs will be discussed. Further, the tutorial presents details
strategy choices for different scenarios, as well as support the discussion       of a Python-based gaze analytics pipeline developed and used by
of various metrics by examples.                                                   Prof. Duchowski and Ms. Gehrer. The gaze analytics pipeline consists
                                                                                  of Python scripts for extraction of raw eye movement data, analysis
Audience. Researchers involved in eye movement detection (or even                 DQGHYHQWGHWHFWLRQYLDYHORFLW\EDVHG¿OWHULQJFROODWLRQRIHYHQWVIRU
WKRVHZKRXVHH[LVWLQJGHWHFWRUVWRPDNHVHQVHRIWKHLUGDWD FRXOGEHQH¿W       statistical evaluation, analysis and visualization of results using R.

18                                                  Tuesday 6.25.19                                                                                              19
Attendees of the tutorial will have the opportunity to run the scripts of an    Eye Tracking in the Study of Developmental Conditions:
analysis of gaze data collected during categorization of different emotional    A Computer Scientists Primer
expressions while viewing faces. The tutorial covers basic eye movement         FREDERICK SHIC, FSHIC@UW.EDU
DQDO\WLFVHJ¿[DWLRQFRXQWDQGGZHOOWLPHZLWKLQ$2,VDVZHOODV          Tuesday, June 25, 2019, 13:30-6:00 (Red Cloud)
advanced analysis using gaze transition entropy. Newer analytical tools
and techniques such as microsaccade detection and the Index of Pupillary        Abstract. Children with developmental conditions,
Activity will be covered with time permitting.                                  such as autism, genetic disorders, and fetal alcohol
                                                                                syndrome, present with complex etiologies and can
Scope and Audience. The tutorial welcomes attendees at all levels               LQFXUVLJQL¿FDQWFKDOOHQJHVWKURXJKRXWWKHLUOLIH
of experience and expertise, from those just beginning to study eye             Especially in very young children, heterogeneity
movements and interested in the basics of experimental design to those          across and within diagnostic categories makes uniform
                                                                                                                                               FREDERICK SHIC
well practiced in the profession who might wish to consider adopting use        application of standard assessment methods, that
of Python and R scripts, possibly wishing to contribute to, expand on, and      often rely on assumptions of communicative or other
improve the pipeline.                                                           GHYHORSPHQWDODELOLWLHVGLI¿FXOW(\HWUDFNLQJKDVHPHUJHGDVDSRZHUIXO
                                                                                tool to study both the mechanistic underpinnings of atypical development
Bio. Dr. Duchowski is a professor of Computer Science at Clemson                as well as facets of cognitive and attentional development that may be
University. He received his baccalaureate (1990) from Simon Fraser              of clinical and prognostic value. In this tutorial we discuss the challenges
University, Burnaby, Canada, and doctorate (1997) from Texas A&M                and approaches associated with studying developmental conditions
University, College Station, TX, both in Computer Science. His research         using eye tracking. Using autism spectrum disorder (ASD) as a model,
and teaching interests include visual attention and perception, eye tracking,   we discuss the interplay between clinical facets of conditions and studies
computer vision, and computer graphics. He is a noted research leader           and techniques used to probe neurodevelopment.
LQWKH¿HOGRIH\HWUDFNLQJKDYLQJSURGXFHGDFRUSXVRISDSHUVDQGD
monograph related to eye tracking research, and has delivered courses           Scope and Audience. This tutorial is geared towards engineers and
and seminars on the subject at international conferences. He maintains          computer scientists who may be interested in the variety of ways eye
Clemson’s eye tracking laboratory, and teaches a regular course on eye          tracking can be used in the study of developmental mechanism or for the
tracking methodology attracting students from a variety of disciplines          development of clinically-relevant methods, but does not assume deep
across campus.                                                                  knowledge of eye tracking hardware, algorithms, or engineering-focused
                                                                                literature. Similarly, the tutorial will be broadly accessible, assuming
Nina Gehrer is a clinical psychologist who is currently working on her PhD      limited or no knowledge of developmental conditions, clinical research,
thesis at the University of Tübingen, Germany, since she received her           and/or autism.
master’s degree in 2015. Her main research interest lies in studying face
and emotion processing using eye tracking and a preferably wide range of        Bio. Frederick Shic, Ph.D. is an Associate Professor of Pediatrics at
analytic methods. As a clinical psychologist, she is particularly interested    the University of Washington and an Investigator at Seattle Children’s
in possible alterations related to psychological disorders that could           Research Institute’s Center for Child Health, Behavior and Development.
XQGHUOLHDVVRFLDWHGGH¿FLWVLQVRFLDOLQIRUPDWLRQSURFHVVLQJ6KHEHJDQ        Dr. Shic has been an autism researcher for 15 years and, as a computer
working with Prof. Duchowski in 2016. Since then, they have enhanced            scientist by training, brings an interdisciplinary perspective to early
and implemented his gaze analytics pipeline in the analysis of several          developmental, therapeutic, and phenotyping research. Dr. Shic leads
eye tracking studies involving face and emotion processing. Recently,           the Seattle Children’s Innovative Technologies Laboratory (SCITL), a
they have started to extend their research to gaze patterns during social       ODEIRFXVHGRQDGYDQFLQJDQGUH¿QLQJWHFKQRORJ\EDVHGWRROVLQFOXGLQJ
interactions.                                                                   eye tracking, functional near infrared spectroscopy, robots, mobile apps,
                                                                                and video games. His goals are to understand lifespan trajectories
                                                                                leading to heterogeneous outcomes in ASD, and to develop methods
                                                                                for positively intercepting these trajectories. To enable this, he focuses

20                                                Tuesday 6.25.19                                                                                          21
on big data perspectives of phenotypic variation, biomarker discovery       Doctoral Symposium
enabled via technology, and rapid, adaptable, evolving frameworks for       Schedule, Tuesday, June 25, 2019, Humboldt
outcomes research applicable to diverse populations. His current and
prior work, funded by NIMH, Simons Foundation, and Autism Speaks,           08:00-10:00
includes developmental, psychological, and applied autism research as       Welcome Note - Doctoral Symposium Co-Chairs (30 minutes)
ZHOODVPHWKRGVHQJLQHHULQJDLPHGDWFUHDWLQJDQGUH¿QLQJDQDO\WLFDODQG   3-minute introductions - in 1 slide introduce yourself, educational
predictive techniques. Previously, he was an engineering undergraduate      background, collaborations, and why your work is important
at Caltech, a Sony PlayStation video game programmer, a magnetic            10:00-10:30
resonance spectroscopy brain researcher, and a graduate student at Yale     Coffee break (Ballroom Foyer)
Computer Science’s Social Robotics Lab. It was during this graduate
work when, needing child gaze patterns to program an attention system       10:30-12:30
IRUDEDE\HPXODWLQJURERWKHZDV¿UVWLQWURGXFHGWRDXWLVPUHVHDUFK      Large Group Discussions (All Together)
at the Yale Child Study Center. He continued this work as an NIMH T32       Discuss technical aspects of your work
postdoc in Childhood Neuropsychiatric Disorders and then as an Assistant    (5 minutes, up to 5 slides, 3 minutes for feedback and Q&A).
Professor at the Yale Child Study Center.                                   Each student is assigned two abstracts to review in detail.
                                                                            One student will serve as moderator - introducing the speaker and topic,
                                                                            and kick off discussion.
                                                                            Another student will serve as scribe.

                                                                            12:30-13:30
                                                                            Lunch break (The Lockwood Kitchen)
                                                                            Discuss the following with peers at your table with
                                                                            one person taking notes (prepare 3-4 concrete quesstions):
                                                                            What obstacles/challenges are you facing?
                                                                            Do you feel your work is progressing smoothly?
                                                                            What could you use guidance on?
                                                                            :KDWVSHFL¿FTXHVWLRQVZRXOG\RXDVNH[SHUWVLQWKH¿HOG"
                                                                            13:30-15:30
                                                                            Summary of lunch discussions (13:30-13:45)
                                                                            Small Group Discussions with Faculty (13:45-15:30)
                                                                            Three small groups of DS students meet with
                                                                            three groups of established researchers.
                                                                            Groups rotate every 30 minutes.
                                                                            15:30-16:00
                                                                            Coffee break (Ballroom Foyer)
                                                                            16:00-17:00
                                                                            Posters Fast-Forward practice run
                                                                            Maximizing your conference experience
                                                                            Closing Remarks
                                                                            18:00
                                                                            Social Event

22                                              Tuesday 6.25.19                                                                                        23
Doctoral Symposium                                                        0LFURVDFFDGLFDQG3XSLOODU\5HVSRQVHWR7DFWLOH7DVN'LI¿FXOW\
                                                                          -XVW\QDĩXUDZVND 6:368QLYHUVLW\RI6RFLDO6FLHQFHVDQG+XPDQLWLHV
Abstracts
                                                                          Looks Can Mean Achieving: Understanding Eye Gaze Patterns of
When you don’t see what you expect: incongruence in music                 3UR¿FLHQF\LQ&RGH&RPSUHKHQVLRQ
and source code reading                                                   Jonathan Saddler (University of Nebraska Lincoln)
Natalia Chitalkina (University of Turku)
                                                                          High-Resolution Eye Tracking Using Scanning Laser Ophthalmoscopy
Eye-tracking based Fatigue and Cognitive Assessment                       Norick Bowers (University of California, Berkeley)
Tanya Bafna (Technical University of Denmark) and John Paulin Hansen
(Technical University of Denmark)                                         Eye movements during reading and reading assessment in Swedish
                                                                          VFKRROFKLOGUHQ±DQHZZLQGRZWRUHDGLQJGLI¿FXOWLHV
Pupil Diameter as a Measure of Emotion and Sickness in VR                 Andrea Strandberg (Karolinska Institute)
Brendan John (University of Florida)

Accessible Control of Telepresence Robots based on Eye-Tracking
Guangtao Zhang (Technical University of Denmark)

The vision and interpretation of paintings: bottom-up visual processes,
top-down culturally informed attention, and aesthetic experience
Pablo Fontoura (EHESS), Jean-Marie Schaeffer (EHESS), and Michel
Menu (C2RMF)

Attentional orienting in real and virtual 360-degree environments:
application to aeronautics
Rébaï Soret (ISAE-SUPAERO), Christophe Hurter (ENAC- Ecole Nationale
de l’Aviation Civile), and Vsevolod Peysakhovich (ISAE)

Motion Tracking of Iris Features for Eye tracking
Aayush Chaudhary (Rochester Institute of Technology)

Automatic quick-phase detection in bedside recordings
from patients with acute dizziness and nystagmus
Sai Akanksha Punuganti (Johns Hopkins University, USA), Jing Tian
(Johns Hopkins University, USA), and Jorge Otero-Millan (Johns Hopkins
University, USA)

Towards a Data-driven Framework for Realistic Self-Organized Virtual
Humans: Coordinated Head and Eye movements
Zhizhuo Yang (Rochester Institute of Technology)

24                                               Tuesday 6.25.19                                                                                25
ETRA 2019 Long and Short Papers                                           COGAIN 2019 Long and Short Papers
Presented as Talks                                                        Presented as Talks

Wednesday, June 26, 2019                                                  Wednesday, June 26, 2019
10:00-12:00                                                               10:00-12:00 & 13:00-15:00
(Humboldt)                                                                (Pike’s Peak)

Session 1                                                                 Session 1

Session Chair: Tanja Blascheck (University of Stuttgart)                  Session Chair: Arantxa Villanueva (Public University of Navarre,
                                                                          Italy
Deep learning investigation for chess player attention prediction using
eye-tracking and game data                                                10:00-10:15
Justin Le Louedec, Thomas Guntz, James Crowley and Dominique              Welcome and Brief Introduction
Vaufreydaz
Long                                                                      10:15-10:55
                                                                          Invited talk
Semantic Gaze Labeling for Human-Robot Shared Manipulation                Eye Tracking - From the Past to the Future [Abstract]
Reuben Aronson and Henny Admoni                                           Heiko Drewes (University of Munich, Germany) [Biography]
Long
                                                                          11:00-11:20
(\H)ORZ3XUVXLW,QWHUDFWLRQV8VLQJDQ8QPRGL¿HG&DPHUD                  A Comparative Study of Eye Tracking and Hand Controller for Aiming
Almoctar Hassoumi, Vsevolod Peysakhovich and Christopher Hurter           Tasks in Virtual Reality
Long                                                                      Francisco Lopez Luro (Blekinge Institute of Technology) and Veronica
                                                                          Sundstedt (Blekinge Institute of Technology)
Exploring Simple Neural Network Architectures for Eye Movement            Long paper
&ODVVL¿FDWLRQ
Jonas Goltz, Michael Grossberg, and Ronak Etemadpour                      11:20-11:40
Short                                                                     Pointing by Gaze, Head, and Foot in a Head-Mounted Display
                                                                          John Hansen (Technical University of Denmark),
Analyzing Gaze Transition Behavior Using Bayesian Mixed Effects Markov    Katsumi Minakata(Technical University of Denmark), I. Scott MacKenzie
Models                                                                    York University), Per Bkgaard (Technical University of Denmark), and
Islam Akef Ebeid, Nilavra Bhattacharya, Jacek Gwizdka and Abhra Sarkar    Vijay Rajanna (Texas A&M University)
Short                                                                     Long paper

                                                                          11:40-12:00
                                                                          Hand- and Gaze-Control of Telepresence Robots
                                                                          Guangtao Zhang (Technical University of Denmark), and John Paulin
                                                                          Hansen (Technical University of Denmark), and Katsumi Minakata
                                                                          (Technical University of Denmark)
                                                                          Long paper

                                                                          12:00-13:00 Lunch break (Lockwood)

26                                             Wednesday 6.26.19                                                                                  27
Session 2                                                        Session 2
Calibration, Cognition, Smartphones,                             Session Chair: Scott MacKenzie
& Sequences                                                      (York University, Canada)

Session Chair: Izabela Krejtz                                    13:00-13:20
                                                                 SacCalib: Reducing Calibration Distortion for Stationary
13:00-15:00                                                      Eye Trackers Using Saccadic Eye Movements
(Humboldt)                                                       Michael Xuelin Huang (Max Planck Institute for Informatics) and Andreas
                                                                 Bulling (University of Stuttgart)
Gaze Behaviour on Interacted Objects during Hand Interaction     Long paper
in Virtual Reality for Eye Tracking Re-calibration
Ludwig Sidenmark and Anders Lundström                            13:20-13:40
Long                                                             SaccadeMachine: Software for Analyzing Saccade Tests
                                                                 (Anti-Saccade and Pro-saccade)
7LPHDQG6SDFHHI¿FLHQW(\H7UDFNHU&DOLEUDWLRQ                Diako Mardanbegi (Lancaster University, Lancaster, UK), Thomas
Heiko Drewes, Ken Pfeuffer, and Florian Alt                      Wilcockson (Lancaster University, Lancaster, UK), Pete Sawyer (Aston
Long                                                             University, Birmingham, UK), Hans Gellersen (Lancaster University,
                                                                 Lancaster, UK), and Trevor Crawford (Lancaster University, Lancaster, UK)
Task-embedded online eye-tracker calibration for improving       Long paper
robustness to head motion
Jimin Pi and Bertram E. Shi                                      13:40-14:00
Long                                                             GazeButton: Enhancing Buttons with Eye Gaze Interactions
                                                                 Sheikh Radiah Rahim Rivu (Bundeswehr University Munich ), Yasmeen
Reducing Calibration Drift in Mobile Eye Trackers by             Abdrabou (German University in Cairo), Thomas Mayer (Ludwig Maximilian
Exploiting Mobile Phone Usage                                    University of Munich), Ken Pfeuffer (Bundeswehr University Munich), and
Philipp Müller, Daniel Buschek, Michael Xuelin Huang,            Florian Alt (Bundeswehr University Munich)
and Andreas Bulling                                              Long paper
Long
                                                                 14:00-14:20
Aiming for Quiet Eye in Biathlon                                 Impact of Variable Position of Text Prediction in Gaze-based Text Entry
Dan Witzner Hansen, Amelie Heinrich, and Rouwen Cañal-Bruland    Korok Sengupta (University of Koblenz-Landau), Raphael Menges
Long                                                             (University Koblenz-Landau), Chandan Kumar (University of Koblenz-
                                                                 Landau), and Steffen Staab (Institut WeST, University Koblenz-Landau and
                                                                 WAIS, University of Southampton)
                                                                 Long paper

                                                                 14:20-14:35
                                                                 Inducing Gaze Gestures by Static Illustrations
                                                                 Pivi Majaranta (Tampere University ), Jari Laitinen (Tampere University),
                                                                 Jari Kangas (Tampere University), and Poika Isokoski (Tampere University)
                                                                 Short paper

                                                                 14:35-15:00
                                                                 Closing Session and Best COGAIN Paper Award

28                                           Wednesday 6.26.19                                                                             29
POSTERS Fast Forward Session / ET4S                                              ET4S 2019 Long and Short Papers Presented as Talks

                                                                                                     Wednesday, June 26, 2019, 15:30-18:00
•    W!NCE: Eyewear Solution for Upper Face Action Units Monitoring
•    A Gaze-Based Experimenter Platform for Designing and Evaluating Adaptive Interventions in       (Pike’s Peak)
     Information Visualizations
•    PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and
     Eye Movement Features
                                                                                                     Session 1
•    iLid: Eyewear Solution for Low-power Fatigue and Drowsiness Monitoring                          Eye Tracking for Spatial Reseaerch
•    Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head-
     Mounted Eye Tracking
•    Estimation of Situation Awareness Score and Performance Using Eye and Head Gaze for
                                                                                                     Session Chair: Peter Kiefer
     Human-Robot Collaboration
•    When you don’t see what you expect: incongruence in music and source code reading               15:30-16:30
•    Eye-tracking based Fatigue and Cognitive Assessment
•    Pupil Diameter as a Measure of Emotion and Sickness in VR
                                                                                                     Eye Tracking in Mixed Reality and its Promises for Spatial Research
•    Accessible Control of Telepresence Robots based on Eye-Tracking                                 Sophie Stellmach
•    The vision and interpretation of paintings: bottom-up visual processes, top-down culturally     Invited Talk
     informed attention, and aesthetic experience.
•    Attentional orienting in real and virtual 360-degree environments: application to aeronautics
•    Motion Tracking of Iris Features for Eye tracking                                               16:30-16:50
•    Automatic quick-phase detection in bedside recordings from patients with acute dizziness and    GeoGCD: Improved Visual Search via Gaze-Contingent Display
     nystagmus
                                                                                                     .HQDQ%HNWDú$U]Xd|OWHNLQ-HQV.UJHU$QGUHZ7'XFKRZVNL
•    Towards a Data-driven Framework for Realistic Self-Organized Virtual Humans: Coordinated
     Head and Eye movements                                                                          and Sara Irina Fabrikant
‡   0LFURVDFFDGLFDQG3XSLOODU\5HVSRQVHWR7DFWLOH7DVN'LI¿FXOW\                                  Long
‡   /RRNV&DQ0HDQ$FKLHYLQJ8QGHUVWDQGLQJ(\H*D]H3DWWHUQVRI3UR¿FLHQF\LQ&RGH 
     Comprehension
•    High-Resolution Eye Tracking Using Scanning Laser Ophthalmoscopy                                16:50-17:10
•    Eye movements during reading and reading assessment in Swedish school children – a new          Eye gaze and head gaze in collaborative games
    ZLQGRZRQUHDGLQJGLI¿FXOWLHV                                                                   Oleg Špakov, Howell Istance, Kari-Jouko Räihä, Tiia Viitanen,
•    GazeVR: A Toolkit for Developing Gaze Interactive Applications in VR/AR
‡   ,PSURYLQJ5HDO7LPH&11%DVHG3XSLO'HWHFWLRQ7KURXJK'RPDLQ6SHFL¿F'DWD$XJPHQWDWLRQ          and Harri Siirtola
•    Reading Detection in Real-time                                                                  Long
‡   ([SORULQJ6LPSOH1HXUDO1HWZRUN$UFKLWHFWXUHVIRU(\H0RYHPHQW&ODVVL¿FDWLRQ
•    EyeVEIL: Degrading Iris Authentication in Eye-Tracking Headsets
•    Remote Corneal Imaging by Integrating a 3D Face Model and an Eyeball Model                      17:10-17:30
•    Detecting cognitive bias in a relevance assessment task using an eye tracker                    Attentional orienting in virtual reality using endogenous
‡   5DQGRPIHUQVIRUDUHDRILQWHUHVWIUHHVFDQSDWKFODVVL¿FDWLRQ                                  and exogenous cues in auditory and visual modalities
•    TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2 in eye-tracking
     studies                                                                                         Rébaï Soret, Pom Charras, Christophe Hurter, and Vsevolod
•    SeTA: Semiautomatic Tool for Annotation of Eye Tracking Images                                  Peysakhovich
•    A Fitts’ Law Study of Pupil Dilations in a Head-Mounted Display                                 Long
‡   )DFWRUV,QÀXHQFLQJ'ZHOO7LPH'XULQJ6RXUFH&RGH5HDGLQJ$/DUJH6FDOH5HSOLFDWLRQ
     Experiment
•    Calibration-free Text Entry using Smooth Pursuit Eye Movements                                  17:30-17:50
•    Analyzing Gaze Transition Behavior Using Bayesian Mixed Effects Markov Models                   POITrack: Improving Map-Based Planning with Implicit POI Tracking
•    Quantifying and Understanding the Differences in Visual Activities with Contrast Subsequences
•    A Deep Learning Approach for Robust Head Pose Independent Eye movements recognition             Fabian Göbel and Peter Kiefer
     from Videos                                                                                     Long
•    A Gaze Model Improves Autonomous Driving
•    Inferring target locations from gaze data: A smartphone study
•    Boosting Speed- and Accuracy of Gradient based Dark Pupil Tracking using Vectorization and      17:50-18:00
     Differential Evolution                                                                          *D]HDZDUHQHVVLPSURYHVFROODERUDWLRQHI¿FLHQF\
                                                                                                     in a collaborative assembly task
The poster fast forward session will feature lightning talks from all ETRA short papers, doctoral
symposium, videos and demos
                                                                                                     Haofei Wang and Bertram E. Shi
                                                                                                     Short

30                                                                  Wednesday 6.26.19                                                                                      31
ETRA 2019 POSTERS, PRE-FUNCTION SPACE
S5     Improving Real Time CNN-Based Pupil Detection Through Domain-           S74    A Fitts Law Study of Pupil Dilations in a Head-Mounted Display
       6SHFL¿F'DWD$XJPHQWDWLRQ                                                      Per Bækgaard (Technical University of Denmark), John Paulin Hansen
       Shahram Eivazi (Eberhard Karls Universität Tübingen), Thiago Santini           (Technical University of Denmark), Katsumi Minakata (Technical
       (Eberhard Karls Universität Tübingen), Alireza Keshavarzi (Eberhard            University of Denmark), and I. Scott MacKenzie (York University)
       Karls Universität Tübingen), Thomas Kübler (Eberhard Karls              S83   )DFWRUV,QÀXHQFLQJ'ZHOO7LPH'XULQJ6RXUFH&RGH5HDGLQJ$/DUJH
       Universität Tübingen), and Andrea Mazzei (Cortical Arts GmbH)                  Scale Replication Experiment
S13    Reading Detection in Real-time                                                 Cole Peterson (University of Nebraska - Lincoln), Nahla Abid (Kent
       Conor Kelton (Stony Brook University), Zijun Wei (Stony Brook                  State University), Corey Bryant (Kent State University), Jonathan
       University), Seoyoung Ahn (Stony Brook University), Aruna                      Maletic (Kent State University), and Bonita Sharif (University of
       Balasubramanian (Stony Brook University), Samir R. Das (Stony                  Nebraska - Lincoln)
       Brook University), Dimitris Samaras (Stony Brook University), and       S91    Calibration-free Text Entry using Smooth Pursuit Eye Movements
       Gregory Zelinsky (Stony Brook University)                                      Yasmeen Abdrabou (German University in Cairo (GUC)), Mariam
S26    Exploring Simple Neural Network Architectures for Eye Movement                 Mostafa (German University in Cairo (GUC)), Mohamed Khamis
       &ODVVL¿FDWLRQ                                                                 (University of Glasgow), and Amr Elmougy (German University in Cairo
       Jonas Goltz (Department of Computer Science and Mathematics,                   (GUC))
       Munich University of Applied Sciences), Michael Grossberg               S103   Analyzing Gaze Transition Behavior Using Bayesian Mixed Effects
       (Department of Computer Science, City College of New York/CUNY),               Markov Models
       and Ronak Etemadpour (Department of Computer Science, City                     Islam Akef Ebeid (The University of Texas at Austin), Nilavra
       College of New York/CUNY)                                                      Bhattacharya (The University of Texas at Austin), Jacek Gwizdka (The
S32    EyeVEIL: Degrading Iris Authentication in                                      University of Texas at Austin), and Abhra Sarkar (The University of
       Eye-Tracking Headsets                                                          Texas at Austin)
       Brendan John (University of Florida), Sanjeev Koppal (University of     S111   Quantifying and Understanding the Differences in Visual Activities with
       Florida), and Eakta Jain (University of Florida)                               Contrast Subsequences
S34    Remote Corneal Imaging by Integrating a 3D Face Model and an                   Yu Li (University of Missouri - Columbia), Carla Allen (University of
       Eyeball Model                                                                  Missouri - Columbia), and Chi-Ren Shyu (University of Missouri -
       Takamasa Utsu (Tokai University) and Kentaro Takemura (Tokai                   Columbia)
       University)                                                             S115   A Deep Learning Approach for Robust Head Pose Independent Eye
S58    Detecting cognitive bias in a relevance assessment task using an eye           movements recognition from Videos
       tracker                                                                        Remy Siegfried (Idiap Research Institute), Yu Yu (Idiap Research
       Christopher G. Harris (University of Northern Colorado)                        Institute), and Jean-Marc Odobez (Idiap Research Institute)
S62   5DQGRPIHUQVIRUDUHDRILQWHUHVWIUHHVFDQSDWKFODVVL¿FDWLRQ         S119   A Gaze Model Improves Autonomous Driving
       Wolfgang Fuhl (Eberhard Karls Universität Tübingen), Nora Castner              Congcong Liu (The Hong Kong University of Science and Technology),
       (Eberhard Karls Universität Tübingen), Thomas Kübler (Eberhard                 Yuying Chen (The Hong Kong University of Science and Technology),
       Karls Universität Tübingen), Alexander Lotz (Daimler AG), Wolfgang             Lei Tai (The Hong Kong University of Science and Technology),
       Rosenstiel (Eberhard Karls Universität Tübingen), and Enkelejda                Haoyang Ye (The Hong Kong University of Science and Technology),
       Kasneci (University of Tübingen)                                               Ming Liu (The Hong Kong University of Science and Technology), and
S65    TobiiGlassesPySuite: An open-source suite for using the Tobii Pro              Bertram Shi (The Hong Kong University of Science and Technology)
       Glasses 2 in eye-tracking studies                                       S125   Inferring target locations from gaze data: A smartphone study
       Davide De Tommaso (Istituto Italiano di Tecnologia) and Agnieszka              Stefanie Mueller (ZPID - Leibniz Institute of Psychology Information)
       Wykowska (Istituto Italiano di Tecnologia)                              S130   Boosting Speed- and Accuracy of Gradient based Dark Pupil Tracking
S70    SeTA: Semiautomatic Tool for Annotation of Eye Tracking Images                 using Vectorization and Differential Evolution
       Andoni Larumbe (public university of navarra), Sonia Porta (public             André Frank Krause (Mediablix IIT GmbH) and Kai Essig (Rhine-Waal
       university of navarra), Rafael Cabeza (public university of navarra),          University of Applied Sciences)
       and Arantxa Villanueva (public university of navarra)

32                                            Wednesday 6.26.19                                                                                           33
You can also read