Abstracts due : Jan 14th, 2019
Papers due : Jan 18th, 2019
Feedback : Feb 18th, 2019
Rebuttals : Feb 25th, 2019
Decisions : Mar 4th, 2019
Camera-ready due : Mar 29th, 2019
CALL FOR PAPERS
The Symposium on Communication by Gaze Interaction organized by the
COGAIN Association (http://cogain.org) will be co-located with ETRA
2019, the ACM Symposium on Eye Tracking Research & Applications.
ETRA 2019 will take place in Denver, Colorado, June 25-28.
Following the successful concept used at the COGAIN symposium 2018,
also COGAIN 2019 will be organized as a “special session” at ETRA.
By combining our efforts with ETRA, we hope to encourage a broader
exchange of knowledge and experiences amongst the communities of
researchers, developers, manufacturers, and users of eye trackers.
We invite authors to prepare and submit short papers following
the ETRA’s ACM format (http://etra.acm.org/2019/authors.html).
Long papers are up to 8 pages (+ 2 additional pages for references).
Short papers are up to 4 pages (+ 2 additional pages for references).
During the submission process to ETRA 2019, you will be asked if
you would like your paper to be presented at the COGAIN Symposium.
All the accepted papers for the COGAIN Symposium will be
published as part of the ETRA 2019 ACM Proceedings.
The COGAIN Symposium focuses on all aspects of gaze interaction, with
special emphasis on eye-controlled assistive technology.
The symposium will present advances in these areas, leading to new
capabilities in gaze interaction, gaze enhanced applications,
gaze contingent devices etc. Topics of interest include all aspects of
gaze interaction and communication by gaze including, but not limited to
* Eye-controlled assistive technology
* Gaze-contingent devices
* Gaze-enhanced games
* Gaze-controlled robots and vehicles
* Gaze interaction with mobile devices
* Gaze-controlled smart-home devices
* Gaze interfaces for wearable computing
* Gaze interaction in 3D (VR/AR/MR & real world)
* Gaze interaction paradigms
* Usability and UX evaluation of gaze-based interfaces
* User context estimation from eye movements
* Gaze-supported multimodal interaction (gaze with multitouch, mouse, gesture, etc.)
The Program Committee will select the COGAIN 2019 best paper.
– John Paulin Hansen [Technical University of Denmark, Denmark]
– Päivi Majaranta [Tampere University, Finland]
– Diako Mardanbegi [Lancaster University, United Kingdom]
– Ken Pfeuffer [Bundeswehr University Munich, Germany]
Please visit http://cogain2019.cogain.org for more information
or contact us by email to email@example.com