gazeNet: End-to-end eye-movement event detection with deep neural networks
Language English Country United States Media print
Document type Journal Article
PubMed
30334148
DOI
10.3758/s13428-018-1133-5
PII: 10.3758/s13428-018-1133-5
Knihovny.cz E-resources
- Keywords
- Deep learning, Event detection, Eye movements, Fixation, PSO, Saccade,
- MeSH
- Algorithms MeSH
- Behavioral Research methods MeSH
- Humans MeSH
- Neural Networks, Computer * MeSH
- Task Performance and Analysis MeSH
- Eye Movements * MeSH
- Saccades MeSH
- Check Tag
- Humans MeSH
- Publication type
- Journal Article MeSH
Existing event detection algorithms for eye-movement data almost exclusively rely on thresholding one or more hand-crafted signal features, each computed from the stream of raw gaze data. Moreover, this thresholding is largely left for the end user. Here we present and develop gazeNet, a new framework for creating event detectors that do not require hand-crafted signal features or signal thresholding. It employs an end-to-end deep learning approach, which takes raw eye-tracking data as input and classifies it into fixations, saccades and post-saccadic oscillations. Our method thereby challenges an established tacit assumption that hand-crafted features are necessary in the design of event detection algorithms. The downside of the deep learning approach is that a large amount of training data is required. We therefore first develop a method to augment hand-coded data, so that we can strongly enlarge the data set used for training, minimizing the time spent on manual coding. Using this extended hand-coded data, we train a neural network that produces eye-movement event classification from raw eye-movement data without requiring any predefined feature extraction or post-processing steps. The resulting classification performance is at the level of expert human coders. Moreover, an evaluation of gazeNet on two other datasets showed that gazeNet generalized to data from different eye trackers and consistently outperformed several other event detection algorithms that we tested.
Department of Computer Science University of the Free State Bloemfontein South Africa
Department of Psychology Regensburg University Regensburg Germany
Faculty of Arts Masaryk University Brno Czech Republic
Humanities Laboratory and Department of Psychology Lund University Lund Sweden
References provided by Crossref.org
Eye tracking: empirical foundations for a minimal reporting guideline
Small eye movements cannot be reliably measured by video-based P-CR eye-trackers