This quote only works for those who experienced both sides of the change. And the problems brought up in the article is mostly about the younger generations.
> Telemetry: Fig has basic telemetry in order to help us make product decisions. We currently give the you option to opt out of all non-essential telemetry by running `fig settings app.disableTelemetry true`. This removes everything except for one daily ping. We use this ping to help us understand how many people were using Fig.
Does anyone know of a good example of eye-tracking being used for HCI where it surpasses keyboard/mouse?
When searching for it most of the use seems to be in foveated rendering, research, marketing or accessibility.
Something like this? Fast, but still a way to go before it's faster.
[Hands-Free Coding: How I develop software using dictation and eye-tracking]
https://news.ycombinator.com/item?id=24846887
"I'd say I probably work at about 50% of my normal speed*. Now, this doesn't mean that I produce 50% of the results; it just means I need to prioritize a little more ruthlessly."
I once gave Marvin Minsky a headache by showing a video tape of bright blinking PostScript graphics! I apologized, and we had a nice conversation about how eye tracking interfaces tend to do the same thing.
>Zero Bandwidth Video: SCAN 1981. Zero Bandwidth Video: SCAN is an excerpt from demos of Architecture Machine Group project circa 1980 an interactive portrait of Raleigh Perkins with various input devices to manipulate the image. The viewer pushes a joystick to make her look in various directions, Nicholas Negroponte demonstrates linking the portrait to an eye tracking device. SCAN was exhibited at SIGGRAPH Art Show 1989.
>The Incomplete Tyranny of Dynamic Stimuli: Gaze Similarity Predicts Response Similarity in Screen‐Captured Instructional Videos
>Although eye tracking has been used extensively to assess cognitions for static stimuli, recent research suggests that the link between gaze and cognition may be more tenuous for dynamic stimuli such as videos. Part of the difficulty in convincingly linking gaze with cognition is that in dynamic stimuli, gaze position is strongly influenced by exogenous cues such as object motion. However, tests of the gaze‐cognition link in dynamic stimuli have been done on only a limited range of stimuli often characterized by highly organized motion. Also, analyses of cognitive contrasts between participants have been mostly been limited to categorical contrasts among small numbers of participants that may have limited the power to observe more subtle influences. We, therefore, tested for cognitive influences on gaze for screen‐captured instructional videos, the contents of which participants were tested on. Between‐participant scanpath similarity predicted between‐participant similarity in responses on test questions, but with imperfect consistency across videos. We also observed that basic gaze parameters and measures of attention to centers of interest only inconsistently predicted learning, and that correlations between gaze and centers of interest defined by other‐participant gaze and cursor movement did not predict learning. It, therefore, appears that the search for eye movement indices of cognition during dynamic naturalistic stimuli may be fruitful, but we also agree that the tyranny of dynamic stimuli is real, and that links between eye movements and cognition are highly dependent on task and stimulus properties.