Is eye-tracking tech not good enough these days to position a cursor when one stares at that position for 2+ seconds? I am recalling studies of driving or artwork where the gaze lingers.
XR devices with eye tracking have practically no latency instead you would need to tune down the points of interest hover over time triggers and then there are a lot of experimental keyboard ideas in Spatial Computing that could be integrated like starburst wheels could give you layers of type assist etc