Imagine a future where you play a video game in the first person, using virtual reality, controlling your avatar with your own movements. The most natural way for someone to walk, run, aim, and jump, is to actually do so. Angling control sticks, pressing buttons, and waving gestures in-air does not map one-to-one in the same way your brain maps to your muscles and limbs. Flow with the wiring of your brain rather than against it.
This system would require direct connection to signals from your brain, intercepting and consuming them, never reaching your muscles. Resting comfortably with REM-like paralysis, you control your game character naturally and intuitively, moving it and not yourself, all from the comfort of your couch.
Technology like this already exists in its infancy, such as providing moveable arms and hands to those who have lost them:
"Move Arm" ---> Engage Robotics ---> "See Arm Move"
"Move Arm" ---> Animate Game ---> "Feel Arm Move"
The current feedback loop in robotics is limited; we can't make Luke Skywalker's robotic hand flinch with needles yet. For things to work well in a video game, you'll need to feel your actions, just as a runner feels their foot-strike flick off the ground.
When will this technology be available? What bottlenecks need to be overcome for it to exist? How much will it cost? What other applications could it have? Will it be released before Episode 3? ;)
I don't know, this seems sort of silly. Working through abstractions such as metaphor and imagery (both visual and textual) is essentially working with the wiring of the brain already. Consider, plenty of people have explored such wondrous places as Middle Earth. None have literally walked around in it. Would it be neat. Sure. Especially in an instant gratification sort of way.
And this is not to mention the cognitive trouble this could lead to. Isn't hard to imagine a situation such as Mal from Inception happening.