
Rivvr Brings Wireless VR to the Oculus Rift and HTC Vive - yurisagalov
https://techcrunch.com/2016/12/15/rivvr-brings-wireless-vr-to-the-oculus-rift-and-htc-vive/
======
yurisagalov
So I had the chance to try this at the YC Alumni Demo day at the computer
history museum, and it was SUPER cool. Completely wireless and doesn't add any
noticeable weight on top of the HTC Vive in general. The ability to walk
around the room with a Vive without having that cable tether made the
experience truly immersive.

~~~
mustacheemperor
Is the 11ms latency really not bothersome? I'd be concerned about the input
latency and the potential for frame rate drops from connection issues etc,
especially over longer terms of use.

~~~
zackmorris
If the engineers did their homework then the headset should send a vsync
signal back to the console so the software could use dead reckoning to predict
where the headset will be. That way it would be relatively immune to longer
transmission latencies. Maybe there's a way to do this with the lighthouse
tracking, maybe sending the latency measurement between actual and delayed
position back through the controllers or something.

~~~
cma
Oculus DK2 had a built in latency sensor that measured a coded pixel in the
corner of the screen and then fed that back in to the prediction/forward-
projection steps. I'm not sure about Vive and Rift CV1. I didn't see anything
like it in the iFixit teardowns.

------
otoburb
>> _Unlike other devices, which are aiming to compress the entire raw HDMI
signal over the air, Rivvr uses its proprietary tech to compress the much
smaller video feed from the PC, sending just about 40-80 mbps of video signal
over the air._

This paragraph seems contradictory to me: other devices aim to compress the
video signal over the air (i.e. wirelessly), but Rivvr somehow compresses
"much smaller" video feeds from the PC "over the air" (i.e. wirelessly).

This isn't my industry, so what key technical definition or fact am I missing
from this picture? The two statements seem to be equivalent.

~~~
sixa
Techcrunch already fixed this error "Unlike other devices which are aiming to
send the entire raw HDMI signal over the air, Rivvr uses its proprietary tech
to compress the much smaller video feed from the PC, sending just about 40-80
mbps of video signal over the air." Other companies trying to transmit raw
HDMI stream wirelessly using 60Ghz frequency. They trying to reinvent Wireless
HDMI which is well known and have a lot of problems. We are using standard
WiFi networks 2.4Ghz and 5Ghz.

~~~
Haldir
I wonder how that should work. Less than 11ms per Frame compression time,
let's say 9ms (as we need to Account for the way back too for latency) for a
2160 x 1200 stream + audio. And you can't see any compression and we're
talking Close up here. Not possible with h264/h265.

~~~
random_comment
[http://www.multicorewareinc.com/news/multicoreware-
demonstra...](http://www.multicorewareinc.com/news/multicoreware-demonstrates-
high-quality-4k-10-bit-real-time-hevc-video-encoding-x265/)

This is from 2015. We're almost into 2017.

\- A 4k frame is 3.2 x as much data as the 2160x1200 frame.

\- In this case, compression was achieved at 60fps using H265.

\- The same spec of equipment should in principle be capable of achieving
60fps * 3.2 = 192fps. That's 5ms latency due to compression.

\- Add in the simplicity of many VR scenes - think RecRoom, Windlands...
should be fairly easy to compress.

\- Audio can be compressed in parallel

"Not possible"?

I disagree, "more than possible two years ago"

~~~
Haldir
Let's look at some real live x265 benchmarks, because you skipped the part
where they did it on a beefy dual socket server. Let's take:
[http://x265.ru/en/x265-hd-benchmark](http://x265.ru/en/x265-hd-benchmark),
that benchmark is 1080P, let's assume you're correct with your simplicity and
it might be closer to reality.

The fastest listed system there does a blazing 33fps (well that's a broadwell,
the 2015 dual socket server is probably atleast the xeon equivalent of it).

And we haven't even looked at the image quality/compression artifacts.

I disagree with your assessment.

Rivvr is apparently a spinoff of Sixa, the appropriate techcrunch article
[https://techcrunch.com/2016/12/09/sixa-secures-3-5m-as-it-
la...](https://techcrunch.com/2016/12/09/sixa-secures-3-5m-as-it-launches-its-
cloud-computer-for-developers/) repeats the 11ms latency. Maybe it's the same
technology? The comments to the techchrunch article do not sound convincing
either.

------
kimburgess
Really looking forward to watching the development of this. Ultra low latency,
high quality signal compression and network based (which I assume this is
based on other comments) distribution is hard. It has a lot of use cases
outside of this scenario.

I know there's some really interesting work being done in the broadcast space
(e.g. BBC R&D with VC-2 [1]) and the various low-latency implementation of
H.265 which seems to be driven by UAV use. Are you guys/girls expanding on
something existing for the codecs used or going down a completely different
route?

[1]
[http://www.bbc.co.uk/rd/projects/vc-2](http://www.bbc.co.uk/rd/projects/vc-2)

------
microcolonel
Maybe they do a final motion adjustment on the receiver just at the video
level. Seems slightly interesting. I doubt they've made any breakthroughs in
compression. maybe eye tracking could reduce the bandwidth even further
(though the latency in this case means it can't go as far as a wired
installation).

------
rubicon33
I'm really curious about how VR will solve the movement problem.

See, a virtual environment is only part of the equation. Oculus peeled the
onion back further, with touch (and Vive equivalent).

How will they solve movement? Using a pad to move forward or backward is not
nearly as immersive as walking...

~~~
feelix
I've always wondered why they dont stick a camera on the front of it, then
overlay the virtual world onto the physical world.

~~~
chrischen
The Vive has this feature although its not used in games.

~~~
dogma1138
It is used for overlaying some info like the room borders and barriers.

It is exposed to the devs but the quality of the camera doesn't really allow
for AR experience so it's not really used.

