Hacker News new | past | comments | ask | show | jobs | submit login

Could the sensors be indeed flooding the device with terabytes of data, but the device can only sample that data at a more reasonable rate?



Well, from that perspective, an analog temperature sensor is flooding your ADC with infinite GB/s.

For further comparison: the fastest CPUs you can get nowadays have an aggregated memory bandwidth of ~ 90 GB/s using four lanes.


if you have enough pins, a custom asic can do just about whatever you want. the data flowing into the HPU is likely huge, but it is processed down into something the CPU can deal with.


Yeah, well, 4x DDR4 DIMMs have 4x288 = 1152 pins. If you want to be two orders of magnitude faster than that, you're talking on the order of 100 000 pins, which is just absurd.


To give an example, a raw 4K stream at 12 bit is about 500MB/s, so unless it has 2000 4K cameras, unlikely.


I think you forgot to multiply by a frame rate, otherwise you don't have a "per second" unit. At 60fps, it's 29.66 GiB/s.


Original Kinect data rates, measured empirically by a third party:

Colour: 10.37 Mb/s Depth: 29.1 Mb/s Skeleton: 0.49 Mb/s

So roughly 40Mb/s (4 * 10^7)

Does this device produce > 1Tb/s (1 * 10^12) or 25,000 times as much data? I'd be surprised.


Maybe we're counting photons now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: