Those 1000 clock cycles get you a perfect digital answer that is the same every time. This analog version gets you a faster answer with less accuracy and repeatability. This may be useful in some situations, but I would not call it general purpose.
I'm just speculating, but I don't see why you can't quantize the output electrically. I guess I'm suggesting a mixed mode system, where an electronic cpu sends an array of values to an optical system that performs a calculation and returns a result electronically.
Sure you can quantize the output but that is not going to help your accuracy. Today's Ghz CPU is 64bit, today's Ghz ADCs are 8 to 16bits. This gets you ~3 to ~5 decimal sig figs for analog and ~15.9 decimal sig figs for 64bit floating point. That's assuming your analog side is perfect, which is unlikely to say the least.