Yeah, all of this was just iPhone -- I did capture with Matterport's app. Some of the final results didn't have great lighting, etc, and a laser scanner would have much better quality. I wondered how this type of capture would look compared to the other cameras that I usually work with--for creating educational materials more than scientific recording because it's so much faster.
The iPhone Lidar results are messier, and the 360 images took a bit of time to edit both as eqs and cubemaps to brighten certain areas, fix bad stitching in the 360 images, remove people, etc.
Since the last project, I'd been working on a new system with threejs to make the tours more gamelike, so I brought the 360 images and 3d data into the new system, cut out the highlight overlays from the 360 images, added the netherworld environment art, and coded up shaders to do various effects in the storytelling.