You know, in the end, maybe what I ask from this sensor is unrealistic.
I've been using all these sensors since at least 5 years ago - I've had Axtion, Kinect v2, RealSense R200, etc.
None have been real good at applying the textures over the mesh resulting from the pointcloud - with different degrees of success.
Angela Dai, Matthias Niessner and the other guys from Standord have developed at least three pieces of code which aim to apply textures and do the mapping in real time - it's almost always that the mesh is nice enough, detailed and all, but the loop closure is deffective, and the best you can end up with is a decent RGB pointcloud, but the textures are all blurry.
Structure suffers from the same issue - even when using Room Capture and taking your time, there's still a lot of triangles all over the place, and for some reason, the textures taken by the awesome camera of an iPad Pro don't look near as nice when scanning - probably it's because of the inability to focus constantly.
So these scanners are nice enough to have for scanning small objects or bodies, or for inverse tracking to be used in VR - but don't do a good job for scanning interiors, if you need nice, crisp textures at the end.
I'll probably revert to using a Faro or Leica with HDR cameras, or maybe even a Matterport.
Thanks for chipping in though, obviously a gread deal of work and knowledge is going into this little piece of equipment !
If you'd be able to find some time and maybe publish a 3D model for the bracket for the iPhone 7, and sort out the Room Capture glitches when remeshing and applying the textures - it'd be awesome.
Edit: that's awesome, exactly what I would need, not sure about the texturing quality - http://graphics.stanford.edu/projects/bundlefusion/