Getting calibrated depth frame along with ARKit


#1

Hi everyone,

I’m working on a project that requires using ARKit along with the depth map of the structure sensor. For convenience, we would like to stick with the Structure SDK built-in calibration.

This is our current solution, far from perfect :

  • We first lunch an STCaptureSession with kSTCaptureSessionOptionSensorAndIOSCameraSyncEnabledKey set to true

  • We get the first synced color frame

    case STCaptureSessionSampleTypeSynchronizedFrames:
    {        
      _calibrationColorFrame = [sample objectForKey:kSTCaptureSessionSampleEntryIOSColorFrame];
      
      if(_calibrationColorFrame != nil) {
          // we got our calibration frame
          [self stop];
      }
    }
    
  • We stop the STCaptureSession, start a new one with depth only and start ARKit. Then we register each depth frame using this reference color frame

    case STCaptureSessionSampleTypeSensorDepthFrame:
    {            
        @synchronized(self) {
            _lastFrame = [sample objectForKey:kSTCaptureSessionSampleEntryDepthFrame];
            _coloredDepthBuffer = nil;
              
            if(_calibrationColorFrame != nil) {
                _lastFrame = [_lastFrame registeredToColorFrame:_calibrationColorFrame];
            }
        }
        break;
    }
    

It seems to be working correctly. Did anyone find a better solution ? We tried storing an OCC to get this reference color frame, but we got deserialisation errors and couldn’t get it working.