I’m looking for some help to get higher resolution RGB images when using the following configuration
kSTStreamConfigKey: NSNumber(value: STStreamConfig.registeredDepth640x480.rawValue as Int), kSTFrameSyncConfigKey: NSNumber(value: STFrameSyncConfig.depthAndRgb.rawValue as Int),
Later I use
captureSession!.sessionPreset = AVCaptureSessionPreset640x480
sessionPresets give me a runtime error before
func sensorDidOutputSynchronizedDepthFrame() gets called. I believe this function is available but I haven’t found any good documentation how to do it. Should I be looking more at iOS’s AVCapture functions to solve this or Structure’s SDK?
On a side note. I am new to the Structure Sensor and haven’t found any detailed tutorials. Has Occipital released a walkthrough to build sample applications? For me personally, those really help reduce the large barrier of entry.