I’m trying to use the Structure Sensor with the iOS camera function captureStillImageBracketAsynchronouslyFromConnection. I would like the associated depth frame to be aligned with one of the images in the bracket (any is fine). I am currently keeping a buffer of past depth images and picking the closest one in time but this can only get me so close (ideally I’d like as little time as possible between the two frames)
I’m working on capturing nice still images also while running the sensor. One thing I’ve thought about trying to is upping the framerate to 60 FPS for the structure sensor to get a close match. Good Luck.
@nickmagus I’d like to better understand your use case for HDR image data.
I’m looking to produce good RGBD data for areas with strong lighting differences such as an area with a lamp or an indoor area partially illuminated by sunlight.
That makes sense. You suggest that you’re buffering depth images, taking a bracketed capture, and time-aligning the depth and image frames. Can you share a little bit about how you then use the aligned depth and image frames, and also expand on the comment: “this can only get me so close”?
This is very cool by the way!
@abryden regarding upping the frame rate, have you seen Jeff’s comments here regarding high resolution color?
Expanding on the “this can only get me so close”, I get depth frames that are usually within 100ms of one of the bracketed images. The exposure bracketed images are aligned to whichever image is closest to a depth frame and merged using exposure fusion to create a single HDR image and the associated depth map. The only trouble is that the 100ms is enough time for the camera (which is handheld) to move significantly. My current Idea is to synchronize to the rgb stream using the standard library function and save both the rbg and depth image later aligning them to the hdr image in the same way the hdr images are aligned. I was hoping there was a better way to align straight to the bracketed image but it appears not.
@nickmagus are you able to do a bracket capture at the same time as video capture if you don’t lock the exposure? If so I would very interested for our app. I had written off HDR capture for our application even though it would be very nice because I thought this wasn’t possible.
If you want to talk offline send me a PM.
@jim_selikoff I saw it but the iPad 2 30 FPS part didn’t register in my brain. I’ll try upping the depth resolution and see what it does for us! Thanks.
Can you please let me know little bit more about how did you capture the RGB stream with Depth raw data. I am stucking too much for capturing it.
Thanks in advance.
Its been a few years since I last worked with the sensor, but the basic principle is to calibrate the rgb camera on the phone/tablet itself to the sensor. This obviously works best with low parallax. If there is negligible movement between the two images you can do this with a single transform calculated for the current setup. Feel free to PM me with more specifics on what it is you’re trying to achieve and maybe I can help, though again, it has been a while.