Creating UIImage from STInfraredFrame


#1

Hello,

I’m trying to create a UIImage from a given STInfraredFrame, but all I get is a black image. This is my code:

let colorSpace = CGColorSpaceCreateDeviceGray()
let bitmapInfo = CGBitmapInfo.byteOrder16Big.union(CGBitmapInfo(rawValue: CGImageAlphaInfo.none.rawValue))
let width = Int(irFrame.width)
let height = Int(irFrame.height)
let data = Data(bytes: irFrame.data, count: width * height * MemoryLayout<UInt16>.size)
let provider = CGDataProvider(data: data as CFData)!

let cgImage = CGImage(
    width: width,
    height: height,
    bitsPerComponent: 16,
    bitsPerPixel: 16,
    bytesPerRow: width * MemoryLayout<UInt16>.size,
    space: colorSpace,
    bitmapInfo: bitmapInfo,
    provider: provider,
    decode: nil,
    shouldInterpolate: false,
    intent: .defaultIntent)!

let image = UIImage(cgImage: cgImage)
return image

Thanks!


#2

Answering my own question -

Core Graphics on iOS does not support 16-bit single component grayscale input. See this table from Apple:
https://developer.apple.com/library/archive/documentation/GraphicsImaging/Conceptual/drawingwithquartz2d/dq_context/dq_context.html#//apple_ref/doc/uid/TP30001066-CH203-BCIBHHBB

So we have two options: Using 8-bit grayscale (which is supported on iOS) or just do the conversion ourselves.

Can someone from Occipital comment on why the Infrared data is uint16 and not 8-bit? Do you use the extra 8-bit?

Thanks!


#3

Glad to see you were able to get somewhere with your issue.

The IR Stream coming from the Structure Sensor is, actually, 10 bits long. Unfortunately, this means that we can’t use unit8 for the IR Stream.

We use uint16 to capture the data and truncate the remaining bits when actually using the data.

Hope this information helps!