I have some questions regarding rgb colors, because i simply can't get it to work properly.
I am using the Luma and Chroma Textures as they get created by the SDK, and the conversion provided in the YCbCr shader from customShaders.h (its actually the same just exported) to generate rgb colors from luma and chroma, but i just get a greenish picture from this, similar to How to convert an UIImage to a sample buffer.
Why does this happen? The only difference between the approach in the SDK and my approach is, that I am using custom "external" glsl shader. Could it be that this happens, because the chroma texture is 2 bytes per pixel, so that reading values from it via uv coordinates only yields half the information I actually need?
(Just an idea that popped in while reading the post linked before)
And second, it seems like I can't get to work with depth and color textures simultaneous on the GPU.
I am using different TextureUnits (0,1 for luma and chroma, 2 for the depth), but it seems like whatever i do, depth is always locked to a certain texture unit, so that its either the same as luma or chroma, making it not possible to efficiently work with both.
Its just strange, since working with the color or with the depth data alone, just works fine (besides the error in the conversion).
I hope someone can help or has a solution for this, even its just something simple that i overlooked somewhere.