I'm using the sensor with a Mac OS X computer and OpenNI2 v2.2. When I point the sensor into a static scene (e.g. a wall), without moving the sensor, and analyze the depth of a point in the image over time, I get varying values. Not just noise, it seems that the depth is converging to some value. If I print the depth of such point into the terminal, I can see that it can vary more than 5cm in 1 minute. It starts in a given value, and then slowly decreases until it reaches the new value. If it started, for instance, at 1900mm, it would end approximately at 1960mm.
Is there any type of filtering/estimation algorithm in the device that can produce this behavior? If that is the case, can I disable it? I'm programming an application where it is very important to always get the same depth value for a given static point (with an error below 1 or 2 cm).
This behavior if more visible for image points that are far away from the image center. I think you should be able to replicate the error by modifying the main.cpp file in the SimpleRead sample. Just change the line "int middleIndex = (frame.getHeight()+1)*frame.getWidth()/2;" into "int middleIndex = 25664;" for instance.
Thanks in advance.