I am new to both openGL and the Structure SDK and am looking to learn more about the interaction between the Sensor Data and openGL.
I have looked over the Scanner Sample app and have been trying to render the mesh in the scanning mode using the MeshRenderer class in place of the built in STScene::renderMeshFromViewpoint function. I have been looking over the MeshRenderer.mm file to understand how it is rendering the mesh.
I am running into a problem with the mesh not being rendered to the screen when calling MeshRenderer:render from ViewController+OpenGL::renderSceneForDepthFrame. I have made the following changes to the scanner app:
// The below was added to the end of ViewController+OpenGL::setupGLViewport
// The below is called in ViewController+OpenGL::renderSceneForDepthFrame in the "ScannerStateScanning" case
// instead of the call to STScene::renderMeshFromViewpoint
STMesh mesh = [_slamState.scene lockAndGetSceneMesh];
The result is that the mesh is not rendered. I have narrowed the problem down to the glProjectionMatrix being received from either the colorFrame or the depthFrame. If i make the following change:
// The below chang was made in protected function LightedGrayShader::vertexShaderSource in the CustomShader.h file
gl_Position = u_modelview*a_position; // Removed the u_perspective_projection from the latter calculation
The model shows up, upside down and smaller then normal, but it shows up. The change to the shader effects the rendering of the mesh Review Mode as well (MeshViewController) in the same way as expected. The difference being that when the projection matrix is added back in the mesh when in Review Mode looks fine where as the mesh when in Scanning Mode is not shown on the screen.
Some help would be wonderful as currently I am not proficient enough with either openGL or the Structure SDK to be able to understand what is not working.
Thanks a bunch!