AR app using UBT sample


I am sorry in advance for my poor English skills.

Using Unity, I want to make an iPad app that can set virtual objects in the real world like Hololens app.
sample image of my app

The app is used in the small room (10m * 4m), and the app allows the user to walk in the room. So I think StructureUnityUBT is a good example for my app because it can detect the user’s position in real world.

However, StructureUnityUBT sample can’t show the camera image (accessing outer camera). As a result, unlike Hololens apps, my app can only show the VR world to the user and can’t show real world view on iPad. So my question is,

  1. How can i access iPad camera while using UBT and show the camera image on iPad?
  2. or any other solutions to track user’s position and show iPad (real) camera image for an AR app? (I tried using StructureUnityAR sample, but it’s tracking capability is inferior to StructureUnityUBT sample)


using some parts of StructureUnityAR SDK, I could realize the app above. the way is about as follows.

  • add and StructureAR.h in StructureUnityUBT project (and some other dependencies)
  • in, adding it to bind color buffer
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
    // add this code
    [[StructureAR sharedStructureAR] uploadGLColorTexture:sampleBuffer];
    [_sensorController frameSyncNewColorBuffer:sampleBuffer];
  • then in Unity side, add Manager.cs , CameraViewScript.cs from StructureUnityAR sample project
  • To debug it, setupWirelessDebugging method was very useful !!!


hi Blue0513,

I was trying to do the same thing!

in Unity, Can you save objects to specific locations in the drive like hololens?


hi Riccardo,

Yes, I could set the objects in the specific position, in the same level of the position-tracking accuracy as when using the StructureUnityUBT SDK.


StructureUnityUBT SDK, i tried to MotionLoggingEnabled scene but it’s basic.

Could you tell me how to proceed to create a mixed reality scene?.

I have to save the GameObject on the plane?

for example like this:


Yes, sure!! (but what is the best way to tell you how to do it? )

But what do you mean by this “MR” ?
If you mean the app which can detect the ground we stand, it’s not my app, because my app is just storing the user’s position(x,y,z) and objects’ position(x, y, z), and reflect them on the view of tablet.


I have 2 rooms and i would like to place my 3D objects(like to hololens), for example on the wall or on the floor.

When the app starts the objects will be in same position but in mixed reality(Augmented reality) without virtual plane(not vr example scene).

i’m using Unity for application.


Hi Riccardo.

You said, you want to set objects in 2 rooms. From a conclusion, I thought it is difficult for this app because the app just memorize the relative position of the user, and the positional error will be cumulative. More than that, it does not consider the occlusion and can’t hide object behind the room walls.


Hey, I know it’s an old post, but I’m struggling with the same issue.
I added the and StructureAR.h and the dependencies to the project, added the code to the StructureWrapper+Camera, added the Manager.cs and CameraViewScript.cs in Unity.
I am able to get the camera view to show and it moves with the sensor. The problem now is that nothing else is moving with the sensor anymore. So the objects in the scene are static so I lost the tracking of the ubt.
(I’m running this on sdk 0.7.1)

update on this, I managed to get the camera view to show and the other objects to move also. The issue now just is that the tracking is terrible, the objects still move a bit with the camera. And also the ubt tracking coordinates change suddenly, from like 1.5 to 0.3 if you move the sensor just a little bit.



I’m trying to achieve the same thing. Using the method described above didn’t work at all for me, it just made the tracking glitch out, or the app crashed. I suspect this is because it was receiving positioning data from both the AR and UBT sides of things. I’ve now got it so that the UBT tracking works with what looks like the relevant AR code, but I can’t get it to render the camera. When I put logging in, it says it’s uploading the camera frames to the texture buffer, but I just get the Unity horizon. Does anyone have any pointers, or a sample project I could look at?