I am sorry in advance for my poor English skills.
Using Unity, I want to make an iPad app that can set virtual objects in the real world like Hololens app.
sample image of my app
The app is used in the small room (10m * 4m), and the app allows the user to walk in the room. So I think StructureUnityUBT is a good example for my app because it can detect the user's position in real world.
However, StructureUnityUBT sample can't show the camera image (accessing outer camera). As a result, unlike Hololens apps, my app can only show the VR world to the user and can't show real world view on iPad. So my question is,
- How can i access iPad camera while using UBT and show the camera image on iPad?
- or any other solutions to track user's position and show iPad (real) camera image for an AR app? (I tried using StructureUnityAR sample, but it's tracking capability is inferior to StructureUnityUBT sample)
using some parts of StructureUnityAR SDK, I could realize the app above. the way is about as follows.
- add StructureAR.mm and StructureAR.h in StructureUnityUBT project (and some other dependencies)
- in StructureWrapper+Camera.mm, adding it to bind color buffer
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
// add this code
[[StructureAR sharedStructureAR] uploadGLColorTexture:sampleBuffer];
- then in Unity side, add
CameraViewScript.cs from StructureUnityAR sample project
- To debug it,
setupWirelessDebugging method was very useful !!!