Guide to setting up a Swift project


For those of you out there who were looking to use Swift to play around with the structure sensor, here’s a guide to getting a project set up so you can build and run on your iOS device.

  1. Download the SDK (If you haven’t requested a download link yet, request one at

  2. Create a new xcode iOS application project. Set your language to Swift.

  3. From the downloaded SDK, add the Structure.framework from the SDK into your project. Just drag it into the project heirarchy, or use ‘Add Files to Project

  4. Go to Build Settings
    a) search for ‘Other linker flags’, and add -lc++
    b) search for ‘Header Search Paths’, and add $(PROJECT_DIR)/Structure.framework/Headers

  5. Go to Build Phases

    1. Expand ‘Link Binary With Libraries’. Structure.framework should already by listed.

    2. Use the plus to add the following frameworks:

  6. Find the Info.plist under ‘Supporting Files’

    1. Add a row, enter ‘Supported external accessory protocols’

    2. expand and hit the plus button twice then set the item values to:

  7. Either manually create and set the bridging header, or just add an objective C file to the project to create one automatically.

  8. In the bridging header, add:

    #define HAS_LIBCXX
    #import "Structure.h"
    #import "StructureSLAM.h"

    Note: You can optionally use the build settings ‘Preprocessor Macros’ to define HAS_LIBCXX instead.

  9. You can now derive a swift class from STSensorControllerDelegate, and implement the required methods:

     func sensorDidConnect() {}
     func sensorDidDisconnect() {}
     func sensorDidStopStreaming(reason: STSensorControllerDidStopStreamingReason) {}
     func sensorDidLeaveLowPowerMode() {}
     func sensorBatteryNeedsCharging() {}
  10. Set the delegate via STSensorController.sharedController().delegate = <your instance here>

At this point you should be able to build and deploy to an iOS device (provided you are set up with an apple developer account).

Of course all this does is get the project building, to actually start making use of the sensor you have to make use of the sensor data by implementing one (or more) of the delegate functions:

    func sensorDidOutputDepthFrame(depthFrame: STDepthFrame!) {}

    func sensorDidOutputInfraredFrame(irFrame: STInfraredFrame!) {}

    func sensorDidOutputSynchronizedDepthFrame(depthFrame: STDepthFrame!, andColorBuffer sampleBuffer: CMSampleBuffer!) {}

    func sensorDidOutputSynchronizedInfraredFrame(irFrame: STInfraredFrame!, andColorBuffer sampleBuffer: CMSampleBuffer!) {}
  1. Before the most of the useful functions will work the library/sensor needs to be initialized.

    let result = STSensorController.sharedController().initializeSensorConnection()
    if result == .AlreadyInitialized || result == .Success {
    // streaming can now be started.

**Note**: If you are getting result == .OpenFailed, then its likely you haven't added the 'Supported external accessory protocols' correctly in Step 5
  1. To start actually getting data from the device you need to start streaming:

    let options : [NSObject : AnyObject] = [
    kSTStreamConfigKey: NSNumber(integer: STStreamConfig.Depth640x480.rawValue),
    kSTFrameSyncConfigKey: NSNumber(integer: STFrameSyncConfig.Off.rawValue),
    kSTHoleFilterConfigKey: true
    var error : NSError? = nil
    STSensorController.sharedController().startStreamingWithOptions(options as [NSObject : AnyObject], error: &error)

With the above options you should now have the sensorDidOutputDepthFrame function called with the depth frame.

Check out the reference material in the SDK about which options to use for which delegate function.

Hopefully this helps others out there who want to get started with Swift. I was able to get the basic depth frame rendering similar to what the viewer app does.

App with Structure Sensor using PhoneGap / React Native
Structure Scanner with swift
Import Structure SDK in Swift Framework

What an awesome contribution, thanks for your efforts!



Very cool, we just shared this via Twitter.


Great tutorial!

You should create a GitHub with the resulting code. Would be great.


As requested, the results of following the above steps are now on github. The branch basic_setup corresponds to the steps above, and the master has the 640x480 depth frame being rendered.

Note: Step 3 will have to be redone as the repo doesn’t include the actual framework.

So Ive created some models - what next

I got the source files/project from GitHub and uploaded the Structure Framework from SDK 0.5 and it seems to be broken. I am a total Xcode newbie, but I image some of the naming changed between the SDKs? Any help would be greatly appreciated!!

var floatDepth = STFloatDepthFrame()
use of an unresolved identifier 'STFloatDepthFrame()'

            toRGBA = STDepthToRgba(streamInfo: STSensorController.sharedController().getStreamInfo(.Depth640x480), options: toRGBAOptions, error: nil)

STSensorController does not have a member named 'getStreamInfo’

func sensorDidOutputDepthFrame(depthFrame: STDepthFrame!) {
if let renderer = toRGBA {
statusLabel.text = “Showing Depth (depthFrame.width)x(depthFrame.height)”
var pixels = renderer.convertDepthFrameToRgba(floatDepth)
depthView.image = imageFromPixels(pixels, width: Int(renderer.width), height: Int(renderer.height))
View Controller does not have member name 'floatDepth’


I would suggest opening an Issue in Github.


I’ve updated the code to use the 0.5 SDK. Things are nicely simplified. Let me know if you have any issues.

The instructions above still seem to be valid though - it’s only the conversion of DepthFrame to DepthFrameFloat (which is no longer neccesary) and the setup of the depth to color converter that changed.


Thank you for the help and support!!!
Build completes successfully now. I didnt realize that you cannot connect to the sensor with the simulator. Bummer. Is there anyway to test builds without fully authoring them for free? Even if its just to a local Apple device without full deployment? Another thread mentioned building Open NI drivers to allow for the sensor to connect in the simulator… Has anyone had success with this? Wish list for SDK 0.6 :slight_smile: ?


Thank you for this tutorial, I have got the Sensor hooked and working and all looks good.

I can however do with a bit of a steer on how to get the SLAM engine to work. I have looked at the RoomCapture source and can work see the principles, but I am missing something in translating across the concepts correctly. Does anyone have a Swift fragment for getting STTracker to be invoked correctly. Thanks


Has anybody got any ideas why this should not work.

I have included the items which I believe are relevant

class BaseRoomCapture: GLKViewController, STSensorControllerDelegate {

let gLContext : EAGLContext = EAGLContext(API: EAGLRenderingAPI.OpenGLES2)

 override func viewDidLoad() {
    STSensorController.sharedController().delegate = self
    (self.view as! GLKView).context = gLContext
    NSNotificationCenter.defaultCenter().addObserver(self, selector: "appDidBecomeActive", name: UIApplicationDidBecomeActiveNotification, object: nil)

func configureSlam(myContext: EAGLContext) -> Bool {
if tryInitializeSensor() {

        let myScene : STScene = STScene(context: myContext, freeGLTextureUnit: GLenum(GL_TEXTURE2))
        // working out how to set up tracker in swift
        let trackerOptions : [NSObject: AnyObject] = [
            kSTTrackerTypeKey: NSNumber(integer: STTrackerType.DepthAndColorBased.rawValue),
            kSTTrackerTrackAgainstModelKey: NSNumber(bool: false),
            kSTTrackerQualityKey: NSNumber(integer: STTrackerQuality.Accurate.rawValue)
        var trackerInitError: NSError? = nil
        STTracker(scene: myScene,options: trackerOptions,error: &trackerInitError)
        if (trackerInitError != nil)
            NSLog("Error during STTracker init: %@",trackerInitError!.description)
        let volume: NSArray = NSArray(array: [128,128,128])
        let mapperOptions: NSDictionary = NSDictionary(objectsAndKeys: volume)
        let mapper: STMapper = STMapper(scene: myScene, options: mapperOptions as [NSObject : AnyObject])
        // configure mapper
        mapper.liveTriangleMeshEnabled = false
        mapper.liveWireframeMeshEnabled = true
        mapper.liveWireframeMeshSubsamplingFactor = 2
        let volumeInMeters: GLKVector3 = GLKVector3Make(3.0, 3.0, 3.0)
        mapper.volumeSizeInMeters = volumeInMeters
        let strategy = STCameraPoseInitializerStrategy.GravityAlignedAtVolumeCenter
        let cameraPoseOptions: [NSObject: AnyObject] = [
            kSTCameraPoseInitializerStrategyKey: STCameraPoseInitializerStrategy.GravityAlignedAtVolumeCenter.rawValue]
        var cameraPoseInitError: NSError? = nil
        let cameraPose: STCameraPoseInitializer = STCameraPoseInitializer(volumeSizeInMeters: volumeInMeters, options: cameraPoseOptions, error: &cameraPoseInitError)
        return true
    return false


Currently the code failing on initialise when I have the scanner plugged in, and I can get a lot of insight what I have mis-configured. Anything obvious?


I’m not able to debug directly right now, but a suggestion: are you using STWirelessLog? I would definitely try to use this to debug errors that happen when you plug the sensor in. Enable wireless log, and add lots of logging to see where it’s failing.


Hi Jeff,

Thank you for the input.

It looks like something is not being initialised correctly as I am getting an EXC_BAD_ACCESS exception.

In case it was an OpenGL Swift issue I have also tried setting STScene as follows with the same outcome.
let myScene : STScene = STScene(context: (self.view as! GLKView).context, freeGLTextureUnit: GLenum(GL_TEXTURE2))

Is that way I am calling STTracker?



Has anyone got the SLAM to work with Swift?

I am pretty much stuck, I think it it a problem the swift binding to provide the context with GLKView. But all I can get out of the debugger is stack dumps, which trace back to STScene and it looks like my down casting of the context is EAGLContext is resulting in a null object.

Any suggestions would be very welcome.


Hi Dave,

I’d like to help you get past this. If you forked StructureViewerSwift, could you post the URL and I will take a look at the errors?




Hi Jeff,

Thank you for the response. It is not forked as I have been building from scratch, just my technique for really getting to grips with an SDK. I have published to here

It is very rough as I have being trying various options in case the way I am working with GLKit is causing the issue.




Sorry for not getting around to doing this sooner, but I’ve just updated the StructureViewerSwift github project to work with Swift 2, Xcode 7 and Structure SDK 5.4.

I’ve also added an .gitignore entry for the SDK to avoid accidental commits.


Don’t know if you found this out when it was announced, but Apple recently changed their policies to allow you to build and run on your own device without a Apple developer account.


Thank you for your update.
I tried to import the project into Xcode and impossible for me to compile it. I have these following errors:
<unknown>:0: error: failed to import bridging header '/Users/user/xcode_project/StructureViewerSwift-master/StructureViewerSwift/StructureViewerSwift-Bridging-Header.h' and Structure.h is not found.

I imported well the framework into the projet but still got the errors…

Thank you in advance !


This error is likely the result of the either ‘Header Search Paths’ not being set up correctly, or you’ve imported the framework to a different path then expected.

If you go back to build settings, search for ‘Header Search Paths’, and double click, you should see something like:

$(inherited) $(PROJECT_DIR)/Structure.framework/Headers

If that’s correct, then likely you’ve imported the structure framework to a different folder then I did.
In finder, I have the Structure.framework located at the same level as the .xcodeproj file. Chances are you have yours is in one level deeper than that. You can either re-import the framework at the top level, or try setting the header search path to:
I don’t recall though if that will require changes elsewhere, so I’d go with re-importing.

Good luck.