Recording Depth&Colors and transfering to (Linux) PC



I have an IPad Air 2 and a I need to record the depth stream & colors for a certain period and afterwards transfer it onto my computer. What is the easiest way to do this? I have no MacOS and AFAIK can therefore not write Ipad Apps.

Thank you,


Unless you have access to a Mac computer and Xcode with a developer license, you will not be able to use our Structure SDK.

You could use the Structure Sensor with our USB Hacker Cable on a Windows machine to record depth using either of the following open-source libraries:

The Structure Sensor does not, however, include a color camera, as we have designed it to use the built-in color camera on iOS devices.


Hello Anthony, thanks for your answer! I have the hacker cable as well, but I need colors. So do I need a Mac computer to transfer the RGB-D stream? I read somewhere that there is an Uplink option? Does it upload depth and color? And is there some place where I can find more information about this?


Ahh, I just assumed that you’d want to write your own specific program.

You can use our Skanect Software that runs on a PC with an iPad using the Uplink functionality, but you won’t be able to receive the raw RGB-D stream, as Skanect was designed for scanning objects and rooms.

For more information about Uplink and Skanect, have a look at the following pages:

There is an old GitHub Repo of Uplink code that you might want to look at, as well, but it might not be much use to you, as it is a very old and deprecated project: About uplink to server


The github repositiory might be exactly what I am looking for. It does not matter if the code is quite old. But is the protocol of the upload still the same as in the implementation?


This is essentially the same code as what Skanect uses for its Uplink functionality, though, as noted, this is pretty old at this point and most likely will not work right off the bat with our Structure Application.


So I managed to compile the code (on Linux) and Uplink is showing in the structure app. Yet once selected I get a cryptic error message:
“Uplink failed to connect.
Service: CaptureReceiverExample at on port 6666.
Error: **** <- some garbage including various utf chars”

How can I continue from here? Is this because the code is old? Did the protocol change?



Yeah, as stated previously, this code is pretty old and might not work.

Unfortunately, we don’t have an plans on resurrecting this code base for public use either.


I assume that there is no way I find out the protocol that is used from the new version of the app. So basically I have to buy some imac and write a streaming app myself?


We do not currently provide a transmit app as seen in the most updated readme file on the repo

NOTE 2017.10.11 – A transmit app is not currently provided, but is planned. We will remove this message and provide instructions with how to access it when it is available.*

We currently do not have a release date for such an application either.

Mainly, I pointed you in the direction of our Uplink Repo to show you a code base in which you could make your own transmit/streaming application (this doesn’t necessarily have to be with iOS, as OpenNI 2 can be used with Android/Windows devices).


Ok, Thank you for all the information :slight_smile: !


Thanks @anthony.monaco!

This seems to be what I’m looking for. The goal is to capture depth data that can be used in video post-production.

Basically this:

Why go mess with DepthKit and a Kinect if I already invested in a Structure Sensor.

Any good tutorials or documentation on how to go about this?


I own a PC, a Mac, an Android mobile as well as an iPhone 7 plus.