Structure sensor on Android


I received the preordered structure sensor. I downloaded the OpenNI2 and created the installer. However, I am unable to figure out how to use the installer.

My intention is to create android app that receives data from the structure sensor. Please provide me a tutorial or instructions to create a sample app for android with structure sensor.

Padmanabha VS


Hello Padmanabha,

We added some Android build instructions on

Can you confirm this is the procedure you followed?

If so, you should be able to start from there, using the NDK getting started instructions, here:

Thanks for your feedback!

Hi nt_,

Can you provide any more detailed instructions?

The build instructions on don’t go very far, and the aren’t specific to OpenNI2 so I’m unsure as to how to connect to OpenNI2’s features, let alone the sensor, once I follow those instructions.



Hi all!

I’m extremely interested in using the Structure Sensor with Android as well. I have to agree with padmanabhavs and Dylan that the instructions for Android are rather sparse. I’ve put together a more detailed set of instructions for folks to use (see below). Please note that these instructions currently only go up to the end of the GitHub steps and not all the way to a working application. Once I’ve progressed further, I’ll post additional steps and example code.

These instructions assume you are on a Linux operating system (preferably Ubuntu) as working with the Android Native Development Kit (NDK) is much easier on Linux than on Windows. I am working off of an Ubuntu 12.04 LTS (64 bit) install, but these instructions should work for most Ubuntu installations. (The Release Notes for Open NI 2 state that you must use “Ubuntu 12.04 (32/64/arm) and above”).  I am also assuming that you know how to use the terminal (command prompt for Windows users).

(Note: What is the Native Development Kit? The NDK is designed to let you develop Android apps with C and C++ code in addition to Java. This allows apps to run much faster (nearly everything is precompiled) and allows developers to reuse code created in C/C++ for other programs (like the Open NI 2 libraries). It should be pointed out that the apps do still require some Java coding as the entry/exit points of the app must be done in Java. Also, there is no way to directly access the Android RGB cameras from the C/C++ side of the program. Instead, each frame must be collected in Java then passed to the C/C++ methods. Structure data on the other hand is only accessible on the C/C++ side of the program as Open NI 2 is strictly native code. I would definitely recommend exploring the demos that come with the NDK before doing any development with the Structure Sensor if you’ve never used the NDK before.)

  1. (Optional yet recommended first step) Create an Android development folder that you can put all of your development programs and downloads into. Example: /home/NAMEOFUSER/AndroidDev)

  2. Install the latest Android ADT Bundle from: . Make sure to get the version that fits your system (x86 vs x86_64). This will provide all of the required SDK items and an Eclipse work environment. Installation consists of downloading a zip file, extracting the contents, and then putting the extracted folder in your dev folder.

Once the ADT Bundle is ready, launch Eclipse (your dev environment) (extracted-adt-bundle-folder/eclipse/eclipse).  You should get a popup that asks you to set your workspace folder (generally /home/NAMEOFUSER/workspace).  This will contain your Android projects once you start developing (code, dependencies, etc.).

(MANDATORY STEP!!!) Once this is done, you need to open the Android SDK Manager (Eclipse task bar at the top of the program -> Window -> Android SDK Manager).  Here you need to download the SDK Platforms for any Android OSes you plan to develop for.  The SDK Platform is what allows you to actually create any apps for Android.  (I’d also recommend grabbing the Samples for SDK as they will provide sample code showing how to use the various internal sensors, OpenGL, the camera, various GUI items, and plenty more.)  While the most recent version of the Android SDK should already be installed, you will need to get the version of the SDK that was supported at the time the Android NDK r8d came out (the only version of the Native Development Kit we can use with Structure).  Based on the NDK release notes (, I believe this would be API version 14.

Note: You will need to build all of your apps at the API 14 (Android 4.0) level.  Any phones running Android 3.2 or earlier cannot run Structure apps!

  1. Download the zip file containing the Occipital branch of OpenNI 2: .  Extract and put it in your Dev Folder.

  2. Install all of the Linux Building Prerequisites as directed by the OpenNI 2 readme: . Not sure if not all of these are necessary (GraphViz?), but might as well do it anyways just in case.  Plus, you’re now partially prepped for Linux development.

  3. Download the Linux version of the Android NDK r8d: .  Extract and put it in your Dev Folder.

  4. Define NDK_ROOT to point to the path of your NDK r8d folder. The terminal command is: export NDK_ROOT="/home/NAMEOFUSER/REMAININGPATHTODEVFOLDER/android-ndk-r8d" .  That first / before the home is important! It creates a global path. If you don’t include the / , the environment variable becomes a path that starts in whatever folder is currently using the variable (Bad idea…).

  5. Create the Android OpenNI 2 packages. Go to your “DEVFOLDER/OpenNI2-master/Packaging” folder.  Then run “python android” in terminal.  This will create “the installer” in the Packaging/Final folder.  This appears to be more release notes, a changes document, and a .tar containing some .ini, .so, and executable files. Another folder (Packaging/AndroidBuild) has also been created that contains multiple folders that are generally used in Android NDK development (jni and obj) filled with precompiled files. Once I figure out how to actually use these for development purposes, I’ll post additional information.

7.1) Potential errors that may occur during step 7 if you’re using a 64 bit version of Ubuntu. Keep in mind that the NDK r8d was only provided in a x86 variety.

Problem: /prebuilt/linux-x86/bin/make: not found
Solution: sudo apt-get install libgcc1:i386

Problem: error while loading shared libraries:
Solution: sudo apt-get install lib32z1

Problem: error while loading shared libraries:
Solution: sudo apt-get install lib32stdc++6

In case someone else knows what to do with this information before I figure it out - some information from the compiled release notes:


  • On Android, only native support (and samples) is currently provided. Please note that as bionic (Android linker) does not support the rpath option, the samples cannot start as is. To solve this, do one of the following:
      - Copy OpenNI libraries (, and to /system/lib (requires root)
  • or -
      - run export LD_LIBRARY_PATH=.:$LD_LIBRARY_PATH before starting the native executeable
1 Like

Paul, this is absolutely fantastic, thank you! Looking forward to updates as you progress!

Finally getting back to working on this problem after some other critical priority stuff came up. Still no success, but I do have a few more useful bits of info if anyone else is trying to work on this as well.

  1. I can almost guarantee you’re going to have to root your phone to work with OpenNI… Found a blog post with fairly detailed instructions on how to get OpenNI working, and so far it looks like it’s impossible to use the adb to push to system without root access due to it being a heavily protected part of the file system. Currently working on learning how to root my phone / trying to find an image for 4.0 or later.

  2. If you’re using a Nexus 4 (like I am), it is “officially” impossible to use anything connected by OTG-USB cable. The phone won’t provide power to any devices, and there is no functionality within the device for OTG. However, if you’re using a self-powered device (like the Structure) and download a user-created patch, you might be able to get it up and running. Note: This also requires rooting your phone… Additionally, Nexus 5 and 7 appear to work with OTG with no problems whatsoever, so no need for patches.

Hi @paul_sassaman Thanks so much for sharing on the development of android app using structure sensor.

Understand that you work on Linux OS, by any chance will u be able to post some tips for Windows OS? Online resources around has been rather quite limited.

Hi, all,
I am working on using Structure sensor on Android, I am wondering if anyone could post some instruction on Windows operating system like paul_sassaman did on Linux. Thanks.


Can anyone help with getting the OpenNI2 drivers to work on Android?

I built OpenNI2 for Android on Windows by installing the Android NDK (version r8d), creating an working folder, creating a JNI sub-folder, copying the OpenNI2 source into that and running ndk-build from the command line within the JNI folder. I think these are essentially the same steps as those performed by the script - except for command line switches to set the current folder (-C) and limit the number of “jobs” (-j8).

I have modified the OpenNI2 source to force the driver path to be the sdcard rather than the default “./OpenNI/Drivers” in the hope of removing the need to root the Android device.

On my Nexus 7 (2012) UsbManager.getDeviceList() reports a device at: /dev/bus/usb/002/002 but my test app reports: Could not open “id27/0600@2/2”: Failed to open the USB device!

On another device I have tried my test app crashes with Fatal Signal 11 (SIGSEGV) code 1 when the OpenNI::initialize code tries to create a device from the library.

Any help or ideas would be much appreciated.

Hi Nick,

I had the same approach and went a little farther on my Nexus 7 (2012).
Android has a permission system for USB devices and I had first to request permission and open the device inside the main Activity

    usbManager = (UsbManager) getSystemService(Context.USB_SERVICE);
    HashMap<String, UsbDevice> listD = usbManager.getDeviceList();
    for (String key : listD.keySet()) {
        UsbDevice host = listD.get(key);
        usbManager.requestPermission(host, mPermissionIntent);

and then intialize only when you receive validation

private final BroadcastReceiver mUsbReceiver = new BroadcastReceiver() {
        public void onReceive(Context context, Intent intent) {
            String action = intent.getAction();
            if (ACTION_USB_PERMISSION.equals(action)) {
                synchronized (this) {
                    UsbDevice usbdevice = (UsbDevice) intent.getParcelableExtra(UsbManager.EXTRA_DEVICE);

                    if (intent.getBooleanExtra(UsbManager.EXTRA_PERMISSION_GRANTED, false)) {
                        if (usbdevice != null) {
                            // call method to set up device communication
                            Log.i(tag, "Permission Granted for USB device " + usbdevice.getDeviceName() + " - " + usbdevice.getInterfaceCount() + " interfaces");
                            final UsbDeviceConnection conn = usbManager.openDevice(usbdevice);

On another older device (Samsung S2) I had to load all libraries in the correct order to make it work, that might explain your crash but I think the error was obvious in logcat.


I was then able to open the device, select video mode and start reading depth frames but what I got was only corrupt frames from the sensor.

You might want to replace code inside XnLogAndroidWriter.cpp or XnLog.cpp by calls to __android_log_print (logcat) for easier debug.

BTW I have for easier integration into Eclipse or Netbeans, I could share if you are interested.

I requested help from occipital and I had no response except looking through this forum (which obviously has not much to offer for android). I came to the conclusion that no one has ever made structure sensor work on android… and that occipital is misleading on this on the website. I would gladly accept any counter-evidence of a working setup.
I also realized that, even if I could make the depth and IR sensor work on android, the RGB camera alignment code, the 3D reconstruction code and the skanect uplink code are not publicy available and that would require A LOT of effort to have structure sensor offer the same functionality as on iPad (another misleading thing if you’re not expert on the topic). Effort which I have not willing to do voluntary without any support from occipital.


Did you ever end up getting a reply from us? I assume you sent mail to I want to make sure of why that would have fallen in the cracks. Thanks!



Thank you for your suggestions. I now can get the list of VideoModes from the sensor on my Nexus 7 (2012).

Unfortunately my other device - which is the one I need the sensor to work with - still crashed with a segmentation fault inside OpenNI::initialize. There is an issue with realpath on my device so I have hard-coded the drivers path used by xnOSLoadLibrary in XnLinuxSharedLibs.cpp.

I now get “Failed to set USB interface!”.

yes the email was to (October the 29th), I can send it again if you wish

I used another approach for the realpath, it may work for you. I store all files in “assets” folder and extract them in the cache dir before OpenNI init. The cache dir is quite always the same (/data/data//cache/) and I can use this as hardcoded folder in OniContext.cpp.
Make sure you extract all ini files (OpenNI.ini and PS1080.ini) and try to set the parameters listed in occiptal’s guide (I guess UsbInterface should be 0 even though it did not make a difference for me)
Also make sure you get the right location inside loadLibraries() (I hardcoded the 3 libs with ‘/data/data//lib/’)
If you want a clean solution instead of hardcoded links, you will have to pass a folder string through existing jni functions (e.g. initialize) or fetch a java field within jni_onload().

On which device is your “Failed to set USB interface!” error and which is the other device you are targeting ?


The device that gives “Failed to set USB interface” is Epson BT-200 smartglasses running Android 4.0.3. The issue occurs inside a libusb_claim_interface(handle, 0) call.

A colleague rooted the device and examined /proc/kmsg where he found “usb 1-1: rejected 1 configuration due to insufficient available bus power”.

Connecting the sensor to the device via a powered USB hub solves the problem.

Glad you’re making progress Nick, have you been able to effectively read IR or depth frames ?


I am getting what appear to be good depth and IR frames. I am only using the lowest two resolutions. The default IR resolution is higher than that of the default depth so I have increased the depth one to make them match.

What is the exact step you did to also get the OpenNI2.jni file?

Because in the Final folder, I found all the files I needed, but this particular one is missing.

I am trying to build a system configuration for a research project in which a dedicated server receives depth and/or IR frames transmitted over a network, which are then post-processed, recorded or used for advanced system interaction and gesture tracking.

So far, I have struggled with the build scripts in the Github repo, using Visual Studio on windows and python on linux, and I could finally obtain the binaries (.so and .ini files) for an ARM-based OTG-ready android device. But none of the scripts created the openni2.jni file which is required by the openni java libraries.

Besides my kind request of assistance to obtain the missing file, I have read this thread and seen that you have already gone some steps further; i.e. extracting depth frames from the structure sensor, if that really works, this is exactly what I need. Do you have the missing file I need and/or examples of which I can learn?

Thank you in advance!


I am using OpenNI from native C/C++ on Android so didn’t need an openni2.jni file. It looks like there is a Visual Studio solution and makefile for OpenNI2.jni in the wrappers folder of the OpenNI-master repository.

I hope this helps.