How to: ROS + Odroid?


I am to get the structure sensor up and running on a quadcopter. I am using an Odroid U3 as my onboard processor runnint Linaro 13.05 (a variation of ubuntu for ARM). My plan is to use the sensor for obstacle avoidance and 3D position estimation. I have ROS hydro installed and it comes up with openni (1.x) which doesn’t support the structure sensor. Now before I start compiling the openni2 driver for ARM, and trying to solve all the depency issues (and breaking a thousand other issues), did anyone attempt the same ? What procedures do you guys recommend me to follow to get the sensor working?


Well, I eneded up following losely the steps on this page : but using the occupital OpenNI2 instead . It compiled nicely, and the SimpleRead example works, but I keep on getting

Unsupported color video mode - Resolution: 640x480@30Hz Format: RGB888

and I can’t seem to be able to display the RGB images. Any ideaS ?


Just an update:

I made a wrong assumption that the structure sensor was an RGBD sensor, I didn’t know that it doesn’t have an RGB camera, just an infra-red camera. So at this stage, I can access the infra-red images, and the depth images. I am in the process of re-writing the ROS camera driver, and launch files so as to generate point cloud from the depth images. I am using this driver as a reference, and these ROS launch files . But I have a feeling that it wont be straight forward because there is some scaling there happening related to Kinect which might not be directly applicable to the structure sensor.

I will update you later


I am going to start a similar project in September, but I am going to use Nvidia TK1 as my controller.
Please let us know more exciting news!


Ok, problem solved. I found a bug in the openni2_camera driver in ROS and managed to fix it. Basically, the driver was checking for the wrong data stream when trying to determine the focal length for the sensor and thus it was assigning it to zero and generating garbage point cloud data. I forked the driver and modified it and generated a pull request so hopefully those of you who want to use the sensor in the future with ROS will be able to just use the official one :

One more thing, you will have to modify the depth.xml.launch file inside the rgbd_launch package so that the depth_image_proc/point_cloud_xyz nodelet uses image_raw instead of image_rect_raw (was not being generated due to the lack of calibration file). You can always calibrate the camera/sensor and use the image_rect_raw, but I was too lazy to do it.

All good now, and working fine !

Note: if you want to use these changes now, then just clone my fork in a ROS workspace, compile it and “rospack profile” then ROS will load this node instead of the one packed with the original distribution.


So you guys enjoy the new capabilities with ROS :smile:


What flight controller are you using and how did you make connection with the ODROID

Also mention how you installed the openni file (which has the globaldefaults.ini file) in the /etc directory


works for raspberry pi 3?