Adding point cloud node to MixedReality Sample App


#1

Hello!

I’m trying to visualise point clouds in AR using MixedReality Sample app.

The SCNNode is created with the code below. It woks perfectly on MacOS with an empty SCNScene object in default configuration.
However, when I add the node as a child to ‘_mixedReality.worldNodeWhenRelocalized’ it is not rendered.
Any suggestions?

Thank you!

Here is the class to create a point cloud node:

import SceneKit

@objc class PointCloud: NSObject {

var n : Int = 0
var pointCloud : Array<SCNVector3> = []

override init() {
    super.init()
    
    let file: String = "bun_zipper_points.ply"
    self.n = 0
    var x, y, z : Double
    
    (x,y,z) = (0,0,0)
    
    // Open file
    if let path = Bundle.main.path(forResource: file, ofType: "txt") {
        do {
            let data = try String(contentsOfFile: path, encoding: .ascii)
            var myStrings = data.components(separatedBy: "\r\n")
            
            // Read header
            while !myStrings.isEmpty {
                let line = myStrings.removeFirst()
                if line.hasPrefix("element vertex ") {
                    n = Int(line.components(separatedBy: " ")[2])!
                    continue
                }
                if line.hasPrefix("end_header") {
                    break
                }
            }
            
            pointCloud = Array<SCNVector3>(repeating: SCNVector3(x:0,y:0,z:0), count: n)
            
            // Read data
            for i in 0...(self.n-1) {
                let line = myStrings[i]
                x = Double(line.components(separatedBy: " ")[0])! //* 100
                y = Double(line.components(separatedBy: " ")[1])! //* 100
                z = Double(line.components(separatedBy: " ")[2])! //* 100
                
                pointCloud[i].x = Float(x)
                pointCloud[i].y = Float(y)
                pointCloud[i].z = Float(z)
                
            }
            
            NSLog("Point cloud data loaded: %d points",n)
        } catch {
            print(error)
        }
    }
    
}


public func getNode() -> SCNNode {
    var vertices : [PointCloudVertex] = []
    let points = self.pointCloud
    
    for p in points {
        let v = PointCloudVertex(
            x: Float(p.x),
            y: Float(p.y),
            z: Float(p.z),
            r: Float(1.0),
            g: Float(1.0),
            b: Float(1.0)
        )
        vertices.append(v)
    }
    
    let node = buildNode(points: vertices)
    NSLog(String(describing: node))
    return node
}

private func buildNode(points: [PointCloudVertex]) -> SCNNode {
    let vertexData = NSData(
        bytes: points,
        length: MemoryLayout<PointCloudVertex>.size * points.count
    )
    let positionSource = SCNGeometrySource(
        data: vertexData as Data,
        semantic: SCNGeometrySource.Semantic.vertex,
        vectorCount: points.count,
        usesFloatComponents: true,
        componentsPerVector: 3,
        bytesPerComponent: MemoryLayout<Float>.size,
        dataOffset: 0,
        dataStride: MemoryLayout<PointCloudVertex>.size
    )
    let colorSource = SCNGeometrySource(
        data: vertexData as Data,
        semantic: SCNGeometrySource.Semantic.color,
        vectorCount: points.count,
        usesFloatComponents: true,
        componentsPerVector: 3,
        bytesPerComponent: MemoryLayout<Float>.size,
        dataOffset: MemoryLayout<Float>.size * 3,
        dataStride: MemoryLayout<PointCloudVertex>.size
    )
    let elements = SCNGeometryElement(
        data: nil,
        primitiveType: .point,
        primitiveCount: points.count,
        bytesPerIndex: MemoryLayout<Int>.size
    )
    let pointCloud = SCNGeometry(sources: [positionSource, colorSource], elements: [elements])
    
    return SCNNode(geometry: pointCloud)
}

}


#2

Probably need to set draw order so it draws at the right time. Keyword to search for is: TRANSPARENCY_RENDERING_ORDER, and you can get a sense of what is drawn and in what order.

For point cloud, I’m guessing this is good:

[_target.node setRenderingOrder:TRANSPARENCY_RENDERING_ORDER+50];

#3

Hi!

Thanks for the suggestion!

I’ve already checked it.
This problem was addressed in The surroundings of the particles are covered with black
I tried [node setRenderingOrder: 100000] and >100000. It didn’t help.


#4

Hello all,

I created a simple app for iOS which renders point cloud data using SceneKit as a proof of concept.
Here is the link: https://github.com/eugeneu/PoindCloudRenderer

Now I’m trying to integrate that point cloud SCNNode to MixedReality sample from Bridge Engine SDK.
For some reason which I still didn’t figure out the node is not visible in the AR scene.


#5

what is the scale of your data?


#6

Hi!

The scale is okay. We did manage to render objects represented as meshes.
Also, for the point cloud we found a workaround: instead of rendering an atomic point we do a sphere node (from SceneKit).
However, the performance is very low for sphere nodes (even though we do the node flattering). So the question of rendering atomic voxels is still open.
If anyone can help to solve the issue we will really appreciate.

The goal is to do the rendering as in the example here https://github.com/eugeneu/PoindCloudRenderer, but in the Bridge AR environment.

If needed we can privately share the code for Bridge with those who are authorized to use the Bridge SDK.

P.S.: Sorry for the late answer, I was busy with another project.

Regards,
Evgeniy