Project North Star
Friday 25/09/2020
The Project North Star headset is a open source Augmented Reality headset. I got to play around with a pre-built kit.
There are kit versions of the headset that you can build yourself. They can be found here: https://www.smart-prototyping.com/AR-VR-MR-XR
It uses the Leap Motion for hand tracking and a Intel RealSense t261 or a t265 for 6DOF.
If you do end up buying the headset / building one yourself, I would highly recommend grabbing the t261 / t265 for 6DOF. This will allow you to rotate and walk around objects.
For this post, I will be assuming you're using the Unity Game Engine.
Since I did not build the headset myself, I will be skipping the calibration step as well as how to build the headset.
Set Up
You will need:
Unity 2018.4 (LTS). I am using 2018.4.27f1.
The LeapMotion v4.0.0 drivers found here: https://github.com/leapmotion/UnityModules/blob/feat-multi-device/Multidevice%20Service/LeapDeveloperKit_4.0.0%2B52238_win.zip
The LeapAR.unitypackage from the Project North Star Github Repo (which can be found under the Software folder): https://github.com/leapmotion/ProjectNorthStar
The RealSense Driver from here: https://www.intelrealsense.com/sdk-2/
The RealSense Unity package from here: https://github.com/IntelRealSense/librealsense/tree/master/wrappers/unity
Hand Tracking with the Leap Motion
First, Install the Leap Motion 4.0.0 drivers.
Second, import LeapAR.unitypackage into your project.
Make a folder called StreamingAssests under the Assets folder and move your calibration file into it. I renamed mine to AR11.json.
Navigate to the LeapMotion/North Star/Scenes folder and open up the NorthStar scene. The AR Camera Rig should have AR11.json set as its Input Calibration File.

In Inspector, you can set the X offset and Y offset, and then hit the button that says Move Game View to Headset.
Have a go at playing the scene! You should see the Leap Motion tracking your hands but you won't be able to look around your hands just yet.
Six DOF with the Intel RealSense t261 or Intel RealSense t265
Install the Intel RealSense drivers. This should also install the Intel RealSense Viewer. Go ahead and open that up.
On the left sidebar, turn on the Tracking Module. The switch should go from red to blue and move your Intel t261 or Intel t265 around. You should see a model moving around with a green trail as shown in the picture below.

Close the Intel RealSense Viewer.
Now, we have to get the t261 / t265 working with Unity so import the RealSense Unity package into the same project as the one with the Leap Motion SDK and the North Star plugins.
Navigate to RealSenseSDK 2.0/Scenes, open up the StartHere scene and hit Play.
The Game View should show a blue screen and a white box with SLAM (Pose) should appear. Go ahead and hit Start. You should be able to move the t261 / t265 around and see it move around in the Game View with an orange trail as shown below.

Go ahead and open up the NorthStar scene again.
Navigate to RealSenseSDK2.0/Prefabs and drag the RsDevice prefab into your scene. Click on the RsDevice gameobject and we'll have to change some things. Set the following settings:
Profiles set to a size of 1
Stream set to Pose
Format set to Six DOF
Framerate set to 0
Stream Index set to 0
Width set to 0
Height set to 0
Change the Profiles array to a size of 1, change Stream to Pose

Expand AR Camera Rig and click on Head. Add a script called Rs Pose Stream Transformer and drag in the RsDevice gameobject in to Source to reference it.

Boom! You have 6 DOF! tada.jgp
Don't believe me? Create a cube, scale it down and move it infront of your headset and play the scene. You should now me able to move around the cube.
Thoughts
I absolutely love it!
It blows my mind that there is something so cool out there and very accessible to everyone. The headset itself is USD $468 (AUD $662), at the time of writing, if you print out your own parts and get the Intel RealSense t261 as well, which is mind boggling when you compare it to the HoloLens which you can get for a mere AUD $5,599.00.
There are some problems with drift. And, you need to set your IPD in Unity otherwise things might be blurry for you.
Shoutout to HyperLethalVector who created Project Esky which aims to allow out of the box development of the Project North Star: https://github.com/HyperLethalVector/ProjectEsky-UnityIntegration but that is going to be another post.
Last updated
Was this helpful?