When I first started OXI, I knew I wasn’t just building “some XR utilities for Unity.”
I wanted something bigger, a framework with a consistent language, one that felt just as approachable to someone opening Unity for the first time as it did to a veteran who’s been through multiple generations of XR SDKs.
That meant solving a big problem: XR development in Unity can be messy.
APIs are fragmented, naming conventions shift between systems, and vendor-specific SDKs force developers into their ecosystem. Worst of all, the simple act of getting data from the headset or controllers can feel like navigating a maze.
My answer with OXI is to strip that all away and give a single, predictable way to access everything, no matter your skill level or hardware.
At the heart of OXI is a path-based API that mirrors OpenXR’s logical role structure but feels natural in C# and Unity.
If you want the right hand’s pose, you don’t need to hunt down an object in the hierarchy or remember a different call for each input profile.
You just write:
User.Hand.Right.Input.Pose.Changed += MyMethod;
It reads exactly how you’d say it: the user’s right hand pose changed, run this method.
Behind the scenes, OXI’s Providers and Receivers make this happen:
Because they’re decoupled, swapping a provider, such as replacing real hardware with a simulation or network feed, requires zero changes to gameplay code.
Everything is role-first. Head
, Hand.Left
, Hand.Right
aren’t just names, they’re anchor points for any device context. This keeps a project consistent whether I’m targeting PCVR, standalone, or some future XR device that doesn’t exist yet.
This isn’t just a naming convention, it’s a philosophy.
For newcomers, it means:
For veterans, it means:
Unity’s XR development environment right now is a tangle of overlapping APIs, vendor-specific workflows, and “manager” objects everywhere.
You can make it work, sure, but maintaining that across multiple projects or teams is painful.
OXI simplifies it:
I didn’t set out to make an API that enforces my personal taste.
I set out to make a shared language for XR developers, something they can rely on whether they’re building their very first VR scene or deploying a massive enterprise training system.