Getting Started With Oculus Quest Development

This time, courtesy of our great supporters at Patreon, we’re talking about getting up and going with development for the Oculus Quest. It’s worth mentioning that Quest development is reasonably flexible with multiple viable paths to follow, so I’ll specify here that I’ll be demonstrating with Unity and Oculus’ own VR kit available for free in the Unity Asset Store.

If you haven’t tried any of the latest VR headsets, you can tell a lot about modern VR from the way enthusiasts are discussing its potential. In our recent interview with David Fox, designer of Thimbleweed Park (and much more), he shared not only his vision for more deeply immersive gameplay experiences in the coming years, but he believes in VR immersion as a means of helping users internalize hypothetical experiences like simulated outcomes of climate change over time. Startups all over the world—I’ve freelance with one locally already—are getting a jump on implementing virtual training in every professional field you can think of, and a few you’d probably never guess. VR may not be in every living room just yet, but it’s probably here to stay until it is.

Most people remember their introduction to VR, and geeks born before 2000 have probably tried a headset that was nowhere near ready for prime time. David Fox has written about several of such disappointments along the way and our friend Blake J. Harris chronicles in humorous detail Oculus founder Palmer Luckey’s exhaustive march through a museum’s worth of headsets prior to finally cobbling together his own in his trailer laboratory in Blake’s new book, The History of the Future.

Personally, I read a whole library book about attempts at VR–I can’t think of the name anymore–before I saw any headset in person. The set I finally tried was part of a Star Wars themed Six Flags attraction, and not the kind they were prominently advertising. I’ve tried to figure out exactly what this attraction was, because I was surprised to find like, no evidence of this ever having existed. Based on some videos I’ve scoured since then, I think a company may have just invested in very expensive headsets and charged us to play some version of Jedi Knight multiplayer. Considering there was Star Wars IP at play, I’m surprised I couldn’t dig up more information about this, but swinging around that crude lightsaber in what looked like a generic GoldenEye map was enough to give me that “holy shit” moment. Technology was on the way that could drop me inside a video game. I had plenty of time to reflect on it while not touching another headset for over ten years.

To be honest, I let the Oculus Rift come and go. I was really glad to hear VR was starting to re-emerge and I was sending the community good thoughts. But knowing what I did about the history of “home VR” attempts made me risk averse. The Rift was only a few hundred dollars, but the VirtualBoy only retailed at $180. The price wasn’t the problem as much as the performance. The Rift was no VirtualBoy, and it’s popularity sent Oculus further down its path.

If any product really pulled me back in, it was actually Google Cardboard. Folding up a cheap cardboard box to carefully space the right lenses between my eyes and a display that relies on my phone somehow resulted in a magical experience that brought me right back to that first immersive thrill. Oddly enough, it was a demo made by Volvo to show off one of its new cars that pulled me in and got me thinking about the future. There was no real interactivity at all, but it put me in the driver’s seat during a trip between beautiful destinations and the boundless possibilities washed over me. We have the technology for beautiful interactive scenes and the technology for intricate head and hand tracking at relatively low cost. At last, they’re coming together.

There’s a story behind my first development experience with VR, and I’ve only revealed some of the details to Blake J. Harris. In fact, I’m going to take it straight out of one of my emails to him:

A couple of years ago I left full-time software development to take care of my young son and try to “go indie.” Shortly after putting out my first App Store game (a letters and numbers game for young kids), I turned to Upwork to see if I could drum up game development work with pay that was a bit steadier. It was there that I struck up a friendship with another St. Louis-based guy who was running his own studio focused on VR. It wasn’t long before he asked “Do you know how to do VR? We need someone with your game dev expertise, and we have a client ready to go.”

“For sure!” I told him. I was lying. I knew they were using Unity and I knew I could learn the rest, I figured if I could do a good job, the client wouldn’t know I had no clue what I was doing.

The client turned out to be Lenovo, outsourcing a demo they would show at Mobile World Congress in Barcelona. Obviously, that turned into a horrifying short period of time, but I picked it up quickly, as promised, and everyone was very upbeat and fun to work with after they saw the first deliverables. I ended up developing a cool set of in-world interactions between objects, including use of an in-world tablet that played videos, which turned out to be a real mind-eff for someone who had never even tested a VR headset before.

The HMD the studio provided was an Acer Windows Mixed Reality headset that was brand new at the time and support for it outside of the Windows 10 home environment was spotty (though I spent a great deal of time marveling at the home environment too—I still think it’s fantastic). The fit was comfortable and the headset was nice and light (this would later make the Quest feel noticeably heavy). That said, getting used to the technology by working on a scene on top of a wind turbine was uncomfortable at times. Unity occasionally kicked the device from 6DoF to 3DoF—if you’re not familiar, 3DoF means a headset will update your view according to your head movement, creating the feeling that you can freely look around the environment. 6DoF means the headset will track your physical position in the room, so that you can walk around a virtual scene to examine it as well. Overall, I like the WMR devices and I’m sorry more hasn’t been done with them.

The local startup and I celebrated the success of the Lenovo project, then things cooled a bit. A couple of times we explored the possibility of working together long term, but never pinned anything down due to timing on both sides. I kind of regret this, honestly; it was the most fun work I’ve ever done.

VR didn’t come up again in my household until my last birthday. We set up a little family dinner at Dave & Buster’s, thinking it was the best we could do for a Tuesday night. I was pleasantly surprised that my in-laws joined us for dinner, and in fact, they were eager to walk around and play games with my four-year-old. We had a great time. As we set up in the corner with the Skee-Ball machines, we noticed the new Men in Black game, complete with VR headsets, warnings about people with heart problems, and a bucket that seemed intended either for catching vomit or cleaning it.

“Is it really dangerous for people with heart problems?” my father-in-law asked.

“Eh, maybe,” I told him. I explained the time I used the fledgling Acer headset to try DiRT Rally’s VR build and got through most of one event before clawing the headset off and staggering across the spinning room, as close as I’ve ever been to heaving without doing it before the sensation subsided. I’d be careful about putting it on my grandma. But I noticed the whole family kept looking over at it. I’m sorry I didn’t take more time to convince anyone to try it. Personally, I’d had a couple of drinks, and figured it might be the wrong time to try it myself.

On the way home, my wife brought up a conversation we often have here in our eighth year of marriage.

“I don’t know what you’d like to have for a present. You need to pick a place you’d like to travel, and I’ll set up a trip for you. Get away for a weekend! Anything you want!” I thanked her sincerely and explained, firstly, that there’s precious little in this life I want and don’t have. I’m a very content person by nature and I have much more than I need as it is. Second, for a pseudo-journalist and general businessperson, I kind of hate traveling. I do it routinely to have fun experiences with her and family or friends, but I use the excuse that I’m saving personal trips for when I have no other real choice. Someone will want me onsite somewhere and I’ll go. I’ll try to grab some pictures and a t-shirt while I’m there.

“What if we did something a little cheaper that the whole family might like?” I asked. For the rest of the drive home I recounted the little questions I’d answered for her parents about the VR headset, talked about the relatively affordable wireless headset Oculus just put out, and talked about how much fun I’d had bringing the family into an environment where they could easily get and enjoy “the whole gaming thing” for an evening. What if we could show people this magical VR stuff at home? Whether it was indeed the interest in a family activity or just an uncomplicated gift with a low price tag, she enthusiastically agreed.

Since our Quest arrived, we’ve had a total blast with it. Beat Saber is an early favorite around the house, but I’ve also downloaded Star Chart for my wife, Gun Club VR for whenever my father-in-law is ready to jump in (and, yeah, I’m enjoying the target practice and the fancy toys) and SUPERHOT VR is just incredible. After getting my bearings and sampling some of the hottest games, I was more than inspired to dig in and start experimenting. Here are my findings.

Getting set up

Oculus provides documentation for three categories of development: “Unity,” for Unity developers, “Unreal,” for Unreal Engine devs, and “Native,” for pretty much just John Carmack. As mentioned previously, I’ll be going the Unity route, but I suggest Unity and Unreal users alike check out their respective compatibility list provided by Oculus to see not only which version is compatible, but which they recommend, VR development has enough little pain points, take all the help you can get.

The docs also provide links to the latest official Oculus Integration package, but I recommend simply starting a new Unity 3D project, opening the Asset Store, and searching for “Oculus.” This will allow you to grab the latest and import it directly into your project.

This is as good a time as any to get your Quest headset switched into “Developer Mode.” It’s not difficult, but it will pull you out of the groove later, so just knock it out now. You’ll need to get going in the Oculus companion mobile app, go to Settings, More Settings, and toggle on Developer Mode. It won’t be this simple, unfortunately, because now you’ll be taken to the Oculus Dashboard site where you’ll have to get informally registered by creating an organization and becoming a developer assigned to that organization. If you’re one person goofing off in your kitchen, it’s going to feel a little silly.

You have a few settings to change to enable VR in your project, but it’s pretty painless.

  • In Player Settings, Virtual Reality Supported needs to be checked
  • Oculus needs to be added under SDKs
  • You’ll need to change the build platform to Android
  • Minimum API Level needs to be set to 19
  • In Other Settings, remove “Vulkan” from Graphics APIs

Pull an OVRPlayerController prefab from Assets > Oculus > VR > Prefabs into your scene and delete the default camera. Add Quest to the OVRManager’s Target Devices array. Theoretically, you’re ready to go.  

Why I just said “theoretically”

Right now, you should be able to plug the Quest into your machine, do a Build and Run, and look around at the majesty of the Unity default skybox as if it’s the real sky above you. It’s fantastic for about nine seconds.

Now we need to make our world interactive. Here’s a quick aside about world scale:

Check that you know everything you think you know about world scale. What’s the conversion between your engine units and real-world measurement? In Unity, 1 unit = 1 meter. The units in your 3D modeling application are probably arbitrary, so you should be able to think of them like Unity’s 1-meter units, but now is the time to test an import and see if that is what’s actually happening. Make a cube that’s your height. Make a sphere the size of a baseball. See if these things feel right; you don’t want to pour lots of time into your first interactive scene and find out everything was scaled incorrectly creating some kind of miniature or giant person simulator. Playing with scale expectations in VR can lead to really cool experiences, but that has to be built on a foundation of exact understanding.

Back to our quest to do stuff with the Quest. Let’s make that aforementioned baseball (sphere scale of about 0.07366) and place it about 1.5 units up from the ground (give yourself a little plane to stand on so you’re not just staring down into the abyss). Don’t worry about physics yet. We have a bigger problem. You may have noticed you don’t have hands.

This is a little beef I have with the Oculus Integration package. It does a little too much and the organizational structure leaves you to do a lot of hunting and experimenting. If there’s a version of an avatar or any player/controller rig with grabbing hands ready to go, it’s not clear to me where it is, and I’ve been playing with this for a while now. As best I can tell, the fastest way to get this complete package is to add an instance of LocalAvatarWithGrab (get rid of your OVR rig if it’s still in the scene), find the CustomHands directory under the SampleFramework > Core folder, drag the CustomHands (Left and Right) to be direct children of LocalAvatarWithGrab, and customize them from there (they seem to load with two different materials assigned). Now you can delete the previous hand children. I know this is an odd process, if I’ve missed a complete rig, feel free to reach out. If you build and run this, you’ll actually have very nice hands set up that change according to your grip on the controller. Excellent!

Good news: grabbing that ball? Much simpler. Add an OVRGrabbable script as a component to the ball. Add a RigidBody component as well. No gravity is okay for now, and it’s a kind of fun way to play around. Note that we haven’t worked out grip points, offsets, and other somewhat advanced concepts here, we just want to play with the ball we’ve been staring at until now.

Where to go from here

While these tests may seem simplistic, adding these concepts to your pre-existing knowledge of in-world events in your 3D engine of choice will actually make you capable of some fantastic VR experiences. If you have experience in non-VR development, you will already know how to fire events based on colliders meeting in space, parenting objects between one another (though I’d like to expand on handling hands and proper grips), working with UIs in three dimensions, etc.

Here are some further concepts I’d like to expand on in future posts:

  • Improving grip points, offset, etc. for held objects
  • Non-grabbable interaction like button pushing, knob turning, and more
  • Moving around scenes without throwing up
  • Design philosophy for VR

If anyone has criticism or corrections, it’s all probably more than fair. Feel free to reach out. Otherwise, get cracking!

Leave a Comment