If you’ve ever played a game that supports hand tracking on any Meta Quest headset, you know that the experience is generally not very good. Even games that are well designed and built from the ground up for manual tracking, like Silhouette, are more frustrating than fun.
But Rogue Ascent is something else entirely. I’m not sure what kind of voodoo magic developer Nooner Bear Studio is using, but it’s absolutely different than what all else has been doing up to this point. It’s absolutely the best Quest 2 game that supports hand tracking, even when games like Little Cities already do it pretty well.
And that’s why I’m no longer concerned that the Apple Vision Pro only uses manual tracking when games inevitably debut on the expensive headset. While Apple didn’t show off any actual VR gaming at WWDC, I hope the headset will help advance hand-tracked VR gaming when it launches early next year.
Manual tracking finally comes in handy
The first days of hand tracking, I mean the first three years, were very rough. During that time, the best hand tracking games for Quest produced some diamonds in the rough, but even the best ones can be incredibly finicky more often than I’d like.
For the most part, I didn’t blame the games. Manual tracking is hard to get right, especially when you had to rely on four basic cameras on the outside of the Quest 2 or Quest Pro. But Rogue Ascent does it differently. He is using the latest version of Meta’s hand tracking, which drastically improves accuracy and speed thanks to AI-powered intelligence, plus his own combination of magic.
You do not believe me? See for yourself in my video below, and pay attention to a few key things during the video:
1.) The speed at which I can move and it still accurately registers my hand movements.
2.) The accuracy of the aim. It’s dead with a controller.
3.) The ability to move as gracefully and smoothly as with a controller.
As you may have guessed by now, Rogue Ascent is a roguelike shooter that forces players to climb as high as possible up a tall building by blasting their way through the obstacles on each floor. Each area ends with a quick conversation with an NPC, and then an elevator ride to the next level.
Like other roguelikes, you will eventually die and be transported back to a hub area. Your character receives some permanent upgrades earned through your travels, while other upgrades and weapons stack only for one person climbing the tower.
It’s a familiar formula that feels fresh thanks to the new input mechanic and the fact that it really works.
Apple still has work to do
While we’re aware of some of the technology found inside the Apple Vision Pro (here called AVP), we’re still not sure how far Apple’s current technology can take VR gaming. Apple primarily showed off its eye-tracking technology at WWDC, not hand-tracking technology, and it’s quickly becoming clear why.
Apple isn’t quite ready to give you a fully hand-tracked experience just yet. Where is the proof? It’s right in the company’s own presentations on how its tracking technology works. The developer of Smash Drums on Quest was quick to point this out on Twitter:
He [latency/input lag/you name it] it’s WILD and definitely not ready for prime time. Oculus Quest hand tracking v1.0 was already better than this when it was released 2 years ago, and it’s come a long way since… Step up Apple! This isn’t even MVP level here 😬 https://t.co/WGC8FN5huRJune 7, 2023
If you watch the video, you’ll see how far the actual tracking is from the AVP compared to the arm movements of the person in the video. Thumb tracking, in particular, lags far behind even extremely slow rotational movements of the person’s arms.
Meanwhile, I’m waving my arms around without even considering speed in the Rogue Ascent video above, and it shows at least one weakness in Apple’s current iteration of hardware.
All of Apple’s demos rely primarily on your eyes looking at a virtual object – which, by all accounts, is the best eye tracking any headset has to date – and a simple “pinch” to confirm your selection.
While this feels “magical” by normal Apple standards – a superlative used by many people who wore the headset at or before WWDC – Apple seems to rely heavily on its eye-tracking technology rather than manual tracking.
And while some demos show smart enough hand tracking recognition, these demos mostly rely on a user holding their arm still to allow a butterfly to land on a finger, for example.
Apple leans heavily on its excellent eye-tracking technology in its UI, but we expect manual tracking to improve by the time the headset debuts.
But I have little doubt that Apple will continue to improve this experience over time. The headset has at least half a year until it’s actually available for purchase, which means plenty of time for behind-the-scenes tech to mature.
apple does anchors of the world better than any XR company on the market right now, even Meta, because it’s been building AR tech into the iPhone for a decade or more. This tells me that any limitations we’re seeing are because manual tracking doesn’t have the long history of the iPhone to build on, not because Apple is somehow incompetent.
Rec Room VR footage in Apple Vision Pro! Great things! Thoughts? https://t.co/FWwc0gxk1M pic.twitter.com/xshO05usrOJune 7, 2023
Also, as the developers have confirmed, Unreal Engine, and Unity support AVP, and games like Rec Room and Fruit Ninja have already confirmed that an AVP version of popular titles will be available. You can see Rec Room running on an Apple Vision Pro in the Tweet above, and while it looks smooth, some have pointed out that the fidelity looks worse than similar games with manual tracking in Quest.
Regardless of initial quality, having more developers on board who can share experiences across devices is great news for the future of the XR, even if Apple only calls its headphones a “spatial computing device.”