Quest's "Body Tracking" API is just an estimate without a leg

Quest’s “Body Tracking” API is just an estimate without a leg

Quest’s “Body Tracking” API is not what it sounds like.

The Body Tracking API was released on Thursday as part of Movement SDK, which also includes Eye Tracking API and Face Tracking API for Quest Pro.

The Official Oculus Developer Twitter Account announced the release with an illustration from the documentation showing a user’s full body pose being tracked. This was widely shared – leading many to believe that Quest just got body tracking – but the API name and artwork are misleading.

Meta’s hand tracking API provides the actual position of your hands and fingers, tracked by outward-facing cameras. Its Eye Tracking API and Face Tracking API provide the actual direction of your gaze and facial muscle movements, tracked by Quest Pro’s inward-facing cameras. But the “Body Tracking” API only provides a “simulated upper body skeletonbased on your head and hand positions, a Meta spokesperson confirmed to UploadVR. This is not real tracking and does not include your legs.

A better name for the API would be Body Pose Estimation. The spokesperson described the technology as a combination of inverse kinematics (IK) and machine learning (ML). IK refers to a class of equations for estimating unknown positions of parts of a skeleton (or robot) based on the known positions. These equations power all of the full VR avatars in apps today. Developers don’t need to implement (or even understand) the math behind IK, because game engines like Unity and Unreal have IK built-in, and packages like the popular Final IK offer full implementations for less than $100. $.

Unless you’re using body-tracking hardware such as HTC’s Vive Trackers, IK for VR tends to be inaccurate – there are only so many potential solutions for each given set of head positions and hands. Meta’s argument here is that its machine learning model can produce a more accurate body pose for free. The demo video seems to support this claim, although without the lower half of the body – and with support limited to Quest headsets – most developers are unlikely to accept this offer.

However, clues given at Meta’s Connect 2022 event and company research suggest that legs will be added in the future.

During a developer conference, Vibhor Saxena, Product Manager Body Tracking, said:

“Further body tracking enhancements in the coming years will be available through the same API, so you can be sure that you will continue to get the best body tracking technology from Meta without having to switch to a different interface.

We are excited to bring these features to you and are working hard to improve body tracking for years to come.

During the keynote, Mark Zuckerberg announced that meta avatars have legs with an equally misleading demonstration. Legacy will be coming to Horizon later this year, then to the SDK for other apps next year. Saxena has confirmed that the Body Tracking API leverages the same underlying technology that powers meta avatars – which seems to suggest that the API will also have legs.

You might be wondering: if the Body Tracking API is just an estimate based on the position of the head and hands, how could it integrate the legs? Last month, Meta presented research on exactly that, taking advantage of recent advances in machine learning. The system shown, however, is not entirely accurate and has 160ms latency – over 11 frames at 72Hz. This timing is too slow and the output is imperfect, so you can’t expect to look down and see your own legs in the positions you expect whether. Comments from Meta’s CTO suggest the company could use tech like this to show off its legs Other people avatars instead:

“Having legs on your own avatar that don’t match your real legs is very confusing to people. But of course you can put legs on other people, you see, and you don’t mind at all.

So we’re working on legs that look natural to someone who’s onlooker – because they don’t know how your real legs are actually positioned – but you’ll probably still see nothing when looking at your own legs. This is our current strategy.

As we noted at the time, the upcoming solution might not be of the same quality as this research. Articles on machine learning often run on powerful PC GPUs at a relatively low frame rate, and the article does not mention the runtime performance of the described system.

#Quests #Body #Tracking #API #estimate #leg

Leave a Comment

Your email address will not be published. Required fields are marked *