-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add WebXR Tracked Hands API #2316
Conversation
@Maksims I can't get hands to show on Oculus Quest in your webxr hands project. I have enabled #webxr-hands and tried all the options for #webxr-hands-tracking, but no luck. I see two cubes for hands but no digits. Am I doing something wrong? Oculus chrome browser is version 84.0.4147.111. |
You probably need to enable auto-hand detection in Oculus Quest (Settings > Device > Hand & Controllers > Auto-Enable Hands or Controllers). Also enable #webxr-hands, but disable #webxr-hands-tracking (as it is emulation). If you don't want to enable auto-detect for hands, then you need to call Oculus panel and in Quick Settings enable hands, then using hands resume previously launched XR session. Launching XR session using hands in browser, currently (07.08.2020) is not allowed on Oculus Quest, and will bring popup menu basically asking you to take controller and do it using it. |
Cool, thanks. I've done as you suggested. I enabled hand control in quest settings and so hand control works in your project, however the hands are still rendered as single cubes. Can you think of anything else I'm doing wrong? |
Also, I have 4 possible options for #webxr-hands_tracking, which is the correct one? I assume |
That is correct - simple example renders hands as boxes.
Disable. This is old option that been used to emulate controllers using optical hand tracking by Quest. But this is deprecated now as we have full tracking API. So set this particular setting to Disabled. |
Unfortunately launching https://launch.playcanvas.com/961791?use_local_engine=https://code.playcanvas.com/playcanvas-latest.js on the oculus browser doesn't work for me at all, just displays 'webxr not available'. |
...however the engine test does work. As an aside, I can imagine that head butting a beehive is probably more fun than testing anything in the oculus browser. |
You can remote debug Quest just like other Android device using usb cabe, adb and chome://inspect Weird it does not work for you, link you've provided works well on my device. Maybe you've updated your browser/device, perhaps a restart required? |
Yes I should take the time to setup remote debugging. I found the problem, had accidentally typed http instead of https. It's working now with blue finger tips and all. Thanks for the help @Maksims! |
...and congrats on a great new addition to engine! |
This PR adds support for tracked hands, that been recently enabled in Oculus Browser, and is available on Oculus Quest under
chrome://flags/#webxr-hands-tracking
flag. Bear in mind that Hand Tracking API is still an experimental API, so it is not yet production ready from browser vendors point of view, and Hand Tracking API is a subject to change.For use of this new API, nothing is required from developer to enable, or create instances. Only check if input source is a hand, then iterate through data and use it in any way required by application.
In order to stay truly cross-XR platform, we need to ensure that basic XR data for input sources, such as
select
event and aray
are available even for hands. So developing apps with this data in use, should be simple and straightforward, without thinking of different input source types.Ray Input
As underlying spec does not implement Ray for hand input type (it currently reports input source as
gaze
which is not right of course. So we emulate Ray, by using some joints. This is an approximation, and of course a big topic in hand tracking - how to define direction at which hands are pointing. Oculus Quest does strong smoothing for ray, we don't, which leads to a bit more active ray changes.Select events
Simple interaction are
select
events, for gaze it is tap on a screen, for AR session touch within the screen, for handheld devices it is a trigger button. In Oculus Quest menu, touching thumb tip with index tip - triggers select event. While HoloLens V2, I believe has some difference. Our implementation is within Oculus Quest, where touching two tips of a thumb and an index fingers, triggers select events.New APIs:
Video of testing application: https://twitter.com/mrmaxm/status/1287393803066904576
Usage Example
Creating hand model where each joint is a box:
Updating joints on each update:
Finding out if thumb and index tips are touching:
Project for tests: https://playcanvas.com/project/705931/overview/webxr-hands
I confirm I have signed the Contributor License Agreement.