Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add WebXR Tracked Hands API #2316

Merged
merged 7 commits into from
Aug 6, 2020
Merged

Conversation

Maksims
Copy link
Collaborator

@Maksims Maksims commented Jul 26, 2020

This PR adds support for tracked hands, that been recently enabled in Oculus Browser, and is available on Oculus Quest under chrome://flags/#webxr-hands-tracking flag. Bear in mind that Hand Tracking API is still an experimental API, so it is not yet production ready from browser vendors point of view, and Hand Tracking API is a subject to change.

For use of this new API, nothing is required from developer to enable, or create instances. Only check if input source is a hand, then iterate through data and use it in any way required by application.

In order to stay truly cross-XR platform, we need to ensure that basic XR data for input sources, such as select event and a ray are available even for hands. So developing apps with this data in use, should be simple and straightforward, without thinking of different input source types.

Ray Input

As underlying spec does not implement Ray for hand input type (it currently reports input source as gaze which is not right of course. So we emulate Ray, by using some joints. This is an approximation, and of course a big topic in hand tracking - how to define direction at which hands are pointing. Oculus Quest does strong smoothing for ray, we don't, which leads to a bit more active ray changes.

Select events

Simple interaction are select events, for gaze it is tap on a screen, for AR session touch within the screen, for handheld devices it is a trigger button. In Oculus Quest menu, touching thumb tip with index tip - triggers select event. While HoloLens V2, I believe has some difference. Our implementation is within Oculus Quest, where touching two tips of a thumb and an index fingers, triggers select events.

New APIs:

// pc.XrInputSource
inputSource.hand // if it is a tracked hand, it will point to pc.XrHand otherwise it equals to null

// pc.XrHand
hand.fingers // list of fingers
hand.joints // list of joints
hand.tips // list of finger tip joints
hand.wrist // wrist joint
hand.tracking // true if hand is tracked
hand.getJointById(id) // returns XrJoint or null by XRHand.* id from spec: https://immersive-web.github.io/webxr-hand-input/#xrhand-interface
hand.on('tracking', function() { }); // fired when tracking becomes available
hand.on('trackinglost', function() { }); // fired when tracking of a hand is lost

// pc.XrFinger
finger.index // index of a finger, in order of: thumb, index, middle, ring, little
finger.hand // hand finger is related to
finger.joints // list of joints of a finger, in order of root of finger to the tip
finger.tip // joint which is a tip of a finger, or null if not available

// pc.XrJoint
joint.index // index of a joint, starts from root of a finger, all the way to the tip
joint.hand // hand which joint is related to
joint.finger // finger which joint is related to, or null (in case of a wrist)
joint.wrist // true if joint is a wrist
joint.tip // true if joint is a tip of a finger
joint.radius // radius of a joint, which is a distance from joint to skin
joint.getPosition() // returns world position of a joint
joint.getRotation() // returns world rotation of a joint

Video of testing application: https://twitter.com/mrmaxm/status/1287393803066904576

Usage Example

Creating hand model where each joint is a box:

var joints = [ ];
var hand = inputSource.hand;
for(var i = 0; i < hand.joints.length; i++) {
    var entity = new pc.Entity();
    entity.joint = hand.joints[i];
    entity.addComponent('model', { type: 'box' });
    parent.addChild(entity);
    joints.push(entity);
}

Updating joints on each update:

for(var i = 0; i < joints.length; i++) {
    var entity = joints[i];
    var joint = entity.joint;
    var radius = joint.radius * 2;
    entity.setLocalScale(radius, radius, radius);
    entity.setPosition(joint.getPosition());
    entity.setRotation(joint.getRotation());
}

Finding out if thumb and index tips are touching:

var positionThumb = hand.fingers[0].tip.getPosition();
var positionIndex = hand.fingers[1].tip.getPosition();

if (positionThumb.distance(positionIndex) < 0.015) { // 15 mm
    // thumb and index tips are touching
}

Project for tests: https://playcanvas.com/project/705931/overview/webxr-hands

I confirm I have signed the Contributor License Agreement.

src/xr/xr-hand.js Outdated Show resolved Hide resolved
src/xr/xr-hand.js Outdated Show resolved Hide resolved
src/xr/xr-joint.js Outdated Show resolved Hide resolved
src/xr/xr-joint.js Outdated Show resolved Hide resolved
src/xr/xr-joint.js Outdated Show resolved Hide resolved
src/xr/xr-joint.js Outdated Show resolved Hide resolved
@willeastcott willeastcott self-requested a review August 6, 2020 11:35
@willeastcott willeastcott merged commit a923052 into playcanvas:master Aug 6, 2020
@Maksims Maksims deleted the webxr-hands branch August 6, 2020 12:10
@slimbuck slimbuck mentioned this pull request Aug 6, 2020
@slimbuck
Copy link
Member

slimbuck commented Aug 7, 2020

@Maksims I can't get hands to show on Oculus Quest in your webxr hands project.

I have enabled #webxr-hands and tried all the options for #webxr-hands-tracking, but no luck. I see two cubes for hands but no digits. Am I doing something wrong?

Oculus chrome browser is version 84.0.4147.111.

@Maksims
Copy link
Collaborator Author

Maksims commented Aug 7, 2020

You probably need to enable auto-hand detection in Oculus Quest (Settings > Device > Hand & Controllers > Auto-Enable Hands or Controllers). Also enable #webxr-hands, but disable #webxr-hands-tracking (as it is emulation).
Then launch the app, and put your controllers to hard surface (so they become stationary, as moving controllers prevent hand tracking auto-detection by Quest). Then move hands in front of you, and it should pick them up just like in system.

If you don't want to enable auto-detect for hands, then you need to call Oculus panel and in Quick Settings enable hands, then using hands resume previously launched XR session.

Launching XR session using hands in browser, currently (07.08.2020) is not allowed on Oculus Quest, and will bring popup menu basically asking you to take controller and do it using it.

@slimbuck
Copy link
Member

slimbuck commented Aug 7, 2020

Cool, thanks. I've done as you suggested. I enabled hand control in quest settings and so hand control works in your project, however the hands are still rendered as single cubes. Can you think of anything else I'm doing wrong?

@slimbuck
Copy link
Member

slimbuck commented Aug 7, 2020

Also, I have 4 possible options for #webxr-hands_tracking, which is the correct one? I assume Hands and pointers?

@Maksims
Copy link
Collaborator Author

Maksims commented Aug 7, 2020

Cool, thanks. I've done as you suggested. I enabled hand control in quest settings and so hand control works in your project, however the hands are still rendered as single cubes. Can you think of anything else I'm doing wrong?

That is correct - simple example renders hands as boxes.
I'm working right now with artist on skinned model which will be added as example later on.

Also, I have 4 possible options for #webxr-hands_tracking, which is the correct one? I assume Hands and pointers?

Disable. This is old option that been used to emulate controllers using optical hand tracking by Quest. But this is deprecated now as we have full tracking API. So set this particular setting to Disabled.

@slimbuck
Copy link
Member

slimbuck commented Aug 7, 2020

Unfortunately launching https://launch.playcanvas.com/961791?use_local_engine=https://code.playcanvas.com/playcanvas-latest.js on the oculus browser doesn't work for me at all, just displays 'webxr not available'.

@slimbuck
Copy link
Member

slimbuck commented Aug 7, 2020

...however the engine test does work.

As an aside, I can imagine that head butting a beehive is probably more fun than testing anything in the oculus browser.

@Maksims
Copy link
Collaborator Author

Maksims commented Aug 7, 2020

You can remote debug Quest just like other Android device using usb cabe, adb and chome://inspect
Also you can chrome://inspect debug over wifi, but it requires to plug with usb first, and switch to wifi debugging.

Weird it does not work for you, link you've provided works well on my device. Maybe you've updated your browser/device, perhaps a restart required?

@slimbuck
Copy link
Member

slimbuck commented Aug 7, 2020

Yes I should take the time to setup remote debugging.

I found the problem, had accidentally typed http instead of https. It's working now with blue finger tips and all.

Thanks for the help @Maksims!

@slimbuck
Copy link
Member

slimbuck commented Aug 7, 2020

...and congrats on a great new addition to engine!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: xr XR related issue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants