diff --git a/docs/mocap/face-tracking.md b/docs/mocap/face-tracking.md index 29954a5..3e84806 100644 --- a/docs/mocap/face-tracking.md +++ b/docs/mocap/face-tracking.md @@ -2,21 +2,21 @@ sidebar_position: 20 --- -# Customizing Face Tracking +# 面部追踪 -## Blueprint Customization +## 蓝图自定义 -During the onboarding process, you can click **Customize Face Tracking...** to customize the tracking blueprint. +在开播准备环节,您可以点击 **面部追踪进阶设置…** 来对蓝图进行定制。 ![](/doc-img/en-mocap-3.png)

Customizing face tracking.

-The following options are available: +目前我们提供以下几种选择: -* **BlendShape Mapping:** Select the blendshape mapping that matches your model. For example, if your model has "Perfect Sync"/ARKit blendshapes, select **ARKit**; if your model is converted from a MMD model, select **MikuMikuDance**; otherwise, select **VRM**. By default, Warudo will try to automatically detect the blendshape mapping, but you can override it here. -* **Enable Head and Body Movements:** If enabled, Warudo will move your character's head and body according to the tracking data. If disabled, only the face (blendshapes and eye bones) will be animated. This is automatically set to **No** by the onboarding assistant if you use a full-body pose tracking system that already tracks the head and body. -* **Idle Head Animation (Auto Blinking / Auto Eye Movements / Auto Head Movements):** If enabled, Warudo will add subtle head motion, eye movement, and blinking to your character. -* **Look At:** If enabled, your character will look at a target (default to the camera) when you look forward. This helps your character / you maintain eye contact with the audience regardless of the camera position, while still allowing your character / you to look around freely. +* **BlendShape 映射:** 选择适合您模型的 BlendShape 映射。比如说,如果您的模型带有 "Perfect Sync"/ARKit Blendshape, 请您选择 **Arkit**; 如果您的模型是由MMD模型转换而来,请选择 **MMD** ;其他请选择 **VRM**。 Warudo 会尝试自动识别模型类型,但您也可以在这里手动更改类型。- +* **启用头部和身体运动:** 此选项开启时,Warudo 会尝试基于动捕数据来同时运动模型的头身部分。在您使用完整的全身动捕系统的情况下,此选项会被开播助手设置为**关闭**。 +* **头部待机动画 (自动眨眼/自动眼部动作/自动摇头):** 在开启状态下,Warudo 会给角色增加微小的头部、眼部动作。 +* **视线:** 此选项开启时,,您的角色视线会望向指定方向(默认指向摄像机位)。这样可以帮助您在任何机位下保持与观众的眼神交流并允许您的头部自由转动。 * **Lip Sync**: If enabled, Warudo will animate your character's mouth based on your speech. You can choose to enable lip sync only when tracking is lost, or always enable it. :::info diff --git a/docs/mocap/leap-motion.md b/docs/mocap/leap-motion.md index f2d99b8..d220154 100644 --- a/docs/mocap/leap-motion.md +++ b/docs/mocap/leap-motion.md @@ -2,44 +2,44 @@ sidebar_position: 70 --- -# Leap Motion Controller +# Leap Motion 控制器 -Hand tracking via [Leap Motion Controller](https://leap2.ultraleap.com/leap-motion-controller-2/). +使用 [Leap Motion 控制器](https://leap2.ultraleap.com/leap-motion-controller-2/)进行手部捕捉。 -## Setup +## 初始设置 -We recommend using a Leap Motion Controller 2 for more accurate and stable tracking. The original Leap Motion Controller is also supported. +为了更准确和稳定的手部捕捉,我们推荐您使用更新一代的Leap Motion 控制器 2,但Warudo也支持使用初代Leap Motion 控制器进行捕捉。 -To connect the Leap Motion Controller to Warudo, you need to download and install the latest Gemini software from [Leap Motion's website](https://leap2.ultraleap.com/gemini-downloads/). +您将需要下载安装最新的[Gemeni软件](https://leap2.ultraleap.com/gemini-downloads/)以将您的Leap Motion 控制器连接到Warudo。 -Warudo supports all 3 tracking modes offered by the Leap Motion Controller: **Desktop**, **Screen Top**, and **Chest Mounted**. We recommend using the **Chest Mounted** mode along with a [neck/chest mount](https://www.etsy.com/market/leap_motion_mounting) for the best experience. +Warudo支持 Leap Motion 控制器提供的所有三种捕捉模式:**Desktop(将设备置于桌面上时)**, **Screen Top(将设备置于屏幕顶部时)**, 以及 **Chest Mounted(颈挂式)**。为了更好的体验,我们推荐在使用颈挂式捕捉模式时,使用一个[颈挂](https://www.etsy.com/market/leap_motion_mounting)。 -## Calibration {#Calibration} +## 校正 {#Calibration} -There is generally no need to calibrate the Leap Motion Controller. However, you can adjust **Controller Position Offset**, **Controller Rotation Offset**, and **Controller Scale** in the **Leap Motion Controller** asset to adjust the tracking. A virtual Leap Motion Controller will be displayed in the scene to help you visualize the tracking. +总体来讲,Leap Motion 控制器并不需要校正即可使用。但您可以在 **Leap Motion 控制器 Asset** 中调制**控制器位置偏移**,**控制器旋转偏移**,以及**控制器缩放比率** 来调整捕捉效果。该界面中将会显示一个虚拟的 Leap Motion 控制器图标,以帮助您可视化的校正动捕效果。 ![](/doc-img/en-leapmotion-1.png) -

Adjusting the virtual Leap Motion Controller.

+

图为调整 Leap Motion 控制器的示意图。

-## Options +## 选项 -* **Controller Position Offset**: The offset of the controller position. Positive X is a left offset, positive Y is an upward offset, and positive Z is a forward offset. -* **Controller Rotation Offset**: The offset of the controller rotation. -* **Controller Scale**: The scale of the controller. Increase this value if you want hands to be further away from the body. You can also enable **Per-Axis Scale** to scale the controller on each axis separately. -* **Fix Finger Orientations Weight**: Due to different model specifications, the finger orientations of the Leap Motion Controller may not match the finger orientations of the model. This option allows you to adjust the finger orientations of the Leap Motion Controller to match the model. 0 means no adjustment, 1 means full adjustment. Adjust this value until the fingers look natural. -* **Shoulder Rotation Weight**: How much the shoulders should be rotated. 0 means no rotation, 1 means full rotation. Adjust this value until the shoulders look natural. +* **控制器位置偏移**: 对传感器的三维空间坐标位置进行调整,正数X值会将传感器位置向左侧调整,反之则向右侧调整。正数Y值会将传感器位置向上方调整,反之则向下方调整。正数Z值会将传感器位置向前方调整,反之则向后方调整。 +* **控制器旋转偏移**: 平面旋转虚拟 Leap Motion 控制器,以调整传感器的角度。 +* **控制器缩放比率**: 此参数为调整传感器的距离感应比例,调整它可以改变模型手部和身体的距离感应灵敏度。你也可以使用 **对每个轴单独缩放** 来单独精调传感器在每个方向上的灵敏度。 +* **修正手指朝向权重**: 由于每个人物模型的参数不同,Leap Motion 控制器中手指的朝向可能与模型中手指的朝向不同步,这个参数可以让您调整手指的朝向,以使模型显示的动作与传感器捕捉到的动作同步。0代表不做任何调整,1代表最高修正值。请您持续调整,直至模型看上去效果自然。 +* **修正肩部朝向权重**: 这个参数会改变模型肩部的旋转角度,0代表不做任何调整,1代表完全翻转,请您持续调整,直至模型看上去效果自然。 -## Frequently Asked Questions {#FAQ} +## 常见问题{#FAQ} -Please refer to [Overview](overview#FAQ) and [Customizing Pose Tracking](body-tracking#FAQ) for common questions. +常见问题请参考[动作捕捉方案一览](overview#FAQ)以及[姿态追踪](body-tracking#FAQ)。 -### The Leap Motion Tracker asset says "Tracker not started." +### **Leap Motion 追踪器显示**"Tracker not started." -Please make sure you have installed the latest [Gemini software](https://leap2.ultraleap.com/gemini-downloads/), and it is running in the background. +请确保您已下载并安装最新版本的[Gemini软件](https://leap2.ultraleap.com/gemini-downloads/),并且程序已在后台运行。 -### My model's wrist/fingers look unnatural. +### 我的模型的手腕/手指看着很奇怪。. -Try adjusting the **Fix Finger Orientations Weight** option in the **Leap Motion Tracker** asset. You may also need to adjust the **Wrist Rotation Offset** and **Global Finger Rotation Offset** options (check the left boxes to enable them). +请尝试在**Leap Motion 追踪器 Asset**中调整**修正手指朝向权重(手指方向修正权重)** 选项,您可能也需要调整**手腕旋转偏移** 与 **全局手指旋转偏移** 选项。(您可以在选项左侧的方框内开启它们) +}} /> \ No newline at end of file diff --git a/docs/mocap/mocopi.md b/docs/mocap/mocopi.md index e2cc29f..7d91583 100644 --- a/docs/mocap/mocopi.md +++ b/docs/mocap/mocopi.md @@ -2,43 +2,43 @@ sidebar_position: 80 --- -# Sony Mocopi +# 索尼 Mocopi -Body tracking via [Sony Mocopi](https://electronics.sony.com/more/mocopi/all-mocopi/p/qmss1-uscx). +用 [索尼 Mocopi](https://electronics.sony.com/more/mocopi/all-mocopi/p/qmss1-uscx)进行动作捕捉。 -## Setup +## 初始设置 -Follow the [official tutorial video](https://www.sony.com/electronics/support/articles/00298063) to set up Sony Mocopi. +请先参照[索尼官方教程](https://www.sony.com/electronics/support/articles/00298063)来对Mocopi进行初始设置。 -Then, enter the settings page on the Mocopi app. In **External device connection settings**, select **mocopi (UDP)** for **Transfer format**, and enter your computer's IP address for **IP address**. +在初始设置完成之后,在Mocopi手机应用程序的设置页面中,选中**External device connection settings(外部设备连接设置)**,将**Transfer Format** 设置为**mocopi(UDP)**,然后再**IP address**中输入您的计算机IP地址。 :::tip -If you do not know your computer's IP, you can check on the configuration page of the **Mocopi Receiver** asset. +如果您不知道电脑的IP地址,可以在**Mocopi 接收器**素材的设置页面中找到。 ![](/doc-img/en-ifacialmocap-1.png) -If multiple IPs are listed, you would need to try each one. Usually, the IP address assigned to your computer by your WiFi router starts with `192.168`. For example, in the above picture, you can first try `192.168.1.151`. +如果出现了多个IP,您将需要一个一个尝试。通常来说,您的路由器分配的IP会以`192.168`打头,比如在例图中的情况下,您可以先尝试`192.168.1.151`。 ::: -Go back to the main interface, tap **Motion** on the menu bar, and tap the **SAVE** icon to switch to the **SEND** mode. Tap the green send button at the bottom to start sending tracking data. +回到主界面,点击顶部菜单栏中的 **Motion**然后点击**SAVE**图标以进入 **SEND** 模式。点击底部绿色的发送按钮来开始传送数据。 ![](/doc-img/en-mocopi-1.png) -## Calibration +## 校正 -Calibration of Sony Mocopi is done in the Mocopi app. +Sony Mocopi的校正需要在Mocopi应用程序中完成。 -## Options +## 选项 -* **Motion Buffering**: Whether to enable Mocopi's built-in motion buffering. This can usually be disabled because Warudo already smooths the motion data. +* **动作缓冲**: 选择是否开启Mocopi自带的动作缓冲功能(该功能可以让动作较为平滑)。由于Warudo自身会对捕捉数据进行平滑处理,这个设置通常不需要开启。 -## Frequently Asked Questions +## 常见问题 -Please refer to [Overview](overview#FAQ) and [Customizing Pose Tracking](body-tracking#FAQ) for common questions. +常见问题请参考[动作捕捉方案一览](overview#FAQ)和[姿态追踪](body-tracking#FAQ)页面。 -### My tracking drifts over time. +### 我的动捕结果经常会自己平移 -This is a common issue with inertial motion capture systems, which drift over time due to accumulated errors. To reduce the drift, make sure there are no magnetic or electrical interference near the tracking sensors. +这种随着时间一点点增大,导致动捕结果平移的误差是惯性动捕系统的常见问题。请尽量减少动捕设备周边的电磁干扰来减少漂移问题。 + diff --git a/docs/mocap/overview.md b/docs/mocap/overview.md index a847ebd..6757afc 100644 --- a/docs/mocap/overview.md +++ b/docs/mocap/overview.md @@ -2,86 +2,85 @@ sidebar_position: 10 --- -# Overview +# 动作捕捉方案一览 ![](/doc-img/mocap-cover.jpg) -Whether you are streaming at home or at a professional mocap studio, Warudo has you covered. Currently, Warudo supports the following motion capture systems: +不论在自家还是在专业动捕工作室,Warudo 自有妙计帮助您。目前 Warudo 支持以下动捕设备: -* Webcam - * Built-in tracking via [MediaPipe](./mediapipe) or [OpenSeeFace](./openseeface). +* 摄像头 + * 使用本软件自带的 [MediaPipe](./mediapipe) 或者 [OpenSeeFace](./openseeface) 功能即可进行动作捕捉。 * iPhone - * Requires either [iFacialMocap / FaceMotion3D](./ifacialmocap) or [RhyLive](./rhylive) app to be installed on the iPhone. + * 在iPhone上安装 [iFacialMocap / FaceMotion3D](./ifacialmocap) 或 [RhyLive](./rhylive) 软件即可使用 * [SteamVR] (./steamvr) -* [Leap Motion Controller](./leap-motion) -* [Sony Mocopi](./mocopi) +* [Leap Motion 控制器](./leap-motion) +* [索尼 Mocopi](./mocopi) * [Rokoko](./rokoko) * [Xsens MVN](./xsens-mvn) -* [Virdyn Studio (VDMocapStudio)](./virdyn) -* [Noitom Axis](./noitom) +* [Virdyn(虚拟动力) Studio (VDMocapStudio)](./virdyn) +* [诺亦腾 Axis](./noitom) * [StretchSense Glove](./stretchsense) -* External applications that support the [VMC protocol](./vmc), e.g., [VSeeFace](https://www.vseeface.icu/), [VirtualMotionCapture](https://vmc.info/) - * VR trackers are supported using the paid (supporter) version of [VirtualMotionCapture](https://www.patreon.com/sh_akira). +* 支持[VMC protocol](./vmc)的第三方软件, 例如 [VSeeFace](https://www.vseeface.icu/), [VirtualMotionCapture](https://vmc.info/) + * 付费(赞助性质)版本的 [VirtualMotionCapture](https://www.patreon.com/sh_akira) 可支持VR动捕。 -[Warudo Pro](../pro.md) also supports the following motion capture systems: +[Warudo Pro](../pro.md) 可以额外支持以下动捕系统: -* Any optical tracking system compatible with [Autodesk MotionBuilder](./motionbuilder), e.g., [Vicon](https://www.vicon.com/), [OptiTrack](https://optitrack.com/) +* 任何兼容 [Autodesk MotionBuilder](./motionbuilder) 软件的光学动捕系统,例如 [Vicon](https://www.vicon.com/), [OptiTrack](https://optitrack.com/) * [OptiTrack Motive](./optitrack) -* [Chingmu Avatar](./chingmu) +* [青瞳 Avatar](./chingmu) -## What motion capture systems should I use? +## 我应该使用哪种动捕系统? -If you are completely new to 3D VTubing, we recommend starting with [MediaPipe](./mediapipe) for both face tracking and hand tracking. MediaPipe is a motion capture system built into Warudo, and you only need a webcam to use it. If you happen to have an iPhone, we recommend using the [iFacialMocap](./ifacialmocap) app for face tracking instead, as it offers much higher accuracy than MediaPipe, and offloading face tracking to the iPhone also reduces the load on your computer. +如果您是初次进行3D直播的话,我们建议您先使用 Warudo 内置的 [MediaPipe](./mediapipe) 来完成全部动捕,只需要您拥有一个电脑摄像头便可完成全部流程。如果您使用 iPhone,我们推荐您使用 [iFacialMocap](./ifacialmocap) 来代替 MediaPipe 进行更高精度的面部动捕,这样也可以降低您电脑的动捕负载。 -If you are looking to improve your tracking quality, here are a few suggestions: +如果您想要提高您的动捕质量,我们有以下建议: -* Use an iPhone for face tracking (via the [iFacialMocap](./ifacialmocap) app). This is also the best face tracking solution you can ever get. If you don't have an iPhone, you may consider looking into a used iPhone. As long as it is compatible with Face ID, it should work with iFacialMocap. +* 使用 iPhone 上的 [iFacialMocap](./ifacialmocap) 应用进行面部动捕,这是目前最好的家用面部动捕方案。如果您没有 iPhone,请考虑购买一台二手 iPhone,只要它支持 Face ID 即可使用此应用。 -:::tip -For best tracking quality, we recommend using an iPhone 12 or newer (iPhone mini also works). Older iPhones may have lower tracking quality. +:::提示 +为了更好的捕捉效果,我们推荐您使用 iPhone 12 或更新的 iPhone 型号(iPhone mini 也可以),较早期的 iPhone 型号可能会影响捕捉效果。 ::: -* If you find MediaPipe hand tracking not accurate enough, you may consider using a [Leap Motion Controller](./leap-motion) for hand tracking. You can use a [neck mount](https://www.etsy.com/market/leap_motion_mounting) to wear it on your neck, so that it can track your hands. -* If you are looking for a full-body tracking solution, you may consider using [Sony Mocopi](./mocopi) or [VR trackers](./vmc.md). If you have the budget, you may also consider using a mocap suit, such as [Virdyn VDSuit](./virdyn), [Noitom Perception Neuron](./noitom), [Rokoko Smartsuit Pro](./rokoko), or [Xsens MVN Link](./xsens-mvn). You can add finger tracking to these mocap suits by using [StretchSense gloves](./stretchsense). -* If you are looking for a more professional solution, you may consider using an optical tracking system, such as [Vicon](https://www.vicon.com/), [OptiTrack](https://optitrack.com/), or [Chingmu](https://www.chingmu.com/). +* 如果您对 MediaPipe 的手部捕捉感到不满意,请您考虑使用 [Leap Motion 控制器](./leap-motion) 进行手部动捕。您可以使用一个[颈挂](https://www.etsy.com/market/leap_motion_mounting)以让它能够追踪手部动作。 +* 如果您需要全身动捕方案,您可以考虑使用 [索尼 Mocopi](./mocopi )或者 [VR trackers](./vmc.md)。如果您有足够预算,您也可以考虑购买动捕服装,例如 [Virdyn(虚拟动力) VDSuit](./virdyn), [诺亦腾 Perception Neuron](./noitom), [Rokoko Smartsuit Pro](./rokoko), 或者 [Xsens MVN Link](./xsens-mvn)。您也可以使用 [StretchSense gloves](./stretchsense) 为其增加手部动捕功能。 +* 如果您需要更专业的解决方案,您也可以使用光学捕捉系统,常见的方案有 [Vicon](https://www.vicon.com/), [OptiTrack](https://optitrack.com/)和 [青瞳](https://www.chingmu.com/)。 -## Setup {#setup} +## 初始设置 {#setup} -There are two ways to set up motion capture in Warudo. The first and the preferred way is to use the **Onboarding Assistant** asset that comes with every scene. Simply click **Basic Setup → Get Started** to start the onboarding process and follow the instructions. +在 Warudo 中有两种方法进行动捕初始设置。第一种推荐的方法是使用**开播助手**一步一步的进行。只需点击 **Basic Setup → Get Started** 开始开播流程,然后按指示做即可。 ![](/doc-img/en-getting-started-2.png) -

Setting up motion capture in the onboarding assistant asset.

+

在开播助手组件中设置动作捕捉

-After the onboarding process is complete, you can select the relevant motion capture assets to further customize your tracking. For example, if you are using iFacialMocap, you may notice Warudo adds some movement to your character's body when your head moves. If this is not desirable, you can set **Body Movement Intensity** to 0 in the **iFacialMocap Receiver** asset. +在开播流程完毕后,您可以使用相关的动捕组件来客制化您的动捕效果。比如说,如果您正在使用 iFacialMocap,您可能会注意到 Warudo 会在您移动头部时增加一些躯干动作。如果您不需要,可以在 **iFacialMocap Receiver** 组件中将 **Body Movement Intensity** 设置为0. ![](/doc-img/en-mocap-1.png) -

Adjusting the body movement intensity of iFacialMocap.

+

在 iFacialMocap 中调整 Body Movement Intensity

-The second way to set up motion capture is to use **Character → Setup Motion Capture**. This allows you to set up face tracking and/or pose tracking directly. However, some checking steps in the onboarding process are also skipped. +另一种设置动捕的方式是使用 **Character → Setup Motion Capture**,这样能让您直接设置面部和身体的动捕。然而,一些开播流程中的检察环节会被跳过。 ![](/doc-img/en-mocap-2.png) -

Setting up motion capture in the character asset.

+

在角色模块中设置动捕

-This method is useful if you need to set up multiple face tracking or pose tracking systems, since the onboarding assistant always remove the existing motion capture setup when you start a new onboarding process. +这个方法在使用多个面部/身体动捕时非常方便,因为开播助手每进行一次新的开播时都会删除已有的动捕设置。 -Whichever method you choose, corresponding tracking blueprints will be generated and added to your scene. For example, if you have chosen iFacialMocap for face tracking and MediaPipe for pose tracking, you will be able to see two blueprints generated: **Face Tracking - iFacialMocap** and **Pose Tracking - MediaPipe**. You can modify these blueprints to customize your tracking. +不论您使用哪种方法,Warudo 都会在您的界面上生成对应的蓝图。比如说如果您使用 iFacialMocap 进行面部动捕,使用 MediaPipe 进行身体动捕,您会看到两张蓝图:**Face Tracking - iFacialMocap** 以及 **Pose Tracking - MediaPipe**,您可以编辑这些蓝图来定制化您的动捕效果。 -:::caution -You should almost never create a motion capture asset manually, i.e., by clicking on **Add Asset** and selecting a motion capture asset, such as **iFacialMocap Receiver**. This is because the motion capture asset alone only receives or provides the tracking data, but does not connect the tracking data to your character, which needs to be done by blueprints. The onboarding assistant and the **Setup Motion Capture** function automatically create the necessary blueprints for you. +:::注意 +在几乎所有情况下您都不应该手工创建一个动捕模块(例如点击 **Add Asset** 然后选择一个动捕模块,例如 **iFacialMocap Receiver**)这是因为动捕模块本身只能捕捉动作并输出捕捉结果,但它必须要使用蓝图才能与您想要使用的直播模型相关联。开播助手和 **Setup Motion Capture** 两种模块都能够自动为您创建需要的蓝图。 ::: -## Frequently Asked Questions {#FAQ} +## 常见问题 {#FAQ} -### My character is not moving. +### 动啊模型,为什么不动! -If you are using a motion capture system that requires an external application, such as iFacialMocap, make sure the application is running and the tracking data is being sent to Warudo. Also, make sure your computer's firewall is not blocking Warudo from receiving the tracking data; you may need to add Warudo to the whitelist, or temporarily disable the firewall. +如果您正在使用一个需要外部应用程序的动捕设备(如 iFacialMocap),请确保该应用正在运行并在将动捕数据传输给Warudo。并且您需要检查您电脑的防火墙设置以防其阻止Warudo接收动捕数据;您可能需要暂时停用防火墙或者将 Warudo 列入白名单。 -You may also want to set your network as a private network, as Windows may block incoming connections from public networks. See [this guide](https://support.microsoft.com/en-us/windows/make-a-wi-fi-network-public-or-private-in-windows-0460117d-8d3e-a7ac-f003-7a0da607448d) for more information. +有些动捕接收器有 **Port(端口)** 选项。请确保端口数字与第三方软件中的端口号码对应。 -Some motion capture receivers have a **Port** option. Make sure the port number matches the port number in the external application. -### My tracking is too smooth / too jittery. +### 我的模型动作太跳跃/太僵硬了。 -Please go to the tracking blueprint (e.g., **Face Tracking - iFacialMocap**) and locate the **Smooth Rotation List** / **Smooth Position List** / **Smooth Transform** / **Smooth BlendShape List** nodes. Increase the **Smooth Time** value to make the tracking smoother, or decrease the value to make the tracking more responsive. +请在动捕蓝图(例如 **Face Tracking - iFacialMocap**)中找到 **Smooth Rotation List** / **Smooth Position List** / **Smooth Transform** / **Smooth BlendShape List** 选项。增加 **Smooth Time** 可以让动捕轨迹更平滑,反之可以让模型响应更迅速。 + diff --git a/docs/mocap/rokoko.md b/docs/mocap/rokoko.md index a666347..01a5581 100644 --- a/docs/mocap/rokoko.md +++ b/docs/mocap/rokoko.md @@ -4,37 +4,38 @@ sidebar_position: 90 # Rokoko -Face & body tracking via [Rokoko](https://www.rokoko.com/). +使用 [Rokoko](https://www.rokoko.com/) 进行面部/身体动捕。 -## Setup +## 初始设定 -Open Rokoko Studio and enable **Livestream → Activate streaming to Unity**: +打开Rokoko并开启 **Livestream → Activate streaming to Unity**: ![](/doc-img/zh-rokoko-1.webp) -Then, update **Profile Name** in the **Rokoko Receiver** asset to the Rokoko actor whose motion capture data you would like to receive: +然后,在 **Rokoko 接收器** 组件中将 **Profile Name** 更新为您想要接收动捕数据的Rokoko actor的名字。 ![](/doc-img/zh-rokoko-2.webp) -:::caution -Note: You need a Rokoko Plus or Pro subscription to stream to Unity. +:::注意 +Note: 你需要拥有 Rokoko Plus 或者 Pro 订阅才能将数据流分享到 Unity。 ::: -:::info -For more details, see the [official documentation](https://support.rokoko.com/hc/en-us/articles/4410471183633-Getting-Started-Streaming-to-Unity). +:::更多信息 +如有更多疑问,请您查阅 [Rokoko官方手册](https://support.rokoko.com/hc/en-us/articles/4410471183633-Getting-Started-Streaming-to-Unity). ::: -## Calibration +## 校正 -Calibration of Rokoko hardware is done in Rokoko Studio. +Rokoko 动捕硬件的校正需要在 Rokoko Studio中进行。 -## Frequently Asked Questions +## 常见问题{#FAQ} -Please refer to [Overview](overview#FAQ) and [Customizing Pose Tracking](body-tracking#FAQ) for common questions. +常见问题请参考[动作捕捉方案一览](overview#FAQ)以及[姿态追踪](body-tracking#FAQ)。 -### My tracking drifts over time. +### 我的动捕结果经常会自己平移 -This is a common issue with inertial motion capture systems, which drift over time due to accumulated errors. To reduce the drift, make sure there are no magnetic or electrical interference near the tracking sensors. + +这种随着时间一点点增大,导致动捕结果平移的误差是惯性动捕系统的常见问题。请尽量减少动捕设备周边的电磁干扰来减少漂移问题。 - diff --git a/docs/mocap/stretchsense.md b/docs/mocap/stretchsense.md index 265486f..3d9c5fe 100644 --- a/docs/mocap/stretchsense.md +++ b/docs/mocap/stretchsense.md @@ -2,23 +2,23 @@ sidebar_position: 130 --- -# StretchSense Glove +# StretchSense 手套 -Finger tracking via [StretchSense Glove](https://stretchsense.com/). +使用[StretchSense 手套](https://stretchsense.com/)进行手指动作捕捉。 -## Setup +## 初始设置 -Warudo supports StretchSense Glove via the [VMC protocol](./vmc). To use StretchSense Glove with Warudo, select **StretchSense Glove** as **Secondary Pose Tracking** during the onboarding process. +Warudo基于[VMC protocol](./vmc)来支持StretchSense 手套。 要使用StretchSense 手套进行动作捕捉,请在Onboarding期间在菜单中将**Secondary Pose Tracking** 选项设置为**StretchSense Glove**。 ![](/doc-img/en-stretchsense-1.png) -## Calibration +## 校正 -Calibration of StretchSense Glove is done in StretchSense Hand Engine. +StretchSense 手套的校正需要在[StretchSense Hand Engine](https://stretchsense.com/solution/hand-engine/)中进行。 -## Frequently Asked Questions +## 常见问题 -Please refer to [Overview](overview#FAQ) and [Customizing Pose Tracking](body-tracking#FAQ) for common questions. +常见问题请参考[动作捕捉方案一览](overview#FAQ)和[姿态追踪](body-tracking#FAQ)页面。 + diff --git a/docs/mocap/virdyn.md b/docs/mocap/virdyn.md index ea5180f..48f69f1 100644 --- a/docs/mocap/virdyn.md +++ b/docs/mocap/virdyn.md @@ -4,25 +4,25 @@ sidebar_position: 110 # Virdyn Studio (VDMocapStudio) -Body tracking via [VDMocapStudio](https://www.virdynm.com/virdyn-vdmocap-studio-motion-capture-software-system-for-vdsuit-full-product/). Requires a [VDSuit](https://www.virdynm.com/virdyn-vdsuit-full-for-full-body-function-inertia-motion-capture-suit-product/) suit. +使用虚拟动力出品的 [Virdyn Studio(VDMocapStudio)](https://www.virdynm.com/virdyn-vdmocap-studio-motion-capture-software-system-for-vdsuit-full-product/)软件进行动作捕捉。 这个动捕方法需要使用官方售卖的 [Virdyn(虚拟动力)VDSuit](https://www.virdynm.com/virdyn-vdsuit-full-for-full-body-function-inertia-motion-capture-suit-product/) 动捕服装。 -## Setup +## 初始设置 -Open VDMocapStudio and enable data streaming by clicking **Broadcast → OpenShare**. Make sure the IP selected is accessible from Warudo. +请先打开 [VDMocapStudio](https://www.virdynm.com/virdyn-vdmocap-studio-motion-capture-software-system-for-vdsuit-full-product/) 软件,点击软件中的 **Broadcast → OpenShare** 以打开数据串流,并请确保您选择的IP地址能被Warudo访问。 ![](/doc-img/en-virdyn-1.png) -## Calibration +## 校正 -Calibration of Virdyn hardware is done in VDMocapStudio. You can also use **Virdyn Studio Receiver → Calibrate Root Transform** to reset the character's root position and rotation in Warudo. +Virdyn硬件的校正需要在VDMocapStudio中完成。你也可以使用Warudo中的 **Virdyn Studio Receiver → Calibrate Root Transform** 功能,来将角色重置为默认的位置和方向。 -## Frequently Asked Questions +## 常见问题 -Please refer to [Overview](overview#FAQ) and [Customizing Pose Tracking](body-tracking#FAQ) for common questions. +常见问题请参考[动作捕捉方案一览](overview#FAQ)和[姿态追踪](body-tracking#FAQ)页面。 -### My tracking drifts over time. +### 我的动捕结果经常会自己平移 -This is a common issue with inertial motion capture systems, which drift over time due to accumulated errors. To reduce the drift, make sure there are no magnetic or electrical interference near the tracking sensors. +这种随着时间一点点增大,导致动捕结果平移的误差是惯性动捕系统的常见问题。请尽量减少动捕设备周边的电磁干扰来减少漂移问题。