Welcome to YanBot, a self-developed home service robot designed to make your life easier and more comfortable.
YanBot is comprised of several decoupled parts, each with its own unique function.
1. YanBot-Sense
Sense Part. Camrea, Lidar... sensors data publisher. Robot's perception of the surrounding environment including 3D semantic map generation. It includes some repos as module components:
- camera/
- realsense_ros: Refined ROS wrapper for Intel RealSense cameras.
- data/
- robothor_pkg: RoboTHOR dataset. Wrapped in ROS.
- TUM: Visit TUM, and download the rosbag you need.
- slam/
- rtabmap: Real-time Appearance-Based Mapping (RTAB-Map)
- orb_slam3_ros: A ROS implementation of ORB-SLAM3 V1.0 that focuses on the ROS part.
- perception/
- grounded_sam2: Adapted Grounded_SAM_2, realize ROS integration in docker.
- paddlex: a low-code development tool for AI models built on the PaddlePaddle framework.
2. YanBot-Arm
Arm Control Part. Arm control cmds subscriber. Robot arm joint pose publisher. Coordinate system conversion, motion planning, grasping and placing items.:
- arx/
- arx_pkg: ARX Robot Arm ROS package.
3. YanBot-Wheel
AGV Control Part. Odometer data publisher. Move cmds subscriber.
- TODO
- TODO
Human Robot Interaction Part. User information publisher. Voice interaction, screen UI display.
- wakeup/
- snowboy: Refined general purpose offline voice recognition tool. Add yanyan voice model. Wrapped in ROS.
- stt/
- TODO
- tts/
- TODO
5. YanBot-Nav
Navigation Part. Path planning based on 3D indoor maps and 2D dynamic semantic maps.
- TODO
- TODO
The following diagram provides an overview of the whole system architecture of YanBot. This diagram illustrates how each module and their respective components interact with each other to provide a seamless and efficient service.
Feel free to explore each module and component to gain a deeper understanding of how YanBot works. Your contributions and suggestions are always welcome!