Skip to content
This repository has been archived by the owner on Jul 26, 2024. It is now read-only.

Wrong output from /depth_to_rgb and /rgb_to_depth topics #110

Open
carcormir opened this issue Jan 20, 2020 · 2 comments
Open

Wrong output from /depth_to_rgb and /rgb_to_depth topics #110

carcormir opened this issue Jan 20, 2020 · 2 comments
Assignees
Labels
bug Something isn't working investigate Used when investigation needed ROS1 triage approved The Issue has been reviewed and approved for development by the Azure Kinect ROS Driver Team

Comments

@carcormir
Copy link

Hi,

My problem is simple, after running the driver.launch file, I launch rviz to visualize the camera output. When I plot the output the color image does not match the depth image ( /depth_to_rgb or /rgb_to_depth topics).

I thought this could be a calibration issue between the ir and color cameras but I have tried other two kinect azure platforms and the same issue occurs.

This is a screenshot of my rviz session:

azure_kinect_calibration

These are the arguments of my launch file:

<arg name="depth_enabled" default="true" /> <!-- Enable or disable the depth camera --> <arg name="depth_mode" default="NFOV_UNBINNED" /> <!-- Set the depth camera mode, which affects FOV, depth range, and camera resolution. See Azure Kinect documentation for full details. Valid options: NFOV_UNBINNED, NFOV_2X2BINNED, WFOV_UNBINNED, WFOV_2X2BINNED, and PASSIVE_IR --> <arg name="color_enabled" default="true" /> <!-- Enable or disable the color camera --> <arg name="color_format" default="bgra" /> <!-- The format of RGB camera. Valid options: bgra, jpeg --> <arg name="color_resolution" default="1080P" /> <!-- Resolution at which to run the color camera. Valid options: 720P, 1080P, 1440P, 1536P, 2160P, 3072P --> <arg name="fps" default="15" /> <!-- FPS to run both cameras at. Valid options are 5, 15, and 30 --> <arg name="point_cloud" default="true" /> <!-- Generate a point cloud from depth data. Requires depth_enabled --> <arg name="rgb_point_cloud" default="true" /> <!-- Colorize the point cloud using the RBG camera. Requires color_enabled and depth_enabled --> <arg name="point_cloud_in_depth_frame" default="true" /> <!-- Whether the RGB pointcloud is rendered in the depth frame (true) or RGB frame (false). Will either match the resolution of the depth camera (true) or the RGB camera (false). --> <arg name="required" default="false" /> <!-- Argument which specified if the entire launch file should terminate if the node dies --> <arg name="sensor_sn" default="" /> <!-- Sensor serial number. If none provided, the first sensor will be selected --> <arg name="recording_file" default="" /> <!-- Absolute path to a mkv recording file which will be used with the playback api instead of opening a device --> <arg name="recording_loop_enabled" default="false" /> <!-- If set to true the recording file will rewind the beginning once end of file is reached --> <arg name="body_tracking_enabled" default="false" /> <!-- If set to true the joint positions will be published as marker arrays --> <arg name="body_tracking_smoothing_factor" default="0.0" /> <!-- Set between 0 for no smoothing and 1 for full smoothing --> <arg name="rescale_ir_to_mono8" default="true" /> <!-- Whether to rescale the IR image to an 8-bit monochrome image for visualization and further processing. A scaling factor (ir_mono8_scaling_factor) is applied. --> <arg name="ir_mono8_scaling_factor" default="0.65" /> <!-- Scaling factor to apply when converting IR to mono8 (see rescale_ir_to_mono8). If using illumination, use the value 0.5-1. If using passive IR, use 10. --> <arg name="imu_rate_target" default="0"/> <!-- Desired output rate of IMU messages. Set to 0 (default) for full rate (1.6 kHz). -->

Any idea on how to approach to this? Should I do my own extrinsic calibration?

Thanks in advance for the help

@carcormir carcormir added bug Something isn't working triage needed The Issue still needs to be reviewed by the Azure Kinect ROS Driver Team labels Jan 20, 2020
@ooeygui ooeygui self-assigned this Jan 20, 2020
@ooeygui ooeygui added investigate Used when investigation needed ROS1 triage approved The Issue has been reviewed and approved for development by the Azure Kinect ROS Driver Team and removed triage needed The Issue still needs to be reviewed by the Azure Kinect ROS Driver Team labels Jun 28, 2020
@AswinkarthikeyenAK
Copy link

Hi @ooeygui ,
Is there a way to synchronize the color and the depth images in ros? Also, How do you find the devices depth settings? For instance I would like to know if the depth setting is in [m] or [mm]?

Thanks

@DanielTakeshi
Copy link

Also curious about how to synchronize color and depth images? The images have different sizes.
Also depth should generally be in meters (but this can be verified by measuring with a ruler).

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working investigate Used when investigation needed ROS1 triage approved The Issue has been reviewed and approved for development by the Azure Kinect ROS Driver Team
Projects
None yet
Development

No branches or pull requests

4 participants