Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

相机参数请求 #6

Open
a3124182425 opened this issue Jan 23, 2025 · 2 comments
Open

相机参数请求 #6

a3124182425 opened this issue Jan 23, 2025 · 2 comments

Comments

@a3124182425
Copy link

您好,我上次向您询问了LuSNAR数据集的相关相机参数,但是多次实验后很遗憾它是错的。我再次认真拜读您的文章没有找到相关信息。但看文章中提到有使用ORB-SLAM进行数据集验证,我根据您提供的参数进行实验,从轨迹预测结果上看在x,y,z三个方向有不同的尺度性误差。不知道您能否提供您在做ORB-SLAM时设置的config.yaml文件以作为参考。十分感谢您的帮助

@Zsy103
Copy link
Collaborator

Zsy103 commented Jan 23, 2025

以下是ORB-SLAM3所使用的配置文件:

%YAML:1.0
#--------------------------------------------------------------------------------------------
# Camera Parameters. Adjust them!
#--------------------------------------------------------------------------------------------

# Camera calibration and distortion parameters (OpenCV) 
Camera.type: "PinHole"
Camera.fx: 610.177840000
Camera.fy: 610.177840000
Camera.cx: 512.0000000000000
Camera.cy: 512.0000000000000

Camera.k1: 0.0
Camera.k2: 0.0
Camera.p1: 0.0
Camera.p2: 0.0

Camera.width: 1024
Camera.height: 1024

# Camera frames per second 
Camera.fps: 10.0

# stereo baseline times fx
Camera.bf: 189.15513

# Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)
Camera.RGB: 1

# Close/Far threshold. Baseline times.
ThDepth: 60.0

Tbc: !!opencv-matrix
   rows: 4
   cols: 4
   dt: f
#data: [0.0, 1.0, 0.0, 0.125,
          #-0.3420201433, 0.0, 0.9396926208, 0,
          #0.9396926208, 0.0, 0.3420201433, 0.0,
          #0.0, 0.0, 0.0, 1.0]
   data: [5.75395780e-17,  1.00000000e+00,  2.09426937e-17,  1.55000000e-01,
          -3.42020143e-01,  3.27925094e-33,  9.39692621e-01, -2.51559075e-01,
          9.39692621e-01, -6.12323400e-17,  3.42020143e-01, -5.73337594e-01,
          0.0, 0.0, 0.0, 1.0]

# IMU noise
IMU.NoiseGyro: 8.7266462e-4 # 1.6968e-04 
IMU.NoiseAcc: 0.02353596 # 2.0000e-3
IMU.GyroWalk: 9.9735023e-8 
IMU.AccWalk: 1.2481827e-6 # 3.0000e-3
IMU.Frequency: 100
#--------------------------------------------------------------------------------------------
# Stereo Rectification. Only if you need to pre-rectify the images.
# Camera.fx, .fy, etc must be the same as in LEFT.P
#--------------------------------------------------------------------------------------------
LEFT.height: 1024
LEFT.width: 1024
LEFT.D: !!opencv-matrix
   rows: 1
   cols: 5
   dt: d
   data: [0.0, 0.0, 0.0, 0.0, 0.0]
LEFT.K: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [610.177840000, 0.0, 512.000, 0.0, 610.177840000, 512.000 , 0.0, 0.0, 1.0]
LEFT.R:  !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
LEFT.P:  !!opencv-matrix
   rows: 3
   cols: 4
   dt: d
   data: [610.177840000, 0.0, 512.000, 0.0,  0.0, 610.177840000, 512.000, 0.0 , 0.0, 0.0, 1.0, 0.0]

RIGHT.height: 1024
RIGHT.width: 1024
RIGHT.D: !!opencv-matrix
   rows: 1
   cols: 5
   dt: d
   data: [0.0, 0.0, 0.0, 0.0, 0.0]
RIGHT.K: !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [610.177840000, 0.0, 512.000, 0.0, 610.177840000, 512.000 , 0.0, 0.0, 1.0]
RIGHT.R:  !!opencv-matrix
   rows: 3
   cols: 3
   dt: d
   data: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
RIGHT.P:  !!opencv-matrix
   rows: 3
   cols: 4
   dt: d
   data: [610.177840000, 0.0, 512.000, -610.177840000,  0.0, 610.177840000, 512.000, 0.0 , 0.0, 0.0, 1.0, 0.0]

#--------------------------------------------------------------------------------------------
# ORB Parameters
#--------------------------------------------------------------------------------------------

# ORB Extractor: Number of features per image
ORBextractor.nFeatures: 600

# ORB Extractor: Scale factor between levels in the scale pyramid 	
ORBextractor.scaleFactor: 1.2

# ORB Extractor: Number of levels in the scale pyramid	
ORBextractor.nLevels: 8

# ORB Extractor: Fast threshold
# Image is divided in a grid. At each cell FAST are extracted imposing a minimum response.
# Firstly we impose iniThFAST. If no corners are detected we impose a lower value minThFAST
# You can lower these values if your images have low contrast			
ORBextractor.iniThFAST: 100
ORBextractor.minThFAST: 7

#--------------------------------------------------------------------------------------------
# Viewer Parameters
#--------------------------------------------------------------------------------------------
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1
Viewer.GraphLineWidth: 0.9
Viewer.PointSize: 2
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3
Viewer.ViewpointX: 0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -1.8
Viewer.ViewpointF: 500
Viewer.imageViewScale: 2

@Zsy103
Copy link
Collaborator

Zsy103 commented Jan 23, 2025

SLAM估计的位姿是初始化成功那一帧的左目相机坐标系,而数据集中提供的位姿真值是小车本体系在世界坐标系下的位姿,因此将SLAM结果直接与世界坐标系进行对比是不合理的,需要根据外参以及初始化成功那一帧的位姿真值,将所有SLAM估计的位姿都转换到世界系下。但是无论是否进行转换,如果是双目SLAM,尺度差异应该只是由累积误差产生的,短时间内是没有尺度差异的。你可以使用evo工具展示一下如下命令的结果:

  evo_traj tum --ref=gt.tum slam.tum -pva

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants