Bad Image Rectification after Stereo Calibration and Image Rectification (From 2D to 3D)
34 views (last 30 days)
Show older comments
Hi,
I'm not new to matlab (> 20 y experience), but new to 3D vision and the stereo calibration (app).
Here is what I have: A setup with 2 camera's (FLIR) at about 60 .. 70 cm above a plate where a sample is photographed.
Setup:
Pictures:
I have taken pictures from the checkerboard pattern following the guideline: Here some samples together with the checkerboard pattern.
Problem:
Using the "Stereo Camera Calibrator", I can feed it with good quality pictures from the checkerboard pattern.
the program can nicely identify the the control points:
However, when I show the rectified view, it shows the following:
... which at first sight showed OK to me (the projection of both images is OK with respect to the horizontal lines),
but, one picture is shown completely left and the other completely right.
Which results in (using stereoAnaglyph)
What am I doing wrong?
I thought this setup is quite controlled, with fixed camera position (and ability to measure distances and angels). Is there a way to feed the "Stereo Camera Calibration" algorithm with more inputs (these known distances?) and do the optimization using this preset?
Looking forward to your suggestions,
Thanks,
Jan
2 Comments
KALYAN ACHARJYA
on 11 Apr 2022
Very good, quite detailed question, but it is more towards conceptual/technical rather than MATLAB, hope respected members can add advices.
Answers (4)
hongliang
on 22 Feb 2024
Hello,you don't hve to worry at all, the calibration you get is fairly fine.Reason for the separation is that your system's disparity is too large. I met the same problem recently and has solved it.
This is my calibration result:
My results (using stereoAnaglyph):
You can measure the disparity roughly by the imageViewer app in image processing toolbox in matlab:
you can see the disparity range is about 500-550, in this case ,you cannot use disparityBM to generate disparityMap,for it's maximum disparityRange lays within [0 256]. You should use disparitySGM instead,for there is no up upper limit for disparityRange. The only requirement is that the difference should below 128;
This is what disparitySGM function generate:
This is the corresponding 3d reconstruction for the image:
You can see the result is fine.
Instead, if you use the disparityBM function regardless of the large disparity, you will get this results:
this is impossible for 3d reconstruction.
So,there is no need to worry about the large disparity and just choose the right function to continue processing it.
However, i have run the same calibration and SGM match use opencv functions in visual studio, the result is fairly normal,there is no such long black bars and the image was not extended so much. This is strange
1 Comment
Suryaansh Rathinam
on 22 Mar 2024
Hi @hongliangI am working on a stereo calliberation and depth mapping problem and I am facing the following error: Unable to estimate camera parameters. The 3-D orientations of the calibration pattern might be too similar across images. Remove any similar images and recalibrate the cameras again.
Could we please connect and could you help me if possible: suryaansh2002@gmail.com
Thank you
Benjamin Thompson
on 11 Apr 2022
If you are using this function, note its description in the documentation:
[J1,J2] = rectifyStereoImages(I1,I2,stereoParams) returns undistorted and rectified versions of I1 and I2 input images using the stereo parameters stored in the stereoParams object.
Stereo image rectification projects images onto a common image plane in such a way that the corresponding points have the same row coordinates. This image projection makes the image appear as though the two cameras are parallel. Use the disparityBM or disparitySGM functions to compute a disparity map from the rectified images for 3-D scene reconstruction.
So if the cameras are far apart and not pointing in the same direction, you should expect a greater amount of adjustment in the images. If you then use one of the disparity functions the results may make more sense.
Also, doing calibrations with a smaller checkerboard that is moved to different regions of the shared camera viewing space gives better calibration results. The app requires multiple images do even start calibration, and those images should be with the board in different spots, sampled at the same time by both cameras.
Giridharan Kumaravelu
on 15 Apr 2022
Edited: Giridharan Kumaravelu
on 15 Apr 2022
I agree with Benjamin here on the number of calibration images used in the calibrator app. Two image pairs looking similar in orientation are not enough for calibrating this stereo system.
Try capturing alteast 10 image pairs where the orientation of the checkboards are different like mentioned here: Prepare Camera and Capture Images.
0 Comments
Jan Bienstman
on 20 Apr 2022
1 Comment
Giridharan Kumaravelu
on 20 Apr 2022
Hello Jan,
Here are few suggestions based on your description:
- For the series of calibration images that you used in figure 2, you could try removing the image pair 1 as both the images produce high reprojection errors (> 1 pixel error). You could right click that image thumbnail and select "Remove and Recalibrate" to see if that improves the result.
- You are correct, capturing all images of the calibration pattern in the same plane could be one of the reasons for this poor result.
- For wide baseline stereo systems like your setup, it is sometimes best to calibrate the two cameras individually to produce the cameraParameters objects and then use these as fixed intrinsics and estimate baseline. That is,
- Use the camera calibrator app, to calibrate the two cameras in two sessions and export the parameters to workspace as cameraParamsLeft and cameraParamsRight.
- In the stereo camera calibrator app, after loading your images you can choose to "Use Fixed intrinsics" on the toolbar and load these intrinsics from workspace that were exported in the previous step. Calibrate the stereo camera with these loaded instrinsics.
To answer you question for the alternative workflow:
- You can create the stereoParameters object with known distances and angles from the MATLAB commandline using second syntax shown here: https://www.mathworks.com/help/vision/ref/stereoparameters.html?s_tid=doc_ta#d123e126627
- The following page explains the pinhole model and the distortion model used inside: https://www.mathworks.com/help/vision/ug/camera-calibration.html
Hope this helps,
Giridharan
See Also
Categories
Find more on Camera Calibration in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!