Aligning 3D stereo co-ordinate system along local vertical and local horizontal

3 views (last 30 days)
Using stereo vision, I am able to reconstruct the object under inspection.
After calibration, I know the co-ordinate system which stereo setup uses is mentioned here (it is wrt optical centre of Camera 1).
I have noticed that the same co-ordinate system may not be followed in real world 3D. For example, the local vertical (of a place, which can be found using a spirit level) need not align with the vertical axis (y-axis) of my stereo setup's co-ordinate system.
Suppose I move my object only along the local vertical, without any change in its x position (i.e. local horizontal of that place), according to the stereo co-ordinate system used by the cameras, my object would have moved along y axis (obviously), and also along the x-axis of camera (which is not right, since according to the real world, my object hasn't been moved along local horizontal at all!).
How can I tackle this issue? I am aligning the stereo setup according to the local vertical and local horizontal (using a spirit level), as well as my object. Still, when I move it along local vertical only, there is a few mm change in x-coordinate reading as well. Any inputs regarding this would be appreciated.
  1 Comment
Meghana Dinesh
Meghana Dinesh on 29 Jun 2015
Edited: Meghana Dinesh on 1 Jul 2015
I think it's similar to this question. But mine is in 3D. What are the relevant functions on MATLAB I can use?
The co-ordinate system considered by my Camera1 is different compared to the real world's. For example, the y-axis of Stereo Camera1 isn't the same as (not parallel to) the local vertical. This is transformation between two 3D co-ordinate systems. Right? How can I go about this?
BTW, I have read this. Slide #19 onwards addresses my issue.

Sign in to comment.

Accepted Answer

Dima Lisin
Dima Lisin on 29 Jun 2015
Edited: Dima Lisin on 29 Jun 2015
Hi Meghana,
Please keep in mind that in the camera-based coordinates the X-Y plane is the image plane, which is inside the camera. It may well not be precisely aligned with the camera's outer casing.
If you need a world coordinate system not tied to the camera, then you can define one by placing a checkerboard in your scene. You can then use the extrinsics function to compute the transformation from the checkerboard's coordinates into the camera's coordinates.
  6 Comments
Meghana Dinesh
Meghana Dinesh on 9 Jul 2015
Edited: Meghana Dinesh on 9 Jul 2015
Oh! I did not know this. Thank you.
I want to clarify my understanding. Camera co-ordinates is the co-ordinate system which is used when I calculate Point Cloud. It depends on the physical orientation of the sensor. It considers the optical centre of Camera 1's image sensor as origin.
[rotationMatrix,translationVector] = extrinsics(imagePoints,worldPoints,cameraParams)
In this, imagePoints are the Checkerboard corner points (where the checkerboard is aligned (local vertical and local horizontal) according to the orientation I want. Correct?).
I have a doubt here: worldPoints should be the co-ordinates after triangulation (so my algorithm knows that these are the values of checkerboard points in camera co-ordinate system which should be mapped to the new world co-ordinate system.) Instead, how does it get this information from generateCheckerboardPoints?
If I have this clarity in understanding, I will be able to use these functions more effectively.
Dima Lisin
Dima Lisin on 13 Jul 2015
Hi Meghana,
There is no triangulation here. The checkerboard simply defines a coordinate system. By calling generateCheckerboardPoints you effectively specify points in the Z=0 plane. The extrinsics function then gives you the rotation and translation between this new coordinate system and your camera's coordinate system.

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!