So, I have a stereo camera with left and right cameras that are already calibrated. Since the precision of stereo vision highly depends on the calibration, it would be useful if the system can detect whether itself is slightly out of calibration, e.g, due to temperature change or mechanical shock that changes the baseline/rotation of the two cameras slightly
So my thought is for every new image pair taken by the stereo camera, the software try to find matching points between the two images, and recalculate the fundamental matrix to see if there is a big shift. However, finding matching points is error prone, especially when no constrains applied
My question is: since I know there should be just a slight shift of the calibration, is there a way to leverage the original calibration to enable a relaxed epipolar constrains on finding the matching points between the two images? maybe as well as a disparity constrain. e.g., I use the original calibration to calculate the distance of the feature points, and I roughly know the disparity will still be within a certain range even the calibration shifted. With such assumptions, I believe I can effectively avoid mismatched points between left and right images, therefore ensure my new fundamental matrix calculation.
So I wonder is there a convenient way to relax the epipolar constrain by a few pixels, and also specify a numDisparities
for feature point matching?