Hello,
I think I'm having some problems with camera calibration. I'm using the sample calibration program provided with several (20) images taken with an iPhone. I get the camera intrinsic matrix K and the distortion coefficients R. I then load such matrices into another program. This program allows the user to select matching features in 2 different undistorted images from which I can take the Fundamental Matrix F and using K I can get the Essential matrix E = K.t() * F * K.
Afterwards, I test both F and E to check for the epipolar constraint, i.e.: x'Fx=0 or x'Ex= where x and x' are the corresponding the user selected. For every matching point, the test for the fundamental matrix yields values very close to 0, while the one for the essential matrix returns values that are as large as 2694990. This is obviously wrong.
From this I can conclude that I must be doing something wrong. I believe the computation for E is right, so that must leave the calibration. What do I need to do for a good calibration?
Thanks