opencv 3 essentialmatrix and recoverpose
We are currently working on a project using random 3D camera positioning.
We compiled OpenCv 3.0.0 and did our best to use the functions findEssentialMat & recoverPose.
In our problem, we have two openGL cameras cam1 and cam2, which observe the same 3D object. cam1 and cam 2 have the same intrinsic parameters (resolution, focal and pp) On each capture from those cameras, we are able to identify a set of matched points (8 points per set)
The extrinsic parameters of cam1 are known.
The objective of our work is to determine the extrinsic parameter of cam2.
So we use
float focal = 4.1f;
cv::Point2d pp(0,0);
double prob = 0.999;
double threshold = 3.0;
int method = cv::RANSAC;
cv::Mat mask;
cv::Mat essentialMat = cv::findEssentialMat(points1, points2, focal, pp, method, prob, threshold, mask);
then we apply
cv::Mat T;
cv::Mat R;
cv::recoverPose(essentialMat, points1, points2, R, T, focal, pp, mask);
in order to get R the relative rotation matrix and T the relative translation matrix.
From that, we tried to apply R and T to cam1 extrinsic parameter without success. Could you help us determine how to obtain cam2 translation and orientation from cam1, R and T?
By advance thanks a lot
@jaystab Did you resolve your issue (I've got the same)?
One question is whether the 8 correspondences you have are actually noise-free. In theory, only 5 are required (which should not be coplanar or close to it), but in reality, you need much more because the measurements are imprecise.
How to you track the matched keypoints in the 2nd frame/camera? do you use
cv::calcOpticalFlowPyrLK
?