Ask Your Question
0

stereo camera triangulatepoints returns nan

asked 2016-12-12 05:22:52 -0600

stillNovice gravatar image

I have a stereo camera setup (a Zed stereo camera). I have calibrated it using the Matlab stereocalibrate app, which gives what looks like a correct calibration. What I am trying to do, is find a chessboard in the camera view, and return the 3d coordinates of the points. However, what I get is a matrix of 0s and 1s, that returns nan when I try to convert it. My code is:

//setup
    int width = zed->getImageSize().width;
    int height = zed->getImageSize().height;

    cv::Mat frameLeft(height, width, CV_8UC4);  
    cv::Mat frameRight(height, width, CV_8UC4);

//fill Mats with Zed frames
sl::zed::Mat rightSide = zed->retrieveImage(static_cast<sl::zed::SIDE> (RIGHT));
slMat2cvMat(zed->retrieveImage(static_cast<sl::zed::SIDE> (RIGHT))).copyTo(frameRight);

sl::zed::Mat leftSide = zed->retrieveImage(static_cast<sl::zed::SIDE> (LEFT));
slMat2cvMat(zed->retrieveImage(static_cast<sl::zed::SIDE> (LEFT))).copyTo(frameLeft);

//get chessboard in frames
cv::Mat cornersLeft, cornersRight;
cv::Size boardSize(8, 6);

bool foundL = cv::findChessboardCorners(frameLeft, boardSize, cornersLeft, CV_CALIB_CB_ADAPTIVE_THRESH);
bool foundR = cv::findChessboardCorners(frameRight, boardSize, cornersRight, CV_CALIB_CB_ADAPTIVE_THRESH);

//setup Intrinsic matrices (from Matlab calibration)
double k1L = -0.168446;
double k2L = 0.0225543;
double p1L = 0;
double p2L = 0;

double k1R = -0.1685;
double k2R = 0.0223743;
double p1R = 0;
double p2R = 0;

cv::Mat distCoeff1 = (cv::Mat1d(1, 4) << k1L, k2L, p1L, p2L);
cv::Mat distCoeff2 = (cv::Mat1d(1, 4) << k1R, k2R, p1R, p2R);

cv::Mat camMat1 = (cv::Mat1d(3, 3) << 672.208668146945, 0,  0,  0,  677.681105920833,   0,
645.529924473109,   342.811492652937,   1);

cv::Mat camMat2 = (cv::Mat1d(3, 3) << 674.107271093112, 0,  0,  0,  678.998021719537,   0,      643.222825318966,   343.244345701717,   1);

//undistort points

cv::Mat UndistCoords1(1, 4, CV_8UC4);
cv::Mat UndistCoords2(1, 4, CV_8UC4);
cv::undistortPoints(cornersLeft, UndistCoords1, camMat1, distCoeff1);
cv::undistortPoints(cornersRight, UndistCoords2, camMat2, distCoeff2);

// Compute projection matrix for camera 1/2

cv::Mat rvec = (cv::Mat3d(3,3) << 0.999991782928634, -0.000307654498095364, -0.00404220532902106,
                0.000310337436529291,   0.999999731980639,  0.000663120897690960,
                0.00404200023350482, -0.000664369896418950, 0.999991610388184);

cv::Mat tvec = (cv::Mat1d(1, 3) << -119.474115546361,   0.0294666455704870, 0.770660705965257);


cv::Mat projMat1 = computeProjMat(camMat1, rvec, tvec);
cv::Mat projMat2 = computeProjMat(camMat2, rvec, tvec);

                //triangulate points
cv::Mat triangCoords4D;
cv::triangulatePoints(projMat1, projMat2, UndistCoords1, UndistCoords2, triangCoords4D);

std::cout << triangCoords4D << endl;

            //convert to 3d
cv::Vec4d triangCoords1 = triangCoords4D.col(0);
cv::Vec4d trinagCoords2 = triangCoords4D.col(1);
cv::Vec3d Coords13D, Coords23D;
for (unsigned int i = 0; i < 3; i++) {
        Coords13D[i] = triangCoords1[i] / triangCoords1[3];
        Coords23D[i] = trinagCoords2[i] / trinagCoords2[3];
}

My computeProjMat function is:

cv::Mat computeProjMat(cv::Mat camMat, cv::Mat rotMat, cv::Mat transVec)
{
    //pass the camera rotation & translation matrices here instead. 
    //combine to get 3 x 4 transfom matrix
    cv::Mat RTMat(3, 4, CV_64F);

    RTMat.at<double>(0, 0) = rotMat.at<double>(0, 0);
    RTMat.at<double>(0, 1) = rotMat.at<double>(0, 1);
    RTMat.at<double>(0, 2) = rotMat.at<double>(0, 2);
    RTMat.at<double>(1, 0) = rotMat.at<double>(1, 0);
    RTMat.at<double>(1, 1) = rotMat.at<double>(1, 1);
    RTMat.at<double>(1, 2 ...
(more)
edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted
0

answered 2016-12-12 06:08:05 -0600

Ice_T02 gravatar image

updated 2016-12-12 06:52:39 -0600

Hello,

i think you did a mistake here:

cv::undistortPoints(cornersLeft, UndistCoords1, camMat1, distCoeff1);

Use the following:

cv::undistortPoints(cornersLeft, UndistCoords1, camMat1, distCoeff1, cv::noArray(), camMat1);

... for additional information see

Instead of writing every value into the projection matrix you can also use something like:

rotMat.copyTo(RTMat(cv::Rect(cv::Point(0, 0), rotMat.size())));

... and append the translation values. But this is just a side note.

Hope this solves your issue.

edit flag offensive delete link more

Comments

Hi, thanks for your reply, I have made the changes, but I am still getting this as my point coordinates:

    [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0;
 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1;
 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 ...
(more)
stillNovice gravatar imagestillNovice ( 2016-12-12 07:55:13 -0600 )edit
1

Did you changed in in both lines (for cornersLeft and cornersRight)? Also this seems to be incorrect:

cv::Mat projMat1 = computeProjMat(camMat1, rvec, tvec);
cv::Mat projMat2 = computeProjMat(camMat2, rvec, tvec);

You use two times the same rotation and translation vectors for 2 indipendent cameras.

Ice_T02 gravatar imageIce_T02 ( 2016-12-12 08:05:05 -0600 )edit

Ah.. that could be it. Should this be zero values for one camera, and the rvec/tvec for the other?

stillNovice gravatar imagestillNovice ( 2016-12-12 08:07:41 -0600 )edit

No, that would be incorrect as well. I think what you are looking for is this. EDIT: Update link to cv::stereoRectify()

Ice_T02 gravatar imageIce_T02 ( 2016-12-12 08:13:29 -0600 )edit

THanks! So that gives me my Projection matrices, correct? So I can get rid of computeProjMat?

stillNovice gravatar imagestillNovice ( 2016-12-12 08:48:58 -0600 )edit

Ok, I have added:

  cv::Mat R1, R2, P1, P2, Q;
                cv::stereoRectify(camMat1, distCoeff1, camMat2, distCoeff2, fSize, rvec, tvec, R1, R2, P1, P2, Q);

and it gives me this error:

OpenCV Error: Assertion failed (src.size == dst.size && src.channels() == dst.channels()) in cvConvertScale, file C:\builds\master_PackSlave-win64-vc12-shared\opencv\modules\core\src\convert.cpp, line 5475

Can you see an obvious reason for this? thanks again for your time.

stillNovice gravatar imagestillNovice ( 2016-12-12 08:55:29 -0600 )edit

Ah, solved. Changing all the Mats to CV_64F looks to have fixed it. Thanks you again!

stillNovice gravatar imagestillNovice ( 2016-12-12 09:12:35 -0600 )edit

Yes, this function will take care of computing the projection Matrix. Glad i could help.

Ice_T02 gravatar imageIce_T02 ( 2016-12-12 09:14:28 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2016-12-12 05:22:52 -0600

Seen: 1,250 times

Last updated: Dec 12 '16