Ask Your Question
0

Matrix error in FaceRecognizer predict

asked 2013-02-20 17:42:46 -0600

This is a question I've also asked on Stackoverflow.com

I'm trying to make a FisherFaceRecognizer's predict() method work, but I keep getting an error

Bad argument (Wrong shapes for given matrices. Was size(src) = (1,108000), size(W) = (36000,1).) in subspaceProject, file /tmp/opencv-DCb7/OpenCV-2.4.3/modules/contrib/src/lda.cpp, line 187

I've verified that both source and training images are the same data type, full color. In fact, I even tried copying one to the other just to make sure they were the same.

My code is adapted from the tutorial at http://docs.opencv.org/modules/contrib/doc/facerec/facerec_tutorial.html#fisherfaces

however, my test image is larger than the training images, so I needed to work on a region of interest (ROI) of the right size.

Here's how I read the images and converted sizes. I cloned the ROI matrix because an earlier error message told me the target matrix must be contiguous:

IplImage* img;
img = cvLoadImage( imgName.c_str() );

int height = trainingImages[0].rows;
// Take a subset of the same size as the training images
Mat testSample1(img, Rect( xLoc, yLoc, trainingImages[0].cols,
   trainingImages[0].rows));

Mat testSample;
testSample1.copyTo( testSample);
int testLabel = 1;

// The following lines create an Fisherfaces model for
// face recognition and train it with the images and
// labels read from the given CSV file.

Ptr<FaceRecognizer> model = createFisherFaceRecognizer();
model->train(trainingImages, labels);
cout << " check of data type testSample is " << testSample.type()
  << " images is " <<
   trainingImages[0].type() << endl;
cout << " trainingImages = " << trainingImages[0].elemSize()  <<  " vs
   " << testSample.elemSize() << endl;

int predictedLabel = model->predict(testSample);
//

// I get an exception message at the predict statement.

I know both matrices have type 16, yet somehow it still doesn't believe the matrices are the same size and data type. I've verified that both Mat's are the same dimensions (180x200) and the same elemSize() (3). Is this an internal bug?

edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted
0

answered 2013-02-21 04:19:20 -0600

berak gravatar image

my image loading / preprocessing code looks like this:

Mat load( string imgpath )
{
    cv::Mat gray = cv::imread(imgpath , 0);  // GRAYSCALE!!
    if ( gray.empty() ) 
        return Mat();

    Mat gr_eq;
    cv::equalizeHist( gray, gr_eq);  // maybe not strictly nessecary, but does wonders on the quality!

    return gr_eq.reshape(1,1);       // flatten it to one row
}

so, to sum things up (for both training and testing!) :

  1. it needs grayscale images, not rgb
  2. the fisher and eigen methods expect the image 'flattened', all in one col (or was it row?) the lbp one doesn't
edit flag offensive delete link more

Question Tools

Stats

Asked: 2013-02-20 17:42:46 -0600

Seen: 1,199 times

Last updated: Feb 21 '13