Why is elemSize 6 and depth 2 of rgb Mat after CV_BayerBG2BGR ?
When using OpenCV RAW to RGB conversion (CV_BayerBG2BGR), the image is displayed correctly but the image basic types are incorrect (elemSize & depth).
Even writing the converted Mat to file and then loading it (rgb_path in code below), there is a discrepancy between the loaded rgb image and the converted rgb image, although both display fine.
This causes an issue downstream where I convert from Mat to uint8_t* as the buffer size is larger in the converted rgb image.
Is this an issue with the conversion itself or my understanding of the conversion / OpenCV basic data types? I am using OpenCV 341.
int main() {
Mat img = imread(rgb_path);
ifstream ifd(raw_path, ios::binary | ios::ate);
int size = ifd.tellg();
ifd.seekg(0, ios::beg);
vector<char> buffer;
buffer.resize(size);
ifd.read(buffer.data(), size);
Mat rgb_image;
Mat raw_image(600, 800, CV_16UC1, buffer.data());
cvtColor(raw_image, rgb_image, CV_BayerBG2BGR);
cout << "elemSize() orig: " << img.elemSize() << endl;
cout << "elemSize() conv: " << rgb_image.elemSize() << endl;
cout << "channels() conv: " << rgb_image.channels() << endl;
cout << "channels() orig: " << img.channels() << endl;
cout << "depth() conv: " << rgb_image.depth() << endl;
cout << "depth() orig: " << img.depth() << endl;
return 0;
}
Output:
elemSize() orig: 3
elemSize() conv: 6
channels() conv: 3
channels() orig: 3
depth() conv: 2
depth() orig: 0
what is the value of
size
?depth: 2 == CV_16U
can you check your saved file with an external image viewer ? (it's probably still 16bit there)
also check
imread(raw_path, IMREAD_ANYDEPTH)
(else it will convert to 8bit 3 channel under the hood)