I am trying to create an HDR image from 7 images taken with different exposure values on iOS. I used the OpenCV cocoapod and followed this GitHub repository to implement HDR processing. I have got the fusion image working and it has a decent picture output, however, the HDR version produces a bad output. It somewhat looks like the image is a negative version of what it should be.
Here is the image:
I have tried to mess around with my code to produce 8-bit and 32-bit versions of this image since that's what I read on some other questions.
I also included my HDR.cpp file that does the merging to HDR:
cv::Mat mergeToHDR (vector<Mat>& images, vector<float>& times)
{
imgs = images;
Mat response;
Ptr<CalibrateDebevec> calibrate = createCalibrateDebevec();
calibrate->process(images, response, times);
// Ptr<CalibrateRobertson> calibrate = createCalibrateRobertson();
// calibrate->process(images, response, times);
// create HDR
Mat hdr;
Ptr<MergeDebevec> merge_debevec = createMergeDebevec();
merge_debevec->process(images, hdr, times, response);
// create fusion
// Mat fusion;
// Ptr<MergeMertens> merge_mertens = createMergeMertens();
// merge_mertens->process(images, fusion);
// fusion
// Mat fusion8bit;
// fusion = fusion * 255;
// fusion.convertTo(fusion8bit, CV_8U);
// return fusion8bit;
// hdr
Mat hdr8bit;
hdr = hdr * 255;
hdr.convertTo(hdr8bit, CV_8U);
return hdr8bit;
}
Let me know if you need any more of my code.