I have a same algorithm working in two computer. One with Ubuntu and other with windows.
In Ubuntu, the algorithm takes a input images through a framework called ADTF. Here the algorithm works in 70fps. Everything is perfect.
In windows, the algorithm takes input images from a video stream without using any framework (visual studio 2012 as an IDE). Here the algorithm works in 2fps.
When I checked for the computation time, I see a function in my algorithm which takes 0.4seconds in windows and 0.011seconds in Ubuntu. And this function has been called 'n' number of times in the process. This function uses only cv::Mat in its argument and some float. I don't think so the data type float is responsible for this big time variation.
I checked the every single line of the c++ codes in my files. Its exactly same in both cases. But only the computation time has been decreased more than 30 times. I have no idea what's happening. Can anyone help me?
Note:: Both the computer have exactly the same configuration.