MOG background subtraction CPU vs GPU implementation differences.
Looking through the files I noticed the following thing.
In the CPU version, in the BackgroundSubtractorMOG class the parameters of the model
Size frameSize;
int frameType;
Mat bgmodel;
int nframes;
int history;
int nmixtures;
double varThreshold;
double backgroundRatio;
double noiseSigma;
are of PROTECTED type. Meaning that if you want to make a modified version of the MOG background subtraction method, you can just extend the class.
On the other hand, in the GPU version MOG_GPU, the parameters of the model
int nmixtures_;
Size frameSize_;
int frameType_;
int nframes_;
GpuMat weight_;
GpuMat variance_;
GpuMat mean_;
GpuMat bgmodelUsedModes_; //keep track of number of modes per pixel
are of PRIVATE type. Making it impossible to make a modified background subtraction method by extending the class. Is this just an oversight, or is there some kind of weird CUDA limitation, that would prevent the methods of the extended class from being able to work on those variables anyways?