Ask Your Question
0

OpenCV 4.1 + Python 3 + CUDA

asked 2019-08-11 06:04:34 -0600

maxp gravatar image

I am going to rewrite feature matching + homography code using CUDA. I know that CUDA is implemented in OpenCV 4.1 and I have already used some of the functions of CUDA, but I'm a beginner, and the documentation is either missing or very poor, so I can not fully figure out and write code similar to the code using the CPU. Please help me deal with this issue. Or am I better off writing code using C++? I want to use Python because it is easier to write a microservice on it.

edit retag flag offensive close merge delete

Comments

Thanks for the answer.

No, in recent versions of OpenCV wrappers for python have appeared, it's just that this is not documented anywhere. Files with generated wrappers for Python (see build/modules/python or build/modules/python_bindings_generator) have CUDA wrappers.

The problem is that because of my level and the lack of normal documentation, I can’t figure it out myself, so I ask for help.

I think this topic should be popular since there is very little information on the Internet on the use of GPU with Python. There are only old questions, where we are talking about old versions of OpenCV, where there were no such CUDA wrappers.

maxp gravatar imagemaxp ( 2019-08-11 07:03:29 -0600 )edit

1 answer

Sort by » oldest newest most voted
0

answered 2019-08-11 11:22:14 -0600

holger gravatar image

Look you touching a "burned" area in opencv now - and you should just invest your time into better things imho: Let me tell you why i think so:

Situation now: 1) Open CV is owned by Intel. This company does not like Nvidea at all. Cuda is a closed source Nvidea technology.

2) GPU support is done in opencv is via OPEN CL. Open CL is open source and should work across mutiple gpu vendors. OPEN CL = "Cross platform gpu computing framework"

3) At least for the DNN module i could realize no difference between cpu or gpu. The code is most likely no optimized
to use gpu at all or opencl code is just not efficient enough. After spending literally days / weeks on this topic "make opencv fast on gpu for dnn/cnn" i just switched to the native solution using a cuda gpu(which was fast).

4) About python: Well python is a nice language with funny concepts of seeing the world compared to other programming languages. But performance is not its strength. Actually why python is so popular in machine learning domain is that you can easily call c / c++ code because of its dynamic typing(or something). The number crunching / heavy lifting is always done in c / c++. So you can think about making cuda calls from python but most likely you will need to write a c / c++ wrapper anyway.

At least on point 4) you are leaving the opencv world and should search somewhere else for help. I for myself just run my model on cuda if possible - otherwise i use opencv - its fastes on cpu

My personal advise: If you want to do face detection - just train a cnn base model and run it on gpu. I would rather invest my time there than to port some code. But your idea is not bad at all - i wanted to do the same!

edit flag offensive delete link more

Comments

Thanks for the answer.

That is, you want to say that OpenCV implements work with the GPU through OPEN CL, but OPEN CL is still working poorly?

But does CUDA work well? Can I take advantage of CUDA by writing code in C++? After all, OpenCV has functions for working with CUDA.

UPD: I came across this discussion (https://answers.opencv.org/question/1...), which says that image processing using the GPU may not be justified at all because it will take even more time to transfer data to the GPU memory than to process the data by GPU. Therefore, yes, another question arose - is it worth using a GPU at all?

maxp gravatar imagemaxp ( 2019-08-12 03:44:05 -0600 )edit

Yes for a single image with a simple operation - using a gpu is maybe even slower than on cpu. you need to transfer the data to the gpu and back.

For other uses cases (massiv vector computation in a neuronal network for example) is makes perfect sense.

So first think about what you want to do and if its worth running on a gpu - thats basic common sense.

holger gravatar imageholger ( 2019-08-12 08:11:42 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2019-08-11 06:04:34 -0600

Seen: 6,252 times

Last updated: Aug 11 '19