1 | initial version |
You are seeing calculated depths, which, frame to frame, are calculated as very different disparities from your stereo cameras. Some have the appearance of a moving pattern of depth changes across the scene.
Your depth calculation problems are a direct result of several things:
Pulsed lighting and short exposure combine to create a problem where each camera picks up a different overall scene illumination intensity, and frame to frame an individual camera's scene intensity also varies wildly.
To correct the pulsed lighting and short exposure problem, change the lighting to constant, e.g. incandescent, so that the lighting intensity level is constant. Alternatively, increase the exposure time to greater than the period of the lighting intensity period - typically, an exposure of 30ms or so is generally long enough to smooth these out - with a corresponding decrease in aperture, ISO, or gain setting, so that the images aren't blown out.
Uncoordinated shutters cause multiple problems; if they are offset in time but the same frame period, then the two cameras with the pulsed lighting an short exposure time have a much different scene intensity. In the disparity algorithm this intensity offset is calculated as a difference in depth from the camera, resulting in a lot of the massive frame to frame fluctuation in calculated depth in the low contrast areas of the image. As an aside, another problem with unsynchronized shutters, is that a moving object captured at different times will have a different depth calculation, depending on direction of movement left or right, than disparity calculated from images taken at the same time.
The second major uncoordinated shutter problem is if the two cameras are running on internal exposure timers; one may be taking exposures at a rate that is a few percentage different from the other. This will create a periodic pattern to the depth fluctuations already intrinsic in the short exposure and uncoordinated shutter problem. I think that is what you see with the 1s or so repeating pattern of disparity error.s
To coordinate shutters, the exposures of the cameras should be slaved to a common externally generated trigger signal. With these changes, the pulsed lighting intensity change will become much less significant frame-to-frame.
Large low contrast areas in the scene, means big areas of the scene (larger than the disparity comparison block size) have no high-contrast spots or edges. Without high-contrast spots or edges in the image which are unique, crisply located, and easy for the algorithm to identify the common feature in both cameras, the disparity algorithm has to resort to a local intensity comparison between the left and right frames to compute disparity. With short exposure pulsed lighting uncoordinated shutters, this is a recipe for large areas of disparity calculation mistakes which are also surprisingly smooth. These are evident in your video.
The semi-global block match stereo disparity algorithm may be able to partially correct for the global intensity shifts between the two images, in a way that the standard block match is too naive to handle - at significant processing cost. However, the low contrast scene with gradual depth change is a hard problem to solve. So, let's be practical: to correct for a low contrast scene, you need to make it a high contrast scene - paint high contrast patterns on the scene. There are a couple ways to do this.:
The pattern needs to have crisp edges, and be roughly the same size and spacing as the disparity algorithm's block size. The type of projector doesn't matter, so long as it is high resolution - a slide projector or laser pattern projector will work - look up "Structured Light projector". The projected pattern can be in a visual frequency human vision is not sensitive to, e.g. infrared light, but the cameras need to be sensitive to this light and any filter in the camera or the camera optical tract must be transparent to this frequency. You will need to adjust the intensity to contrast correctly with the other scene illumination. A projector can run at continuous intensity or be pulsed to be synchronized with the camera exposure or illumination pulse - this can be valuable to e.g. alternate images with and without the projector patterns - the pattern images are for depth detection and the non-pattern images are for presentation to human users.
Note that if you use a calibration target printed with a periodic pattern, that any pattern projector you use needs to be turned off during calibration as it will interfere with finding the edges or crossing points.
Hope this helps, let us know how it goes.
2 | No.2 Revision |
You are seeing calculated depths, which, frame to frame, are calculated as very different disparities from your stereo cameras. Some have the appearance of a moving pattern of depth changes across the scene.
Your depth calculation problems are a direct result of several things:
Pulsed lighting and short exposure combine to create a problem where each camera picks up a different overall scene illumination intensity, and intensity. Also, frame to frame frame, images from an individual camera's scene intensity also varies wildly.vary wildly in intensity.
To correct the pulsed lighting and short exposure problem, change the lighting to constant, e.g. incandescent, so that the lighting intensity level is constant. constant relative to a camera frame period. Alternatively, increase the exposure time to greater than the period of the lighting intensity pulse period - typically, an exposure of 30ms or so is generally long enough to smooth these out - with a corresponding decrease in aperture, ISO, or gain setting, so that the images aren't blown out.
Uncoordinated shutters cause multiple problems; if problems. If they are offset in time but have the same frame period, then the two cameras with (with the pulsed lighting an and short exposure time time) have a much different scene intensity. In the disparity algorithm in low contrast areas especially (more on this in a moment), this intensity offset is calculated as a difference in depth from the camera, resulting in a lot of the massive frame to frame fluctuation in calculated depth in the low contrast areas of the image. depthin your video. As an aside, another problem with unsynchronized uncoordinated shutters, is that a moving object captured at different times by left and right cameras will have a different depth calculation, depending on direction and speed of movement left or right, than disparity calculated from images taken at the same time.right.
The second major uncoordinated shutter problem is if the two cameras are running on internal exposure frame timers; one may be taking exposures at a rate that is a few percentage different from the other. This will create a periodic pattern to the depth fluctuations already intrinsic in the short exposure and uncoordinated shutter problem. I think that is what you see with the 1s pattern of disparity errors repeating about every second or so repeating pattern of disparity error.s- it's a multiple beat frequency effect of the two out of sync frame timers and the lighting intensity changes.
To coordinate shutters, frame start times, the exposures of the cameras should be slaved to a common externally generated trigger signal. With these changes, the any pulsed lighting intensity change will become much less significant frame-to-frame.
Large low contrast areas in the scene, means big areas of the scene (larger than the disparity comparison block size) have no high-contrast spots or edges. Without high-contrast spots or edges in the image which are unique, crisply located, and easy for the algorithm to identify the common feature in both cameras, the disparity algorithm has to resort to a local intensity comparison between the left and right frames to compute disparity. With short exposure pulsed lighting uncoordinated shutters, this is a recipe for large areas of disparity calculation mistakes which are also surprisingly smooth. These are evident in your video.
The semi-global block match stereo disparity algorithm may be able to partially correct for the global intensity shifts between the two images, in a way that the standard block match is too naive to handle - at significant (10x to 20x) processing cost. cost (and more algorithm/scene tuning parameters). However, the low contrast scene with gradual depth change is a hard problem to solve. solve even for SGBM.
So, let's be practical: to correct for a low contrast scene, you need to make it a high contrast scene - paint high contrast patterns on the scene. There are a couple ways to do this.:
The pattern needs to have crisp edges, and be have features roughly the same size and spacing as the disparity algorithm's block size. The type of projector doesn't matter, so long as it is high resolution - a slide projector or laser pattern projector will work - look up "Structured Light projector". The projected pattern can be in a visual frequency human vision is not sensitive to, e.g. infrared light, but the cameras need to be sensitive to this light and any filter in the camera or the camera optical tract must be transparent to this frequency. You will need to adjust the projector intensity to contrast correctly with the other ambient scene illumination. A projector can run at continuous intensity or be pulsed to be synchronized with the camera exposure or illumination pulse - this can be valuable to e.g. alternate images with and without the projector patterns - the pattern images are for depth detection and the non-pattern images are for presentation to human users.
Note that if you use a calibration target printed with a periodic pattern, that any pattern projector you use needs to be turned off during calibration as it will interfere with finding the edges or crossing points.
Hope this helps, let us know how it goes.