Ask Your Question
0

Using cv::solvePnP on Lighthouse Data

asked 2017-02-23 16:17:19 -0600

Hello,
I've got a project going, were I try to get the pose of a 3d-tracker, which utilizes the lighthouse basestations from Valve.

The basestations provide laser-sweeps across the tracking-volume and my tracker records the timings when a laser-plane hits one of its ir-sensors. These timings can then be converted into degrees, based on the fact that the laser-planes rotate at exactly 3600RPM.

Since I know exactly where my sensors are placed on the tracker I should be able to get the pose using the cv::solvePnP function.

But I can't figure out what kind of camera-matrix and distortion coefficients I should use. Since a basestation has neither a lens nor a 2d-image-sensor I can't think of a way to calculate the focal-length needed for the camera-matrix.
First I've tried the imagewidth/2 * cot(fov/2) formula, assuming an "image width" of 120, since this is the "domain" of my readings, which leads to a focal-length of 34.641px. But the results were completely off. I've then tried to calculate a focal length for a given scenario (tracker 1m infront of the basestation) which gave me a focal-length of 56.62px. If I place my tracker about 1 meter in front of a basestation the results are plausible but if I move away from that "sweetspot" the results are again completely off.

But since I have no lens there should be no distortion, or am I wrong about that?

If anyone could give me a hint I would be very grateful.

edit retag flag offensive close merge delete

Comments

If you have the direction in degrees, you're already "past" the camera matrix, as it were. I'll say more later, sorry.

Tetragramm gravatar imageTetragramm ( 2017-02-23 18:09:37 -0600 )edit

@RupertVanDaCow, I am working on almost exactly the same thing as you, i'm getting crazy Z values that are way off in distance, but the general X/Y positions appear good to me.. Rotations I haven't even looked at yet, but they look crazy too. How are you converting the laser sweep time to an "image point" for feeding into SolvePNP? I'm basically saying the "camera" is 0-180 degree field of view, and tick times from sync pulse to laser are some small value, like, 90.0868 (out of 180) for about the "middle" of the image. Can you help me with getting the data from sweep times to solvePNP?

kyranf gravatar imagekyranf ( 2017-03-30 15:16:46 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
0

answered 2017-02-23 19:50:34 -0600

Tetragramm gravatar image

Ok, so take your vector (and your two angles make a 3D line of sight, which is a vector). It's a Mat(3,1,CV_64F)

Arbitrarily define your camera matrix as the identity matrix. (If you have an actual camera, you'd use that here) Mat(3,3,CV_64F)

Multiply your vector by the camera matrix. LOS = camMat*LOS;

Now divide your LOS by LOS.at(double)(2).

LOS.at(double)(0) is now your x value, and (1) is your y.

You can put these into solvePnP with the identity camera matrix and I think you'll get good results.

edit flag offensive delete link more

Comments

Ok, first of all thanks for your help.

Multiply your vector by the camera matrix. LOS = camMat*LOS;

But isn't this redundant if I define the camera matrix as the identity matrix?

And about that angle conversion:
If I take my x/y angles and translate them into a LOS vector and then dividing by the z-component, wouldn't that be equal to simply taking the tangens of x and y as the new x and y?

RupertVanDaCow gravatar imageRupertVanDaCow ( 2017-02-23 21:19:23 -0600 )edit

It is redundant in this case, but I'm describing the general case if someone else comes along.

The x/y angles are not the tangent of the x and y. It's also not the tangents of the normalized unit LOS, though that's closer. Your identity camera matrix makes the focal length 1 pixel. So you take the vector intersecting the plane at 1 pixel distance (that's the 1 in the z component) and you get the location on the screen in pixel units.

Tetragramm gravatar imageTetragramm ( 2017-02-23 21:45:13 -0600 )edit

Ok, but can you tell me what/where my error is on this thought:
I want the intersection of my LOS-ray at z=1. I know that the LOS is the intersection of the yz-plane rotated around the y-axis by x-degrees and the xz-pane rotated around the x-axis by y-degrees.
The rotation around the y-axis should therefore not affect the y-component of my intersection-point. The x-coordinate should be the distance from the yz-plane, which (correct me if I am wrong here) is tan(x-degrees) * z. Since this should be analogous for the rotation around the x-axis, it should boil down to:
Intersection-point for x-degrees, y-degrees and z=1 is (tan(y-degrees), tan(x-degrees), 1).

RupertVanDaCow gravatar imageRupertVanDaCow ( 2017-02-24 04:29:14 -0600 )edit

Nope. You're forgetting that the lengths of the vectors are affected by the rotations. Basically, the length of your hypotenuse changes with both x and y, as does the "adjacent" you're using for tan.

Look HERE for an explanation of the model. Or google "Pinhole Camera model"

Tetragramm gravatar imageTetragramm ( 2017-02-24 17:48:39 -0600 )edit

Sorry if I sound a bit repetitive, but I really can't understand what's wrong with tan...
If you think of this image as the view along the y-axis, with the z-axis pointing to the right and the x-axis pointing up and I am interested in the x-offset for x-degrees (which is Theta in this image) at z=1 the image tells me this length is tan(Theta). I don't rotate my point, but rather slide it on the projection plane, maintaining its z=1.
And If I would change my view to be oriented along the x-axis the same scheme applies for getting the y-offset at z=1.

RupertVanDaCow gravatar imageRupertVanDaCow ( 2017-02-24 19:33:55 -0600 )edit

Oops. You're correct. I made a mistake in my scratch program I was testing with and had the wrong angles, so of course the tan didn't match.

Now remember that this only works for the identity camera matrix. You're not really using a camera model here.

Tetragramm gravatar imageTetragramm ( 2017-02-24 20:34:52 -0600 )edit

Tetragramm (or Rupert) could you please describe to me how I come up with this "line of sight" vector? If I get sweep data from the lighthouse that represents 0-180 degrees, (times, with known rotation speed), how do I make this into something usable for solvePNP as discussed in this question/answer?

kyranf gravatar imagekyranf ( 2017-03-30 15:37:49 -0600 )edit

Basic geometry.

x = cos(elevation) .* cos(azimuth)
y = cos(elevation) .* sin(azimuth)
z = sin(elevation)
Tetragramm gravatar imageTetragramm ( 2017-03-30 18:11:26 -0600 )edit

@Tetragramm, okay so with my X and Y angle (elev and azimuth) i do those equations, then follow your answer and I should get reasonable values? My current method is actually working now but it's probably overly complicated, Also, solvePNP seems to give very "noisy" results with data that appears quite stable. Is it normally quite sensitive? what are good known solutions to filtering a pose like what comes out of solvePNP?

kyranf gravatar imagekyranf ( 2017-04-03 05:34:22 -0600 )edit

I have no idea what you're doing. Why don't you start a new question with context and code snippets and more details about what exactly isn't working?

Tetragramm gravatar imageTetragramm ( 2017-04-03 18:20:10 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2017-02-23 16:17:19 -0600

Seen: 437 times

Last updated: Feb 23 '17