Ask Your Question
0

Converting ArUco axis-angle to Unity3D Quaternion

asked 2020-07-09 15:47:16 -0600

Alice_aj gravatar image

updated 2020-07-10 04:56:23 -0600

supra56 gravatar image

I'm interested in comparing the quaternions of an object presented in the real-world (with ArUco marker on top of it) and its simulated version in Unity3D.

To do this, I generated different scenes in Unity with the object in different locations. I stored its position and orientation relative to the camera in a csv file. where quaternions is looking something like this (for one example):

[-0.492555320262909 -0.00628990028053522 0.00224017538130283 0.870255589485168]

In ArUco, after using estimatePoseSingleMarkers I got a compact version of Angle-Axis, and I converted it to Quaternion using the following function:

def find_quat(rvecs):
      a = np.array(rvecs[0][0])
      theta = math.sqrt(a[0]**2 + a[1]**2 + a[2]**2) 
      b = a/theta 
      qx = b[0] * math.sin(theta/2)
      qy = -b[1] * math.sin(theta/2) # left-handed vs right handed
      qz = b[2] * math.sin(theta/2)
      qw = math.cos(theta/2)
      print(qx, qy, qz, qw)

where rvecs is the return value of ArUco

However, after doing this I'm still getting way different results, example of the same scene:

[0.9464098048208864 -0.02661258975275046 -0.009733748408866453 0.321722715311581] << aruco result

[-0.492555320262909 -0.00628990028053522 0.00224017538130283 0.870255589485168] << Unity's result

Am I missing something?

edit retag flag offensive close merge delete

Comments

Validate first your angle-axis <==> quaternion conversion on some test data. In Python, there should be already some available libraries for that.

If the conversion is correct, you have to understand the meaning of the returned [translation + rotation]. For instance, what are the frame of reference for the object coordinates system, for the ArUco coordinate system, for the camera coordinate system, for the Unity coordinates system.

Eduardo gravatar imageEduardo ( 2020-07-10 04:20:12 -0600 )edit

you might also try the reverse - getting axis/angle from your unity quat. if that's not the same, there might be other coord transformations involved in unity.

berak gravatar imageberak ( 2020-07-10 05:45:09 -0600 )edit

Hi!

Thanks for your response,

Yes I have tried that but it didn’t work. In both (Unity and ArUco) I’m finding the quaternion relative to the camera. However, I noticed that things get more closer to each others if I change the rotation of the object in Unity to be (-90, 0, 0)

Also, for no apparent reasons, ArUco might find two different quaternions for a marker on the same rotation from the camera, does that make sense?

Alice_aj gravatar imageAlice_aj ( 2020-07-10 07:20:49 -0600 )edit

If I rotate the object in unity by -90, I get to this which is closer but with signs flipped: [-0.941845417022705 -0.0123657090589404 0.00498956767842174 0.335781961679459]

Alice_aj gravatar imageAlice_aj ( 2020-07-10 07:29:24 -0600 )edit

3 answers

Sort by » oldest newest most voted
0

answered 2020-07-12 19:44:59 -0600

Alice_aj gravatar image

updated 2020-07-13 18:39:56 -0600

For anyone coming here trying to find an answer.

My problem was that I was having the marker on top of the cube (so rotated by -90) which made converting the orientation impossible.

Change your pivot point in Unity and rotate it by -90. Then convert by

(x,y,z,w) = (-x,y,-z,w)

Reach out to me if you feel it isn't clear.

edit flag offensive delete link more
0

answered 2020-07-10 12:47:45 -0600

Eduardo gravatar image

Dealing with transformation is tricky.

You have to be sure that you are comparing the same things.

If you are sure that the quaternion <==> angle-axis conversion is correct, you have to be able to draw the coordinates system of all the elements:

  • ArUco coordinates system, can you tell me how the z-axis is pointing (I know already the answer)?
  • camera coordinates system, as it is classical used in computer vision
  • the above two allow you to interpret the translation + rotation returned by OpenCV
  • Unity object coordinates system, are you sure that the ArUco coordinates system (the texture image) and the coordinates system for the Unity object match?
  • Unity coordinates system, for instance with OpenGL the camera coordinates system is probably left-handed?

An easy way to interpret the coordinates system is to look at the translation part. For instance, with the translation returned by OpenCV, you can easily interpret and validate t_z.


Once you have validated these points:

  • be sure to test with the ArUco marker appearing the bigger in the image
  • because there can be an ambiguity for planar pose estimation in certain conditions, see the ArUco doc
  • but this should occur rarely and not when the tag is big in the image (at least less likely)

Good luck.

edit flag offensive delete link more

Comments

For the z-axis, if you mean for the camera frame it's pointing out. But for the marker, it should be up, no?

Unity is left-handed while OpenCV is right-handed. So the y value of openCV should be inverted in the translation vector to match Unity's.

I'm dealing here with everything relative to the camera frame, so why should I care about z? when the z in both matches.

Thanks for your reply!!!

Alice_aj gravatar imageAlice_aj ( 2020-07-10 13:25:28 -0600 )edit

As for the conversion part, I double-checked with this website https://www.andre-gaschler.com/rotati... and it seems correct

Alice_aj gravatar imageAlice_aj ( 2020-07-10 13:31:50 -0600 )edit

Sorry if this sounds naive, but I started to question my aruco marker placement. I'm placing it in top of the cube in the real-world and the small red square (the one they said should be the top left corner to be there). Example Please ignore the one shifted, this is me playing with it. is this correct?

Alice_aj gravatar imageAlice_aj ( 2020-07-11 10:07:27 -0600 )edit

OpenCV will return you the pose of the ArUco in the camera frame. You can use drawFrameAxes() to draw the coordinates system of the ArUco. You can also look at tz to have roughly the distance between the camera and the tag. You can look at tx to see if the tag is at the left or at the right of the camera.

For Unity, you have to understand which transformation is returned. If possible, display the different coordinates systems. Also, you only post orientation, but translation is easier to interpret. So what about translation part. Do they match?

Eduardo gravatar imageEduardo ( 2020-07-12 08:07:30 -0600 )edit

Yes, they matched. That's why I posted only the orientation. I validated this using drawAxis() as drawFrameAxes()is deprecated. You can see here: https://drive.google.com/file/d/12d5w..., Please ignore the shifted coordinate. I was playing with the numbers to check. The translation I'm getting for this image is: [ 0.02888627 0.06490939 0.54608041] in meters.

Alice_aj gravatar imageAlice_aj ( 2020-07-12 11:25:49 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2020-07-09 15:47:16 -0600

Seen: 1,616 times

Last updated: Jul 13 '20