Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct

QMatrix4x4: Why methods like `translate`, `rotate` etc. perform left-to-right multiplication?



  • Hello!

    All docs I've seen about affine transformations say that if one wants to combine two affine matrices A then B, he needs to compute matrix M = BA, not AB. I.e. order of matrix multiplication is inverted.

    Although, methods of QMatrix4x4 like translate, rotate, scale which apply additional transformations to existing matrix perform multiplication left-to-right. As an example, sources of QMatrix4x4::rotate(const QQuaternion&) end with *this *= m; instead of *this = m * (*this).

    So my question is. Does Qt's matrix class implement transformation combination incorrectly? Or am I getting it completely wrong?

    Thanks



  • @Igor-Baidiuk said in QMatrix4x4: Why methods like `translate`, `rotate` etc. perform left-to-right multiplication?:

    Although, methods of QMatrix4x4 like translate, rotate, scale which apply additional transformations to existing matrix perform multiplication left-to-right. As an example, sources of QMatrix4x4::rotate(const QQuaternion&) end with *this *= m; instead of *this = m * (*this).

    What's wrong about that?

    Assuming A = this and given another B:

    *this *= m is equal to *this = *this * m, so like newA = A * B



  • @Pl45m4 As I noted in my post, the right order of multiplication should be m * (*this). Transformation matrices are combined by multiplying in reverse order.



  • @Igor-Baidiuk This issue arises because some authors use the (row) vector * matrix notation and others use the matrix * (column) vector form. The matrix in the first case is the inverse of the matrix in the second case.



  • @ofmrew Correction. Change inverse to transpose. Sorry.


  • Moderators

    @Igor-Baidiuk said in QMatrix4x4: Why methods like `translate`, `rotate` etc. perform left-to-right multiplication?:

    All docs I've seen about affine transformations say that if one wants to combine two affine matrices A then B, he needs to compute matrix M = BA, not AB. I.e. order of matrix multiplication is inverted.

    This makes no sense. Firstly I assume you mean affine transformations represented by a matrix; there are no affine matrices as such. And secondly, and much more importantly, there's no such thing as an "inverted" order. Matrix multiplication is non-commutative, so as AB != BA in the general case, there can be defined no "right" and "inverted" order.

    Matrices are operators (or rather representations of linear operators), so they operate to the right (generally speaking).
    The hermitian adjoint of a matrix can be thought of operating on the left, however. So, it is all about the way you structure the matrix and what you're doing, and how you think about it ...

    In the cited source about QMatrix4x4::rotate, did you notice the matrix was constructed as a transposed one, before actually multiplying this from the right?


  • Moderators

    @ofmrew said in QMatrix4x4: Why methods like `translate`, `rotate` etc. perform left-to-right multiplication?:

    @ofmrew Correction. Change inverse to transpose. Sorry.

    Inverse is the same as the transpose in this case. Rotation matrices are orthogonal.



  • @kshegunov You are missing the point. I am not saying that the order of multiplication does not matter: it does. What I am saying is that the pre- or post-multiplication is dictated by where your vector, whether it is a row or column vector. What you are looking at is referred to as composition. For example, if you want to rotate a position vector you must first translate its center to the origin, then perform the rotation and finally translate it back to its original position. If you have column vectors, the it would be (TbRTo)v, but if the vectors are row vectors it would be v(ToRTb). Note the order change.


  • Moderators

    @ofmrew said in QMatrix4x4: Why methods like `translate`, `rotate` etc. perform left-to-right multiplication?:

    You are missing the point. I am not saying that the order of multiplication does not matter: it does. What I am saying is that the pre- or post-multiplication is dictated by where your vector, whether it is a row or column vector. What you are looking at is referred to as composition.

    I'm pretty sure I'm not missing the point, I have an idea what I'm talking about being a physicist and all. The point is that the order of the matrices follows the order of operations, which is up to the user.

    For example, if you want to rotate a position vector you must first translate its center to the origin, then perform the rotation and finally translate it back to its original position.

    You may (not must) want to do that if you want to rotate in a local coordinate system tied to the mentioned vector. You can, however, rotate as much as you want in any coordinate system you want; you just may not get exactly what you expect, but that's another matter.

    If you have column vectors, the it would be (TbRTo)v, but if the vectors are row vectors it would be v(ToRTb). Note the order change.

    The order change is not enough. What you're citing is the adjoint distributive rule. That is: (AB)* = B*A* where * is the hermitian conjugate, or in the field of real numbers it's simply a transposition. This extends to vectors and scalars, for obvious reasons. Your reordering of the matrices is not enough to get the desired result - your Tb from the first case is not the same as the Tb from the second case, more specifically they're transposed to one another.



  • If you have a transformation matrix A and a vector x you get the same result if you do A*x or x*(A^T) (A^T meaning A transposed). (If you want to be really precise in distinguishing row- and column-vectors you need to write x^T as well.)

    Now, if you have two transformation matrices A and B you want to apply successively, then it holds:
    B*A*x = x*(A^T)*(B^T)

    The reason why Qt multiplies the matrices in reverse order is because 1) their matrices are the transposed matrices and 2) they multiply the vector from the right as well. Mathematically speaking, the two are equivalent (assuming you construct the right matrix for your transformation): either chain your multiplications to the left or to the right. Just make it consistent and do not mix the two.


Log in to reply