As far as I remember in order for a matrix to be orthonormal it has to be orthogonal and normal... to be orthogonal you have to be able to multiply the matrix by it's transposition and end up with the identity matrix (1's all down the diagonal and 0s elsewhere). I don't think you can do that with a matrix that isn't square, but then again it's been about 5 years?
No, one of the rows would have to be linearly dependent on the other 2 rows. You are dealing with 2 dimensional vectors so the biggest number of linearly independent vectors (row Rank) you can get is 2.
Originally posted by: ArmenK
No, one of the rows would have to be linearly dependent on the other 2 rows. You are dealing with 2 dimensional vectors so the biggest number of linearly independent vectors (row Rank) you can get is 2.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.