matrix question...(math)

Semidevil

Diamond Member
Apr 26, 2002
3,017
0
76
when given bases and trying to convert them to matrix, how do you tell if it is standard or non-standard base?

i.e (x + y, 2x -y) is a non-standard base
i.e ( x- 2y, 2x + y) is a standard base

how?
 

sandmanwake

Golden Member
Feb 29, 2000
1,494
0
0
Well, I'm going to just assume you mean basis here, otherwise, I have don't think I have any context to go from for an explaination.

Lets begin by defining a basis for the subspace W or R of n as a linearly independent set of vectors which span W. In other words, if {u1,u2, . . .,ur) is a basis for W, we can say W = BASIS{u1,u2, ...,ur}, in which case all vectors in the subspace W can be obtained by multiplying the basis by some scalar x.

The standard basis for R of n is usually the set of vectors (u1,u2, u3, . . ., ur ) in which in each dimension across only one dimension is a 1 and the rest are 0's. For example if we're talking about the subspace W of R3, then u1={1,0,0}, u2=(0,1,0}, and u3={0,0,1}. As you can see, any element of the subspace W can be written as a combination of u1 through u3--> x = [x1,x2,x3] transposed = u1x1+u2x2+u3x3.

The non-standard basis, if my memory serves me correctly, would be any set of vector not of the standard base, but each vector are still orthogonal to each other, thus can form any vector found within the subspace W.

Someone feel free to jump in here and correct me if I'm wrong since it's been a while since I've studied the finer details of linear algebra.