Lecture 3:
-
- In a Vector Space V, there exist vectors called basis that satisfy the following conditions:
- Definition: “The vectors can generate any vector in V through linear combinations” and “The vectors are linearly independent”.
- The basis v_1, …, v_n is called the generating system of V.
- Definition: “The vectors can generate any vector in V through linear combinations” and “The vectors are linearly independent”.
- What is the difference between being linearly independent and being a basis..? (blu3mo)
- When we say a set of linearly independent vectors, is the number of vectors not specified? (blu3mo)
- It seems that we can define it as “a set of vectors that is linearly independent and has the same number as the dimension of the space”.
- Oh, but is the concept of dimension not defined yet?
- Maybe the dimension is defined by the definition of a basis? (blu3mo)
- When expressing a vector using a certain basis, the representation is uniquely determined.
- Well, that’s obvious (blu3mo)
- For an orthonormal basis, there is no other representation of (5,3) except for 5(1,0) + 3(0,1).
- For a skew-orthogonal basis, there is no other representation of (5,3) except for 2(1,0) + 3(1,1).
- It is useful to have various bases when considering Diagonalization.
- In physics, I don’t think I’ve ever considered anything other than an orthonormal basis (blu3mo).
- I have considered a skew-orthogonal basis for the same unit, but I wonder if there are any advantages to considering a skew-orthogonal basis for vectors like velocity and time.
- I think I used skew-orthogonal bases in relativity theory (takker).
- Sounds interesting (blu3mo)(blu3mo)(blu3mo).
- It is also necessary to discuss curved spaces (takker).
- Relativity theory is famous, but it is also used when considering the deformation of surfaces in continuum mechanics.
- I have considered a skew-orthogonal basis for the same unit, but I wonder if there are any advantages to considering a skew-orthogonal basis for vectors like velocity and time.
- In a Vector Space V, there exist vectors called basis that satisfy the following conditions:
-
Extension of Basis
-
- I’ve never thought about the definition of dimension (blu3mo).
- As a premise, the number of bases of the same vector space is constant.
- Therefore, the number is defined as the dimension.
- It’s casually mentioned as a premise, but it actually needs to be proven (takker).
- However, it’s not a difficult proof for finite-dimensional vector spaces.
- It was proven in the class (blu3mo).
- That’s good to hear (takker).
- It seems to be a problem for infinite-dimensional vectors.
- How can we prove it?
- Can we simply show that a bijection can be constructed between two bases for the same vector space?
- However, it’s not a difficult proof for finite-dimensional vector spaces.
- Theorem
- They all seem obvious intuitively.
- Theorem: If there are n vectors () in a vector V, they are linearly dependent.
- Theorem: If we consider a subspace W of vector V, .
Lecture 2:
- Subspace = Subvector Space
- When we have a vector space V, a subset that itself becomes a vector space.
- For example,
- When considering a subspace of natural numbers, if there is only 3, it does not satisfy one of the axioms, so it does not become a subspace (blu3mo).
- It becomes a subspace if both 3 and -3 are included (blu3mo).
- It is not a subspace if it does not contain 0 (zero element).
- When considering a subspace of natural numbers, if there is only 3, it does not satisfy one of the axioms, so it does not become a subspace (blu3mo).
- Definition/Requirements of a subspace
-
- It contains 0.
- Is this necessary? (takker)
- If we exclude the empty set from being a subspace, can it be replaced with “not an empty set”?
-
- When there are two elements, their sum is also included in the set.
- (Without this, addition does not hold)
-
- When there is an element, any scalar multiple of it is also included in the set.
- (Without this, scalar multiplication does not hold)
- As long as we consider these three, we can determine if it is a subspace (blu3mo)(blu3mo).
- Since it is a subset of a vector space, we can skip checking things like the associative law (blu3mo).
- The condition for the inverse element can also be included by considering -1 times in 3) (blu3mo).
-
- For example,
- The subspaces of are only the following three.- > , a line passing through the origin,
- When we have a vector space V, a subset that itself becomes a vector space.
- Each of them is 0-dimensional, 1-dimensional, and 2-dimensional (blu3mo)(blu3mo)
- Most cases fall into either 2) or 3) (blu3mo)
- It might be interesting to also examine subspaces of (takker)
- Linear Independence
- Linear Combination: When is an element of the vector space V, it can be expressed in the form
- a is a scalar, right? (blu3mo)
- Yes, it is a scalar (takker)
- Linear Independence: ” implies ”
- When there are two 2D vectors pointing in different directions, we can say that “if the linear combination is zero, then the scalar coefficients of each vector are also zero”
- Pay attention to the logic (blu3mo)
- Well, in this case, the reverse is also true, but it’s important to pay attention to the direction (takker)
- Linear Dependence: not linearly independent
- Hypothesis: For n-dimensional vectors, it seems that n different vectors pointing in separate directions are linearly independent (blu3mo)
- Seems like it (blu3mo)
- As a theorem, there is the following:
-
Theorem: ” are linearly independent” is equivalent to ” are not on the same line”
- (Here, vector)
- This can be proven by manipulating the equations
- It seems that the same can be said for (placing n vectors)
- Regarding the number n:
- In the vector space , n+1 or more vectors cannot be linearly independent
- This is because with n linearly independent vectors, any vector can be expressed as a linear combination of them
- Linear Combination: When is an element of the vector space V, it can be expressed in the form