
Akash S. answered 11/17/24
Experienced College/HS Tutor for Math and Science
The subspace spanned by two linearly independent vectors is a plane, so we need to find the distance between a point and a plane. The overall idea is to project the point onto the plane (by projecting onto each of the vectors) and then find the distance between the point and the projected point. I will skip most of the tedious details of calculation, but highlight the overall steps.
Project y onto V1:
We'll call this vector a. a=3/11*V1
Project y onto V2:
We'll call this vector b. b=1/33*V2
Project y onto the plane spanned by V1 and V2:
Since V1 and V2 are orthogonal (the dot product of V1 and V2 is 0), this is just a+b! We can call this vector c. c=a+b=(1/3,-1,2/3,-1/3)
Find the distance between y and c:
y-c=(-16/3,0,7/3,-2/3)
|y-c|=\sqrt{103/3}