Determinant of Gram matrix  without knowing it's a Gram matrix or using anything "fancy"
Assume the set is linearly dependent, then at least one vector is a linear combination of the other vectors. Without loss of generality, we can assume it is the last element. Then Aik= , by rewriting vk as a linear combination of the other elements in the set, it becomes clear that Aik is a linear combination of {Ai1,Ai2,...Ai(k1)}. This is true for all i, and so the last column of the Gram matrix is a linear combination of the other columns and the matrix is not full rank, so the determinant is 0. To the prove the second half assume that the determinant is zero, which implies there is a linearly dependent column, then by doing the same steps done to prove the first half backwards you reach the contradiction that one of the elements in the set is linear combination of the other elements in the set.
Related videos on Youtube
Alec Teal
Updated on July 29, 2022Comments

Alec Teal less than a minute
NOTE this is independent learning or self learning.
Hey I hate to "dumb the question down" but I am trying to find an alternate proof that doesn't use knowledge of what a Gram matrix is, imagine this just happens to be one.
Let $G(v_1,v_2,...,v_k)$ the Gram matrix which (as I can't do matrices in latex) is the matrix where $A_{i,j}=<v_i,v_j>$ where $<,>$ is an inner product.
This makes the matrix symmetric.
I wish to prove:
If ${v_1,v_2,...,v_k}$ are linearly dependent then $\det(G(v_1,v_2,..,v_k))=0$
and
if ${v_1,v_2,...,v_k}$ are linearly independent then $\det(G(v_1,...,v_k))\ne 0$ (or $>0$ as the matrix has entirely positive members).
I believe that this can be done from definitions because if one considers the case k=2, if the points lie on a line going through the origin then this determinant is zero.
I know the "Schwarz" inequality, $\lvert\sum^N_{i=1}x_iy_i\rvert\le(\sum^N_{i=1}(x_i)^2)^\frac{1}{2}(\sum^N_{i=1}(y_i)^2)^\frac{1}{2}$, I also proved if this is an equality (not less than) then either all the $x_i$ are zero, or $y_i=\lambda x_i$ for all i
$\lambda\boldsymbol{v}$ = line through the orgin as one varies lambda.
Please no help that uses definitions of the Gram matrix, if this doesn't rule out that $A^TA$ method, please rule it out because I'd like to use the definition of $<,>$ rather than using a dot product.
Additional speculation
I also believe (not proved yet) that an orthagonal transformation preserves distance, I wish to show this is an "if and only if" relationship (a distance preserving transformation must also preserve angles and thus is a isometry, whcih would also be orthagonal) I sense the two might be related as is the norm squared, the diagonal of the matrix above and the triangle inequality tell me these two might be linked.

anon over 8 yearsIt strikes me as very arrogant to say "I have no idea how this helps" and request an answerer move their hint to a comment within mere minutes of seeing their response, surely not enough time to try to take full advantage of the hint and see if it truly helps or not, even if it is just a sentence or two. Note $B^TB=A$ in Jagy's answer is true regardless of whether or not $B$ is square or how many $v$'s there are relative to the dimension of the vector space.

Alec Teal over 8 years@anon and yet the answer is gone.
