
Peter G. answered 09/29/16
Tutor
4.9
(43)
Success in math and English; Math/Logic Master's; 99th-percentile
Partial answer
a) It suffices to show that T_n(R) and T_n^s(R) are each rings [it is implicit when the problem states that the initial ring has identity that the definition of ring used here is the one without multiplicative identity; otherwise T_n^s(R) wouldn't be a subring, because it lacks the multiplicative identity of M_n(R)]; according to the subring test theorem, it is enough to show that they are closed under multiplication and subtraction (this saves the step of showing that each element of them has an additive inverse and additive identity)
For two matrices (aij) and (bij) in Tns(R) let us show that their product is in Tns(R):
Let (aij)*(bij) = (cij). Then by definition of matrix multiplication, cij = ∑aik*bkj , where the sum is over k. Suppose i >= j. Then for k <= i we have zeros from the aik in the sum, and for k >= j we have zeros from the bkj in the sum. Thus, zeros when j <= k OR k <= i; since i >= j the two sides of that OR overlap in the middle, and so we have zeros in all terms of the sum.
a) It suffices to show that T_n(R) and T_n^s(R) are each rings [it is implicit when the problem states that the initial ring has identity that the definition of ring used here is the one without multiplicative identity; otherwise T_n^s(R) wouldn't be a subring, because it lacks the multiplicative identity of M_n(R)]; according to the subring test theorem, it is enough to show that they are closed under multiplication and subtraction (this saves the step of showing that each element of them has an additive inverse and additive identity)
For two matrices (aij) and (bij) in Tns(R) let us show that their product is in Tns(R):
Let (aij)*(bij) = (cij). Then by definition of matrix multiplication, cij = ∑aik*bkj , where the sum is over k. Suppose i >= j. Then for k <= i we have zeros from the aik in the sum, and for k >= j we have zeros from the bkj in the sum. Thus, zeros when j <= k OR k <= i; since i >= j the two sides of that OR overlap in the middle, and so we have zeros in all terms of the sum.
The proof of subtraction is a bit easier, since if we put (aij) + (bij) = (cij), then by definition cij=aij-bij, which is certainly 0 if i >= j, since aij=0 and bij=0.
Show the same for T_n(R): very similar
b) Here you must show that T_n^s(R) is closed under right- and left-sided multiplication by any element of T_n(R). Note how we proved that T_n^s(R) was closed under multiplication; the proof is the same, using the fact that the overlap is similar, except one inequality in the OR is no longer strict. We are no longer guaranteed an overlap (an overlap was more than we needed), but we are guaranteed that the two halves (of values of k in the sum contributing all 0s) taken together cover everything from 1 to n.