# Full Rank Factorization of Matrices

First Paragraph: There are various useful ways to write a matrix as the product of two or three other matrices that have special properties. For example, today's linear algebra texts relate Gaussian elimination to the LU factorization and the Gram-Schmidt process to the QR factorization. In this paper, we consider a factorization based on the rank of a matrix. Our purpose is to provide an integrated theoretical development of and setting for understanding a number of topics in linear algebra, such as the Moore-Penrose generalized inverse and the Singular Value Decomposition. We make no claim to a practical tool for numerical computation--the rank of a very large matrix may be difficult to determine. However, we will describe two applications; one to the explicit computation of orthogonal projections, and the other to finding explicit matrices that diagonalize a given matrix.

Identifier:
http://www.jstor.org/stable/2690882
Subject:
Rating:
Creator(s):
R. Piziak and P.L. Odell
Cataloger:
Daniel Drucker
Publisher:
Mathematics Magazine 72, No. 3 (1999), 193-201
Rights:
R. Piziak and P. L. Odell

Interesting article; gives detailed treatment of full rank factorizations and their applications. There's a wealth of information in this article. Some readers may be unfamiliar and/or uncomfortable with the axioms for Moore-Penrose inverses; these are used many times in the proofs of the authors' results. Some facts about rank are stated without proof, but they can be found in standard texts, such as Strang's Linear Algebra and its Applications. There's a mistake on p. 194: $R_1$ consists of the first $r$ columns of $R^{-1}$, not $R$.