Orthogonal filtering is a method to extract essential information from digital data using orthogonal transformations. It belongs to the category of Wave Digital Filters (WDF?s) as originally defined and considered by the late Alfred Fettweis, one of the principal founders of modern digital filter theory. In the original WDF theory, filtering is done using adders and an algebraically minimal number of multipliers exclusively. When, instead, the arithmetic is based (also exclusively) on purely orthogonal transformations (Jacobi/Givens rotations), a much larger category of lossless digital filters is obtained. In this paper, it is shown how central classical problems with many engineering applications, namely quadratically optimal tracking (Bellman), linear least squares estimation (Kalman) and spectral factorization (Wiener), among many other types of filters, produce natural orthogonal filters and can be obtained and designed using nothing more than orthogonal transformations. Simple proofs based on these insights are provided, together with a streamlined realization theory for the resulting data processors and filters. The paper uses nothing more than elementary matrix theory, and should be accessible to students with no other background, although it does at some point make the connection with the Beurling-Lax theory on inner-outer factorization and the Wiener theory on spectral factorization, putting these theories in a purely matrix algebra context.