In stochastic control with partial observations, the control law at any time can depend only on the past outputs and the past inputs of the stochastic control system. Neither is available to the control law in the current state nor the past states. Control theory for stochastic systems with partial observations is poorly developed and poorly understood. The approach to this control problem is to first determine a stochastic realization of the stochastic control system based on the information structure and second to apply a modified dynamic programming procedure. The approach is illustrated for a Gaussian stochastic control system and for an output-finite–state-finite stochastic control system. The tracking problem is treated.