Abstract: Recursive least square (RLS) is an efficient approach to neural network training. However, for the classical RLS algorithm, during the iterations, its gain vector gradually decreases to zero and loses the ability of modification, which will lead to the so called “data saturation” phenomenon. This paper proposes an improved recursive least square (IRLS) and applies it to nonlinear time-varying system identification together with the feed forward neural network. Theoretic analysis and two simulation examples are given to demonstrate the effectiveness of the proposed IRLS. Simulation results show that the proposed IRLS can overcome the problem of “data saturation” and has higher accuracy and robustness.