Model reduction of discrete-time systems in limited intervals

Model order reduction (MOR) is a process of obtaining a lower order surrogate model that accurately approximates the original high-order system. Since no actuator or plant operates over the entire time and frequency ranges, the reduced-order model should be accurate in the actual range of operation. In this paper, model reduction techniques for discrete time systems are presented that ensure less reduction error in the specified time and frequency intervals. The techniques are tested on the benchmark numerical examples and their efficacy is shown.