Document Type

Article

Publication Title

Mathematical Problems in Engineering

Abstract

A basic feedback control problem is that of obtaining some desired stability property from a system which contains uncertainties due to unknown inputs into the system. Despite such imperfect knowledge in the selected mathematical model, we often seek to devise controllers that will steer the system in a certain required fashion. Various classes of controllers whose design is based on the method of Lyapunov are known for both discrete [4], [10], [15], and continuous [3–9], [11] models described by difference and differential equations, respectively. Recently, a theory for what is known as dynamic systems on time scales has been built which incorporates both continuous and discrete times, namely, time as an arbitrary closed sets of reals, and allows us to handle both systems simultaneously [1], [2], [12], [13]. This theory permits one to get some insight into and better understanding of the subtle differences between discrete and continuous systems. We shall, in this paper, utilize the framework of the theory of dynamic systems on time scales to investigate the stability properties of conditionally invariant sets which are then applied to discuss controlled systems with uncertain elements. For the notion of conditionally invariant set and its stability properties, see [14]. Our results offer a new approach to the problem in question.

First Page

1

Last Page

10

DOI

10.1155/S1024123X95000020

Publication Date

1995

Share

COinS