正在加载图片...
12.2 Using the State of the System to Determine Stability The stability of a system is defined with respect to a given equilibrium point in state space. If the initial state x is selected at an equilibrium state x of the system, then the state will remain at x for all future time. w the initial state is selected close to an equilibrium state, the system might remain close to the equilibrium state or it might move away. In this section we introduce conditions that guarantee that whenever the system starts near an equilibrium state, it remains near it, perhaps even converging to the equilibrium state as time increases For simplicity, only time-invariant systems are considered in this section. Time-variant systems are discussed in Section 12.5 Continuous, time-invariant systems have the for x(t)=f(x(t)) and discrete, time-invariant systems are modeled by the difference equation x(t+1)=f(x(t) (12.2) Here we assume that f: X-R", where XsR" is the state space. We also assume that function f is continuous; furthermore, for arbitrary initial state xo E X, there is a unique solution of the corresponding initial value roblem x(to)=Xo, and the entire trajectory x(o) is in X. Assume furthermore that to denotes the initial time It is also known that a vector x E X is an equilibrium state of the continuous system, Eq (12.1), if and only if f(x)=0, and it is an equilibrium state of the discrete system, Eq (12.2), if and only if x=f(x). In this chapter the equilibrium of a system will always mean the equilibrium state, if it is not specified otherwise. In analyzing the dependence of the state trajectory x(t) on the selection of the initial state xo nearby the equilibrium, the ollowing stability types are considered. Definition 12.1 1. An equilibrium state x is stable if there is an eo >0 with the following property: For all E1,0<e< eor there is an e>0 such that if llx-xol<e,then‖x-x(圳川<ep, for all!t>t 2. An equilibrium state x is asymptotically stable if it is stable and there is an e >0 such that whenever ‖x-x‖<e, then x(t)→xast→ 3. An equilibrium state x is globally asymptotically stable if it is stable and with arbitrary initial state xo ∈Xx(t)→xast→∞. Unstable The first definition says an equilibrium state x is stable if the entire trajectory x(t) is closer to the equilibrium state than any small E, if the initial state xo is selected close enough to the equilibrium state. For asymptotic stability, in addition, x(t)has to converge to the equilibrium state as I-o0. If an equilibrium state is globally asymptotically stable, then x(n) converges to the equilibrium state regard less of how the initial state x, is selected These stability concepts are called internal, because they stable roperties of the state of th illustrated in Fig. 1 FIGURE 12.1 Stability concepts. Source: F. Szi In the electrical engineering literature, sometimes our darovszky and AT Bahill, Linear Systems Theory Boca tability definition is called marginal stability, and our Raton, Fla. CRC Press, 1992, P 168. With permission. asymptotic stability is called stability. e 2000 by CRC Press LLC© 2000 by CRC Press LLC 12.2 Using the State of the System to Determine Stability The stability of a system is defined with respect to a given equilibrium point in state space. If the initial state x0 is selected at an equilibrium state x of the system, then the state will remain at x for all future time. When the initial state is selected close to an equilibrium state, the system might remain close to the equilibrium state or it might move away. In this section we introduce conditions that guarantee that whenever the system starts near an equilibrium state, it remains near it, perhaps even converging to the equilibrium state as time increases. For simplicity, only time-invariant systems are considered in this section. Time-variant systems are discussed in Section 12.5. Continuous, time-invariant systems have the form (12.1) and discrete, time-invariant systems are modeled by the difference equation (12.2) Here we assume that f: X Æ Rn, where X Õ Rn is the state space. We also assume that function f is continuous; furthermore, for arbitrary initial state x0 Œ X, there is a unique solution of the corresponding initial value problem x(t0) = x0, and the entire trajectory x(t) is in X. Assume furthermore that t0 denotes the initial time period of the system. It is also known that a vector x Œ X is an equilibrium state of the continuous system, Eq. (12.1), if and only if f(x) = 0, and it is an equilibrium state of the discrete system, Eq. (12.2), if and only if x = f(x). In this chapter the equilibrium of a system will always mean the equilibrium state, if it is not specified otherwise. In analyzing the dependence of the state trajectory x(t) on the selection of the initial state x0 nearby the equilibrium, the following stability types are considered. Definition 12.1 1. An equilibrium state x is stable if there is an e0 > 0 with the following property: For all e1, 0 < e1 < e0, there is an e > 0 such that if || x – x0 || < e, then || x – x(t)|| < e1, for all t > t0. 2. An equilibrium state x is asymptotically stable if it is stable and there is an e > 0 such that whenever || x – x0 || < e, then x(t) Æ x as t Æ •. 3. An equilibrium state x is globally asymptotically stable if it is stable and with arbitrary initial state x0 Œ X, x(t) Æ x as t Æ •. The first definition says an equilibrium state x is stable if the entire trajectory x(t) is closer to the equilibrium state than any small e1, if the initial state x0 is selected close enough to the equilibrium state. For asymptotic stability, in addition, x(t) has to converge to the equilibrium state as t Æ •. If an equilibrium state is globally asymptotically stable, then x(t) converges to the equilibrium state regard￾less of how the initial state x0 is selected. These stability concepts are called internal, because they represent properties of the state of the system. They are illustrated in Fig. 12.1. In the electrical engineering literature, sometimes our stability definition is called marginal stability, and our asymptotic stability is called stability. x˙( )t = f(x(t)) x(t + 1) = f(x(t)) FIGURE 12.1 Stability concepts. (Source: F. Szi￾darovszky and A.T. Bahill, Linear Systems Theory, Boca Raton, Fla.: CRC Press, 1992, p. 168. With permission.)
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有