State variable
This article needs additional citations for verification. (December 2009) (Learn how and when to remove this template message) |
A state variable is one of the set of variables that are used to describe the mathematical "state" of a dynamical system. Intuitively, the state of a system describes enough about the system to determine its future behaviour in the absence of any external forces affecting the system. Models that consist of coupled first-order differential equations are said to be in state-variable form.^{[1]}
Contents
Examples[edit]
- In mechanical systems, the position coordinates and velocities of mechanical parts are typical state variables; knowing these, it is possible to determine the future state of the objects in the system.
- In thermodynamics, a state variable is an independent variable of a state function like internal energy, enthalpy, and entropy. Examples include temperature, pressure, and volume. Heat and work are not state functions, but process functions.
- In electronic circuits, the voltages of the nodes and the currents through components in the circuit are usually the state variables.
- In ecosystem models, population sizes (or concentrations) of plants, animals and resources (nutrients, organic material) are typical state variables.
Control systems engineering[edit]
In control engineering and other areas of science and engineering, state variables are used to represent the states of a general system. The set of possible combinations of state variable values is called the state space of the system. The equations relating the current state of a system to its most recent input and past states are called the state equations, and the equations expressing the values of the output variables in terms of the state variables and inputs are called the output equations. As shown below, the state equations and output equations for a linear time invariant system can be expressed using coefficient matrices:
- R^{N*N}, R^{N*L}, R^{M*N}, R^{M*L},
where N, L and M are the dimensions of the vectors describing the state, input and output, respectively.
Discrete-time systems[edit]
The state vector (vector of state variables) representing the current state of a discrete-time system (i.e. digital system) is , where n is the discrete point in time at which the system is being evaluated. The discrete-time state equations are
which describes the next state of the system (x[n+1]) with respect to current state and inputs u[n] of the system. The output equations are
which describes the output y[n] with respect to current states and inputs u[n] to the system.
Continuous time systems[edit]
The state vector representing the current state of a continuous-time system (i.e. analog system) is and the continuous-time state equations giving the evolution of the state vector are
which describes the continuous rate of change of the state of the system with respect to current state x(t) and inputs u(t) of the system. The output equations are
which describes the output y(t) with respect to current states x(t) and inputs u(t) to the system.
See also[edit]
- State space (controls)
- Control theory
- Equation of state
- State (computer science)
- Dynamical systems
- State (functional analysis)
- State diagram
- State variable filter
References[edit]
- ^ William J. Palm III (2010). System Dynamics (2nd ed.). p. 225.