Definitions for "State Space"
In this book, another name for the phase space of a dynamical system. Roughly speaking, if the dynamics of a dynamical system can be described by values, then the state space is the -dimensional volume that the system moves through. Systems that are continuous in time will form a smooth trajectory through this volume, while discrete systems may jump to different locations on subsequent time steps. In either case, if a system ever returns to a previously visited location in the state space, then the system is in either a fixed point or a limit cycle. For chaotic systems, or for programs that never halt, the system will always be at a previously unvisited portion of the state space.
The set of possible states in a system.
An abstract space used to represent the behavior of a system. Its dimensions are the variables of the system. Thus a point in the phase space defines a potential state of the system.
Keywords:  descriptions, set
a set of descriptions, or STATES