Actions

Adaptive Control

Adaptive Control is the capability of the System to modify its own operation to achieve the best possible mode of operation. A general definition of adaptive control implies that an adaptive system must be capable of performing the following functions: providing continuous information about the present state of the system or identifying the process; comparing present system performance to the desired or optimum performance and making a decision to change the system to achieve the defined optimum performance; and initiating a proper modification to drive the control system to the optimum. These three principles—identification, decision, and modification—are inherent in any adaptive system.[1]


See Also




References

  1. ↑ What Does Adaptive Control Mean? Britannica