how to find the minimum resolution for controller output signal ?
1 view (last 30 days)
Andy Bartlett on 18 Jul 2022
Edited: Andy Bartlett on 18 Jul 2022
Minimum resolution could mean multiple things.
My guess is you want to do a rough what-if-analysis to determine how coarsely you can quantize just the output of the controller and still achieve your system level behavioral requirements.
You can quickly explore this experimentally by dropping the Quantizer block on the signal line from the Controller output to the plant input. (Alternately, you could also drop in a pair of Data Type Conversion blocks and quantized to fixed-point and then convert back to double.)
Try different values for the Quantization interval parameter to determine where the simulation behavior meets system behaviorial requirements and where it fails.
A key aspect of determining if behavioral requirements are met is simulating for a set of inputs (references, disturbances, etc.) that are representative of the range the system will see when deployed.
Your diagram is showing a step response. That's rarely an adequate set of input scenarios for a studying quantization. Books on control system design often focus on a system's unit step response. That's very useful if you make the BIG assumption that the system is really close to linear over its range of operation. Quantizing the controller output introduces a non-linearity. Whether that quantization has huge impact or tiny impact will heavily depend on the input scenarios that must be handled and how behavioral success has been defined.
For example, suppose you are designing the system to control the laser in a lithography system. You need to control the laser to meet accuracy requirements across the full width and height of the wafer region being etched. Quantizing the system to be sufficiently accurate for a unit step response could lead to a system that is accurate enough for a small portion of the wafer but leads to manufacturing failures in may other regions of the wafer.