I am coding to make a Fuzzy PID controller for a PWM motor driver for controlling speed. The feedback is a square wave from the hall-effect encoder fixed to the motor shaft.
My code can count the clock rising edges between two rising edges of the square wave from the encoder to calculate the time for one rotation. My function in my code can covert any given RPM to the time required for one rotation, and that is the setpoint for the controller.
The error is obviously the difference between the setpoint and current value (what should be the time for one rotation - what it currently is).
This goes into the back difference PID control algorithm and I get a number as output (which is basicall P*error + I*(sum of prev errors) + D (prev-current speed)).
This has to be mapped to a specific PWM duty-cycle percentage for the PWM driver process to increase or decrease PWM power output.
I am having conceptual issues with this mapping. How should i convert the output number to a percentage? I was thinking in the lines of calculating this value for max possible error and zero error and then mapping these to 100 and 1% duty cycle respectively.
I am looking for the concept and not the code. Thanks.