0

Is there any delay function can be used to PIC18F4550 in C programming, similar to delay() and delayMicroseconds() in Arduino?

The delay functions that can find are Delay10KTCYx(), Delay10TCYx() and etc which is very difficult to generate the delay that we desired, and the lowest delay is not even in milliseconds.

Kindly seek your assists, please. Thank you

Toby
  • 9,696
  • 16
  • 68
  • 132
  • 3
    http://thinkinbinary.blogspot.nl/2013/04/generating-time-delays.html – Lanting Aug 10 '17 at 11:21
  • 2
    the best way is to use the timer. All other methods works fine on a very simple uC but fail on more decent ones. hardcoded delays should be avoided anyway as a very very bad programming habit. – 0___________ Aug 10 '17 at 12:14

1 Answers1

1

When doing microcontroller programming, you should always use the on-chip hardware timers if possible. There are typically several of those and perhaps a real-time clock as well. Rather than looking for some busy-delay function, you should look for a driver or HAL around those hardware timers present in your MCU.

In addition, if you need better than 1ms resolution then note that "delay" functions tend to be inaccurate.

Busy-delay() functions/loops are mostly a quick & dirty amateur solution. They are bad because:

  • They consume 100% CPU and thereby 100% power.
  • They have a tight coupling against the compiler and its settings. Different optimization levels might break such delays.
  • They have a tight coupling to the system clock, whereas on-chip timer drivers usually specify which clock to use as a parameter and adjust pre-scaling accordingly.
  • They are typically not very accurate.
  • Overall they do not necessarily have deterministic behavior.
Lundin
  • 195,001
  • 40
  • 254
  • 396
  • I think this is a bad advice for this small micro (PIC18F4550; 32K Flash, 2K SRAM, not something which would normally be used with a higher level OS). At bootup when the program is setting up hardware, delays may be used as the particular hardware demands, during program execution, for a couple of microseconds also delays are the best practice, not needing any "external" resource (such as a timer). The base frequency of the timers is usually also the system clock, so no win there either. Of course the delay needs to be an assembler routine (possibly generated). – Jubatian Sep 07 '17 at 10:31
  • @Jubatian No, this advise applies perfectly for a bare metal MCU. For any MCU program of any kind, you are going to need some manner of generic timer. This is best implemented through a RTC or one of the timer peripherals. Once you have this you can easily implement a function like `delay(ms)`. For shorter delays with us resolution, you'll want to use a dedicated on-chip timer. And yes the source will be the system clock, but if your timer driver knows the system clock and the desired frequency, it can pick pre-scaler settings accordingly - which is a big win. – Lundin Sep 07 '17 at 10:57
  • Same for a properly generated delay code (both a prescaler setting and generated code is compile time!), without using a precious timer (which are also scarce on such small micros!). The delay code produces a guaranteed minimum delay (an IT interrupting could lengthen it, but if the overall process was sensitive to it, then it doesn't matter whether it fires in the delay or somewhere else). What you suggest is good for process control, there it is indeed greatly recommended on anything, but what the OP asks is more like for special HW interfacing at microsecond range. – Jubatian Sep 07 '17 at 13:13
  • And you should still correct the part of the answer where you mention why busy-delay functions are bad. If you delay using a timer, that's still a busy delay, just polling a timer until it hits a target, so those points are wrong there without clarification. – Jubatian Sep 07 '17 at 13:15
  • @Jubatian Another major difference is that a hardware timer will keep running. A "burn-away" amateur thing will freeze and wait for interrupts to execute. Meaning that the hardware timer version will at worst have the inaccuracy of 1 ISR call, while the amateur thing will have the inaccuracy of _n_ ISR calls. The typical case where the amateur thing breaks, is LCD initialization code over SPI, where you need delays but the SPI/DMA generates interrupts. – Lundin Sep 07 '17 at 13:35
  • SPI only generates interrupts if you ask it to do so. Now if you have a system doing several things, then you need either an IT or periodically checking the SPI or DMA result (depends), and then of course a busy delay (be it polling a timer or precalculated instruction sequence) won't do. If you only need to init the LCD (possible if it is done at boot-up), then a busy delay is fine. Your LCD code might be prone to break anyway then if you have so lengthy interrupts which matter, it will be just rarer (less chance for the IT to fire on the right spot to tip it over). – Jubatian Sep 07 '17 at 17:12