This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Coding short delays

I would like to be able to write code that implements short delays (a few microseconds) but where the code generated automatically adapts to the speed of the processor. i.e. as the processor speed is changed, the delay remains constant.

I don't suppose there is a predefined macro constant available (that Keil hasn't told us about) that makes the cycle time available to the source code?

I guess this is quite a common problem.

uVision seems to know all about time, so would it be difficult for a predefined constant to be provided?