Keil Logo Arm Logo

How could I make a adjustable delay function in "micro-second" level

Next Thread | Thread List | Previous Thread Start a Thread | Settings

Details Message
Read-Only
Author
Jason Yang
Posted
17-Jan-2005 08:51 GMT
Toolset
C51
New! How could I make a adjustable delay function in "micro-second" level
Hello Every :

I use AT89C52 and 11.059MHz oscillator.
1 instuction takes about 1 micro-second.(10^-6 sec.)

I write a delay function in *.asm.

It works well , but I want it adjustable.
I want to delay according to the parameter I pass.My goal is to modify delay time for some protocol time slot testing.

I found it some how difficult to achieve.
Because the basic instruction , for example : _nop_ , take 1 micro-second.
I can't add more decision. It will takes more time.

How could I implement such kind of delay?
Please give me a hand , thanks a lot~~
Read-Only
Author
Andrew Neil
Posted
17-Jan-2005 10:02 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
Sounds like you need to use a faster processor, and/or one that uses less clock cycles per instruction.

Maybe you could use a timer, but you'll still have difficulties if the delays are on the same order as the execution time of a single instruction!

Maybe some external hardware to do the critical timing?

Take a look at this thread:
http://www.keil.com/forum/docs/thread2938.asp
Read-Only
Author
Drew Davis
Posted
17-Jan-2005 17:38 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
You might try a series of _nops_, and jump far enough down into them so that the remainder represents the time that you need.

There is still some overhead in this method, of course. You have to calculate the jump, and get into and out of the function. You won't be able to create a 1 us delay, but you might be able to create fairly precise delays from, say, 10 us up to as long of a table as you can stand.

Longer intervals means you can use a loop, at the cost of more granularity and more overhead.

Small intervals and precise measurements are best done by timer hardware rather than software. "Small" is relative to your processor clock and architecture. If software has to act at very precise times (send this response exactly 3.4 microseconds after receiving this message), then you need a faster processor, as Andrew suggests. There is an overall system design issue to be considered here, so it's hard to suggest solutions without really knowing the requirements.
Read-Only
Author
erik malund
Posted
17-Jan-2005 17:52 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
I want to delay according to the parameter I pass.My goal is to modify delay time for some protocol time slot testing.


For delays this short the only option is inline nop.

two possibilities

1) Since you refer to "testing" could you use a conditional assembly.

2) Selecting the routine instead of selecting the delay:

do_it_with_1
mov...
...
....
...
nop
...
...
...
...
ret

do_it_with_2
mov...
...
....
...
nop
nop
...
...
...
...
ret

Erik
Read-Only
Author
erik malund
Posted
17-Jan-2005 17:53 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
oh, even easier
do_it_with_1
MACRO_1
nop
MACRO_2

do_it_with_2
MACRO_1
nop
nop
MACRO2

Erik
Read-Only
Author
Tim-Oliver Paaschen
Posted
19-Jan-2005 06:27 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
Take a look at the following thread:
http://www.keil.com/forum/docs/thread1940.asp
It has some interesting macros to convert microseconds to instruction cylcles and to place the appropriate number of NOPs into your code.
Read-Only
Author
Jon Ward
Posted
20-Jan-2005 16:50 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
Just for grins, I created the following assembler file using MPL macros:

NAME	NOPS

?PR?NOPS?NOPS	SEGMENT CODE
	RSEG	?PR?NOPS?NOPS

%DEFINE (GenNops) (
%SET (cnt, 200)

%WHILE (%cnt GT 0) (
  %SET(b_ones,(%cnt / 1) mod 10)
  %SET(b_tens,(%cnt / 10) mod 10)
  %SET(b_huns,(%cnt / 100) mod 10)
PUBLIC	NOPS_%substr(%b_huns,2,1)%substr(%b_tens,2,1)%substr(%b_ones,2,1)
NOPS_%substr(%b_huns,2,1)%substr(%b_tens,2,1)%substr(%b_ones,2,1):	nop
  %SET (cnt, %cnt - 1)
  )
)

%GenNops

	RET
	END

When youassemble this file, you'll get something like the following:

                      19     PUBLIC  NOPS_200
0000 00               20     NOPS_200:       nop
                      21
                      22     PUBLIC  NOPS_199
0001 00               23     NOPS_199:       nop
                      24
                      25     PUBLIC  NOPS_198
0002 00               26     NOPS_198:       nop
                      27
                      28     PUBLIC  NOPS_197
0003 00               29     NOPS_197:       nop
.
.
.
                     613     PUBLIC  NOPS_002
00C6 00              614     NOPS_002:       nop
                     615
                     616     PUBLIC  NOPS_001
00C7 00              617     NOPS_001:       nop
                     618
                     619
                     620
00C8 22              621             RET


The assembler file is simply a table of NOP instructions with a public label for each. You can control how many are generated with the %SET (cnt, 200).

The only overhead for calling a "function" in this table is the time it takes to execute the CALL and RET instructions.

The following shows how to delay for 10 NOPs (plus the CALL and RET).

extern void NOPS_010 (void);

void main (void)
{
while (1)
  {
  NOPS_010 ();
  }
}

An improvement which could be made to this is to change the labels so that there was no 0 prefix for numbers less than 100.

Jon
Read-Only
Author
Jason Yang
Posted
21-Jan-2005 03:12 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
Thank You all , I learned a lot from you all.

But Jon , Maybe you just solved half of my question.
All of You did give me some great solution of how to delay in micro-second level.
But , I still can't decide "delay count" in run time.
Your method is calling certain non-parametered sub-routine in the program.
For example , if I want to delay 10 uS , I could call NOPS_010 (),and 20 uS ,call NOPS_020().
This is great when implementing any protocol.
But , if I want to adjust Signal time slot (protocol) in run time.
I want to input "delay count" to decide how long to delay.
I tried the method in this article :
http://www.keil.com/forum/docs/thread2938.asp (the one Graham Cole wrote).
And I declare a variable for storing "delay count".
Then , I could just change "delay count" by passing different parameter.
Or I can call the major function that includes "delay function".

This method seems doesn't work.
Because when I call by constant (ex:NOPS(10)), it almost meet my require.But when I call by variable (ex:NOPS(delay_count) where delay_count = 10)
it has serious overhead . It delays about (delay_count + 20~30) uS.

Are there any more solutions ?
Anyway , thank you all. I did learn a lot.
Read-Only
Author
Andrew Neil
Posted
21-Jan-2005 07:54 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
"when I call by constant (ex:NOPS(10)), it almost meet my require.But when I call by variable (ex:NOPS(delay_count) where delay_count = 10) it has serious overhead ."

See my post dated 11/17/03 15:26:56 in the thread I cited earlier:
http://www.keil.com/forum/docs/thread2938.asp
It specifically describes a way to cope with this overhead!
Read-Only
Author
Jon Ward
Posted
21-Jan-2005 20:05 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
That MACRO was originally written by me in the following thread:

http://www.keil.com/forum/docs/thread1940.asp

There, I made the following important note:

NOTE: Do not use variables with this macro. Use constants only. Using a variable causes actual code to be generated for each ternary operator. Using constants causes the ternary operator to be evaluated by the preprocessor (which is what we want).

Jon
Read-Only
Author
Jon Ward
Posted
24-Jan-2005 05:31 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
OK - So, I used standard macros to create the same thing...

NAME	NOPS

?PR?NOPS?NOPS	SEGMENT CODE
	RSEG	?PR?NOPS?NOPS

NUMNOPS	EQU	200

MakeNOP MACRO X
	PUBLIC	NOPS_&X
	NOPS_&X:	NOP
	ENDM

NOP_LIST:
REPT	NUMNOPS
	MakeNOP %(NUMNOPS - ($ - NOP_LIST))
	ENDM

	RET
	END

Jon
Read-Only
Author
cui wei
Posted
22-Jan-2005 08:51 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
sample:

void delay_us(unsigned char counter)
{
ACC = 255 - counter;
#pragma asm

mov dptr,#DELAYBEGIN;
jmp @a+dptr

DELAYBEGIN:
REPT (255 - 9)
nop;
ENDM

#pragma endasm
}

note: counter must be from 9 to 255.
Read-Only
Author
erik malund
Posted
24-Jan-2005 14:39 GMT
Toolset
C51
New! RE: How could I make a adjustable delay function in "micro-second" level
#pragma asm
Why on earth would you do that.
Purt the subroutine in a .a51 module.

also,
the routine described is wrong
ACC = 255 - counter;
it should be
ACC = 255 - counter - overhead
jb acc.7,crash ; routine used for too short a delay
Erik

Next Thread | Thread List | Previous Thread Start a Thread | Settings

Keil logo

Arm logo
Important information

This site uses cookies to store information on your computer. By continuing to use our site, you consent to our cookies.

Change Settings