This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Machine codes of accessing bit variable in are different ...?

I have one question about the machine code of bit variable in ?C_INITSEG and normal function

Assume bit variable PASS is declared and defined:

bit PASS=0;

After starting Debug Session I can see the machine code for PASS=0 and PASS=1 as follows:

(For PASS=1)
D234     SETB PASS(0x26.4)
(For PASS=0)
C234     CLR  PASS(0x26.4)
* where C2 is highByte and 34 is lowByte

Thus I guess: 0x34 means PASS variable and C2/D2 for CLR/SETB

But if I did not assign initial value to PASS like below:

bit PASS;  // no initialization compared with above

Then I found C134 is absent in ?C_INITSEG SEGMENT(due to not initializing PASS variable...)

It seems:
C134 for "PASS=0" in ?C_INITSEG and C234 for "PASS=0" in normal function...

Why does "PASS=0" have 2 different machine code representation?