I can't get my julian day function working. With gcc and msvc++ it works perfectly. In the Keil compiler it doesn't. The date 05/21/2003 should produce 2452781 julian days, instead it produces 27950. And sometimes the month parameter is hosed. It's supposed to be 5, but a lot of the times it's like 31713 or some crazy value. Program Size: data=127.3 xdata=8174 code=36284
// Julian day number 0 corresponds to -4713-11-24 Gregorian. The // Julian day (jd) is computed from Gregorian day, month and year (d, // m, y) as follows: ULONG julian(int month, int day, int year) { return ( 1461 * ( y + 4800 + ( m - 14 ) / 12 ) ) / 4 + ( 367 * ( m - 2 - 12 * ( ( m - 14 ) / 12 ) ) ) / 12 - ( 3 * ( ( y + 4900 + ( m - 14 ) / 12 ) / 100 ) ) / 4 + d - 32075; }
damnit, nevermind. I forgot that 'int' on this processor is only 16-bits. I changed 'int' to 'long' and it works.
"I forgot that 'int' on this processor is..." As I repeated on this very forum only last week: "I recommend that you should *always* encapsulate *all* your compiler dependencies in #defines!!" Here: http://www.keil.com/forum/docs/thread31.asp#msg11352
Okay. I've got this. I'm sure it can be improved upon. This is in a header file that is shared between MSVC++ and the Keil.
// BOOL is 4 bytes in msvc++, 1 byte on keil // UINT is 4 bytes in msvc++, 2 bytes on keil // // Define some useful types #ifndef _WINDOWS_ typedef unsigned char BYTE; typedef unsigned int UINT; typedef unsigned long ULONG; typedef unsigned char BOOL; typedef unsigned int WORD; typedef unsigned long DWORD; typedef unsigned int USHORT; #endif
"What do you do about signed stuff?" How about S8, S16, S32? "What about BOOL?" That would be 'bit'. Stefan
That's the kind of thing "I'm sure it can be improved upon" Yes: avoid ambiguous names like INT, SHORT and LONG - make them explicitly obvious; eg I use U8 - Unsigned 8 bits S8 - Signed 8 bits U16 - Unsigned 16 bits S16 - Signed 16 bits Always have a #error in the final #else You also need to encapsulate keyword extensions like xdata, sfr, interrupt, and using
What Andrew said... U8, U16, U32, S8, etc. "WORD" is evil because every system has a different natural word size. Lots of MS Windows conditioning means a lot of people think "WORD" must be 16 bits and thus "DWORD" must be 32. But I much prefer U16 and U32. "UINT" tells you nothing "unsigned int" didn't; it just saves typing, but it won't save you from the sort of gotcha exhibited in this thread. There's not much point to this typedef. "UINT32" if you find the U32 form to be too terse. The C99 spec now provides standard names for these types of the form uint8_t, int8_t, and so on, from inttypes.h. It seems like it would be a good idea to use the standard names. But I'm having a hard time getting rid of my old habits. Hindsight is clear; such types should have been there from the beginning. "BOOL" is trickier. Keil's bit data type is somewhat restricted. You can't, for example, have an array of bits or such a BOOL in a struct along with some other data. So it's only good for "standalone" scalar bools; you can't use it anywhere you can use a normal type. My standard Bool typedef is for a U8, and I use "Bit" when I need to explicitly save the space or time. Our standard header also includes a definition for bitfield endianness for the platform. So, when you declare bitfields, you can #if the declaration and write it both ways. Kind of extreme. Usually I just avoid bitfields in the first place, another old habit. Portable declaration of interrupt handlers is difficult at best. You'll probably need a macro to rewrite the entire function declaration: // Keil #define INTERRUPT(name, vec) \ void name (void) interrupt vec #define USING(bank) using bank // somebody else #define INTERRUPT(name, vec) \ interrupt void name (void) #define USING(bank) so a line like: INTERRUPT(MyProcName) USING(3) has some hope of having portable syntax. Or, you just isolate that sort of code and accept the price of rewriting it for ports. Maybe you'll need to add a function call to set an interrupt vector (particularly with an RTOS, and tasks!), and there's nothing a simple macro can do for you.
"'WORD' is evil..." I agree! "...because every system has a different natural word size." Absolutely. Thus "WORD" is, by definition, entirely target-dependent! "...a lot of people think 'WORD' must be 16 bits" and a lot of other people think that, since int is 16 bits, "WORD" must be 32 bits...!!! All the more reason to avoid "WORD"!!! "Keil's bit data type is somewhat restricted." Actually, although it looks like a data type, bit is really more like a memory-space qualifier - like idata, xdata, etc. That's why, for example, you can't have an odd bit in a structure. :-( "Portable declaration of interrupt handlers is difficult at best" Absolutely - since the properties of interrupts are highly target-dependent. My goal in encapsulating these types of extension is so that the code can just be made to compile with another compiler - not to have it work on a different target! As I've mentioned before, it is often well worth the effort of making your 8051 embedded code compilable on something like MSVC or Borland, since these "big" tools generally give far better diagnostics than "small" compilers like Keil. I also find that the browsing facilities of MSVC are far superior to uVision's. etc, etc,...
although it looks like a data type, bit is really more like a memory-space qualifier Hm. Maybe that's the way I should think of it instead. Of course, you then have to wonder what the type of "bit b;" is supposed to be. ("int", as in "static s;" or "f(void) { return 0; }"?) I suppose if I'm going to be consistent, I should insist on calling it a U1. (Not to be confused with an S1; yet another reason to avoid standard bitfields.) U1 bit myBool; Of course, then I'm going to want U1 xdata myBool; for consistency and struct { U1 EA; U1 ES1; U1 ET2; ...} IE; which gets us back to bitfields. And I'm not enough of a preprocessor wizard to try and make those two bits of syntax come out to something appropriate :) U1 array[8]; would seem to require self-modifying code on the 8051, which is a bit difficult without von Neumann memory. Pesky customers with their non-stop demands! Actually, I'd rather have just typedef unsigned long long U64; Yet another bit of the C99 standard to be adopted.