It appears that when using the large memory model results in an invalid pointer. For example:
void bar(const char *h, ...);
void foo()
{
bar("Strings:", "String1", "String2", "String3", NULL);
}
void bar(const char *h, ...)
{
va_list ap;
const char *ptr;
va_start(ap, h);
printf("%s\n", h);
while( (ptr = va_arg(ap, const char*)) != NULL)
{
printf(" %s\n", ptr);
}
}
Now, looking at the disassembly for foo:
SUBA #0x00012,SP
MOVX.A #0x0db22,0x00000(SP)
MOVX.A #0x0db2c,0x00004(SP)
MOVX.A #0x0db34,0x00008(SP)
MOVX.A #0x0db3c,0x0000c(SP)
CLR.W 0x0010(SP) <----- Here's the problem
CALLA #.text:bar
ADDA #0x00012,SP
RETA
Note the push of a 16-bit value pushed for the NULL, rather than a 32-bit (well, actually 20-bit) value. Now when in bar(), 32-bit values are extracted. And since the upper 16-bits are whatever is on the stack....
So digging around in stdio.h, I see that NULL is just defined as 0. Now, I understand that this is how it is usually done, but I think that is because it presumes that a pointer size is equal to an integer. But that isn't the case in the large memory model.
So my solution, thus far, has been to put "--define=NULL="(void *)0"" on the command line.
Have others had this problem? Is this how they work around the issue?