William M. answered 12/14/19
PhD in Neuroscience, taught Linux and Operating Systems for 7+ years
Cannot reproduce this error on macOS,
It is true that beginning variable names with digits is illegal in C (and C++, and Java, ...), but no C compiler should be incorrectly identifying the small letter l as the number 1.
What happens when you use other variables that start with lower case l, such as:
int lower = 10; // OR
int length = 5;
Do you see the same type of C compiler errors you reported?
//-----------------------------------------------------------------------------------
// Below is the code you showed, being compiled and run, and also being shown with the gcc -E test.c option.
// Compiling and running your code produced the following output (as expected)
//------------------------------------------------------------------
#include <stdio.h>
int main(void) {
int linux = 5;
printf("linux variable has value: %d\n", linux);
return 0;
}
//------------------------------------------------------------------
// which produced the following output:
//------------------------------------------------------------------
linux variable has value: 5
Program ended with exit code: 0
//------------------------------------------------------------------
// and running it with gcc -E test.c produced the following (again, as expected)
//------------------------------------------------------------------
extern int __vsnprintf_chk (char * restrict, size_t, int, size_t,
const char * restrict, va_list);
# 408 "/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/usr/include/stdio.h" 2 3 4
# 90 "test.c" 2
int main(void) {
int linux = 5;
printf("linux variable has value: %d\n", linux);
return 0;
}
//------------------------------------------------------------------