[TECHSUCKS] WTF? CHAR SIGNEDNESS NOT DEFINED
2013 March 17WTF?
I never understood why people use the type char
to pass integers in C. I always thought it was a bad idea without ever being able to say WHY it was. Now I do:
#include <stdio.h>
void foo(char x){
printf ("%i\\n", x);
}
int main(){
foo(-1);
return 0;
}
void foo(char x){
printf ("%i\\n", x);
}
int main(){
foo(-1);
return 0;
}
This code is simple enough, right? It demonstrates the way that from my experience MANY people pass single-byte integers around. And now here's why this is FSCKING STUPID:
[09:40 AM][blindcoder@x86-host:~]$ gcc -o test test.c
[09:40 AM][blindcoder@x86-host:~]$ ./test
-1
[09:40 AM][blindcoder@x86-host:~]$ ./test
-1
Works as expected? Sure, on x86. Now, let's try the same on ARM, like the Raspberry Pi:
[09:44 AM][pi@raspberry:~]$ gcc -o test test.c
[09:44 AM][pi@raspberry:~]$ ./test
255
[09:44 AM][pi@raspberry:~]$ ./test
255
UNSIGNED! Simple, basic, first class idiotic integer underflow!
Seriously, people: if you want to pass integers, USE integers.
EOF
Category: blog