http://www.politiker-stopp.de/gfx/politiker-stopp-print.png

Benjamin Schieder

[TECHSUCKS] WTF? CHAR SIGNEDNESS NOT DEFINED

2013 March 17 | 2 comments

WTF?
I never understood why people use the type char to pass integers in C. I always thought it was a bad idea without ever being able to say WHY it was. Now I do:

#include <stdio.h>
void foo(char x){
printf ("%i\\n", x);
}
int main(){
foo(-1);
return 0;
}


This code is simple enough, right? It demonstrates the way that from my experience MANY people pass single-byte integers around. And now here's why this is FSCKING STUPID:
[09:40 AM][blindcoder@x86-host:~]$ gcc -o test test.c
[09:40 AM][blindcoder@x86-host:~]$ ./test
-1

Works as expected? Sure, on x86. Now, let's try the same on ARM, like the Raspberry Pi:
[09:44 AM][pi@raspberry:~]$ gcc -o test test.c
[09:44 AM][pi@raspberry:~]$ ./test
255

UNSIGNED! Simple, basic, first class idiotic integer underflow!

Seriously, people: if you want to pass integers, USE integers.


EOF

Category: blog

Tags: TechSucks WTF STUPID


2 Comments

From: Christof Buergi
2013-03-19 14:01:56 +0100

(snicker)
Actually, according to C99, char is an integer. However, it is the only integer that may be either signed or unsigned by default. Also, char isn't 8bit on all platforms.
That said, int, short, long or long long shouldn't be used either, if you want your code to be portable.

From: mirabilos
2013-03-26 14:32:44 +0100

No, a char is not an int, just an integer whose sizeof() is 1 but it may have arbitrary signedness indeed.
If passing around integers, just always use uint8_t, int8_t, etc.

Post a comment

All comments are held for moderation; basic HTML formatting is accepted.

Name: (required)
E-mail: (required, not published)
Website: (optional)
Comment: