Only minimum sizes are defined - CPUs that can handle larger sizes with no overheads may use compilers that use larger bit lengths than the minimum; eg. it's possible a char, short int and int could all be 16 bit with some compilers, or even 32 bit.
Plus different compilers have different "endedness", storing bytes low to high or high to low within words or memory addresses.
You can use sizeof() to find how big a type is within a program, and simple tests to determine endedness, if that would matter.
A Computer Science portal for geeks. It contains well written, well thought and well explained computer science and programming articles, quizzes and practice/competitive programming/company interview Questions.
www.geeksforgeeks.org
As Pomie says, using a library such as stdint puts numeric range restrictions on numeric types - but there is no absolute guarantee that the memory or register size in use may not be larger than that, on some systems.
Thanks to both of you for confirmation. The minimum size of the data type is defined in C standards and the size of the data type depends on compiler and target.
The question below confuses me more
Do the short / long modify the size of the data type? What does short/long state in C standard?
From the off.. An int was a whole number ( regardless of size ) when the target was 8 bit we had int and char 8 bit and 7 bit as a character was 0~ 127.. the int was 0~255.. But an int was next to useless so the int became a 16 bit with two successive locations
char became 8 bit signed or unsigned .
int was then a short or along ( short was default ) 16bits or 32bits.
so if you do not specify and use int... 16bits is assumed if you use long then 32bits are used.
Why!!! I hear people cry.... Well an 8 bit micro math doubles with each size unsigned char + unsigned char is quick
long + long is a lot slower...
If the target is a 32bit mpu.. Then using a long is quicker than using an 8 bit integer... so as pommie said.. include the stdint.h and use the size that is needed in that instance.
As others have said, it depends on the compiler and the processor - I struggled for a while once, until I realised the processor I moved to used 32 bit integers while the one I moved from used 16 bit integers - I only got caught the once though