unsigned int a = 1;
int b = -2;
if(b < a)
NSLog(@"Less than!");
else
NSLog(@"Greater than!");
Output:
Greater than!
Why is b casted to a uint and not a to a 64-bit signed int or even a 40-bit signed int?
<tpw_rules> LightDark: java is a consequence of inverse moore's law: every 18 months, the average program will be twice as slow. therefore, computers always run at the same percevied speed. java's invention was a monumental step
That's really Interesting, I guess this sort of case would carry to much of performance penalty considering there appears to be no cpu compare instruction for this type of situation in the x86 lexicon. Upon further inspection the way to get this case to work is to explicitly convert the uint to a long int during the compare. I wonder why C/C++ compilers don't do this implicitly? Is it too much of a performance penalty or will implementing it at this point in the life of the language break too many things?
<tpw_rules> LightDark: java is a consequence of inverse moore's law: every 18 months, the average program will be twice as slow. therefore, computers always run at the same percevied speed. java's invention was a monumental step
huh, that is really interesting, But i could see the reasoning from a nomenclature sense. an unsigned int is by nature restricted to only positive numbers. in this case, it would probably be better to convert it to a long integer, so as to deal with signed integers.
but that still brings up a problem doesnt it? since signed integers have half the range the unsigned int does, the moment a uint exceeds the half range value, how do you go about comparing it? maybe bitwise operators could be used but... ehh... yea this is a tough one..
without code, we wouldnt have life as we know it...
Honestly, I think the right thing here would be for the compiler to give a type error, since they're different types. If you want to compare, then cast them so they're the same type. I know people feel like that would be annoying, but personally I think it's less annoying than having to remember the arbitrary behavior that compiler implements.
The novice realizes that the difference between code and data is trivial. The expert realizes that all code is data. And the true master realizes that all data is code.
I don't know about obj-c, but c rules are that types get promoted when two different types are compared. I don't have a reference right now, but one of the rules says a signed int gets promoted to an unsigned int.
following the output provided by bbguimaraes, it does make sense when you remember the property of 2's complement (more importantly how it maps a signed number to unsigned).
without code, we wouldnt have life as we know it...