I understand, in many cases its important to define how large your numeric variable is (and I always consider unsigned vs signed), but do compilers ever optimize or make tailored size decisions when you don't specify char, short, long or long long (so, int does != long long int by default?). This would probably be hard to implement in a compiler, but is it so? I think some kind of meta-programming mechanism which provides virtual integer sizes (not templates) is essential. I'm also annoyed by the very existence of char. Please, call it octet, or something, because we might handle textual strings with 2-byte characters (Or use 1-byte numeric values).
I think C++11 has some feature like this (virtual integer size), right? "auto" ?
But it doesn't specify anything about size inference. This would be useful if the compiler could look back at the way you use a virtual integer in the final program, and then deduce the optimal size. But I doubt any discerned modification of C++ could be powerful enough for such generalization utility.
A lot of people seem undervalue the importance of size efficiency. And this isn't just for efficiency, but there's a large number of cases where I can write a set of functionality which is great by the aspect of its individual design, but heavily depends on certain tuning.
Dejan Jelovic demonstrates another sort of my frustration:http://www.jelovic.com/articles/using_namespaces.htmhttp://www.jelovic.com/articles/java_na ... g_time.htm
... I have a lot more to rant about, but I'll leave you in peace. I have to go to bed.
When life gives you lemmings, keep them safe.