|
Boost : |
From: Simon Buchan (simon_at_[hidden])
Date: 2005-09-29 17:23:11
Martin Bonner wrote:
> ----Original Message----
> From: Simon Buchan [mailto:simon_at_[hidden]]
> Sent: 29 September 2005 10:21
> To: boost_at_[hidden]
> Subject: Re: [boost] [integer] Create (u)int_natural_t
>
>
>>Daryle Walker wrote:
>>
>>>The "int" type was supposed to match the processor's natural built-in
>>>integer processor. That was easy to maintain in the 16- and 32-bit
>>>eras, but got screwed up when we started 64-bit computing. The C
>>>and C++ communities decided to expand its integer types by keeping
>>>the current types at their 32-bit-era sizes and extended the type
>>>system with a "long long" instead of moving "int" and "long" up and
>>>adding a "short short". Now we don't have a convenient way to name
>>>the best integer type in a portable fashion. I suggest we add a
>>>"int_natural_t" typedef to <boost/cstdint.hpp> to name the best
>>>integer type (and a corresponding "uint_natural_t"). We would have
>>>to research what that type is for each compiler and/or platform
>>>combination and use #conditionals.
>
> I don't think it's worth it. The performance difference between a 16-bit
> integer and 32-bit integer on a PDP-11 was pretty significant. Is there any
> noticable difference between a 32-bit and 64-bit integer on any of the
> 64-bit processors?
>
>>Isn't int the 'natural' integer type, by definition?
>
> Depends what you mean by "natural", but for a normal English meaning,
> probably not.
>
I thought we were talking about the type that's easiest for a computer
to push around?
>
>>I never understood why long didn't become the 64-bit type. It seems
>>pointless to have int and long the same size on 32-bit.
>
>
> Oh god no! The thread that would not die! This has been argued to death on
> comp.std.c.
>
Heh. Sorry :)
(But really, what's different?)
> The reason long didn't become the 64-bit type is that most commercial C
> compiler vendors wanted to continue to support code their customers had
> written that assumed long was exactly four eight-bit bytes.
>
> You can argue as long as you like that the problem is in that code (and I
> would agree with you). It won't change the fact that telling your customers
> they are wrong is not usually the route to commercial success.
>
Oh, ok. But what about non-commercial compilers? And why did int get
longer for 32-bit? (You don't have to answer that)
>
>>Who still writes 16-bit code on desktop, anyway? :D
>
> Nobody ... but who still writes for the desktop? Think about the number of
> car radios (as just one example) compared to the number of PCs.
Yes, but there arn't many 64-bit "car radios", and thus not much need to
have 4 different int sizes. And I don't believe in the demise of the
desktop, just the demise of the desk, but that's a different argument.
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk