I'm using STLport (shipped with C++Builder6).
I'm agree that lexical_cast is great by its universality, but my goal is to have correct conversion :
from string to unsigned char, signed char, int, signed int, long, signed long.
and from all these "type" to string.
For me, lexical_cast must be change or clearfully reject/expose the cases where it don't work.
I'll try a solution where I convert back to the Source type the Destination value calculated.
If the 2 value are OK, then the conversion is ok.
Maybe it's a penalty for other cases, but it assure that "all" conversion will works.
Best regards.
Stephane Bronsart.
"Terje Slettebų" <tslettebo@chello.no> wrote in message news:269b01c22159$5b5db8f0$60fb5dd5@pc...
From: Bjorn.Karlsson@readsoft.com

> From: Stiphane Bronsart [mailto:stephane.bronsart@bea.be]
> But if do a call like this :
> signed short int i = lexical_cast<signed short int>("60000")
> no exception is
> throw !
> i is assigned a value of -5536

The value of i is actually undefined, but in this case, the value wraps (and does so on most implementations). lexical_cast does not perform range-checking on numeric types.

> It is possible to catch some error ?  How ?
> If I have to scan the string before calling lexical_cast, it is not so
> useful...

Everything's possible...if you're not concerned about the efficiency penalty, consider lexical_cast:ing the returned value back to a string, and compare that to the original value. If there is some sort of guarantee on the input value, so that you can use a numeric type that can hold all possible input values, lexical_cast to that type, followed by a numeric_cast to the actual destination type. numeric_cast performs checks for positive and negative overflow, and does what you're looking for (but only for numeric types). Wrap this up in a cast function of your own, and that should do the trick.

I think this is a good solution. Having a mandatory range-check could make it less generic (how to you range-check UDTs?), and may result in overhead, which may not be needed. It's not easy to have this as an option, either, because the interface shouldn't be changed, so that it retains an interface like a cast.

For this reason, stream configuration isn't possible, either.

.> Is the use of  stringstream (see implementation of
> lexical_cast) not to
> heavy ?
> What say boost about that ?

Nah, personally I don't think so. While it can be argued that there is potential for optimizations by using lower level constructs for known conversions, that defeats the generality of the solution.

I also replied to the same question at Boost-User's list, a while ago. The problem with using low-level conversion functions for known conversions, is that they don't follow the stream convention. They don't follow the locale, for example.

In addition, you get a patchwork of special cases of questionable value. Granted, a possible new version of lexical_cast, in the works, now, uses various special cases to solve the problems with characters or strings containing whitespace, or empty strings. It also supports wide characters.

However, these cases don't use C-functions or anything like that, and their main aim is to provide a consistent functionality. The efficiency of skipping the stringstream, for special cases, is more of an added bonus. It happens for known conversions where the source and target is character or string, or the same type.

 The concept of streaming makes lexical_cast usable with UDT:s; often without any extra coding. IMO, that's a real plus. Finally, the elegance of the solution is just too beautiful to change :-).

There are a few changes in the lexical_cast version you've just received, but it still uses stringstream. :) In the cases where it doesn't, it uses nothing at all, except things like constructors, or subscript.