Boost logo

Boost Users :

From: dizzy (dizzy_at_[hidden])
Date: 2008-08-19 04:27:39


On Monday 18 August 2008 19:15:50 Andrea Denzler wrote:
> Sure, we know all that C/C++ fortunately allows to do that. But if such
> tasks, like cross platform serialization are so common, why not introducing
> them as a standard like with the STL.

Because there are just too many decisions on how to serialize it (the external
portable representation). Decisions that depend on the user testcase and thus
the user should decide on them not the standard library (because there are
inherent tradeoffs to be made and it's up to the user to do that).

What I would like tho from C++ is something that helps in doing serialization
no matter how you do it. And first thing that comes in my mind about that is
compile time introspection for data members. That alone would make writing
serialization frameworks much easier and less error prone to use them. Another
thing that might be nice is a skeleton serialization framework, extremely
extensible (at compile time), it may come with an implementation already for
the lazy people but it should allow for those with special needs the
flexibility to have complete control over the external representation of the
values.

But our discussions are in vain. It's pointless what we say here, it only
matters what papers are submitted to the committee (and if any will be
approved about this subject that will make it in the next standard after
C++0x).

> What I mean is that there should be a crossplatform solution for defining
> all characters. Unicode DO THIS. wchar_t not. So wchar_t is useless for
> real crossplatform i18n applications. On Windows it even breaks the
> definition of the wchar_t requiring that one wide char represent all
> possible characters for that system, that's not true on Windows because for
> rare combinations you need two wchar_t.

Yes wchar_t is useless for portable character encoding. Just as "int" is
useless for portable integer value communication. They are however useful to
their purpose. If it's true what you say about Windows then it's not
conforming (well the letter says "Type wchar_t is a distinct type whose values
can represent distinct codes for all members of the largest extended character
set specified among the supported locales (22.1.1).").

> Yes, fixed integral types is enough.

You do know however that they are not mandated? A conforming implementation if
it does not have a native type without padding for say a int32_t then that
implementation will not offer "int32_t" (so your program will not compile
which may be what you wanted or not, just saying that it does not cover
everything people imagine about these types).

> A Unicode point is just a number. Sure a standard function for
> decoding/encoding from the current locale to Unicode characters is welcome.
> Even better if I can write directly a Unicode text to a i/o stream.

You mean write UTF-8 or UTF-16 or other Unicode encodings?

> I want that my code is portable. I have a overflow because on common
> platforms the unsigned type use all bit's when handling signed/unsigned.
> Are you telling me to compile only on Unisys MCP to avoid the overflow?

I didn't say anything as such, just saying that integral types may have
padding.

> The simple fact that -1 < 2U returns false without giving me any warning is
> a issue. C++ is IMHO the best language in the world, I love it, I loved C
> since the first day long time ago. But there are issues... there is plenty
> of room to improve it. Do you really think that a over 30 years old
> language has no issues today???
>
> The best path is to extend and improve it... so that I benefit new features
> on existing code.

Yes there are problems but not so big to say the whole integer C++ feature is
flowed.

-- 
Dizzy
			"Linux is obsolete" -- AST

Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net