I think the best soulution would be like you suggested to add the '$' character to the valid basic source character set to allow identifiers containeing a $. I therefore tested out how different major compilers and versions of these compilers handle this. The following compilers and versions of the compilers allow a '$' character within the identifiers (of macros):

gcc 2.95, 3.1, 3.2, 3.3.3, 3.4.3 and 4.0.2
icc (intel c compiler) 8.0, 8.1

I would corroborate for the fact that it seems like there is some traction for allowing '$' characters within identifiers in major compiler preprocessors.

I have not been able to reproduce the error you got with GCC or with any other compiler . Which compiler and version did you use when you got your error message?

Thanks
Andreas

 
On 11/18/05, Hartmut Kaiser <hartmut.kaiser@gmail.com> wrote:

Andreas Sæbjørnsen wrote:

> When preprocessing the code
> #define $jack int test;
> $jack
> using any preprocessor using the wave library, in this case
> the samples/lexed_tokens/lexed_tokens preprocessor , gives
> the following error:
>
> PP_DEFINE        (#369) at test.C (  1/ 1): >#define<
> SPACE            (#393) at test.C (  1/ 8): > <
> lexed_tokens:
> /home/saebjornsen1/projects/boost/boost/wave/token_ids.hpp:497
> : boost::wave::util::flex_string<char,
> std::char_traits<char>, std::allocator<char>,
> boost::wave::util::CowString<boost::wave::util::AllocatorStrin
> gStorage<char, std::allocator<char> >, char*> >
> boost::wave::get_token_name(boost::wave::token_id): Assertion
> `id < T_LAST_TOKEN-T_FIRST_TOKEN' failed.
> Aborted
>
> using the preprocesor cpp (from the gcc team) I get (as cpp test.C ):
> # 1 "test.C"
> # 1 "<built-in>"
> # 1 "<command line>"
> # 1 "test.C"
>
> int test;
>
> I believe the problem lies in that the lexer does not
> recognize the '$'  as a valid  character  within the name of
> macro definition.

Accordingly to the Standard the '$' character is _not_ part of the basic
source character set (see 2.2.1 [lex.charset ]). For this reason it won't get
recognized as the part of a identifier.

I fixed the lexed_tokens sample (actually the Wave library) so you won't get
an assertion anymore, but a meaningful output. Now the output is:

PP_DEFINE        (#369) at test.cpp (  1/ 1): >#define<
SPACE            (#393) at test.cpp (  1/ 8): > <
<UnknownToken>   (#36 ) at test.cpp (  1/ 9): >$<
IDENTIFIER       (#381) at test.cpp (  1/10): >jack<
SPACE            (#393) at test.cpp (  1/14): > <
INT              (#335) at test.cpp (  1/15): >int<
SPACE            (#393) at test.cpp (  1/18): > <
IDENTIFIER       (#381) at test.cpp (  1/19): >test<
SEMICOLON        (#297) at test.cpp (  1/23): >;<
NEWLINE          (#395) at test.cpp (  1/24): >\n<
NEWLINE          (#395) at test.cpp (  2/ 1): >\n<
<UnknownToken>   (#36 ) at test.cpp (  3/ 1): >$<
IDENTIFIER       (#381) at test.cpp (  3/ 2): >jack<
NEWLINE          (#395) at test.cpp (  3/ 6): >\n<

BTW: when using the wave driver for the code given above you get:

    test.cpp(1): error: ill formed preprocessor directive: #define

What certainly could be done additionally is to add the '$' character to the
valid basic source character set to allow identifiers conatining a '$', but
this weakens the Standards conformance of Wave. Any suggestions?

HTH
Regards Hartmut



_______________________________________________
Boost-users mailing list
Boost-users@lists.boost.org
http://lists.boost.org/mailman/listinfo.cgi/boost-users