Boost logo

Boost :

From: Hartmut Kaiser (hartmut.kaiser_at_[hidden])
Date: 2007-11-21 13:59:16


Ben,

> >> Yes, but as we said (unless you sent this before your other
> >> reply) a 4.25MB (0x110000 * 4) lookup is definitely
> unacceptable. I
> >> suppose for the case of Windows (a 256K lookup - 65536 *
> 4) this is
> >> OK, so we could consider an auto compression scheme if you exceed
> >> 65536 entries for the vector in question. Do you think
> even 256K is
> >> too much?
> >
> >2^16 (64k) is probably the max you should use to cut off if larger.
> Remember
> >the cache is not only there for the data, instructions are held in
> there as
> >well...
> >Can you make this max size configurable?
>
> Sure, I'll need to understand how to do the compression first though.

Sure.

> >One last note: parser/tokeniser/str_token.hpp still includes
> >size_t.h/consts.h (it should include size_t.hpp/consts.hpp),
> same for
> >examples/cpp_code.hpp and examples/csharp_code.hpp.
>
> My bad: just delete the directory tree (lexer) and
> re-unzip... those files aren't used anymore.

Ohh, that's too bad, because I'm using the cpp_code.hpp and serialise.hpp
headers in the lexertl based lexer for Wave. Any chance to revive these?

Regards Hartmut


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk