Boost logo

Boost :

From: Markus Schöpflin (markus.schoepflin_at_[hidden])
Date: 2008-08-11 07:26:41


Daryle Walker wrote:
> On Aug 4, 2008, at 3:27 AM, Markus Schöpflin wrote:
>
>> we still should come to a decision regarding this issue. Can't we just
>> make the change and check if there are any new test failures caused
>> by it?
>>
>> Markus Schöpflin wrote:
>>> Daryle Walker wrote:
>>>> What happens on systems, like mine, that already have sufficient
>>>> recursive depth? Will specifying a maximum lower that the default
>>>> actually lower the setting? If so, then this addition could be
>>>> dangerous.
>>> If your toolset supports setting the recursion depth (gcc, qcc, acc,
>>> and hp_cxx at the moment), it will be set to the value specified.
>>> So yes, it might lower the default setting. But why should this be
>>> dangerous? The recursion depth needed to compile a program is
>>> independent of the toolset, isn't it? So if for a given compiler a
>>> value lower than the default value is used, there should be no harm.
> Is the depth actually independent of the toolset?

I was going to say that the recursion depth is a function of the code and
not of the compiler, but I did some tests and I got surprising results:

GCC 4.2.3 on a 32bit Linux system needs a recursion depth of 76 to compile
the test successfully. (The -ftemplate-depth-NN flag was introduced with
gcc 2.8.0 and defaulted to 17 (which is the value required by the C++
standard), but the default has been increased since then. To which value I
cannot say, and the default is undocumented. It has been hard-coded in
Boost to 128 for quite a few years now.)

HP CXX 7.1 on a 64bit Tru64 system needs a recursion depth of 66 to compile
the test successfully. (The default value here is 64.)

I still believe that the recursion depth is a function of the code, and not
the compiler, but there seems to be a difference in how the recursion depth
is calculated.

> Also, the number of full test cases is dependent on how large uintmax_t
> is; what happens when computers get bigger and/or use a different value
> outside of the 8/16/32/64-bit mindset?

Well, you would have to increase the depth, of course.

> Is the problem affecting every test computer with these compilers, or
> just yours?

It is affecting every C++ compiler that has a limit for the maximum
recursion depth. (Keep in mind that the C++ standard requires a maximum
supported template recursion depth of 17.)

> If we use a default value for a parameter, it can increase as the
> creator updates the product; if we fix the value, then the burden of
> vigilance falls to us.

Well, Boost has lived with hard-coding an arbitrary value for the template
recursion depth for g++ for years. A standard conforming compiler sticking
to the required instantiation depth of 17 wouldn't be able to compile the
test at all. Therefore I think it's better to make it explicit in the
Jamfile that this parameter needs tuning, if possible.

I'm not set on any particular value of the parameter, if the hardcoded 128
has worked for GCC for years, why not use <c++-template-depth>128 then?

Markus


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk