From: Daryle Walker (darylew_at_[hidden])
Date: 2008-08-08 22:24:12
On Aug 4, 2008, at 3:27 AM, Markus Schöpflin wrote:
> we still should come to a decision regarding this issue. Can't we
> just make the change and check if there are any new test failures
> caused by it?
> Markus Schöpflin wrote:
>> Daryle Walker wrote:
>>> What happens on systems, like mine, that already have sufficient
>>> recursive depth? Will specifying a maximum lower that the
>>> default actually lower the setting? If so, then this addition
>>> could be dangerous.
>> If your toolset supports setting the recursion depth (gcc, qcc,
>> acc, and hp_cxx at the moment), it will be set to the value
>> specified. So yes, it might lower the default setting.
>> But why should this be dangerous? The recursion depth needed to
>> compile a program is independent of the toolset, isn't it? So if
>> for a given compiler a value lower than the default value is used,
>> there should be no harm.
Is the depth actually independent of the toolset? Also, the number
of full test cases is dependent on how large uintmax_t is; what
happens when computers get bigger and/or use a different value
outside of the 8/16/32/64-bit mindset? Is the problem affecting
every test computer with these compilers, or just yours? If we use a
default value for a parameter, it can increase as the creator updates
the product; if we fix the value, then the burden of vigilance falls
-- Daryle Walker Mac, Internet, and Video Game Junkie darylew AT hotmail DOT com
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk