Boost logo

Boost :

From: Beman Dawes (bdawes_at_[hidden])
Date: 2004-09-10 12:24:29


At 11:49 AM 9/10/2004, Ben Hutchings wrote:
>Beman Dawes <bdawes_at_[hidden]> wrote:
><snip>
>> Normally we would have a regression test that identifies the systems
>> which fail. But we don't test large file support directly because to
>> do so would be a burden on those who run the regression tests; such
>> tests would chew up gigabytes of disk space and might also be very
>> slow.
><snip>
>
>why so? Unix file-systems support sparse files, as does NTFS.

I don't want to introduce a requirement that the regression tests need to
be run on file systems which either support sparse files or have lots of
space available. If someone wants to run tests on a FAT file system, I'd
like that to be practical.

If there was an overwhelming benefit to some test needing a particular
environment, that might be another matter. But it doesn't seem to me that
an actual large file test offers much advantage over a surrogate, as long
as the surrogate accurately reflects reality. We will see if that is the
case.

Thanks,

--Beman


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk