Subject: Re: [boost] [smart_ptr] Interest in the missing smart pointer (that can target the stack)
From: Noah (duneroadrunner_at_[hidden])
Date: 2016-01-31 15:59:35
On 1/30/2016 11:16 AM, Steven Watanabe wrote:
> It's not just about optimization. Initializing
> a variable with a bogus value is no more correct
> than leaving it uninitialized, and also prevents
> tools like valgrind from detecting any real problems.
I wonder though, does the same argument apply to say, std::vector? I
mean, is the default initialization of std::vector to the empty state no
more correct than leaving it uninitialized? Should we require
programmers to explicitly set the vector state, even if they want to
start off with an empty vector? Or is the empty state somehow
intrinsically valid, but the zero value for integers is not? If we did a
random sample of C++ code on github, what percentage of integers would
be initialized to zero? What percent of std::vectors would be
initialized to a state other than empty?
I actually wonder...
Google doesn't seem to know. I actually don't have a strong opinion
either way, but it's not obvious to me that the zero value for integers
is more bogus than the empty state for vectors.
For native integers the language had to make a choice, and for
performance reasons, not bug-finding reasons, C chose no default
But when using substitute classes, I don't know if it's an either-or
situation. This is just off the top of my head, but let's say the
default substitute integer class requires it's value to be set
explicitly before use. And let's say it enforces this by throwing an
exception, in debug mode, if it's used before explicit initialization.
And let's say, for performance reasons, it didn't do any
"under-the-hood" default initializations. Let's call this class
CBaseInt. But then let's say, some of us would prefer an
"under-the-hood" default initialization (to guarantee that the resulting
release code was deterministic), but would still want to require
explicit initialization before use by the programmer. We could then just
publicly derive a class from CBaseInt called CDeterministicBaseInt, and
do the default initialization in CDeterministicBaseInt's constructors.
And let's say some lazy people don't want to have to explicitly
initialize before use. They could just derive a class from
CDeterministicBaseInt called CIntForLazyPeople. CIntForLazyPeople could
disable the "use before initialization" exceptions by a calling a
function provided by CBaseInt.
Then you could just use whichever integer class you prefer. Would this
satisfy everyone? Is this ideal?
So I do accept the notion that requiring explicit initialization before
use does help catch and prevent bugs. How many, I don't know. But I
don't agree with the idea that all of C++'s language interface should be
determined by valgrind's ability to find "use-before-initialization"
bugs in debug mode. I am not suggesting that C++'s legacy
high-performance language interface be abolished. I'm suggesting that
those of us trying to write "secure" and/or high level applications need
a different interface that is not encumbered by C's legacy priorities.
Specifically, we need the option of primitive types that have the power
and flexibility of full fledged classes.
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk