Boost logo

Boost :

Subject: [boost] tweaking the review process (was: signals2 review results)
From: Stjepan Rajko (stjepan.rajko_at_[hidden])
Date: 2008-11-21 11:38:33


On Thu, Nov 20, 2008 at 11:40 PM, vicente.botet
<vicente.botet_at_[hidden]> wrote:
>
> I would like to make some suggestion to improve the review management:
>

Thank you for starting this discussion.

> * I had notice only at the end of the the review that the review tokes place on two mailing lists (devel ans user). Maybe this is usaual on Boost reviews but I was not aware. It will be more transaparent if all the reviews were posted to the same mailing list, maybe a specific one should be created.

Yes, this is a bit of an issue. I mentioned that both lists are used
in my post that opened the review (in my notes to first-time
reviewers) to try to give people a heads-up (but... I often get
long-winded and I'm sure it's easy to miss parts of my posts :-)).

Having a dedicated, or at least recommended mailing list (either dev
or user) might be a good thing. The only problem I see regarding a
completely separate mailing list is that it requires active effort to
sign-up for and follow. I think that for many people (even those that
are genuinely interested in the library), the fact that the review is
happening at time X-Y very easily slips off the radar. Having the
e-mails constantly appear in the common lists also serves to remind
everyone of the ongoing review, and I suspect it leads to many
valuable impromptu comments and even full reviews from people that
perhaps weren't originally planning on participating in the review,
but end up following the review because it happens on the list (I
recall being in this situation myself several times).

>
> * From the 5 committing reviewers making this review possible only 3 made a review and 2 of them late. I'm wondering if this new rule must be preserved as the review can be accepted without the commiting reviewers review.

Trying to use committed reviewers was a new thing for this review (it
was proposed by others in http://tinyurl.com/48tdjs). Now that it has
been tried out, it would be good to reflect on how things went.

For the reviews that were late, the reviewers were very diligently
contacting me about their changing time constraints and offered times
that would work for them, and I found those times acceptable. Out of
the two committed reviewers that did not submit, one also diligently
worked with me on setting up an alternative timeline, but in the end
contacted me saying that a full review would not be possible after
all, and offered a brief report of looking into the library and a
brief rationale for acceptance (as the points matched those of other
reviewers, I didn't forward it to the list). I was not able to
contact the final reviewer after a certain point.

I think the major benefit of the committed reviewers is the high
probability that they actually will submit a review. Things always
happen, and boost contributions are all volunteer time, but I found
that overall the committed reviewers *very responsibly* succeeded in
keeping their commitment as well as they could. Were they not
committed reviewers, I suspect many of them would have considered
giving up. In addition, the extensions that the committed reviewers
negotiated gave other reviewers a chance to contribute.

So, I feel that committed reviews are a good way of reasonably making
sure that a certain number of reviews will be submitted. I'm not sure
that the library acceptance / rejection should be influenced by the
number of committed reviews (reviews from others are a perfectly good
substitute).

While we're on this topic... The one thing that the number of
committed reviewers should definitely influence is whether the review
happens in the first place. We were fortunate to get 5 people sign
up, which was the only threshold proposed, so we followed through with
the review. I was wondering what to do if the threshold wasn't
crossed, and I think what I would have proposed is some sort of a
holding pattern, where the review schedule page is updated to say
something like "signals2 - reviewers needed, contact review manager to
sign up as a reviewer". Once the threshold is crossed, the reviewers,
manager and author can schedule a time.

Another thing... many of the committed reviewers reported that they
were first time reviewers (and the reviews they submitted were
impressively detailed and very valuable). Personally, I was thrilled
by this, since getting new participation in boost is critical to
sustaining its quality . Perhaps some combination of allowing
reviewers to commit beforehand, and explicitly encouraging reviews
focused on the user perspective (another suggestion from the
http://tinyurl.com/48tdjs thread), helped in getting first-time
reviewers (perhaps the reviewers can comment on this?).

>
> * There were some reviews that came into this list by the intermediation of the review manager with a delay between the posting of the reviwer and forward from the RM. One negative review posted the 4th and reveived in this list the 11th other positive posted the 2nd and received in this list 3rd. I think that the review manager should not encourage the reviewers to send the reviews to himself. This will avoid this kind of delays. So I purpose that only the reviews sent to this single mailing list must be taken in account.

Yes, this particular delay is entirely my fault. I can't explain why
I didn't notice this email until so late (which is the point where I
contacted the poster notifying him I would like to forward the mail,
and forwarded it the following morning).

Allowing reviews to be sent to the review manager is straight from
http://www.boost.org/community/reviews.html (Introduction paragraph).
There are two valid reasons for this, IMO:

* the reviewer is only subscribed to one of {boost, boost-user,
boost-announce} lists, and would like the reviewer to be forwarded to
both boost and boost-user lists (if there was a dedicated review list,
like you suggest, that was also open to all posters, then this reason
would go away)

* the reviewer would like to remain anonymous to the list (granted,
this could also be accomplished by sending from an anonymous e-mail).

If this stays as it is, the RM should be more diligent about
monitoring her personal mail, which I apparently wasn't.

>
> * Even if the review was over the 10th there were 2 accepting reviews comming from the commiting reviewers just some hours before the review result annoncement 19th. I think that the review manager must state clearly when the review is over and do not accept any review after. This do not means that the RMcannot change this date, but annnce it clearly.

I tried to keep my timeline as transparent as possible. In my
review-closing email, I asked people to let me know if they were
considering writing a review. Those that contacted me were informed
of the timeline of pending reviews. When the last promised review was
submitted, I set a hard deadline. Given the uncertainty in people's
schedules, I decided to approach this in a flexible way. That was
just my personal preference - I'm sure setting a hard deadline earlier
would have been a fine choice as well.

>
> I hope this will help to improve future reviews,
>

Thanks again for starting this discussion and for your suggestions - I
also hope good improvements will come out of it.

Stjepan


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk