Boost logo

Boost :

Subject: Re: [boost] Summer of Code 2010
From: Darren Garvey (darren.garvey_at_[hidden])
Date: 2010-03-07 21:56:26


Hi Andrew,

Thanks for taking this on!

On 7 March 2010 20:25, Andrew Sutton <andrew.n.sutton_at_[hidden]> wrote:

> It looks like we are not doing a very good job keeping up with past SoC
> work
> :) but the results are encouraging (I think, given the nature of Boost).
>

I have kept an eye on the past projects by means of the Boost SVN:

https://svn.boost.org/trac/boost/browser/sandbox/SOC

Just from the activity there, it's hard to tell where any of the projects
are actually up to. Given the perfectionist nature of Boost, it's not always
easy for the people involved to know exactly where the projects are up to
either... ;)

At a minimum, it would be good if there was an "official" round-up of each
SoC a couple of months after the pens-down date.

The number of slots allocated to Boost has been declining each year. From
> 2006 to 2009 we had 9, 9, 8, and 7 slots to fund students. This is not a
> particularly good trend, but that number seems to depend on the number and
> quality of ideas and the availability of mentors.
>

... add to that the success rate of previous years?

> <snip>
>
> First, it's tough for students to get up to speed with
> Boost and meet the exacting requirements of reviewers. Second, [by] the
> time
> the students are really up to speed, it's back to school and the work
> basically stop.
>

Accurate observation. The SoC is not very long at all and if it takes a
student a week or two to get up to speed with Boost.Test, SVN and
Boost.Build, that is a big chunk of time lost right at the start of the
project. Students also tend to do crazy things like go away on holiday or
have relatives come to stay, so the projects really need to be tight if they
are going to succeed.

I think it would help if there was a rule about having to commit something
to svn each week. Students may be tentative about publicly displaying
incomplete code but it's almost impossible to gauge how things are going
otherwise.

One way to improve those numbers is to develop a set of goal projects with
> precisely specified requirements... We want to have this data structure,
> that implements these operations, and no, you can't use libgmp. These
> projects should define a clear path to acceptance (testing, docs, formal
> review, etc.).

If students were encouraged to write tests at the start of the SoC with
their mentor, they would have a specific set of goals to work to. Mentors
should be well placed to help the students define the precise requirements
of the project. The path to acceptance from there is: get all the tests
passing and document how the library does it. Even if this doesn't happen by
the end of the SoC, there is still a definite goal.

> If anybody who worked on a project as a student or mentor and has
> > information about it or its status, please let me know.
>
> Still a valid request :)
>

Just chiming in with a brief status update of the CGI library: it is in
working order and the interface is largely stable now. It needs some more
housekeeping and a more complete test suite before it is ready for review
and there is some internal refactoring I'd like to do to support more
advanced uses.

Support for sessions is included (finally) and I've been rewriting the
documentation over the last couple of weeks as there has been significant
changes since last year's (ultimately unsuccessful) SoC project. I had been
planning on posting to the vault when that's ready.

Cheers,
Darren


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk