Subject: Re: [Boost-docs] [quickbook] Does import work?
From: Rene Rivera (grafikrobot_at_[hidden])
Date: 2011-10-18 03:46:03
On 10/17/2011 3:35 PM, Daniel James wrote:
> On 16 October 2011 20:09, Rene Rivera<grafikrobot_at_[hidden]> wrote:
>> On 10/16/2011 11:38 AM, Daniel James wrote:
>>>
>>> So you don't want to include the whole headers, just the documentation
>>> embedded in them?
>>
>> Right.
>>
>>> This feels like a bit of an odd thing to do in quickbook
>>> itself,
>>
>> Is it any odder than the existing import facility?
>
> Yes. The existing facility extracts snippets of code for inclusion in
> a quickbook document - so it's still quickbook driven. The C++ files
> are normally examples that are part of the documentation. You want C++
> driven documentation which is something else - especially for C++
> files that are part of the implementation.
I want "source coupled documentation". What I'm working with are
actually neither C++ nor C. I know weird for a Boost Library :-) They
are actually CPP headers only (i.e. meant to work with both C and C++ --
and maybe Objective C if I can stomach doing that). Because of the high
number of totally modular files it would be logistically problematic to
separate the documentation from the headers. In reality the
documentation isn't driven by the "source code" content, but only
coupled to it. This is in distinct contrast to, for example Doxygen, for
which it's predominantly documentation driven by the "source language".
Which means coupling of both source and semantics.
>>> I'd probably write a script to extract the documentation first and
>>> then include the extracted documentation from quickbook.
>>
>> That sounds much less maintainable than either extending quickbook or
>> manipulating Doxygen to give the results I want. Especially given that I'd
>> have to then deal with writing multiple scripts, sh/BAT, or use Python.
>
> Then write it in C++, which would be more maintainable than doing it
> in quickbook (actually writing it in python would be as well).
Well.. I am writing it in C++. Just that the C++ is inside quickbook ;-)
I'll have to see at maintainability though.
> Consider adding glob support to quickbook. First it will need to
> integrate with quickbook's path system.
I would be willing to consider extending the path system to allow this
use case in general. As it seems like it would be useful in other contexts.
> An external tool wouldn't have
> to do that. And whichever way you do it is going to be a bit fragile
> since it'll be easy to accidentally pick up files.
That seems like a weak excuse to not do something. If the semantics are
simple and limited it's much easier for people to rely, and work with,
those semantics.
> Then when it's
> enshrined as a feature, someone will want to do a more sophisticated
> glob, they'll want to control the order or exclude certain files, or
> who knows what else.
Probably, yes. But I don't think that's a good reason to not do
something. One of the ideals of quickbook from the start was to provide
for the 90% of the use cases as simply as possible. And either ignore
the other 10%, or to make not make it impossible but just painful.
> If you look at the way quickbook parses C++ files to get the snippets,
> the parser just looks for specially marked up comments and generates
> templates from them which are only parsed as quickbook on expansion.
> It isn't hard to do something like that in an external tool.
Right.. But it's already done in quickbook. So why would I want to write
another tool to do it? In contrast to just extending the current tool?
> And quickbook is already complex enough. There's plenty that needs
> fixing already. Adding features that should be implemented elsewhere
> will just make things worse.
I guess we disagree on what makes for good quickbook features :-(
>>> I suppose you could use special comments to strip out the parts you don't
>>> want to appear, but I expect it'd get messy.
>>
>> Is it any messier than the existing import facility? Note, IMO it's not only
>> not messier than the existing import, it's actually less messy.
>
> I meant it's messy to use special comments to ignore that parts you
> don't want to appear, e.g. '//<-' and '//->' pairs.> You'll also find
> that if you want to define code snippets but not have them appear
> inline it'll run into difficulties. Implementation files tend to be
> messy, much of the doxygen documentation has preprocessor directives
> for controlling what doxygen sees vs. what the compile sees, doing
> something similar for quickbook would either involve making the
> headers messier or making quickbook more complex.
I guess you misunderstood the extent of what I want and the context of
the example I posted. To clarify..
* I don't want to consider any form of parsing or coupling of the source
code semantics to the documentation content.
* The only parsing involved here is what quickbook already does.
* What I want is to automate what I would otherwise have to do to
achieve coupling at a file level the documentation that corresponds to
the code.
Perhaps if I make the example simpler it will be obvious:
====predef.qbk====
[article Predef ...]
[section Reference]
[import boost/predef/version_number.h]
[endsect]
====
====boost/predef/version_number.h====
/*`
[heading BOOST_VERSION_NUMBER macro]
Blah, blah:
* One
* Two
*/
====
Note, the only parts I removed are the CPP code parts. I.e. the parts I
want quickbook to ignore.
>> Also note, I'm not
>> talking about needing this for a handful of files.. I'm dealing with
>> documenting dozens of individual files.
>
> That sounds like a good reason to use something with good file
> handling capabilities.
I guess you are saying that the "main" / "only?" reason not to do this
is that the current quickbook path handling is terribly written? That
seems like just another bad excuse :-\
-- -- Grafik - Don't Assume Anything -- Redshift Software, Inc. - http://redshift-software.com -- rrivera/acm.org (msn) - grafik/redshift-software.com -- 102708583/icq - grafikrobot/aim,yahoo,skype,efnet,gmail
This archive was generated by hypermail 2.1.7 : 2017-11-11 08:50:41 UTC