Boost logo

Boost :

From: Cristianno Martins (cristiannomartins_at_[hidden])
Date: 2008-04-02 17:52:47


>In my experience, there are very few instances where parallelism can be
>usefully concealed behind a library interface. OpenMP has very high
>overhead and will help only for very long-running functions -- at least
>under Microsoft's compiler on x86 and x64 platforms.

Right, but this is something that depends of the algorithm. For example,
let's take the regular expression library implementation in Boost
(Xpressive). Pattern matching is an expensive task and it's a kind of thing
that comes in a library set like Boost. Researchers have tried parallelize
regular expression matching ( So, I guess there
would be researches that are based on parallel libraries construction.

>Algorithms that
>are likely to be applied to very large datasets could have the OMP
>pragmas inserted optionally but they would need to be protected by
>#ifdef logic (or given distinct names) because otherwise the overhead
>will destroy programs that make more frequent calls on smaller datasets.

Ok, but the feasibility study of the parallel version of an algorithm should
be a part of the parallelization task. Not all of the parallelizable tasks
would be really interesting running in parallel.

>Also consider that, in an application that is
>already parallelized, there are no extra cores for the library to use

Well, but what if the library is already parallelized? Maybe the extra task
generated by parallelize the real application would be dispensable.

>On a smaller scale, adding "vectorized" and/or "streaming
>producer-consumer" interfaces for selected libraries may help a lot by
>encouraging use of vector instruction / execution units, unrolling of
>loops and improving instruction and data locality.

This is a pretty good idea.


Cristianno Martins
Mastering in Computer Science
State University of Campinas
skype: cristiannomartins
gTalk: cristiannomartins
msn: cristiannomartins_at_[hidden]

Boost list run by bdawes at, gregod at, cpdaniel at, john at