|
Boost : |
Subject: Re: [boost] [NVL++]: new library and utilities
From: Manfred Doudar (manfred.doudar_at_[hidden])
Date: 2010-09-01 12:47:13
Hello Christian,
On Wed, 1 Sep 2010 11:26:32 -0400
Christian Henning <chhenning_at_[hidden]> wrote:
> Hi Manfred, I tried something similar a while ago when developing a
> GIL ( generic image library ) extension to make OpenCV functionality
> available for GIL data types. It's not multi-threaded.
>
[snip..]
Admittedly, I could have benefited from GIL, but most of my work had
been done by the time GIL was released.
I've looked at your extensions, they are nice and crisp. However, what
I've published goes considerably further, and would love to hear your
opinion if you could spare time to look under the hood. Much of the
unit tests illustrate functionality.
** Here's an example of face-detection:
camera_device< > camera;
video_server< > server(camera);
video_client< > client(&server);
// algorithm
face_detect ftor("haarcascade_frontalface_alt.xml");
// run it
client(ftor);
** Want to thread and connect multiple streams:
using namespace boost;
controller< > ctrl1;
video_server< > server1(&ctrl1, "video1.avi");
video_server< > server2(&ctrl1, "video2.avi");
video_client< > client1(&ctrl1, &server1, &server2);
video_client< > client2(&ctrl1, &server1);
controller< > ctrl2;
video_server< > server3(&ctrl2, "video1.avi");
video_client< > client3(&ctrl2, &server3);
thread ctrl1_thd(ref(ctrl1));
thread ctrl2_thd(ref(ctrl2));
thread server1_thd(ref(server1));
thread server2_thd(ref(server2));
thread server3_thd(ref(server3));
// algorithm
play_stream< > ftor;
// run it
thread client1_thd(bind(&video_client< >::operator()<play_stream< > >,
cref(client1),
ref(ftor)));
// sync off server1 (redundant, only 1 server)
// play stream 2secs from now, then 800ms thereafter
posix_time::ptime now = posix_time::mircosec_clock::local_time();
thread client2_thd(bind(&video_client< >::operator()<play_stream< > >,
cref(client2),
cref(server1),
now + posix_time::seconds(2),
posix_time::milliseconds(800),
ref(ftor)));
// sync off server3 (redundant, only 1 server)
thread client3_thd(bind(&video_client< >::operator()<play_stream< > >,
cref(client3),
cref(server3),
now + posix_time::seconds(5),
posix_time::seconds(0),
ref(ftor)));
client3_thd.join();
client2_thd.join();
client1_thd.join();
ctrl2_thd.join();
ctrl1_thd.join();
** Here's an example of algorithm chaining:
- below we do histogram equalization on image stream,
then we do phosphene rendering of the stream, and display
[note: I've not released either histeq or phosphene algorithms, but
library doco details how you'd write such algorithms of your own].
std::vector<variant< pipe::visitable<histeq::streamable> *
, pipe::visitable<phosphene::streamable> *
>
> algo_stream;
// histogram equalization algo
streamable::histeq_streamable<depth_8u, bgr_p> heq_stream;
algo_stream.push_back(&heq_stream);
// phosphene algo
streamable::phosphene_streamable<depth_8u, bgr_p>
p_stream(gain_control,
gamma_correction,
x_count,
y_count);
algo_stream.push_back(&p_stream);
// set up the pipe
pipe::pipe_stream2< depth_8u
, bgr_p
, histeq::streamable< depth_8u
, bgr_p
>::base_type::result_type
, pipe::visitable<histeq::streamable>
, depth_8u
, bgr_p
, pipe::display_tag::type
, pipe::visitable<phosphene::streamable>
, pipe::visitor
, std::vector
> pipe(std::move(algo_stream));
camera_device< > camera;
video_server< > server(camera);
video_client< > client(&server);
// run it.. feed the pipe to client
client(pipe);
Here it is again: http://users.cecs.anu.edu.au/~manfredd/nvl++.tar.bz2
Cheers,
-- Manfred
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk