|
Boost Users : |
Subject: [Boost-users] (no subject)
From: Johnny McCullough (johnny.mccullough_at_[hidden])
Date: 2014-10-30 19:50:32
I'm new to boost, and I'm wondering about the best / most efficient way to
transfer a large dataset via boost::asio.
I've retrieved a datasetfrom a database (Cassandra) and I'm using boost
asio to provide a network service that responds to queries.
So I have a client socket and a dataset. I have to iterate the dataset,
perform some simple formatting (e.g. csv, json, xml), then write it out to
the client socket. But there's a lot of data. Perhaps gigabytes.
So my initial thoughts were the producer / consumer pattern - the producer
writing to a streambuf and the consumer async_write-ing it out to the
client's socket.
But I'm not sure this is a good approach - I know that streambuf isn't
threadsafe and so managing the streambuf with locks may well become a
performance bottleneck.
Obviously, I don't want to replicate what may be gigabyte(s) of data from a
resultset into a buffer just so it can be written out to a socket, So
should I write this data synchronously straight to the socket instead?
What's the best approach to solving this kind of problem?
thanks for your consideration
Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net