Boost logo

Boost :

From: Andrzej Krzemienski (akrzemi1_at_[hidden])
Date: 2023-08-18 10:46:59


On Fri, Aug 18, 2023, 03:36 Klemens Morgenstern via Boost <
boost_at_[hidden]> wrote:

> On Fri, Aug 18, 2023 at 6:51 AM Andrzej Krzemienski via Boost
> <boost_at_[hidden]> wrote:
> >
> > czw., 17 sie 2023 o 23:55 Ruben Perez <rubenperez038_at_[hidden]>
> napisał(a):
> >
> > > > One more question. This interface of async::generator<Out, In>,
> taking
> > > two
> > > > parameters, where one can not only generate values from the
> generator,
> > > but
> > > > also obtain values: is there a real-life use case for this?
> > >
> > > I'd say major languages like Python and JS allow for this, too.
> > > So if you're coming from these, it makes sense.
> > >
> >
> > Thanks, but still,
> > could someone show a plausible real-life example of this written in
> > Boost.Asynch?
> > I am not familiar with Python's or JS's coroutines. But do they have an
> > *identical* interface?
>
> Not identical, you need to call `send` in python, instead of operator().
>
> >
> > When I was trying to come up with an example, I found the results
> > surprising:
> >
> > auto output1 = co_await generator(input1);
> > auto output2 = co_await generator(input2);
> >
> > I expected that this instruction would mean "take input2, suspend, and
> when
> > resumed return value computed from input2". But because the
> implementation
> > in the coroutine has to read:
> >
> > auto next_input = co_yield compute(input);
> >
> > The consequence is that the co_awaits actually mean "take input2,
> suspend,
> > and when resumed return value computed from input1".
>
> generators have this "weird" kind of overlap by their nature.
>

I agree with the observation. I mean, the weirdness comes into play when we
employ the mechanism for injecting values into the generator. This is why I
am asking for any use case that would be served by this feature. My
hypothesis is that it is useless. Useless in Python and JS, and now it is
copied into Boost.Async. I may be wrong about this. This is why an example
of a plausible use case woul help proving me wrong.

They can be made lazy but then the inner workings get utterly
> confusing too, because where does the input1 come from before the
> co_yield?
>
> i.e. in your example:
>
> async::generator<Result, Task> client(Task t)
> {
> for (int i = 0; ; ++i < 100)
> {
> std::cout << "processing: " << t.value << std::endl;
> t = co_yield Result{ std::format("result-{}-{}", i, t.value) };
> }
> co_return Result{ std::format("result-{}-{}", 100, t.value) };
> }
>
> when I do the first co_await g(t) - where does `t` go? You're in the
> co_yield using the t passed in through the argument list, so it's not
> clear what's going on either.
> I actually did this in asio::experimental::coro, and I found it worse.
>

I agree that my example is confusing. I was trying to find any application
for the feature (of injecting values into a generator) and I failed. Hence
my hypothesis that it serves no use case.

> There might be an option to support this with a runtime_option, e.g.:
>
> async::generator<Result, Task> client()
> {
> auto t = co_await async::this_coro::initial; // wait for the first
> co_await & make the generator lazy.
> for (int i = 0; ; ++i < 100)
> {
> std::cout << "processing: " << t.value << std::endl;
> t = co_yield Result{ std::format("result-{}-{}", i, t.value) };
> }
> co_return Result{ std::format("result-{}-{}", 100, t.value) };
> }
>

You are describing a potential new feature, right?

>
>
> >
> > Maybe I am doing something wrong, I would like to be corrected. The
> > argument that other languages have it is not a valid one for me. I would
> > still like to know if this has a use case when implemented as it is with
> > C++ coroutines.
>
> I think you're just looking for a lazy generator and I made it eager.
> There's no reason I couldn't support both.

I wasn't really requesting a lazy generator (but maybe it is useful). I
just want to understand if there is any known use case for an eager
generator that is injected values. (Because if there isn't, this would be a
basis for criticizing this portion of the library interface.)

Regards,
&rzej;

> >
> > I enclose my example, where I tried to model a producer and consumer
> > situation, and concluded that I couldn't.
> >
> > Regards,
> > &rzej;
> >
> > _______________________________________________
> > Unsubscribe & other changes:
> http://lists.boost.org/mailman/listinfo.cgi/boost
>
> _______________________________________________
> Unsubscribe & other changes:
> http://lists.boost.org/mailman/listinfo.cgi/boost
>


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk