Solved, by moving the decode() inside the lambda, and adding the `.template ` for GCC (MSVC was happy w/o it...)
A miraculous inspiration I guess. Unusual the ML is silent like this. Must have been a question asked the wrong way. --DD

    static void decode(Decoder& d, std::variant<Ts...>& var) {
        const size_t index = d.decodeUnionIndex();
        if (index >= sizeof...(Ts)) {
            throw avro::Exception("Invalid Union index");
        }
        boost::mp11::mp_with_index<sizeof...(Ts)>(
            index, [&](auto Idx) {
                constexpr size_t alternative_index = Idx;
                avro::decode(d, var.template emplace<alternative_index>());
            }
        );
    }

On Wed, Jun 30, 2021 at 3:27 PM Dominique Devienne <ddevienne@gmail.com> wrote:
Hi. I'm modernizing some code, to use std::variant instead of ad-hoc peudo-variant structs.
These structs need to be encoded/decoded via avro::codec_traits specializations, but given
that std::variant is variadic, I'm struggling a bit. I've done my research and found peter's [1] and also looked at [2],
which I thought might work, but does not, and I don't understand the errors I'm getting with MSVC 2019 in C++17 mode.