Set-to-Sequence Methods in Machine Learning: a Review

Abstract

Machine learning on sets towards sequential output is an important and ubiquitous task, with applications ranging from language modelling and meta-learning to multi-agent strategy games and power grid optimization. Combining elements of representation learning and structured prediction, its two primary challenges include obtaining a meaningful, permutation invariant set representation and subsequently utilizing this representation to output a complex target permutation. This paper provides a comprehensive introduction to the field as well as an overview of important machine learning methods tackling both of these key challenges, with a detailed qualitative comparison of selected model architectures.

Publication
Journal of Artificial Intelligence Research
Mateusz Jurewicz
Mateusz Jurewicz
PhD fellow

Mateusz researches in neural set-to-sequence models and catalogue/sequence optimisation.

Leon Derczynski
Leon Derczynski
Associate professor

My research interests include NLP for misinformation detection and verification, clinical record processing, online harms, and efficient AI.