Model developers often use back translation (BT) to train machine translation (MT) models on target-side monolingual mode. BT involves generating pseudo-parallel data by using a model to translate the translated (target) version of a text back into the original (source) language. Unfortunately, training a good backward model for BT is typically hard or slow and computationally expensive. Observing that lower quality but more diverse pseudo-parallel data yields better translation quality, Pham et al. propose a meta-learning algorithm to generate pseudo-parallel data from a pre-trained back-translation model. The pseudo-parallel data generated by the meta-learning algorithms are designed to train a forward-translation model well on a validation set. Unlike existing approaches that keep the trained backward model fixed, the authors update the backward model throughout the forward model’s training to improve the forward model’s performance.