The T5 library, released in October 2019, enables ML practitioners to finetune Transformer models pretrained on a large text corpus. Specifically, the T5 framework recasts any language task to the text-to-text format. More recently, Google Research has released a new and improved version of this library, T5X, which is implemented in JAX and Flax. With T5X, users can train models at many scales and on TPU.