Although GPT-3, with 175 billion parameters, can achieve state-of-the-art performance on NLP tasks like text generation, summarization, and conversation, it does not generalize well to tasks in other languages, since it is trained on a 93% English corpus. To enable downstream Chinese NLP tasks, researchers from Tsinghua University have open-sourced Chinese Pre-trained Language Model (CPM). At 2.6 billion parameters (trained on 100GB Chinese data), it is the largest Chinese pre-trained language model to date.