SOTA deep learning models are getting bigger (for example, the largest Facebook Seer model, described above, is 10 billion parameters). As such, model developers need new and better tools for efficient training. Alibaba has released EasyParallelLibrary, which enables model developers to apply annotations to automatically transform a local model into a distributed implementation. EasyParallelLibrary decouples algorithm modeling from system design so that users can easily leverage different parallelism strategies(e.g. data, pipeline, and tensor model parallelism) with just a few lines of code. In addition, EasyParallelLibrary also includes memory-saving techniques and an optimized communication library to enable more scalability and efficiency.