To protect sensitive user information while still enabling the sophisticated modeling and analysis, many companies are exploring differential privacy (DP). To support the adoption of DP by members of the machine learning community, Facebook has released Opacus. Opacus, a scalable, high-speed library for training PyTorch models with DP, was designed to preserve the privacy of each training sample – without significantly compromising the accuracy of the final model. It does so through an algorithm (based on differentially private stochastic descent) that adds unbiased noise to the gradients in every iteration so that the model does not memorize training examples. To train models with DP, users of Opacus just declare the PrivacyEngine abstraction, which interactively tracks the expenditure of the privacy budget (a core mathematical concept in differential privacy) and works on gradients.