Recently, the ML community has expressed increased interest in causal models, including for applications like reinforcement learning, fairness, safety, and other use cases where explanations are important. Although researchers often use causal Bayesian networks (CBNs) to represent causal dependencies, these models cannot precisely model context-specific causal dependencies. Genewein et al. discuss discrete probability trees as an alternative to CBNs. By presenting concrete algorithms for causal reasoning infinite probability trees that cover conditions, interventions, and counterfactuals, they expand the domain of causal reasoning to a general class of discrete stochastic processes.