Research on “lottery tickets” suggests that neural network subnetworks (i.e. “winning tickets”) can be trained in isolation to achieve the performance of the original network. This research indicates that neural network performance hinges on the lucky random initialization of winning tickets. Consequently, it implies that neural networks initialized with random weights must be large to increase the likelihood of including a winning initialization. Springer and Kenyon evaluate this phenomena – they study how weight initialization and the distribution of training data impacts the network’s ability to learn by examining how effectively small neural networks can predict future steps of the 2D cellular automaton Conway’s Game of Life. Through this exploration, they find that networks require substantially more parameters than anticipated to converge. They also discover that the probability of convergence is highly sensitive to small perturbations of initial weights and to properties of the training data.