To support Project Lighthouse, an initiative to ensure more equitable outcomes for guests and hosts on Airbnb’s platform, the company released a new system designed to measure discrepancies in user experiences resulting from discrimination and bias. This system uses anonymized perceived race data that is not linked to individual Airbnb accounts to protect users’ privacy and because discrimination is often based on perception of race (instead of self-identified race). In the first post of a two part series, Skyler Wharton explains how Airbnb uses p-sensitive k-anonymity to calculate acceptance rates by guest perceived race while preserving user privacy. P-sensitive k-anonymity mitigates the risk of certain attribute disclosure, thereby reducing the chances of exposing users’ perceived races to misactors. Wharton also explains why Airbnb uses an external research partner to label perceived race instead of employing an ML system for this task.