App Store review insights from former head of Twitter content rules


Yoel Roth recently published an op-ed in the New York Times detailing his experiences with content moderation at Twitter. He stepped down this month. Roth's role was head of trust and safety. One bit noted challenges of complying with Apple's governance in app reviews, which at times can seem arbitrary and capricious.

Apple's guidelines for developers are reasonable and plainly stated: They emphasize creating "a safe experience for users" and stress the importance of protecting children. The guidelines quote Justice Potter Stewart's "I know it when I see it" quip, saying the company will ban apps that are "over the line."

In practice, the enforcement of these rules is fraught.

In my time at Twitter, representatives of the app stores regularly raised concerns about content available on our platform. On one occasion, a member of an app review team contacted Twitter, saying with consternation that he had searched for "#boobs" in the Twitter app and was presented with ... exactly what you'd expect. Another time, on the eve of a major feature release, a reviewer sent screenshots of several days-old tweets containing an English-language racial slur, asking Twitter representatives whether they should be permitted to appear on the service.

Reviewers hint that app approval could be delayed or perhaps even withheld entirely if issues are not resolved to their satisfaction -- although the standards for resolution are often implied. Even as they appear to be driven largely by manual checks and anecdotes, these review procedures have the power to derail company plans and trigger all-hands-on-deck crises for weeks or months at a time.