Toward Best Practices around Online Content Removal Requests
In the Spring of 2022, the Berkman Klein Center hosted a nine-week Research Sprint in collaboration with the Center’s longest-running project, Lumen, with the goal of developing a set of concrete best practices for transparency regarding content takedown requests sent to Online Service Providers (OSPs).
(Illustration by dpict.)
The Sprint was motivated by the experience that the Lumen project has had over the past two decades running the Lumen Database—a repository of content takedown requests sent to OSPs. Funded by Arcadia, a charitable fund of Lisbet Rausing and Peter Baldwin, Lumen partners with Google, YouTube, Twitter, Wikipedia, Medium, Automattic, and other OSPs to aggregate content takedown requests received by these OSPs. The project’s database offers opportunities to researchers, scholars, academics, journalists, and others to study patterns in content takedown requests. Various OSPs voluntarily share copies of takedown requests with Lumen, subject to each OSP's internal guidelines and shifting priorities.
The lack of standardization of transparency regarding takedown requests leads not only to opacity regarding takedowns generally, but also to difficulty in understanding the ways that takedown processes broadly may be misused by individuals, organizations, and powerful institutions. Hence, the focus of this Research Sprint was the creation of best practices for takedown transparency.
A cohort of sixteen early to mid-career professionals from over twelve countries and across five continents were selected to participate in this Sprint. Over the course of the Sprint, speakers including Daniel Seng, Brandon Butler, evelyn douek, and a representative of Google’s Trust and Safety team discussed their views and expertise in areas of content takedown and transparency through the lens of their work in academia, civil society, activism, and with OSPs. With the benefit of each participant’s experience in areas related to platform governance, the cohort collectively explored the global notice and takedown environment and contributed to ongoing efforts to standardize content moderation and transparency best practices through two outputs.
The first output was to build a working draft of a Statement of Best Practices (SOBP) with respect to transparency regarding takedown requests that OSPs receive from external parties. While there are a few indispensable and effective best practices for online transparency and content moderation, such as the Santa Clara Principles, there is a lack of industry standards among OSPs regarding content takedown request transparency, i.e. clarity regarding who sends content removal to OSPs and why. The SOBP output was envisioned to be a first step in a process of creating best practices for transparency concerning content takedown requests, while acknowledging and balancing tensions such as data privacy of users, content sensitivity, and user safety.
The second output was supplementary to the first—a documentation of the process of creating an SOBP, grounded in the experience creating one within the Sprint. This process documentation was envisioned as supporting mindful building of the draft SOBPs and was also intended as a "how-to" for creating future SOBPs—a starting point for communities looking to engage in creating SOBPs of their own.
In an attempt to most effectively harness the cohort’s wide breadth of individual experiences and knowledge, the participants were initially divided into four small groups to think through what they considered essential elements in draft SOBPs. In the latter half of the Sprint, the four groups were merged into two larger “supergroups” so that the participants could strengthen their initial ideas through the addition of new perspectives and opportunities for deliberation.
Identifying relevant stakeholders and considering tradeoffs
The first group focused on visually mapping critical stakeholders and their competing transparency interests. The system mapping was intended as groundwork for the best practices community when considering the stakeholders involved, their role in the community, and their impact. They also accounted for the goals and tradeoffs to consider when building the best practices.
Systematically documenting the interests of each stakeholder revealed the need for best practices to be tailored to the specific needs, incentives, and capacities of each entity (e.g., government, academic researchers, platform end-users, advocacy organizations, etc.). Similarly, the mapping of priorities also highlighted the various tradeoffs to be considered when making decisions on transparency at a granular, notice by notice level, rather than in a “big data” aggregate, such as the extent of detail that can be shared about an individual takedown request and the actions taken.
The first group’s final recommendation prioritized an approach to transparency premised on a balancing competing rights framework that weighs factors such as the type and sensitivity of the content, the originator of the request, etc. For instance, based on notice senders, notices sent by governments were categorized to be of high interest. However, when looking at the type of notice, content like non-consensual intimate imagery will generally have relatively lower public interest value for granular transparency and carry higher risks for those involved. What makes this balance an especially difficult one is the recognition that despite the possibility of predicating varying levels of transparency based on notice senders and notice types, the two may not always be mutually exclusive.
Overall, the recommendations focused on the need for clarity regarding the decision-making process behind takedown, including the takedown criteria, and advocated for further regulation of automated decision-making processes.
(Visual mapping of the stakeholders and priorities involved (Supergroup blue).)
While the first group mapped relevant stakeholders, drafted general recommendations, and identified the need for specific best practices for each stakeholder, the second group engaged deeply with a specific stakeholder—researchers. Their focus on the research community stemmed from the recognition that researchers may be an underrepresented community in the existing transparency best practices for OSPs. The planned merger of these two groups’ visions led to a draft SOBP that used the visual stakeholder mapping to lay out the best practices for takedown transparency from researchers’ point of view. This included recommendations for providing notice details including the type of content requested to be removed and information regarding action taken by the OSP on the takedown request sent.
We tried not to view transparency as necessarily a good or a bad thing or as an intrinsic value event but more as a springboard to hold OSPs accountable.
–Snigdha Bhatt, Sprint participant
Phased procedural transparency regarding content removal decisions
The third group of Sprint participants approached takedown transparency through a three-pronged system, creating an overarching framework for content moderation decisions with three phases: pre-decision, decision, and post-decision.
This group also focused on identifying the disparate incentives and interests that shape transparency decisions, some of which strongly overlapped with the concerns that came up in other groups, such as human rights for internet users and privacy-related concerns. Their recommendations considered the size of different OSPs – determining that smaller platforms should be held to less onerous standards than the biggest market players – as well as the various jurisdictions that platforms operate in. The group recommended clarity and uniformity amongst OSPs’ Terms of Service and amongst jurisdictions in which they operate with respect to the definitions of terms such as “hate speech,” “defamation,” and others that are the basis for takedown requests.
Group four added to group three’s phased, three-pronged system by offering its own three-pronged approach, which blended seamlessly into the supergroup’s draft SOBP. First, group four recommended general best practices, such as publishing regular reports regarding takedown transparency. Second, the group suggested that OSPs publish whether a content takedown decision based on a notice was automatic or subject to human review, with additional information concerning the adjudication process and any balancing by reviewers.
Finally, group four also offered recommendations in the form of operational best practices to increase the accessibility of transparency reports, such as offering translations, providing use cases or examples, and supplying metrics suitable for non-expert audiences.
(Three-tiered phased transparency (Supergroup red).)
Scoping the land of takedown transparency and balancing tensions
Even though the resulting SOBPs differed in their approaches, the groups largely converged on the necessity of adopting a “balancing” framework to navigate the tensions that inevitably emerge between transparency and competing rights or interests. The global reach of the notice and proposed takedown framework also militates against a singular, universal approach. Jurisdiction, scale of the company involved, and the specific facts and context of a notice can each affect the appropriate course of action with regards to transparency.
For example, participants largely agreed that it may be difficult to reconcile the norms for one jurisdiction with those of another, given that most countries have their unique set of laws regarding content removal and varying definitions of categories of content that warrant removal by OSPs, such as hate speech and defamation. This tension is also exacerbated due to varying types of governments globally. While increased transparency may be acceptable in certain (sometimes more liberal) nations, others (especially increasingly authoritarian nations) require that takedown requests made by the government remain opaque and that OSPs remain discreet about the actions taken as the consequence of such requests.
Further, OSPs also operate at varying scales and while certain granular transparency requirements may become too onerous for smaller OSPs, they may be practical and feasible for larger OSPs to provide. The participants largely concurred with the model proposed in the EU’s Digital Services Act, where OSP transparency mandates are dependent on the number of users and scale of operations.
The balance between transparency requirements and user privacy was also identified as a delicate one, since it may lead to potentially sensitive personal data being exposed or a user being de-anonymized. A proposed resolution to this came in the form of a tiered access system to takedown requests, where such access is provided only after satisfactory verification of bona fide research intentions in the public interest, and on a temporary basis.
While a few groups sought to resolve these tensions by focusing on the minimum common denominators in the form of general takedown transparency best practices for all OSPs, others focused on a single stakeholder, such as researchers, and engaged more deeply on the complex best practices that would meet their needs.
Any statement of best practices is incomplete without a broad and meaningful stakeholder engagement as best practices are informed by human rights and pluralistic perspective, so it is imperative to be very intentional with each best practice.
–Eren Sözüer, Sprint participant
What's next?
With its unofficial motto "good data informs good policy," Lumen attempts to ensure content takedowns do not happen in the dark. This effort to enable meaningful transparency was also the guiding principle behind the outputs of this Sprint.
The draft SOBPs are iterative and will serve as a starting point for a longer consultative process that Lumen will spearhead with the support and insight of its partnering OSPs and other interested stakeholders.
The hard work of the Sprint cohort will be the seed that Lumen and the cohort members can plant in their respective networks, nurture, and make it grow into something meaningful that perhaps the entire global ecosystem of takedown transparency could take advantage of.
–Adam Holland, Project Manager, Lumen Project
What started out as an exercise to create a unique set of best practices in the previously unchartered territory of takedown transparency has now resulted into working drafts of best practices that individuals and organizations globally can continue to build on. While this effort of turning the draft SOBPs into adoptable best practices continues, we are eager for interested stakeholders with relevant expertise to get involved with this work and push it forward. If you are one of them, start by reaching out to us at team@lumendatabase.org.
The Sprint’s living syllabus and lightning presentations are available as additional resources for a more detailed insight into this Sprint.
(Header image by Scott Macpherson, CC BY-SA 2.0.)