Statement of Best Practices towards Takedown Transparency
Jointly prepared by Working Groups 3 and 4 (Supergroup Red), as part of the Research Sprint on Takedowns and Transparency Research Sprint: Global Norms, Regulation and the Nature of Online Information hosted by the Berkman Klein Center for Internet & Society at Harvard University.
Team members/Authors:
- Berdien Bernarda Erika van der Donk (Working Group 3)
- Charles Culioli (Working Group 3)
- Eren Sözüer (Working Group 4)
- Inika Serah Charles (Working Group 4)
- Snigdha Bhatta (Working Group 4)
- Tavishi Ahluwalia (Working Group 3)
- Torsha Sarkar (Working Group 3)
Table of Contents
Statement of Best Practices towards Takedown Transparency1
Foundational Principles of the SOBP6
PHASE 1. Transparency on TDRs: Request vi✤sibility11
PHASE 3. Transparency post-takedown decisions: Decision🧔 visibility17
INTERNAL REFLECTIONS ON THE CREATION OF AN SOBP22
Reflection on the groups’ process22
1.1. Reflection on working group 3’s project22
1.2. Reflection on working group 4’s project33
The Tripartite Model: Genera🅘l, Ph♈ase-wise, and Operational Best Practices36
Unresolved Questions and Looking Ahead38
Post-mortem improvements: what’s next?39
and the , the attempt was to build ꦜon foundational principl꧋es that were relied on by those institutions. Along with universally accepted principles such as the Universal Directive of Human Rights, United Nations General Principles on Business and Human Rights, as well as instruments with industry buy-in such as the GNI Principles on Freedom of Expression and Privacy, mechanisms for algorithm accountability, such as those by the Algorithmic Accountability Agency Framework (A3 Framework), further attention might need to be paid on the following principles: - Due Process: The overarching foundational principle is that of Due Process, and one that should permeate into each stage of the content takedown process.
- Predictability: OSPs should aim to create principles that create a sense of predictability in users. Creating a predictable atmosphere would entail clearly defining each and every act that is considered a violation by the platform.
- Explainability: OSPs should publish clear, accessible andreadable transparency reports at frequent time intervals. Explainability would also entail that thesereports be made available in the language(s) of each country, so that takedown and appeals decisions reflect the language, culture, political and social context, as well as the legal framework of the country in which content is removed.
- Stakeholder-specificity: Transparency should be made context specific, i.e. maximum transparency approach to government requests whereas a minimum transparency approach to content involving CSAM, NCII etc.
- Accessibility: Not only should takedown notices be brought within closer public view but to ensure actual accountability, users and relevant stakeholders should have access to details of globally applicable or jurisdiction-specific local laws relevant to the specific content, details of any formal or informal working relationships and/or agreements the company has with state actors when it comes to flagging content or accounts or any other action taken by the company, and other granular information that are relevant to assess accountability.
- Contestability: A robust grievance redressal mechanism should be in place to allow alleged infringers to file counter notices. A fixed time frame must be set within which OSPs should respond to such users, with a recourse to further appeal.
In the pre-decision phase:
The pre-decision phase covers the period before an OSP takes a takedown decision and focuses on the information on permissible content and behavior that is available to the users of OSP services. Before a takedown decision is taken, users must be transparently informed of the rules applicable to the platform. The terms of service of any OSP service should clearly state the types of content that will be removed and the types of behavior that can lead to (permanent) exclusion from the platform.In the takedownphase:
The takedownphase deals with the internal mechanisms through which OSPs process TDRsand arrive at decisions. It spans the time between the receipt of a TDR and the OSP's ultimate action. Important components of this phase include composition and constitution of internal institutions and organizational hierarchies, resource allocation, internal guidelines on dealing with different categories of TDRs , internal deliberations on decisions, the extent of use and efficacy of automated tools, efficacy and training of human reviewers, etc. Consequently, meaningful transparency in this phase shouldallow decision-making processes to be assessed against fairness and non-discrimination, consistency, and predictability. Similarly, internal institutions should be transparent to enable representation, especially to historically marginalized communities.
In the post-decision phase:
The post-decision phase covers the period after the takedown decision and focuses on the transparency requirements that allow checks-and-balances. At this time of the takedown process, respecting due process equals granting a right to appeal to users. Balance of interests has to be struck between the right to privacy and transparency on the right to appeal. Transparency also ensures that due process is respected. As a result, transparency in this phase should include redress mechanisms such as appealsboth by OSPs and other stakeholders as additional checks on takedown.Applicability
The best practices apply whether OSPs take down content proactively or reactively, as long as the takedown process is required by external actors or triggered by external rules. Unless otherwise indicated, the best practices apply to all OSPs, irrespective of size or type. Best practices marked out with a (🔺) only apply to Large Scale OSPs, as defined in the glossary. OSPs that do not qualify as such are strongly encouraged to comply with these obligations. Best practices marked out with a (●) pertain to transparency afforded to researchers, as defined under the best practice, “Researcher Access.”Methodology
PART I. GENERAL BEST PRACTICES These best practices are applicable across all phases or tiers: They aim to answer questions regarding what is removed, the norms applicable to removals, and how instances of abuses of the takedown mechanism notices are tackled. PART II. PHASE-WISE BEST PRACTICESThese best practices are specific to the phase in the takedown process: (1) Transparency on the TDRs themselves; (2) Transparency on the decisi🎐on-making procedure on how the TDR is handled; and (3) Post-hoc transparency, or transparency on practices afte🔥r the decision on the TDR has been made by the OSP.
PART III. OPERATIONAL BEST PRACTICESThe focus of these best practices is to ensure that the transparency report is readable, periodic and drafted in a manner that sets out the data in the most user-friendly format possible, whether the object of scrutiny can be seen by the relevant parties or not, whether the stakeholders can observe the conduct of those that govern.
Legend
Symbol | Meaning |
🔺 | Applicable for Large Scale OSPs |
● | Part of Researcher Access |
STATEMENT OF BEST PRACTICES
I. General Best Practices
- Providing information on norms/rules applicable to TDRs
- Separate reporting for different categories of TDRs
- Jurisdiction-specific reporting
- Countering misuse of TDR procedures 🔺●
- Providing information on OSP’s policies regarding specific requesters 🔺●
Segregating reports, or providing separate sections to address different types of TDRs received and processed is a good practice, which will allow the data released by OSPs to be better appreciated. Different types of TDRs, such as copyright, government requests, RTBF requests have different considerations. The provision of segregated information would be of great help to users and researchers alike.
Keeping in mind that the takedown processes and numbers for different jurisdictions may differ widely, OSPs should provide jurisdiction specific data. This would not only demonstrate the number of countries from which TDRs are received, but also bring to light other insights - such as which countries' governments are more active in the removal of content from the internet. This could include:- A list of countries from which TDRs are received.
- The number of TDRs that are received from each country.
- Providing information on measures taken to prevent or counter abuse of TDRsin general, as well as measures that relate to specific takedown procedures. Such measures should be supported by sufficiently detailed explanations of representative cases.
- Good practice: In the copyright delistings transparency report, Google indicates that they reject TDRs which, upon investigation, are found to be deceptive.[1]
- Providing data on the internal systems in place to identify bot accounts that send automated TDRs.
- The requesters for whom such special policies are in place, along with examples of such trusted flaggers and reporting agencies.
- The procedures in place for TDRs sent by all such requesters.
- OSPs should publish aggregate data on all TDRs, which should be segregated based on the category of request. For example, requests pertaining to copyright, the RTBF, legal demands, and local laws should all be reported on separately.
- Categories of requests should be as refined as possible, depending on the type of service and product. For instance, the RTBF and defamation requests both pertain to local laws, but should be reported on separately. In such cases data should be provided for both the broader category and sub-categories.
- The following categories, as a minimum, are recommended:copyright, trademark, counterfeit, legal demands, local laws, RTBF, defamation, CSAM, NCII, emergency requests.
- OSPs should define and explain the scope of each category, referring to applicable laws or policies, and provide representative examples of requests.
- OSPs should report on the normative basis of TDRs (e.g. DMCA, GDPR, German Network Enforcement Law), both in the aggregate and in relation to each TDR type.
- In addition to categorizing requests based on type, large-scale OSPs should report on the issues that TDRs relate to, e.g. drug abuse, terrorist content, national security, both in the aggregate and in relation to each TDR type. This should include the total number of TDRs per issue along with sufficiently detailed and intelligible explanations of the scope of each issue. 🔺●
- TDRs submitted by individuals:
II. Phase-wise Best Practices
PHASE 1. Transparency on TDRs: Request visibility
- Segregating and defining the categories of TDRs
- Good practice: Twitter’s divisions such asbetween “legal demands” and “local laws”[2]
For Large Scale OSPs, this should include the total number of TDRs per law along with sufficiently detailed and intelligible explanations of the law. 🔺●
- Good practice: Google’s report “”
- Reporting on requester profile
- As a rule, only high-level, aggregate data should be published. and individuals sending the TDRs should not be identified. Only percentage and total number of requests by individuals should be reported.
- Where applicable, and depending on the type of request, individualrequests maybe broken down into categories. This would depend on the status of the individual, for instance, in the case of a public figure and a RTBF request. 🔺●
- If the requester is a public figure or the request pertains to an issue of public interest, the identity of the requester should be reported except where such reporting is outweighed by other interests (e.g. privacy or limitations imposed by local law). 🔺●
- TDRs submitted by non-individuals:
- As a rule, names of non-individual requesters, e.g. state/government, NGO, trusted flagger, reporting agency, and corresponding percentage and number of submitted TDRs should be reported (except where such reporting is prohibited by limitations under local law).
- If a request is from the government, the specific agency from which the request originated, should be reported, e.g. judiciary, military, law enforcement.
- With regard to converging requester profiles, a distinction should be made depending on the origin of the request. For example, if the TDR was originally triggered by an individual’s demand, it should be primarily classified as such, irrespective of the entity submitting the request to the OSP. When there is such a convergence, this should be reflected in the data.
- Reporting on the object of the TDR
- Reporting on specialized policies
PHASE 2. Transparency on OSP takedown procedure: Decision-making visibility
- Reporting on initiation of takedown for Large Scale OSPs🔺●
- Reporting on actions that may be taken
- Good practice: Distinctions made by Google (“remove v. block”), TikTok (“remove v. restrict”), and Twitter (“removal, restriction, disable access, cease communicating, cease making available”).
- Good practice: Google’s “highlights” in the RTBF transparency report[3]
- Granular reporting on actions taken
OSPs should report aggregate data on actions taken pursuant to TDRs, both in percentage and total numbers, broken into request categories. This report should also include cases where a TDR is rejected.
- Providing information on the make-up of the team of decision-makers ●
- What decision-making directives are in place to help committee members make accurate and consistent decisions,
- whether such decisions are consensus-based or vote-based (or is one member of the team given veto?)
- Reporting on standard checks and operational guidelines in place to review the TDRs
- OSPs should report on the standard operating procedures in place to determine if the TDR submitted complies with all the procedural or formal requirements, e.g. checking if government notices are duly authorized.
- OSPs should report the checks (if any) in place to evaluate the substantive aspects of the notice, e.g. analyzing the government takedown notices that are in conflict with local law/user guidelines of the platform/international human rights, or checking if DMCA notices target content permitted under fair use.
- Reporting on means of reviewing TDRs
- Scope of each method and accompanying safeguards, if any.
- Information on the type of content that is sent automatically to automated takedown mechanisms versus content that is sent to human reviewers; how it is decidedto send a TDR to a human reviewer versus to an automated tool. 🔺●
- OSPs should report the ratio of each method used per request category and the outcome. ●
- Providing demographic data on members of the decision-making board would allow users and researchers to appreciate the data on how TDR decisions are made, and give context for why certain content was or was not removed. Accordingly, OSPs should release aggregate data on:
- Reporting obligations regarding human reviewers for Large Scale OSPs 🔺●
- Number of staff employed in a jurisdiction for content review
- Qualifications of staff (spoken languages etc)
- Location of staff
- Other particulars, such as age, nationality, race, and gender of staff
- Whether human reviewers are outsourced to third-party firms and if yes, what working conditions, along with training, exist to facilitate the moderation of content.
- OSPs should also release data on: 🔺●
- The selection process pertaining to the make-up of the decision making board and if the members in the decision board are the same as that of the appeal board.
- What processes are in place to help moderators make consistent and “accurate” decisions, especially in borderline/ambiguous cases.
- The extent to which human oversight is documented and what mechanisms exist to provide better training to human moderators, as well as the type of internal support in place to assist them in making consistent decisions.
- OSPs should publish a list of all types of automated tools (e.g. filters, digital hashes) that may be used to detect, review, or take down content.
- OSPs should publish sufficiently detailed information/explanations on automated tools. This should include, at the minimum:
- Between jurisdictions: How are TDRsfrom jurisdictions with no on-ground platform presenceprioritized and handled?
- Between categories of notices: Are certain categories of TDRsprioritized over others, e.g. Government notices or court orders?
- Between categories of flaggers: Are notices by trusted flaggers prioritized over other users?
- Reporting obligations for automated review tools for Large Scale OSPs 🔺●
- How the tools work
- Whom the tools are available to
- Development process of the tool, e.g. whether there was input from civil society
- How regionally, demographically, linguistically diverse the data are and what kind of results the automated models generate in response to such diverse notices
- Good practice: Explanation of Microsoft’s PhotoDNA
- Length of decision-making process for Large Scale OSPs🔺●
- Reporting on prioritization of TDRs for Large Scale OSPs🔺●
- Reporting on geographical scope of takedown
- Additional disclosure mandates for large-scale OSPs 🔺●
- OSPs should disclose error rates for both automated and human reviewers.
- OSPs should provide information on accuracy of takedown processes, which can include granular information on:
- Data on false positives: when content is labeled infringing/violating but were not violations/infringements.
- Data on true positives: when content is labeled as violating/infringing and were violating/infringing.
- Data on false negatives: when content is labeled as not violating/infringing but actually were violations/infringements
- Data on the number of erroneous decisions that were left un-appealed and how frequently OSPs proactively correct such erroneous decisions.
PHASE 3. Transparency post-takedown decisions: Decision visibility
- Informing users on takedowns
OSPs should also inform users on whether they may be subject to further action in relation to the takedown, e.g. termination of account.[4]
- Reporting on redress mechanisms for users
- whether users can respond to TDRs , e.g. if any explanation can be provided to defend the content
- how to appeal takedowns
- how such an appeal would be adjudicated
- by whom would it be adjudicated
- how long the user can reasonably expect to hear from the appeal
- whether decision from the appeal is further appealable
- Large Scale OSPs should publish data on the appeals and outcome, broken into request categories: 🔺●
- number of appeals/counter notice received by stakeholders (separate reporting on separate stakeholders)
- speed at which appeals/counter notices are reviewed
- successful ratio of appeals per type of request
- both number and percentage of unsuccessful appeals
- place of origin of appeals
- number/percentage of such appeals originating from takedowns by automated tools or human reviewer
- number of erroneous decisions that were actually not appealed
- data on whether material was removed instantly and later restored in response to a counter-notice or they waited for a counter-notice to be filed.
- Onappealing TDR issued by courts or government agencies, OSPs should report, at the minimum, on the following:
- Clearly outline reasons for sending a particular set of appealed cases for external review, if any (for instance, Facebook Oversight Board) accompanied by the procedure adopted by such external board while reviewing takedown decisions. 🔺●
- Reporting on appeals by the OSP
- Percentage and total number of appeals made per TDR category
- Reasons for appealing the TDRs
- The appeals process and outcome
- Good practice: Google’s practice of updating original court order notice to include the result of the appeal
- If there are processes in place to send appealed notices to a specific decision-making committee/board and if this process is random or deliberate?
- If a board is making a decision, what decision-making directives are in place? Is it a consensus-based decision making or vote-based or is one member of the board given veto?
- With consideration to data protection and other legal requirements, OSPs should maintain an archive of removed content of public interest as well as that of rejected requests (including the content) and provide researcher access thereto.
- Researchers should be given access to data on the type of content that reviewers (both automated and human) tend to flag as “ambiguous” and the length of time taken to process such content.
- Sharing takedown data with researchers and repositories 🔺●
]
In favor: The transparency requirements on what is removed by OSPs can serve as a basis to analyze the ne♔ed for future legislation. For example, when OSPs are open about removing ‘sexually explicit content’ (as with OnlyFans), law makers can pick up the discussion on whether such content is illegal or not. 💖It could also spark a discussion on whether (large-scale) OSPs should only be allowed to remove illegal content and whether a ‘must-carry obligation’ is necessary for non-illegal content.
Potential overarching aim: transparency in this first stage creates legal certainty for all parties.
Create transparency in what can and cannot be removed from OSPs services and limiting the endlessly br😼oad normative power that OSPs can exert over takedowns on their service.
Can OSPs decide to remove non-illegal content from their platforms based on their terms of service? What requirements should those terms of service have in that case?
This stage does not concern problems with (the enforcement of) ‘illegal content’ as those are governed by law (illegality), but to content that is removed without a legal basis, such as misinformation, or the platform’s own decided ‘harmful content’ (such as sexually explicit content on OnlyFans).
Stage 2:
The takedown decision
This stage covers the takedown decision and focuses on the procedural
and organizational requirements for taking the a decision to remove content
OSPs:
Against transparency: An OSP’s independence in running the busines꧑ses according profit imperatives will be impacted. More scrutiny can also lead to more criticism and more liability, as every decision made by the platforms can be contested. Finally, more transparency can lead to slower/ weaker internal processes due to high risks attached with decision making. This can increase transparency costs and 🅺be difficult for smaller/ newer OSP(s) to implement.
In favor: Clearer processes help build trust and legitimacy in the other communities. Clearly defined and publicly stated processes for takedown notices can potentially save platforms from taking difficult decisions with long term public consequences in silos and based on shifting goalposts( like determining what constitutes hatﷺe speech/ disinformation).
It can also help in initiating a more nuanced conversation and understanding of issues of online speech and building collaborative solutions.
Conflict between these interests: OSPs might want to share some data that give a higher level overview (percentage of decisions taken) but any more procedural transparency can be seen as a threat to the freedom of running their private business.
End-users:
Against Transparency: Editorial Transparency can have the negative implication of platforms taking decisions to please the regulators or the dominant political power in a particular jurisdiction. Potentially, too stringent transparency standards can hamper platform’s ability to take decisions in emergency circumstances. Finally, there can be privacy concerns for users in practices around data-sharing, as even anonymized data can be traced back.
In favor: Transparency in how notices are processed and decisions are taken can provide accountability to users.These will also act as a check against the disproportionate incentives to take down content and act as a check against discrꦺimination where some users/communities become easy targets of takedown.
Transparency standards can also improve the review process behind each takedown notice by increasing the number of human reviewers, building better automated tools and increasing the presence of legal experts/ moderators/ reviewers from different jurisdictions. Finally, transparency will act as a check against:- Platforms prioritizing certain jurisdictions while maintaining little to no mechanisms/ human reviewers/ automated tools in certain languages and dialects
- Platforms following different mechanisms or taking arbitrary decisions in response to public backlash or political circumstances.
- State interference/ backdoor channels of communication and censorship
Conflict between these interests: Increasing freedom of expression and righ🥃tto know versus right to privacy
Commercial users:
Against Transparency: Editorial Transparency can have the negative implicꦆation of platforms fo🍌rmulating procedures/ policies for takedown that favor bigger players due to business interests.
In favor: Transparency in🐻 how notices are processed and decisions are taken can:
- Act as a check against the the OSP’sincentives to over-classify as copyright violation
- Provide better information to raise counter-notices based on fair use
- May also be a step in addressing power asymmetry between small content creators and powerful players using anti-privacy tools with automated tools.
Researchers:
Against transparency: There might be ethical considerations behind standards that mandate data-sharing, especially for sensitive content that can breach user privacy.
In favor: 💎♛ Provide insights into how/ why platforms make decisions on takedowns. This can lead to better understanding in many areas including:
- Which users are more likely to be targeted in which categories of TDRs?
- What kind of content is more likely to be taken down?
- Whether the content taken down is actually illegal?
- Are laws empowering governments to censor content being misused?
Trusted flaggers:
Against transparency: Presence of procedural transparency reduces the otherwise disproportionate incentives for platforms to take down content.
In favor: More clarity on how content takedown decis♚ion💞s are taken would result in more certainty and predictability with respect to flagged content.
Policy makers:
Against transparency: States might be interested in maintaining confidentiality of certain state notices, since better transparency might reduce the space for backdoor channels to platforms. Can potentially expose the state to be a party to litigations against platform’s decisions.
In favor: Transparency might provide checks against arbitrary decision making by platforms. As a result, states can better assess platform’s compliance with local laws.
Potential overarching aim: Transparency in the second stage can provide an insight into how/ why takedown decisions are taken and also have an effect of improving these internal processes to be more equitable and fair.
- Clearly stated policies on how takedown notices in each category and each jurisdiction are processed.
- How are takedown notices prioritized
- Between Jurisdictions (if there is no on-ground presence for processing notices)
- Between categories of notices (for instance are court/government orders prioritized?)
- What part of the decision-making process is automated, and for what category of notices. Similar granularity for notices that require human intervention/review mechanisms. Under this further information on the following can be provided:
- The error rates of such automated tools.
- The availability of tools across jurisdictions and languages
- If a notice is found to be procedurally incorrect what happens next?
- What checks (if any) does the platform apply to check if the substantive part of the notice is correct?
- Whether the government’s takedown notice is in violation of local law/ user guidelines of the platform/ international human rights?
- What circumstances prompt platforms to make such reviews and ?
- What processes/ communication channels/ appeal mechanisms do platforms follow in such instances
- The reasons for the action taken to be recorded clearly and unambiguously (for instance, stating whether the content was in violation of: Local laws/Platform Community Guidelines/Terms of Service/Court Order
- Archiving and sharing content that has been taken down (if possible) or metadata of content (including engagement, demographic details of users) with researchers
- Transparency on composition of internal institutions dealing with takedown notices.
- Providing users whose content is taken down with a copy of the notice.
Stage 3:
Post-decision transparency
This stage covers the period after the takedown decision and focuses
on the transparency requirements that allow checks-and-balances
OSPs:
Against transparency: Transparency might prevent OSPs from taking business-oriented decisions that would encroach upon stated terms and conditions. Also creates more opportunities for criticism of the OSP decision-making process and end-decisions, hampers speedy decision-making in case of emergency; and potentially scares off users by underlying over-removal of content before appeal.
In favor: Post-decision transparency guarantees end-users that the rules established by the platform are respected, accordingly, building trust with user communities. This can empower users to avoid self-censorship and allow for the creation of more user-generated content. Finally, established appeals processes might reduce risks of liability and judicial redress by providing users with information on appeals.
End-users:
Against transparency: If not well-designed, appeals processes might threa🦩ten user priv💙acy.
In favor: Better appeals processes provide users with data to assess the respect of due process and right to appeal by OSPs; they allow use🌟rs to avoid self-censorship due to fear of the lack of due process; ensures legal certainty as to which type of content can be freely published on the OSP and acts as a pedagogical tool and a hands𒈔-on guide on what to expect when appealing an OSP takedown decision.
Conflict between these interests: Transparency on the right to appeal is to be balanced with the right to privacy. In particular, it is important to consider how to use the concept of « meaningfully public data » to determine which dataꦬ publication could be subject to less privacy scrutiny and more explainability on takedown decisions.
Commercial users:
Against transparency: Commercial users might have vested privacy interests on appeals processes, as the “name and shame” effects ♛of transparency. This is because, commercial users who on average, have more audience, might be categorized as “meaningfully public”, which means more information on content takedown processes surrounding them might be published.
In favor: Good appeals processes would reassure comme⛦rcial users that user-generated content cannot be deleted from the platform without due process . This in turn, also enhances freedom of expression by outlining the procedure used to prevent access to online infrastructure.
Researchers:
Against transparency: N/A
In favor: Better disclosures would allow researchers to study the breadth and usefulness of appeal procedures, allow researchers to evaluate the number of internal 🐈 appeal procedures being litigated in courts and assess OSPs due process;
Trusted flaggers:
Against transparency: Appeals might disavow TDRs made by trusted flaggers.
In favor: Rꦏeports on efficiency of TDRs by trusted flaggers.
Policy makers:
Against transparency: Appeals might undo TDRs made by policymakers and governments; These processes might also underline reckless and voluminous requests made by governments and reduce influence on platform decision-making.
In favor: Transp♎arency around appeals processes might inform the public on the number of internal appeals tꦍhat end up being litigated in courts.
Potential overarching aim:
Transparency in the third stage ensures that due process is respected. Including litigation in courts following internal appeals acts as an additional check on OSPS.
, some members cꦉonducted a, as recommended by the Sprint Team. This exercise proved to be beneficial for understanding the advantages and ꦗshortcomings of existing best practices, identifying gaps, and forming the structure of the SOBP in terms of granularity, methodology, language etc. II.Forming the SOBP
Following the brainstorming phase, the group came to the understanding that awareness of sufficiently detailed information on existing transparency practices and requirements is a prerequisite in order to build on them and to work out deficiencies. Accordingly, the group examined a number of existing transparency reports, and compared them against each other in order to identify areas where there was scope for improvement. In our attempt to answer the questions mentioned above, the overarching aim was to build on leading best practices such as the Santa Clara Principles On Transparency and Accountability in Content Moderation and the New America Transparency Toolkit. Therefore, while the foundational principles overlap significantly with that outlined in Santa Clara Principles, we found ways in which OSPs could be more meaningfully transparent to specifically users and researchers about content takedown. Against this backdrop, we moved onto discussing the specifics of the SOBP. In order to delineate the scope, some questions we pondered upon were: - Should we focus on quantitative or qualitative transparency, or both?
- What should be the applicability of the best practices in terms of OSP types, jurisdiction, and so on? In this regard,what are the criteria for classifying OSPs?
- How granular should the SOBP be?
- What are the different transparency contexts? e.g., sending notices to users and databases like Lumen, transparency reports, researcher access, internal bodies.
- Should there be best practices specific to certain categories of takedown requests and if so what are these categories?
- What are the different transparency contexts? e.g., sending notices to users and databases like Lumen, transparency reports, researcher access, internal bodies.
- Should there be best practices specific to certain categories of takedown requests and if so what are these categories?
Before answering how best to hold OSPs accountable, we mulled upon whether platforms are the right actors to govern the internet. Challenging the adjudicatory role of OSPs was an ambitious endeavor and one that did not squarely fit into the scope of this research exercise. However, the question became a springboard for us to assess other questions such as the limits of their role as an adjudicatory body. The potential discriminatory, arbitrary application of guidelines and rules and the increasing use of automated tools to monitor online content was approached from all possible angles. We found building user trust to be a building block of our SOBP. We emphasized on OSPs to commit to respecting human rights and embed such commitment across all phases of takedown. Such commitment must be assessed, periodically reported and any harm arising from over-judiciousness must be addressed by way of a redressal mechanism. The tracking of the sufficiency of grievance redressal must be addressed and reported and more importantly, it must be made available to relevant stakeholders in an accessible manner.The accountability to this commitment requires OSPs to abide by such foundational principles both procedurally as well substantively and to that end, the report has tried to be as granular as possible and erred on the side of maximum transparency, while taking into account certain risks of over-transparency. This paved the way for us to create a tripartite model: general, operational, and phase-wise best practices.
III.The Tripartite Model: General, Phase-wise, and Operational Best Practices
The Tripartite model, as we envisioned it, was surprisingly similar to how Working Group 3 had approached their work. Our aim was to segregate the TDR process into buckets that could then be examined separately to assess transparency best practices. We drafted the general and operational best practices to be applicable across all phases of the takedown process, with the aim of ensuring comprehensive, granular, and reader-friendly reporting. The phase-wise best practices are aimed at tackling each stage of the takedown process separately. Our phase-wise approach focused on (1) Transparency on the TDRs themselves; (2) Transparency on the decision-making procedure on how the TDR is handled; and (3) Post-hoc transparency, or transparency on practices after the decision on the TDR has (and hasn’t) been made by the OSP. Being cognizant of the fact that a one-size-fits-all approach may not be useful for our purposes when working with OSPs that offer different products and are of different sizes, network effects and so on, we strived to design the SOBP to address these differences. Accordingly,we have marked out transparency requirements that may be followed by OSPs of larger sizes. We also acknowledged that the general body of users do not require the kind of comprehensive data that researchers would appreciate. For this reason, we have marked out, separately, the information that could be made available only to certain selected researchers. The tripartite model found synergy with Working Group 3’s three-tier model and merging the two was a seamless exercise as both the teams approached transparency through a stakeholder-specific lens and worked with similar foundational principles. Their three-tier approach focused on pre-removal, during and post-removal phases, and carved out transparency goals for each tier. Our working group adopted a similar structure as the three-tier model was subsumed by the phase-wise best practices, with the difference being that while Working Group 3 deliberated on specific goals for each stage, our working group focused on addressing specifically two overarching objectives: process of decision-making and the decision itself (i.e. the process and the decision transparency) and aimed to get OSPs to reveal as much information as possible about the procedure, the means taken to apply said procedure, and the end-decision arrived after applying said procedure. Therefore, both working groups’ ethos were similar, the pillars used to construct the SOBP were also similar while, metaphorically speaking, the bricks that made up the pillar differed without contradicting each other. Accordingly, the merging was a coordinated exercise of discussing which bricks to place under which pillar and why. Additionally, our working group aimed to incentivize OSPs, especially large-scale OSPs, to provide granular information on takedown while ensuring that small-scale OSPs were not negatively impacted by the onerous nature of obligations. With regard to methodology, our working group decided to refer to existing good practices of OSPs where possible.IV.Lessons Learned
Our research and analysis of existing transparency practices and requirements led us to the understanding that any attempt to draft an SOBP is incomplete without broad, meaningful stakeholder engagement. We found that this was all the more true as our best practices were informed by a human rights-based and pluralistic perspective. Although we strived to be intentional with each best practice, we also concluded that close engagement with stakeholders is indispensable to do so. This requires being thoughtful about who the constituents of the “best practices community” are and may be in the future. Significantly, any stakeholder engagement must include OSPs starting from early on and ideally, enable active communication with other stakeholders. For instance, in order to assess the efficacy of and overcome challenges regarding existing best practices, as well as to identify and bridge gaps, we felt the need to know whether and to what extent OSPs make use of these best practices and get granular feedback. The question of incentives for OSPs to adopt the SOBP, which we grappled with throughout our work, could also be addressed through such engagement. Another key lesson has been that we need diversity in all aspects; diversity in stakeholders who have varying interests and expectations is a must. A major takeaway in this regard has been that risks arising from over-transparency must be thoroughly investigated. We believe that one way of doing this is through stakeholder input. Diversity in OSP types is also critical, which we addressed by marking out transparency requirements for OSPs of larger sizes. However, we believe that criteria other than size should also be considered in future efforts. Finally, introspectively, we believe that the diversity of the team in charge of preparing the SOBP is also crucial. Specifically, in our experience, we found that technical expertise, e.g. in the context of automated review, along with experience from the “field,” e.g. civil society/activist experience, is essential.V.Unresolved Questions and Looking Ahead
Given the time and resource constraints, there are some issues that the group was not able to address or could do so only partially.One pressing issue is the incentives for OSPs to adopt the SOBP, which is cru🃏cial to ensure its widespread adoption. The group observed that while OSPs have various incentives to comply with TDRs and whatever the underlying motivations may be to do so, they are also inclined to be transparent about takedowns, especially if a takedown originates from external requests. For further work, the group would ask if this is in fact the case and if so, what are the incentives? How can we make us🐼e of these incentives to shape the best practices so as to attain broad OSP buy-in?
Another crucial matter, which was also prompted by the Research Sprint team, is ensuring the applicability of the SOBP across jurisdictions. As much as the group strived to address this through adopting a general approach that was not focused on any one jurisdiction, the group would question if a jurisdiction-neutral transparency model is even possible/feasible and ask how widespread adoption by OSPs can be attained given potential clashes between the voluntary best practices and legal obligations. In this context, the group ღwould also pose broader questions of ensuring the flexibility of the SOBP as well as “future-proofing” it considering the evolving tech and the dynamics between actors.
In addition, the group’s attempt to address specific challenges arising from the use of automated systems felt incomplete due to the black-box nature of AI systems. During our research, we realized that the challenges faced by small-scale OSPs and large-scale OSPs were vastly different, i.e. the type of errors made by human reviewers are not the same as that by automated tools. However, a lack of access on how automated models take down content or what different models exist to perform this limited our research on large-scale OSPs. Going forward, we would explore scholarly work in this area and flesh out the specific transparency obligations to make automated takedown mechanisms more transparent. Finally, the group would like to note some aspects that should be dealt with in further detail:- Each best practice should specifically address relevant limitations and trade-offs and provide guidance on how the latter can be resolved.
- Where applicable, best practices should incorporate considerations for different categories of TDRs (e.g. government requests) and content/issue (e.g. RTBF, CSAM, government secrecy). In relation to TDR types, the group makes note of the necessity of further research regarding the impact of network shutdowns on the takedown ecosystem.
- Where applicable, best practices should be further refined according to how much transparency would be afforded to whom. Accordingly, the SOBP should recognize audiences other than researchers.
- Future efforts should build on our initial work regarding takedown abuse and safeguards. Specifically, retaining content that has been takedown and ensuring researcher access thereto, as well as transparency on rejected TDRs should be further investigated. With regard to the former, the group makes note of digital repository efforts.[6]
- Further research should be conducted to determine criteria for “Large Scale OSPs” and the extent of “Researcher Data.”
2.Post-mortem improvements: what’s next?
Our work has focused on the overarching values necessary to establish a common framework for transparency. Our work leaves pending a wide array of questions regarding the adaptability to quickly evolving platforms and changing moderation rules in jurisdictions. Specific statements of best practices for platforms ought to address relevant limitations/trade-offs and provide guidance on how they can be resolved. It is essential to underscore that specific SOBP will differ on how much transparency would be afforded to whom, notably regarding the scope of transparency under each category could be crafted according to the relevant audience. As a result, we ought to get into the specifics of creating a data taxonomy and attributing each specific data to determined actors given their interests. Moreover, our SOBP did not interrogate automated takedowns (method, programming, results) in as much detail as we would have liked due to a lack of technological capability, it is something that ought to be tackled in the future. Finally, our work could be confronted with transparency regulations mandated by states and would benefit from participation and feedback from stakeholders.
Annex: Glossary of terms
General terms
- Online service providers:Online service providers (OSPs) include a broad range of entities that provide electronic communication services or remote computing services[7]. Thus, entities providing network access, peer to peer messaging, email, online news broadcasting , online video or audio streaming, search engines, e-commerce, online banking, social media, etc are classified as OSPs.
- Commercial users: A commercial user uses the service of an online service provider to generate economic gain. Included in this category are for example social media influencers, and third-party resellers on online marketplaces. Excluded are individuals occasionally selling their products by the means of online service providers.
- Trusted flaggers:A trusted flagger is a status accorded to an individual or organization by an OSP for having valuable expertise or experience in content moderation. It can come with access to additional resources and responsibilities in the content moderation ecosystem of that OSP.
- Community standards: Community standards refers to the terms of service applied by online service providers. In practice, a variety of different terminology is applied, including user terms, user agreements, community standards, policy guidelines, and house rules. These community standards include the platform’s policy on content moderation and describe which content is allowed on the platform’s service.
Stage 1
- Opportunistic takedown requests: An opportunistic takedown request refers to the practice where trusted flaggers or third party notifiers send a takedown request to an online service provider without certainty that the content is infringing.
- Large Scale OSP: Refers to large-scale platforms with social functions. In German case-law (Bundesgerichtshof, III ZR 179/20 and III ZR 192/20) a distinction is made between social media platforms that significantly impact users’ social life and those who do not. As a result of the third-party effect of fundamental rights related to this ‘social function’, the former are subjected to stricter procedural rules on content moderation.
Stage 2.In correspondence withTransparency on OSP takedown procedure (Decision-making visibility)
- Procedural transparency: An important component of transparency is open decision–making such that the processes/ tools used to arrive at decisions are subjected to oversight. Procedural transparency in the context of OSPs’ content moderation decisions will include disclosure of information on internal institutions, resource allocations and processes followed to arrive at takedown decisions.
- Editorial transparency : means disclosing information pertaining to the OSP(s)’s editorial decisions and operations. It can include information on the platform’s internal editorial policies that determine what content can be taken down, information on the algorithms and efficacy of automated tools that are used in such decisions, justifications and explanations of specific decisions or even disclosing aggregate numbers about editorial decisions.
- Metadata of content: Metadata provides additional information about the content under a takedown request. This can include information on the level of engagement, date of posting and removal, demographic information of the content author, geographic and demographic information detailing content engagement etc. This data can be of special interest to researchers.
Stage 3. In correspondence with Phase 3: Transparency on the takedown decision: Decision visibility
- Public user: A public user is a user who benefits from large audience or engagement on the platform and creates viral posts on a regular basis or a public figure such as a government official, a political figure, or anyone benefitting from a broad reach because of their achievements in the arts or in business in a certain country or region.
- Meaningfully public data : Meaningfully public data is data related to a public user which can be subject to less privacy scrutiny to favor explainability and accountability for decisions affecting content reaching out to a large audience.
[1] In one case, “the [reques﷽tin🐼g] individual’s site had been created and then back-dated for the purpose of filing this takedown request.”
[2] See, e.g., .
[3] See //tr💃ansparencyreport.google.com/eu-privacy/overvꩵiew?hl=en.
[4] See Department of Commerce DMCA Multistakeholder Forum: DMCA Notices and Takedown Proces😼ses, I(A)(7)𝕴.
[5] See Council of Europe Recommendation on the impacts of digital tecওhnologies on freᩚᩚᩚᩚᩚᩚᩚᩚᩚ𒀱ᩚᩚᩚedom of expression, para. 4.5.
[6] See and.
[7] See
------------------------------------[]