This site uses necessary cookies to enable basic functions like page navigation. By clicking “Accept Cookies”, you agree to the storing of additional cookies on your device to allow the Web Foundation to analyse site usage. View our Privacy Policy for more information.

Using the Tech Policy Design Lab Playbook to co-design platform governance policies and counter malicious flagging

July 18, 2023
Deceptive Design
OGBV
General
July 18, 2023
Spread the word
Instagram iconTwitter iconLinkedIn icon

Dr. Carolina Are, Innovation Fellow at the Centre for Digital Citizens, Northumbria University

Dr. Carolina Are is a content moderation expert focusing on social media platforms’ governance of nudity, sex and LGBTQIA+ expression. Thanks to both her research at Northumbria University’s Centre for Digital Citizens and her personal experiences of de-platforming, Dr. Are is one of the many individuals performing an unfunded service to platforms by connecting them with users  who have been targeted and harassed to expedite their requests to companies.

From January - May 2023, Dr. Are undertook a Technology Policy Design Lab using the TechLab Playbook to further research on malicious flagging and its relationship with de-platforming of marginalised communities, journalists and activists. This blog post from Dr. Are shares this work and advances the mission of a more open, safe, trusted, and empowering web that is for everyone.

What would platform governance look like if users affected by it could get a seat at the table to design the policies that will govern their social media experience? As a user with personal experiences of online abuse and de-platforming and as a platform governance researcher, I was interested in re-designing platform policies surrounding de-platforming, also known as account deletion, and malicious flagging, or the misuse of the reporting tool to silence specific accounts.

Today, as part of the postdoctoral research project I am leading at Northumbria University’s Centre for Digital Citizens and in collaboration with The World Wide Web Foundation and Superbloom, I am launching a report detailing policy recommendations to tackle platform governance inequalities in the enforcement of flagging and de-platforming on social media. Co-designed with users affected by these forms of governance, these policy recommendations were gathered through a series of workshops structured according to The World Wide Web Foundation’s Tech Policy Design Labs (TPDLs) playbook.

This report provides social media platforms with user-centred and research-informed recommendations to improve the design and effectiveness of their flagging and appeals tools. Research has shown these affordances to be inadequate at tackling online abuse, and to provide malicious actors with opportunities to exploit strict platform governance to de-platform users with whom they disagree. This has disproportionately affected marginalised users like sex workers, LGBTQIA+ and BIPOC users, nude and body-positive content creators, pole dancers, but also journalists and activists. 

My research has found that being de-platformed from social media often leaves users unable to access work opportunities, information, education as well as their communities and networks – which has devastating mental health and wellbeing impacts. Yet, in my research and personal interactions with social media companies I have learnt that, too often, the time, resources and attention allocated to engagement with the stakeholders who are directly affected by technology are awarded sparingly. 

The idea underpinning this report is that content moderation often fails to take the human experience into account to prioritise speed and platform interests, lacking in the necessary empathy for users who are experiencing abuse, censorship, loss of livelihood and network as well as emotional distress. As a result, this report is a free resource for both users to feel seen in a governance process that often erases them and, crucially, for platform workers to avoid escaping stakeholder engagement in the drafting of their policies. 

The workshops were supported by Northumbria University’s Policy Support Fund and by the Engineering and Physical Sciences Research Council. The World Wide Web Foundation’s TPDL Playbook was crucial in developing an accessible and open-ended format to harness the power of a diverse set of participants from different walks of life. Thanks to the structure provided by the Playbook, and to Superbloom’s co-facilitation during the workshops, the TPDLs I ran became a space in which we could harness participants’ experience and expertise to learn directly from those affected by technology, for policymakers, tech companies and researchers to take their stories into account. 

Download the full report here.

Participants and co-designers

The policy recommendations in the report were co-designed with 45 end-users, who are too often ignored when drafting the rules governing the spaces they depend on for their social and work lives. 

Similarly to previously documented experiences of censorship, women were over-represented among those who joined the workshops. Those who participated included both cis and transgender women and men, as well as non-binary people. Participants were both heterosexual and from the LGBTQIA+ community, and were located in and/or from a diverse set of locations: Europe (i.e., the United Kingdom, France, Germany); the United States; South America (i.e., Chile and Brazil); Asia (India, China, Hong Kong); and Africa (Ghana, Kenya, Nigeria, South Africa, Uganda). 

With these communities, I wanted to reimagine content moderation policies outside of Big Tech, to challenge social media platforms to instil empathy into content moderation. Instead of turning them into passive research subjects, these users became co-designers and co-rule makers. After asking participants to reflect on a set of personas presenting stories of flagging and de-platforming arising from my previous research and presented in the shape of re-imagined, queer, body- and sex-positive tarot cards, we aimed to answer the following questions: to answer the questions: 

  1. How might we protect users who think they are being censored and/or de-platformed because of mass reports?

  2. How might we help de-platformed users after they experience censorship and/or de-platforming?
Re-imagined, queer, body- and sex-positive tarot cards.

Policy recommendations

The solutions participants proposed to tackle malicious flagging and de-platforming, have a focus on improving transparency, safety and equality, education, design, fairness, due process, contextual knowledge, community support and sign-positing in platform governance. 

Many of these recommendations require action by platforms, but participants also shared recommendations for governments and civil society organisations to step in and help users affected by governance inequalities. These can be found in more detail in the full report, and largely revolve around the following three changes:

  1. Communication: Transparency and detailed information about content moderation educates and empowers users, improving due process.
  1. Recognising vulnerability: Allowing users to report themselves as protected categories can mitigate current platform governance inequalities, fast-tracking appeals and help in situations when users face abuse.
  1. Granular content controls: Making ALL content visible to everyone causes more governance issues than it solves. Empowering users to choose what they see through more granular content controls and periodic check-ins about the ability to opt in or out from certain posts can help mitigate this.

However, towards real, systemic change, users pushed for radical transparency and, crucially, for a duty of care by platform conglomerates, demanding information, workers’ rights, and compensation when platforms fall short of protecting their users from censorship and/or from abuse. Because of this, they found that current legislation and societal approaches fall short of protecting them: without recognising content creation and platform work as labour, and without recognising users have the right to express themselves sexually and work through their bodies, platform governance will continue replicating the offline inequalities it is marred by. 

This report is a first step towards ensuring that the needs of users who are too often designed out of Big Tech spaces are heard and make it into the rooms where decisions about their online lives are made. Please share it far and wide. 

For more information, contact: Carolina.are@northumbria.ac.uk

This blog is being hosted on the Web Foundation site.
Please click the button below to read the external blog article.