Regulating Inequality on Digital Platforms

This project aims to find legal, ethical, technical, and commercial opportunities to counter systemic inequality online.

Digital platforms – including search engines, social media, peer economy, and news platforms – are extremely influential in organising commerce, coordinating behaviour, and shaping the information that people see and share. They are also frequently used in ways that are deeply sexist, racist, and biased against disadvantaged groups. Digital platforms might not aim to discriminate, but technology is never neutral; when new tools and services are deployed in a deeply unequal world, they will often reproduce and exacerbate existing social inequality.

Digital platforms are under increasing pressure to address systemic inequality as it manifests in their networks. These debates include the moderation of hate speech and incitement to violence, the amplification of lawful but harmful content, the visibility and representation of marginalised voices, and the discriminatory social and economic impact of digital platforms on consumers, creators, and producers. Our research asks two main questions:

  • What works to address inequality online? We use computational statistics and machine learning to understand the impact of efforts to tackle misogyny, racism, and other forms of structural discrimination online.
  • What responsibilities should platforms have to address inequality and discrimination? We examine if and when private sector digital platforms can be expected to monitor and regulate the actions of their users, what responsibilities they have to avoid contributing to discrimination, hatred, intolerance and abuse, and how the law should develop to ensure that our digital environment is more equal and fair.

Our immediate areas of focus include improving the visibility and popularity of accessible listings in peer economy accommodation sites; promoting inclusivity and visibility for LGBTQIA+ content creators; and countering the ordinary hateful and divisive speech that has become a normal part of social media.

Project team


Professor Nic Suzor is globally recognised as a leading scholar in technology law. He is a Professor at the Law School at Queensland University of Technology, one of the leaders of QUT’s Digital Media Research Centre, and one of 20 international experts on Facebook’s Oversight Board. He is also a Chief Investigator of the ARC Centre of Excellence for Automated Decision Making and Society, and the author of Lawless: the secret rules that govern our digital lives.
Headshot of Laura Vodden wearing glasses with shoulder length brown hair with blonde highlights Laura Vodden is a data scientist with a focus on machine learning, data storytelling, programming and statistics.
Lucinda Nelson is a PhD researcher studying how social media platforms can address online misogyny and improve platform cultures for young women.
Suzy Wood is a lawyer and legal researcher specialising in intellectual property law.


Project funding

This project is funded by Professor Suzor’s Australian Research Council Future Fellowship (FT210100263) and integrated with the ARC Centre of Excellence for Automated Decision-Making + Society.


Image credit: Speech bubbles icon by Dryicons. Available at:

Image of two speech bubbles in conversation on an orange sunbeams background