Surveillance
Self-Defense

Offline/Online Project Highlights How the Oppression Marginalized Communities Face in the Real World Follows Them Online

People in marginalized communities who are targets of persecution and violence—from the Rohingya in Burma to Native Americans in North Dakota—are using social media to tell their stories, but finding that their voices are being silenced online.

This is the tragic and unjust consequence of content moderation policies of companies like Facebook, which is deciding on a daily basis what can be and can’t be said and shown online. Platform censorship has ratcheted up in these times of political strife, ostensibly to combat hate speech and online harassment. Takedowns and closures of neo-Nazi and white supremacist sites have been a matter of intense debate. Less visible is the effect content moderation is having on vulnerable communities.

Flawed rules against hate speech have shut down online conversations about racism and harassment of people of color. Ambiguous “community standards” have prevented Black Lives Matter activists from showing the world the racist messages they receive. Rules against depictions of violence have removed reports about the Syrian war and accounts of human rights abuses of Myanmar's Rohingya. These voices, and the voices of aboriginal women in Australia, Dakota pipeline protestors and many others are being erased online. Their stories and images of mass arrests, military attacks, racism, and genocide are being flagged for takedown by Facebook. The powerless struggle to be heard in the first place; online censorship further marginalizes vulnerable communities. This is not OK.

In response, EFF and Visualizing Impact launched an awareness project today that highlights the online censorship of communities across the globe that are struggling or in crisis. Offline/Online is a series of visuals demonstrating that the inequities and oppression these communities face in the physical world are being replicated online. The visuals can be downloaded and shared on Twitter, Facebook, and Snapchat, or printed out for distribution.

In one, the displacement of nearly 700,000 Rohingya Muslims from Myanmar because of state violence is represented in a photo showing Rohingya children trying to board a small boat. Rohingya refugees, many of whom are women and children, are arriving in Bangladesh with wounds from gunshot and fire, according to the United Nations.

And online? Facebook is an essential means of communication in Myanmar. Activists there and in the West have documented the violence against the Rohingya online, only to have their Facebook posts removed and accounts suspended.

Inequity offline, censorship online.

The EFF/Visualizing Impact project exposes this pattern among Palestinians, aboriginal women in Australia, Native Americans, Dakota pipeline protestors, and black Americans. We believe this is just the tip of the iceberg. We are already far down the slippery slope from judicious moderation of online content to outright censorship. With two billion Facebook users worldwide, there are likely more vulnerable communities being subject to online censorship.

Our hope is that activists, concerned citizens, and online communities will post and share Inequity Offline/Censorship Online visuals (found here) many times, raising awareness about the impact of censorship on marginalized communities—a story that is underreported. Sharing the visuals is a step all of us can take to combat online censorship. It may help restore the speech and voices being erased online.

JavaScript license information