“We The (Social) Media” Project: Research on Social Networks, Activism, and Content Moderation
30. March 2026
Since 2023, the research project We The (Social) Media (WTSM) has been examining how social movements use social networks and which challenges arise in the process. The project is funded by the Federal Ministry of Research, Technology and Space (BMFTR), runs until 2026, and is led by Prof. Dr. Andreas Breiter.
At the center of WTSM is the question of how social media shape communication, organization, and visibility for social movements. Early on, the research revealed that digital participation is not equally accessible to everyone. Scientific researcher Anna Ricarda Luther explains: “The starting point of my research was the question of how social movements use social media for their purposes. In exchange with activists, it quickly became clear how much hate speech burdens their work, ties up resources, and ultimately influences who is able to participate in digital public discourse.”
These insights led to a shift in the project’s focus: “These findings shifted the focus of our research toward questions of safety, content moderation, and the design of digital spaces from the perspective of those affected.”
The topic of content moderation is particularly pressing in light of current developments on major platforms. As Luther emphasizes: “Content moderation is currently undergoing major changes: social platforms are revising their moderation policies, cutting jobs in Trust & Safety, and increasingly relying on automated systems that often fail.” According to her, these developments have far-reaching consequences “for freedom of expression, safety, and the protection of marginalized groups—and make critical academic engagement more urgent than ever.”
WTSM is a subproject of DataNord, the interdisciplinary data competence center for the Bremen region, and fosters collaboration between computer science, political science, and communication studies. Recent findings were presented and published at the Conference on Human Factors in Computing Systems (CHI) 2025 in Yokohama. In addition, the project regularly engages in public discussions on content moderation and platform responsibility, most recently in the CAISzeit podcast and during a talk at re:publica 2025.
Find more information on the project page.
