Skip to main content

The Weaponization of Social Media: How social media can spark violence and what can be done about it, November 2019

Countries
World
+ 4 more
Sources
Mercy Corps
Publication date
Origin
View original

Executive Summary

Social media has emerged as a powerful tool for communication, connection, community and, unfortunately, conflict. It’s created new, highly accessible channels for spreading disinformation, sowing divisiveness and contributing to real-world harm in the form of violence, persecution and exploitation. The impact social media has on real-world communities is complex and rapidly evolving. It stretches across international borders and challenges traditional humanitarian aid, development and peacebuilding models. This new paradigm requires a new approach.

Mercy Corps has partnered with Do No Digital Harm and Adapt Peacebuilding on a landscape assessment to examine how social media has been used to drive or incite violence and to lay the foundation for effective, collaborative programming and initiatives to respond quickly and help protect already fragile communities.

This assessment explores how weaponized social media can contribute to offline conflict by examining realworld case studies. These examples are not exhaustive. Rather, they surface a range of concepts and implications that can help humanitarian, development and peacebuilding organizations — as well as technology companies and policymakers — understand what’s happening and develop effective responses.

Case studies Information operations (IO): Coordinated disinformation campaigns are designed to disrupt decision making, erode social cohesion and delegitimize adversaries in the midst of interstate conflict. IO tactics include intelligence collection on specific targets, development of inciteful and often intentionally false narratives and systematic dissemination across social and traditional channels. The Russian government used such tactics to portray the White Helmets humanitarian organization operating in Syria as a terrorist group, which contributed to violent attacks against the organization.

Political manipulation (PM): Disinformation campaigns can also be used to systematically manipulate political discourse within a state, influencing news reporting, silencing dissent, undermining the integrity of democratic governance and electoral systems, and strengthening the hand of authoritarian regimes. These campaigns play out in three phases: 1) the development of core narratives, 2) onboarding of influencers and fake account operators, and 3) dissemination and amplification on social media. As an example, the president of the Philippines, Rodrigo Duterte, used Facebook to reinforce positive narratives about his campaign, defame opponents and silence critics.

Digital hate speech (DHS): Social media platforms amplify and disseminate hate speech in fragile contexts, creating opportunities for individuals and organized groups to prey on existing fears and grievances. They can embolden violent actors and spark violence — intentionally or sometimes unwittingly. The rapid proliferation of mobile phones and Internet connectivity magnifies the risks of hate speech and accelerates its impacts. Myanmar serves as a tragic example, where incendiary digital hate speech targeting the majority Muslim Rohingya people has been linked to riots and communal violence.

Radicalization & recruitment (RR): The ability to communicate across distances and share usergenerated, multimedia content inexpensively and in real time have made social media a channel of choice for some violent extremists and militant organizations, as a means of recruitment, manipulation and coordination. The Islamic State (ISIS) has been particularly successful in capitalizing on the reach and power of digital communication technologies.