Skip to main content

From Likes to Violence: How Big Tech is Helping Fuel Extremism

Could Big Tech's failure to moderate social media be fueling violence and extremism in Kenya and beyond?
An iPhone displays the apps for Facebook Play Podcast
AP PHOTOS

About the Episode

Deep Dish delves into how the unchecked power of social platforms in the Metaverse, woven into the fabric of modern communication, is fueling conflicts, disinformation, terror, and hate. Experts Kristina Wilfore and Leah Kimathi confront the chilling reality of Big Tech's role in fueling extremism in Kenya, where profit often trumps content policing, and share what Kenyans are doing about it.

[Lizzy Shackelford: INTRO: This is Deep Dish on Global Affairs— going beyond the headlines on critical global issues. I’m your host, Lizzy Shackelford, with the Chicago Council on Global Affairs.

Today, we delve into an urgent concern: extremist groups harnessing social media's power to breed conflict and hate. In countries where social media, like Facebook, is a primary source of communication, information, and entertainment rolled into one, the consequences of lax content moderation are dire. Our focal point: Kenya. A vivid case study of what transpires when Big Tech prioritizes profit over policing, leaving a trail of devastation. So, does the allure of profit blind us to the grave repercussions in this interconnected age?

Here with me to expose the perilous connection between online radicalization and digital giants' negligence is...

Leah Kimathi a social researcher and activist with a deep interest in governance in Africa, elections, peace and security, and diversity and inclusion. She is also a founding member of the Council for Responsible Social Media.

Also joining us is Kristina Wilfore the cofounder of #ShePersisted, the only global initiative uniquely dedicated to tackling gendered disinformation and online harms against women in politics.

Now this issue captured Kristina’s attention when she was in Kenya during the 2017 elections working on peace and security. And she discovered that all the tools in that toolbox that we need to help advance society, don't work the same in an era of mis and disinformation...]

kristina Wilfore: I’ve become a unwilling, I would say, begrudgingly expert on countering disinformation because of the normal things that I've spent my lifetime doing -- trying to work towards strengthening democracy at home and abroad in the form of supporting civil society, of helping women enter politics, of dealing with identity based conflict… don't work the same in an era of mis and disinformation. And really that the assumptions of social media as a mere reflecting societies ills are really not what's going on anymore. And in fact, I think we can look at the situation, as 1 as of conflict profiteers in the way that Big Tech and platforms are eroding essentially all the things that we've done to really support social cohesion across the globe.

Lizzy Shackelford: Yeah, Leah, can you give us some of the examples of how this is playing out specifically in Kenya today? What are some examples of the real world dangers of this extremism playing in social media?

Leah Kimathi: Obviously Kenya is one of those countries where, internet has penetrated quite a bit. We are now talking about, uh, 20% of the population being on social media and we do know that Kenya is a highly youthful population. We also do know that we are a fragile democracy, and we are found in the horn of Africa. Now, of course, the one thing about the horn of Africa is the prevalence of terrorism, and especially given what's going on in Somalia and the effect of Somalia across the borders, and Kenya is right next door. And we have been on the receiving end of terrorism, concerns around recruitment and other security concerns. Now, what we have then seen, based on our porous borders, based on, the relationships across geographies and across demographics that we have with Somalia, is initially when Kenya went into Somalia around 2011, we began to see the rise of terrorism in Kenya in response to what was seen in some quotas as Kenya's invasion of Somalia.

Lizzy Shackelford: Just for our listeners, Kenya was part of the African Union force that went in to try and support the Somali government.

Leah Kimathi: Absolutely. And initially we didn't even go in as part of Amazon, but then later we rehearsed and joined Amazon. But then at that very moment, then we began to see reprisal terrorist attacks in Kenya. Initially, perhaps a lot of the attacks were coming from Somalia, but with time we began to see the growth of our own, what we call homegrown terrorists. In Kenya responding partly to Kenya's work in Somalia, but also to the many governance challenges that are found in Kenya. So, from then I think we've seen a rise of radicalism, rise of terrorism, both offline, physical, but also in the online spaces.

Lizzy Shackelford: So, what are these extremist groups doing with social media? How are they using social media in Kenya?

Leah Kimathi: The most obvious evidence of the use of social media by this, extremist groups was the Ducey's attack that happened in 2019 and there is evidence that that attack was partly planned on Facebook. So, yes, we have extremist groups using, social media to organize, to plan, and to recruit especially, from among the Kenyan population.

kristina Wilfore: Let me step into to say what was discovered and this is just an interesting sort of journey. I had been doing work all across different parts of the world around elections, especially, but have been focused really on the way that gender is being weaponized in different contexts, both in the US, in East Africa, in parts of Europe. And in the process of helping us sort through what's going on gender issues, really discovered this massive network where Facebook is failing, I think still it's appears to be to catch Islamic state groups and Al Shabaab extremist content in posts aimed at East Africa. And in fact, they essentially let an entire news ecosystem develop on Facebook, and it's been very troubling to me to really not have the reaction from European and US policymakers who have invested millions of dollars in security, hard security, to then not consider the online aspect of this. So, people who are susceptible to messages coming from terrorist organizations are typically in those communities in some way, but when you have a mainstream platform serving up, delivering and building audiences to then socialize more people toward terrorist related content and attacks against the state, the institutions undermining of the election, everything that we saw from what was being produced. This to me goes into an entirely different category. There's no way we can get these platforms to deal with the stickier issues of say, gender, if they're wholly ignoring entirely terrorist accounts on their platforms with products that they are responsible for overseeing and that their community standards indicate that they will shut down. So, they have a provision for, dangerous actors with a list that is of known terrorist organizations that they have said to governments and Kenya and the US and everywhere else that they will monitor and not allow on their platform is a clear violation of that. This kind of network had been up for two years, producing 15,000 videos, and there was not even any transparency about the take down efforts within the week of that being exposed. And now I'm told from online extremism researchers that they now see the accounts coming back up. And so, we have really no confidence whatsoever that this is being monitored or overseen. And this is one of the most egregious examples of Facebook and Meta using its products for bad actors.

Lizzy Shackelford: So, you've raised a couple of good questions here. I mean, meta has committed to with its own guidelines, right? With what it's told different governments that it's committing to do, and they're falling short on that. But what are their legal responsibilities? And who are the entities that can enforce this if they're not enforcing their own standards?

Leah Kimathi: I think for us, sitting here in Nairobi, in Kenya or in, fact, other countries in Africa, because this is a reality that we are facing as the African market of big tech. We feel that there is systematic under investment in African markets. We are treated as peripheral markets. We are treated as extractive markets where these companies can come in, do business, invest as little as they can. With very little care about our safety, our dignity, online and, we are looking at this as really a continuation of an extractive economy that has been there for centuries. I mean, you're looking at colonial economies in Africa, so this is a continuation of that extractive model where African markets are treated as peripheral markets and, perhaps that are not accorded the same safety as markets elsewhere, especially in the West. So, whether you're looking at issues around terrorism and radicalization, whether you're looking at issues around women's health. We have harms proliferating online openly against this company's old community guidelines. We are calling them out. They rarely do they listen to what we say, but we call them out, we provide evidence, but we are saying that the way these social media companies based out of America are treating Africa is extremely concerning and it's time that African countries come together and form a formidable bloc and begin pushing back in very specific ways against these companies.

kristina Wilfore: What we've really seen is that the era of self regulation should be over because that's the model that has existed. And so, if I look at it from the US election, as we gear up towards another contentious election in what I would describe a very fragile democracy in the US right now, there is literally no new rules from 2020. There's nothing that will prevent another insurrection with full evidence that social media and the building of audiences, the growth of advocating violence as a way in which to hold power is being fomented online. Nothing has changed literally since 2020. And then when you consider markets outside of the US and I saw this in the Brazil election also which I would argue no country tried harder than Brazil to curtail online harms and to get the platforms to try to adhere to their terms of service. Yet you still had the Brazil version of an insurrection, and you have a hotly contested debate going on there about new rules. So, the existing rules require the platforms to monitor themselves. And this is the fox guarding the hen house. There's nothing different than that. They don't have an incentive to root out fraud. They don't have an incentive to be transparent about, certainly foreign influence operations and fake to count. And in fact, their stock prices are based on monthly active users, which is eyeballs on content. And so, there's a natural incentive to keep engagement high. And so ultimately this system is absolutely broken. And while there may be no immediate one set of solutions, a well defined problem is in fact, half of the solution. And that's the conversation that we are seeing in marketplaces outside of the US where it's just even more egregious in that they don't have local language. They don't have content moderation that is effective. Even if they had more effective and less abusive content moderation, it doesn't get at the algorithmic promotion of incendiary content for profit. And so those are the core business model problems that are difficult for a country outside of the West to try to leverage control over. And this is why you see no real meaningful legislation on the US while the European Union has moved forward with the Digital Services Act. Which requires them to do things like risk audits and to actively monitor their platforms in a way that they aren't with real punitive, responses, if not, but these are very difficult policy positions and instead, most of the work in the South, let's say, has been about worry of access and internet shutdowns rather than understanding the dangerous ways in which the status quo is affected because of the way that the social media platforms are organized.

Lizzy Shackelford: Now, in much of the world, we're expecting these companies to self regulate and that raises problems for accountability. And that's one of the places where I'm so interested in what we're seeing in Kenya. Leah, can you talk a little bit about the court cases that have developed in Kenya as this attempt to try and press some form of accountability here?

Leah Kimathi: Nairobi being a hub, for a lot of these, social media companies, and with amount of online harm that is affecting not just the content moderators themselves, but everyday users, Kenyans, so what do we do? And so, what has happened is right now we have four groundbreaking court cases. Three of them against Meta and one against TikTok. And they are basically around content moderation, okay? And so, the cases in principle are arguing about that the investment in content moderation is so poor, either it's poor working conditions of content moderators, or there are very few content moderators for that matter. There is little to no investment in local languages. So that's the content that then the public gets exposed to causes harm and I mean, across various spaces, look at the one of the court cases is around the war in Ethiopia, against META and, uh, the accusation is that Facebook was used to propagate and to spread hate that led to the war in Ethiopia, because they were unable to moderate their own content. So, what has happened is that the Kenyan, courts are using existing law, to begin to see how do they rein in or how do we begin engaging with the challenges that are emanating from, unregulated, ungoverned social media spaces and the fact that the companies themselves seem, unwilling, to self regulate.

Lizzy Shackelford: Yeah, and you've, mentioned the war in Ethiopia, which I think is a very interesting case because as you say, Nairobi is a hub, but it's the hub that's responsible for META for moderating a lot of the whole East African region, right? And one of the problems is a resource question, they have moderators, but the moderators have, as you say, poor working conditions and simply are not, sufficient numbers to be able to moderate the scale of content and have the right resources, you know, and language skills and that type of thing. Facebook in particular spends the vast majority of its global budget like 87% for classifying misinformation in North America, even though that's a small, small percentage of global users. Why is that? and what can be done to, press these companies to increase their capacity so that they can actually content moderation around the world effectively.

kristina Wilfore: It's structural racism, I would say, like simply, in the way that they treat marketplaces. And if we have the cleaned up version of the internet, you know, goodness, goodness sake. So, I will pull out the world's smallest violin to emphasize with meta about the cost of content moderation. Which is already a downstream problem to address going further upstream, you've got to look at the way that data is collected. Audiences are built, surveillance. Capitalism. What Leah has spoken to is exactly the opportunity in these marketplaces that are getting the worst of the worst strategic litigation. It's not just, laws and policies those should come at a point in which there's discourse in the country and where the platforms aren't manipulating that discourse. You know, I've heard the meta representatives in Kenya. Mercy is her name. Mercy is the same profile of the meta representative I came across in Ukraine. The mercy of Kenya is the Katya of Ukraine is the flava of Brazil and all actually young women who have no power in terms of the company and how it operates but are on the front lines of making promises that sound good and in fact are empty and don't do anything. So those points of leverage and opportunity are on where an example of a Kenya of the world is able to take existing laws, work it through your court system, pursue strategic litigation and also look at market pressure. What are the ways in which advertisers, as they began more to understand in a more savvy way, the way that they're advertising dollars, in fact, subsidizing this egregious kind of content and growth of these dangerous communities, then that's another opportunity. I Think one of the lessons from this is also what the international community, who is working to counter disinformation in different ways and how they invest more strategically. So, most of the investments in places, like Kenya are focused on fact checking. And in fact, that's a system that is also subsidizing what should be done by the platforms and buying silence in some ways. It's a soft capture of civil society to make them less apt to be critical of the platforms because they're trying to, and in a stop gap emergency, get things shut down that are very dangerous for their country. So, all of those things have to be addressed in a continuum, and we should not be too dystopian that we can't do these things, but it requires really sorting through what is being said about what the companies can do and what is expected of them from a regulatory viewpoint. It's the same situation as a big oil, a big pharma of any big industry that is essentially lying about their value to society and then trying to avoid actual accountability.

Lizzy Shackelford: Now, some countries have responded to these challenges. particularly with TikTok, we've seen different countries banning TikTok. With. Facebook and social media. More broadly, we've seen, blackouts of the Internet during, sensitive times like, elections. I know there's, not a lot of, enthusiasm, particularly in many publics about that, because it blocks off these opportunities for communication. But Leah, what do you think about the approach of kind of banning or cutting off the Internet at different times. and alternatively, what do you think are some policy decisions that you'd like to see your government make to try and address this better while you're taking the litigation route in courts?

Leah Kimathi: I must begin by looking at what's happening in Kenya right now. As we speak, we have an ongoing conversation, a petition really around banning TikTok, from some quarters in Kenya because of the really, bad content that especially is being exposed to children. So, there has been legitimate concerns from sections of the country saying that we need to ban Tiktok, but from where we sit, we are saying that that's not the way to go. So, you ban Tiktok today, and then tomorrow, what happens when another platform comes? So, are you going to keep banning? Left, right center. We are saying that we need more sustainable ways of addressing, the challenges of social media platforms and the harms that proliferate online. And that is why we are saying that first, we need to address issues around systemic racism. The fact that there is so little investment by these platforms to ensure platform safety for Kenyan users. So, we need to engage with that, whether we are talking about content moderation, whether we are talking about the political will of these companies to implement and enforce their own community guidelines. But then we also saying that we need to go a step further as the Kenyan users and the step further is looking at our laws. Yes, we agree that they are progressive because that is how we find ourselves with forecourt cases ongoing. But we do need to look at the existing legislation and ask ourselves, can we make it better? Can we, for example, borrow from what's happening in Europe in terms of the Digital Services Act? So can we look further afield and borrow and strengthen our regulatory environment? And all this obviously is based on effective demand structure. And that is why we do believe that Kenyans must rise up and demand for dignity, to be treated with dignity by these social media companies, because that's the only time they will listen when Kenyans demand that we are a community of nations. We demand the same dignity as they treat other countries, but we also think that, It's a business, they are looking to get as much as possible that they can get out of us with as little as possible investment. These are trillion dollar companies. And that is why we are saying that perhaps in terms of also strengthening our bargaining power, we need to come together as a region. Europe has their, digital services as a continent. So, because of the nuances, of the African, continent, whether we are speaking about languages, all those things. We need us African countries to come together and begin developing continent wide legislation that then speaks to our nuances and that brings together a more formidable demand structure to these social media companies.

Lizzy Shackelford: Now, Leah you're saying also that the Kenyan public broadly speaking is pretty aware that this is a problem.

Leah Kimathi: Yes, absolutely. we did a national opinion poll on social media, in 2022. And, what we found was very interesting that 70% of the respondents said that they have come across social media humps online. But even going beyond that, we had again, about 70% saying that social media companies are not doing enough to regulate themselves. And out of that, we had a further 59% saying that government needs to step in and regulate social media companies without an overreach. Now for the Kenyan population to say that they need government to step in and protect them, that it shows you that there is a serious problem that they acknowledge and they're able to see with these social media companies. And that is why we do believe that as the Council for Responsible Social Media, what we are doing is, bringing different sectors together. And spearheading the conversation around social media harms in Kenya, and then building this effective demand from the ground up. Because we believe that's the only time social media companies will listen, and perhaps then also work with public regulators to strengthen the regulatory environment.

Lizzy Shackelford: I have to second you on that. Having lived in Kenya myself, the fact that there is public demand for government action on this means that the public is quite concerned with it, which might be a good sign for organizing around it. Kristina question for you. Do you think that the EU is the leader in how to address this so far? Or are there other countries that have examples that the rest of the world can learn from on how to either press the self regulation or otherwise, enforce better standards on social media companies?

kristina Wilfore: We have a lot of experimentation going on and to be clear, this is a David and Goliath fight. I mean, if you look at the gas lighting, especially on the US, which has been part of the reason why there's not substantial legislation, curbing. The behavior of the platforms is that they've spent upward of 650 million dollars lobbying against it with ads that say they want to be regulated. Right? and buying up everybody in town and every lobbyist, you know, that there is. So that makes it difficult. for any government of any size and shape to carve out some specific remedies, but there's lots of efforts going on. And I think it is only a matter of time before both policymakers wise up as well as the kind of public Leah was talking about as consumers are smarter about. How it is that they are the product, and they are being targeted. There's another investigation that she persisted in Fumboa in Kenya were involved in exposing the way that dangerous products aimed at women that can make them infertile are thriving on Instagram and Facebook and at least YouTube admitted and move to demonetize some of this, so, so good for Google there, but we're still waiting for Meta. So, I think when people understand this is not just debates around elections and misinformation coming from your crazy uncle, but this is, you know, life and death matters that affect the health and well being future and independence and autonomy of a country. Then you see movement. but it's a difficult path. We can look to Australia and Canada who passed provisions to just provide economic support back to the journalistic entities, the news properties that Facebook shares, as a company and having essentially amount of the profit that they gain from using that material go back to these institutions. And in fact, what you've seen as a response is manipulation of Facebook and punishing punitive kind of efforts that they've done to really put the pain to the public. So, it's a 1 step forward, 2 step back. But the digital services act, now that it's law is really the gold standard and test case because of the way it shifts the burden of responsibility on the platforms to show in fact, their tools and their products are not causing harm. So, we'll see how that unfolds in the pending, European parliament elections in 2024 and as Kremlin related mis and disinformation is monitored and shut down and what this says about those commitments related to bigger issues, of course, around, war and conflict that are also hitting the European marketplace.

Lizzy Shackelford: Well, just as we wrap up this conversation, Leah, I've got one more question for you. what tools are available, for activists and advocates who are concerned about this issue to try and minimize these dangers in their own states?

Leah Kimathi: I think the first thing that we need to disabuse ourselves is that platforms are benign actors providing a platform for communication. And then we have the users are the bad actors that then post, nefarious information. We need to distribute ourselves of that, and we do need to recognize that platforms are through their algorithms, amplify. The bad material, the harms that are then, found on their spaces. So that recognition is extremely important. The 2nd one is that there is a relationship, a real relationship between online harms and offline and physical harms. I think we did see this during elections in Kenya, and we continue to see this even with terrorism. So, countries must seriously take the prevalence of online harms as a security risk. As a national security risk. Now the third thing is that, because we are not dealing with small companies, we are dealing with trillion dollar companies. And especially if you are in Africa, we are dealing with trillion dollar companies that will ignore us for as long as they can continue to ignore us. We need to surface evidence of these harms. And then we need to create alliances that are both local and global, so that then pressure can be brought to bear on these social media companies, both from here and externally. They may not listen to us here in Kenya, but if we have alliances perhaps with other actors in the global north, they may listen to those.

[Lizzy Shackelford OUTRO: Leah Kimathi and Kristina Wilfore -- I want to thank you both so much for coming on Deep Dish to explore how Big Tech is failing the Global South and helping fuel terrorism and many other societal problems across the world.  

And thank you for tuning in to this episode of Deep Dish.  

A reminder that we want to hear more from you, our listeners. So, send us an email or, better yet, a voice memo, to deepdish@globalaffairs.org -- You can suggest issues you’d like us to cover, guests you’d love to hear from, or you can just let us know how you think we’re doing.    

And if you are looking for more Deep Dish in your podcast diet? Tap the “follow” button in your podcast app so you get each new episode as it’s released. If you think you know someone who would like today's episode, please "share" it with them!  

As a reminder, the opinions you heard belong to the people who expressed them and not the Chicago Council on Global Affairs.  This episode is produced and edited by Kyra Dahring and mixed by Frank McKearn from Aphorism Productions. 

Thank you for listening. I’m Lizzy Shackelford and we’ll be back next week with another slice of Deep Dish.]

About the Experts
Social Researcher and Activist
LK Headshot
Leah Kimathi is based in Kenya as a social researcher and activist with a deep interest in governance in Africa, elections, peace and security, and diversity and inclusion. She is also a founding member of the Council for Responsible Social Media.
LK Headshot
Co-founder, #ShePersisted
KW Headshot
Kristina Wilfore is a seasoned international development, elections and civic advocacy professional who has worked in over 25 countries for more inclusive and responsive democratic movements. She is also the cofounder of #ShePersisted, the only global initiative uniquely dedicated to tackling gendered disinformation and online harms against women in politics.
KW Headshot
Elizabeth Shackelford
Former Senior Fellow, US Foreign Policy
Council staff Elizabeth Shackelford
Elizabeth Shackelford, a former career diplomat who served the US Mission to Somalia and the US Embassy in South Sudan, focuses on building awareness and understanding of a "restraint" approach to foreign policy, which seeks to limit the use of force to core US security interests and favors diplomatic engagement.
Council staff Elizabeth Shackelford

Related Content