How are authoritarians using artificial intelligence for political repression? Steven Feldstein on the global spread of anti-democratic applications. Dark Arts How are authoritarians using artificial intelligence for political repression? Steven Feldstein on the global spread of anti-democratic applications. Sean Nangle / The Signal In March, the Reuters news agency published a [review]( of more than 2,000 Russian court cases showing security-camera footage and facial-recognition technology having been used in the arrests of hundreds of people. Initially, authorities were using the technology to identify and detain people whoâd joined various anti-government demonstrations, but after the invasion of Ukraine last year, they started using it to intercept protesters and prevent them from demonstrating at all. Now theyâre using it to spot and whisk away opponents of the Kremlin whenever they want. Itâs a remarkable storyâand just one in a developing pattern of autocratic regimes using technologies powered by artificial intelligence to clamp down on their populations. Whatâs the extent of all this? Steven Feldstein is a senior fellow at the Carnegie Endowment for International Peace and the author of [The Rise of Digital Repression: How Technology Is Reshaping Politics, Power, and Resistance](. As Feldstein explains, repression-enabling AI applications have become key elements of the authoritarian repertoire globally. Autocrats have invested heavily in them because, although theyâve insulated their power and rolled back democratic movements in recent years, they still understand that the biggest enduring threat to that power in the contemporary world is their own people, either rising up in revolutions or voting them out in elections. And the biggest emerging opportunity to control people is by connecting the digital environments they increasingly live in to state surveillance systems powered by AI. This article is part of a [series]( in partnership with the [Human Rights Foundation](. Feldstein will be a speaker at the [Oslo Freedom Forum]( in June. J.J. Gould: How have autocratic authorities been using artificial intelligence? Steven Feldstein: There was a moment in the early 2010s when new digital-information and -communication platformsâsocial-media applications especiallyâhad started to play this remarkable role in helping civilians around the world mobilize and challenge the autocracies they were living under. We saw this in the color revolutions in post-Soviet Eurasia through the Arab Spring. And it led to a lot of optimism that these liberation technologiesâas they were calledâwould help propel a new wave of democratic revolutions globally. Whatâs actually happened, alas, is that autocratic governments have figured out how to use new digital technologiesâAI applications especiallyâto repress their citizens more effectively, undercut emerging liberation movements, and reinforce autocratic political power. Weâve seen this above all in Chinaâbut also in Russia, the Gulf states, and other authoritarian and illiberal regimes. The range of applications has been expanding, but a few use cases stand out. One is tracking popular discontent and, when it comes to it, controlling mass protest. That can work in a number of different ways. It can work on a mass scale through automated social-media monitoring, interpreting what people are thinking from what theyâre saying online. It can work through public-surveillance cameras and other ways of seeing when and where people are gatheringâand then preempting political demonstrations or arresting people who participate in them. Weâve seen this increasingly in Russia, for example, in these techniques Moscow uses to pick up and neutralize anti-war protesters. Advertisement A second use case is maintaining control in an area of a country where the state is experiencing unrest. Chinaâs Xinjiang province, where thereâs ongoing dissatisfaction and pushback against Beijing among the regionâs Uyghur population, is a prominent example. Here, Chinese authorities continue to use traditional autocratic repression tactics, including brutal reeducation camps. But theyâre also supplementing these traditional tactics with advanced machine-learning technologiesâfacial-recognition platforms, biometric scanning, genomic surveillance, and so onâwhich the regime can integrate with an information-management system that enables predictive policing carried out by tens of thousands of security officers. A third use case is super-enhancing propaganda and disinformation. Where an autocratic regime has a constitutional obligation to hold formal elections, for instance, and where it might have conventionally used methods of rigging those elections like ballot-stuffing or voter suppression, itâs now more and more likely to augment those methods with AI technology. It can now collaborate with bot and troll armies to spread approved messaging. It can identify and engage key social-media influencers. It can leverage social-media platforms to push out automatic, hyper-personalized disinformation campaigns. And it can use deep-fake technology to generate ever-more realistic audio and video forgeries to discredit challengers. Gould: Why the demand for repression-enabling AI technology? Havenât traditional autocratic means of repression worked just fine in preventing a new wave of democracy globally? Feldstein: From the end of World War II through the late 1980s, the most common way for an autocrat to lose power was a coupâwhich is to say, by being forced out by elite competitors internal to the regime. After the Cold War, the threat started to shift toward popular challengers external to the regimeânamely, toward mass revolts or electoral defeats. The implication for autocrats has been a need to focus on controlling those challengersâby repressing popular civic movements and manipulating elections. Thereâs a very direct logic aligning this perceived need with AI technologies. The application of these technologies for political repression can be expensive to developâbut in the end, not as expensive, in terms of either resource costs or political risks, as relying wholly on the manual work, as it were, of old-fashioned security forces. So in the context of the shifting threat to autocrats, AI technologies have improved both the effectiveness and efficiency of political repression. Joseph Chan More from Steven Feldstein at The Signal: âNow you have governments all over the world wanting to buy the technologies that enable this monitoring and management, and you have companies all over the world manufacturing and selling them. Autocrats from the Gulf, like Qatar, or from the Middle East, like Egypt, or from South Asia, like Pakistan, are all to some extent importing Chinese technologies. But theyâre also importing technologies from companies in other countries, many of which are democratic. The world has become a lot more complicated in that way.â âThereâs understandably a lot of concern in democratic societies about the kinds of social control the large language models that power AI will potentially enableâboth at an industrial scale, in spreading bad information, and in ways that are remarkably customized for persuasion at the individual level too. Thereâs also a lot of concern about ways large language models are starting to power surveillance techniques in criminal detection and law enforcementâwith the use of these techniques already, in some cases, racing ahead without regard to any regulations that lay out standards and norms for whatâs private or secure and what isnât.â âYou can, to some extent, deny the most advanced technologiesâwhich are the hardest to acquireâto those whoâd accrue the most damage with them against their citizens. As an example, in the U.S., an executive order on spyware recently came out ⦠that would deny any kind of market ability within the U.S. to companies selling these tools to regimes with bad human-rights records. That creates a big incentive to stop doing it. Does this mean those countries wonât be able to attain the tools at all? Probably not. They can probably get something like them from someone else. But it makes it harder, it makes it more expensive, and it probably thwarts these countries from being able to use the most top-of-the-line capabilities theyâd otherwise want to use.â [Continue reading ...]( [The Signal]( explores urgent questions in current events around the worldâto support it and for full access: [Subscribe now]( The Signal | 1717 N St. NW, Washington, DC 20011 [Unsubscribe {EMAIL}](
[Constant Contact Data Notice](
Sent by newsletters@thesgnl.email