0
0
Podcast

The State of Authoritarian Tech | Steven Feldstein

Authoritarian regimes are upgrading their playbook — from surveillance cameras and spyware to algorithmic censorship and AI-driven policing.
0
0
Oct 2, 202559 min read

David:
[0:03] Steve Feldstein is a senior fellow at Carnegie's Democracy, Conflict, and Governance program, where he focuses on the intersection of technology,

David:
[0:09] human rights, and global politics. He's also the author of two books, The Rise of Digital Repression, How Technology is Reshaping Power Politics and Resistance, that came out in 2021. And then an incoming book, Bites and Bullets, Global Rivalry, Private Tech, and the New Shape of Modern Warfare, coming out next year. We wanted to get Stephen on the show today to inform us about a subject that we think is just an important to keep our eyes on, which I'll call the state of authoritarian technology or repression technology,

David:
[0:39] technology that assists authoritarian states in the repression of its citizenry. Stephen, to start off this interview, how is the outlook of repression technology? I'll admit when I wake up in the morning, I don't really think about how mature or sophisticated this tech sector is on NET. How capable is this tech sector today? How fast is it progressing?

Steven:
[1:00] It's progressing fast. First of all, thanks for having me on as well. I really appreciate it. The sector is progressing fast. It's something that I think, it's an interesting arc that we've seen. I would say if you go back about 15 years around that time, there was a moment where there was a real emphasis, I think, on liberation technology and the idea that different digital tools could really liberate and empower people to fight back against state control and so forth. And I think there's been a real reversal since then, that governments have gotten clued into the power of these technologies. These technologies continue to get more effective and cheaper. And things like surveillance, biometrics, and so forth have become ubiquitous around the world, no matter where you are. So I think what's interesting is that they become more common, they become cheaper to use. And, you know, there's, there's someone that is, is all around us to a much greater degree.

Ryan:
[1:52] Steve, is there like a taxonomy that you think about when you think about, you know, repression, digital repression technology? Like, what are the different categories in your mind?

Steven:
[2:02] Yeah, there's a few. So, I mean, I would start with surveillance technologies being one. And within that, there's a few, even a few different types. You have mass surveillance technologies. So things like public facial recognition cameras are used to see who is gathering where. You also have social media surveillance technologies that really tries to mine through millions of different messages to see what types of sentiments or political challenges are happening. And then you have targeted surveillance, things like spyware, where you have governments that are able to use software to eavesdrop on conversations, on emails, and so forth. So that'd be one category. And then another is sort of related to that censorship, the idea that you're trying to stop and control information from spreading or being produced. And so that could be everything from China and its Great Firewall stopping Western platforms or media from coming in. Or it could also be something where internally you're trying to police what people are able to say and you take keywords that people are using and you essentially downrank or delete those messages in an automated way. I think the third thing is actually disinformation. So it's the idea, especially you see a lot of different states where they try to manipulate information for political ends. I wrote a chapter in my book about the Philippines being a prime example of that, where you had the former president, Duterte, really use and create an entire disinformation network that was meant to tarnish opposition politicians.

Steven:
[3:29] Tarnish civil society, journalists, anyone who would challenge the government narrative about what he was doing.

Steven:
[3:36] And then internet shutdowns is another one I would also put in this kind of category. It's Essentially, the idea of trying to disrupt or cut off connectivity, especially when you see instances of large protests or gathering momentum against challengers. And to some extent, the ban that just happened in Nepal last week of social media platforms that then galvanized mass protests and the fall of the government there is an example of an attempted use of shutdowns.

Steven:
[4:06] That backfired and actually led to significant political change. So that's kind of in a nutshell, the universe.

Ryan:
[4:13] Steve, can you actually talk about that, the Nepal case? But Dave and I were kind of tracking this and understand some of the anti-authoritarian technologies that were used by some of the protesters and I guess, you know, revolutionaries, we should call them because they basically led a revolt against the incumbent government. So they were using, you know, encrypted communication technology, things like Signal, things like Telegram. There's reports of them using cryptocurrency because some of this censorship and repression obviously doesn't just shut down communication networks. It also debanks people and shuts them out of the financial system. But can you just talk about that? Because that's been in the headlines recently. Although I don't think mainstream media, there's a lot going on in the Western world right now to report. So I feel like the Nepal, the craziness that's going on in Nepal has been maybe underreported. But what are the facts there and what's the situation like?

Steven:
[5:10] Yeah, it's been really interesting. And I've actually commissioned a couple pieces via a network I run at Carnegie to write more about it. So that's coming out in the next few days. But essentially what you saw is that for a number of weeks you saw a rising level. And this is really Gen Z oriented. So that was one of the things that was really important is that the bulk of the people protesting, spreading messages and so forth were of a younger generation that were finally sort of coming into their own and were really angry about a couple things. They were angry first and foremost about corruption and they were particularly incensed by, they had this hashtag, Nepo babies. And they were looking at politicians' children and all the kind of wealth that they were flaunting. In a country that is extremely poor, where people are forced to leave and become migrant workers, In other countries, you had, you know, these children, the very small elite of wealthy politicians who were going out in fancy cars, showcasing luxurious lifestyles and so forth. So that starts.

Ryan:
[6:07] Chronicling all of this on social media, I imagine, right? Just like kind of influencer culture. But you can imagine this for the kind of the Nepo baby elite, you know, children of the politicians.

Steven:
[6:17] And so that started to gain momentum. And, you know, what happened was that there already was an attempt by the government. They had sort of passed a law that would essentially force platforms, all sorts of external platforms from Meta to Instagram to X and so forth to register. And so then they decided to enforce this law and immediately suspended 26 or so platforms from operating in Nepal. Now, the government claimed at the time that they were doing this strictly because they were trying to enforce the law. What everyone in the country more or less saw was that they were trying to silence this growing anti-corruption movement that was starting to look very dangerous for the government. So they suspended these platforms fully. And then immediately what happened was people became even angrier about that. And they started coming out on the streets. And then you start seeing violence occur. And so day one, I think 19 people were killed.

Steven:
[7:13] Immediately, the government backtracked. They said, okay, we're not going to suspend these platforms. But at that point, it was too late. You know, people were so angry. More and more people joined. The second day, more people died. Buildings started to be burned. Immediately, the prime minister resigned. And then all of a sudden, you had sort of a change. And what was interesting is that one of the platforms that was used, I think this is the first time I've seen this platform used as a way to organize a mass amount of people online was Discord. And so Discord was being kind of leveraged and, you know, it was kind of open. You know, it wasn't like there were a lot of use of encrypted apps, Signal, Telegram, so forth for different people to communicate with one another. But when it came to actually trying to make political decisions amongst everyone, Thousands of different activists and their supporters, they, from what I understand, use Discord as a way to sort of make key political decisions and to figure out who would sort of step in to run the country in this vacuum that had occurred. And I would assume at this point, they're still using Discord to make decisions on a day-to-day basis.

Steven:
[8:13] Which is wild.

David:
[8:14] I know we're at the very beginning of this episode, and it's all about authoritarian repression technology, but I could imagine in an alternative universe, or maybe Steve, we just have you back on, we talk about the inverse, which is, what would you call this, freedom technology?

Steven:
[8:26] Liberation.

Steven:
[8:27] Liberation technology.

David:
[8:27] Liberation technology, yeah. And like it turns out, Discord might actually just be in the toolkit for preserving one's freedoms. Using this example of Nepal, you just gave us a taxonomy of four different categories of repression technology, surveillance and spyware, censorship and firewalls, disinformation and internet shutdowns. Can you talk about how Nepal, the government of Nepal before it was overthrown, the actions that it took and what categories that those map to?

Steven:
[8:53] Yeah. And you know, what's interesting about Nepal. So I would say for like primarily what Nepal used, I mean, I would say surveillance less because surveillance tends to be something that is used, it can be used in more sophisticated ways. Nepal is not that well-resourced of a country, so it tends not to have access to the most sophisticated advanced tools. We're not looking at a Russia. We're not looking at a China. We're looking at a place that's very poor, even amongst its government. And so censorship, I think first and foremost, and using kind of lawfare in combination with more technological censorship means was a classic, it was classically used in Nepal case, right? So essentially, you know, they said, look, we have a problem in terms of people saying things that are challenging the government online. A, we're going to use a law that clearly isn't very democratic. And we're going to weaponize that to try to stop communication. And then we're going to actually sever the people's ability to access these apps completely with using this kind of internet shutdown type of thing. To the extent that they have been able to use disinformation techniques, it doesn't seem that that has been particularly effective. I mean, that's something that actually takes a lot of investment and time. It's not something you can just sort of throw in the mix with zero followers or maybe one influencer who you're paying off and you can all of a sudden kind of like, you know, have an impact. So primarily it was in this case, I would say censorship first and foremost with internet shutdowns as their modus operandi.

David:
[10:18] So between these four categories, again, surveillance, censorship, disinformation, internet shutdowns, is there any one category that scares you the most? Or is it more about the effective coordination of leveraging all of them in concert with each other? If we're talking about the archetypal authoritarian state, it's not really ever just going to be one. It's more about leveraging these all skillfully in a coordinated fashion to actually do successful repression. Is that kind of the idea?

Steven:
[10:50] Yes, right. It's a system of repression, right? It's like a structure that you put in place and you use and balance off different tools because each one can serve like a different purpose, but on their own, they don't necessarily bring about the effect you want. In combination, they can be very potent. So if you think about it, you know, surveillance can and be helpful to understand what specific people are saying on the one hand, or to understand more broadly if there are challenges bubbling up from a population, like, you know, is there a large amount of discontent when it comes to corruption issues? Okay, that actually can be very useful for a regime to kind of understand, you know, oh, there's a vulnerability, we need to address it. On the other hand, if they say, oh, these four or five ringleaders are really leading the charge online or offline when it comes to organizing protests, let's pick them up. Let's imprison them, right? So there you have a combination of kind of like.

Steven:
[11:42] Surveillance together with, you know, frankly, just basic repression measures like imprisonment. Censorship is also useful then once you kind of get a sense that, okay, we have this bubbling discontent. Well, what do you do about it? I mean, one way to deal with it is to try to stop it from coming about at all. And that's, you know, in China, that's a very typical fall-to strategy. We have dissent bubbling up. Could be about environmental issues. Could be about corruption. Let's try to stop it, quash it, either by just taking offline those accounts or using other kind of means. Or another option is use disinformation. Distort the message. Try to turn it back on the people who are pushing this out to begin with. Claim that the people who are saying that the government is corrupt are actually being paid by foreign governments themselves and that they themselves are corrupt. But use different means to distort the message and to make it so people have a tough time discerning what's true and what isn't. So that's how the combination of effects works. It's not any one thing per se, but it's sort of adapting. It's kind of using physical digital repression techniques. It's trying to kind

Steven:
[12:49] of work them through in a dynamic way.

Ryan:
[12:51] Yeah. So it almost seems like it is the case, Steve, that the scariest thing is the level of sophistication to which a company might apply this digital repression stack, right? And there are various levels of sophistication. So it seemed like Nepal was using more of the blunt instrument of like, oh, just censorship. Hey, you know, like, we don't like dissent. We're going to shut off the internet. But this is incredibly obvious to protesters what's going on. You shut off the internet, everyone in the country gets pissed. And it's very clear what you as the government is trying to do.

David:
[13:25] Exactly. If the government just does that, it's like, oh, yeah, we know we're doing bad things.

Ryan:
[13:29] Right, exactly.

Steven:
[13:30] It's an omission of defeat. Exactly.

Ryan:
[13:32] It's the more subtle things that seem kind of scary, where if you can just sort of bend the conversation that's already happening online in a certain direction, or if you can put a tinge of fear out there so that individual citizens self-censor, or if you can plant some sort of message out there that seems to distract people from what's really going on. And it strikes me, that's what a more sophisticated authoritarian nation state would actually do. They're not going to straight up ban the internet or ban particular apps. They're going to just shape the direction in very subtle ways of the population towards their coordinated end. And that's how a country like China might use this sort of tech stack. That's right.

Steven:
[14:19] Yeah. No, I mean, I think, you know, what it first and foremost comes down to is like, I would say, look at the underlying system and look how well it's resourced and look how coercive it is. And so China really, you know, fits the bill in all respects. It's sophisticated, it's well-resourced, it's highly motivated to repress. It has a tradition of it as well, being an authoritarian country. So it can bring all these elements together in a pretty significant way. And what it does is, you know, it does a little bit of both. So you don't see internet shutdowns happen anymore in China. I don't think there was a single one in the last few years that occurred, which is sort of interesting given how much censorship there is. But what they have done is they, first of all, made a decision 20 years ago or so to not allow Western platforms in, at least without using a VPN, right? So the ordinary citizen, without sort of taking extra efforts.

Steven:
[15:07] Can't access outside information. So that was like sort of step one. And then what that allowed them to do is that, well, you have this vacuum because people still want to communicate and they want to do things. So create our homegrown substitutes, create an alternate ecosystem of Chinese apps that we can control. And we can control through very quietly coercive means. You know, we pressure companies or even just blunt means. You know, we essentially put in place filtering algorithms that say this type of speech is not allowed and this is. And if we start getting too many hashtag corruption, you know, messages, we just make sure that we delete them or downrank them. Right. And then you find other ways, as you mentioned, you know, in terms of using subtle disinformation, distortion techniques, kind of pushing out pro CCP messages and so forth. And the combination of all that has made China's ecosystem almost impossible when it comes to actually being something that's open or democratic or pluralistic or so forth. And you see that similarly, you know, play out in places like Russia. Iran is trying to do the same thing as well.

Steven:
[16:06] You know, some of

Steven:
[16:06] The Gulf states have been pretty successful in terms of creating closed environments. But, you know, these are all places that are highly motivated, have a lot of resources and are willing to deploy different types of techniques

Steven:
[16:17] and combination to push back against kind of a more open digital environment.

David:
[16:22] Is there like a central hub of this industry, like in the same way that Silicon Valley is the center point of social media platforms and more recently AI, or you have like Virginia, Maryland, Washington, D.C. For like the industrial complex, military industrial complex. Where is the locus of this technology or is it just kind of like spread out all over the world?

Steven:
[16:45] Yeah. That's an interesting question. I mean, I think on the one hand, you can make a fair argument on two respects. You can make an argument, one, that there's a lot of authoritarian learning taking place. So people look to models and China has been the preeminent model. And they say, oh, how did they do that? Do we want to have a great firewall? What are some of the other techniques that China has done to kind of control its environment? And so different aspects of that can be borrowed. And I would say, in addition, China has been known to export a lot of its authoritarian technology to other countries. And so you do have a situation where facial recognition systems are used in places that may not have much of a business having such a system. Places like Uganda or Kenya have Chinese or Serbia have Chinese facial recognition systems that are used.

Steven:
[17:30] And so, and that's that you do have an export sort of strategy. And so you can make a fair argument that, you know, China itself and some of the different cities within that, Hangzhou, Beijing, and so forth, are kind of places where this repression technology lives. But the other point that I've tried to make and that, you know, I think people have figured out is that it's not just Chinese technology that's used. A lot of Western companies also have been responsible not only for powering China's own repression, but for exporting different parts of the stack that are then used for repressive means. And, you know, certainly when we're talking about something like disinformation or propaganda as a tool of repression, like in the Philippines, that was all meta.

Steven:
[18:08] That's a U.S. company. You know, it had nothing to do with China. In fact, in my case study, everything there really centered around a couple social media platforms. I mean, frankly, largely meta to begin with. So, you know, the idea that it's strictly something that emanates from an authoritarian country out and that as liberal democracies or quasi-democracies, we're sort of vulnerable to that. It's not really how it works. It's really kind of goes out in different ways from different sources, both Western or U.S., democratic, as well as authoritarian.

Ryan:
[18:38] In the case of something like Meta in the Philippines, the way we experience Facebook in the US is basically it's a reasonably open platform. Of course, they have some algorithmic selection as far as which news feeds are pushed down to our timeline, that kind of thing. But to the best of our knowledge, it's not censored in any ways. When they export Meta or Facebook to something like the Philippines, what do they just give the government basically a toolkit, some toggles to switch on like, you know, authoritarian mode or something like this? And, you know, is that basically what they're doing? They're taking the structure, but then they're customizing it. You're giving the government the ability to customize it. And I guess in defense of that, on the meta Facebook side, they would say, well, every nation has their own specific laws and we can't take our American Western values and enforce them on another country. So of course, we have to localize our platform for the laws in a particular nation state. And somebody could listen to that argument and reasonably they could agree and they could say, yeah, this is just Facebook being, I guess, giving nation states the ability to kind of customize things for their local traditions or culture or laws or whatever. But on the flip side, What if the laws are authoritarian? Then they're just basically arming governments with a toolkit to censor, to tweak the algorithms. Is that basically how it works?

Steven:
[20:06] Yeah, you know, it's a complicated issue because it's really kind of cat and mouse. And I think, you know, it's probably overly simplistic for me to say or for others to say, well, you know, the reason why you had propaganda that spread out in the Philippines is, due to META. META was the means by which that happened. META was the intermediary, but the source and the motivation came from the government. I mean, I think the first and most interesting thing people point out when it came to META's experience in the Philippines and with Duterte, the prior president in particular, was that they went to his campaign when he was sort of like a long shot presidential candidate. And they said, and they offered this to all the presidential candidates, and he was the only one who took them up the offer and said, look, we can be really helpful to you. You can advertise politically here. You can use our platform, these different tools to promote your message. And then, you know, if you were to win, we can continue working with you to show you how Meta can be helpful to your own political agenda. And so he said, yeah, this sounds great. He met with the Meta representatives. He sort of incorporated their ideas. He was able to wield Facebook really successfully. In fact, some people argue that it was because of that collaboration and because of his adept use in leveraging of Meta that he won the presidential campaign, period.

Steven:
[21:21] And then from there, he obviously said, well, this is a potent tool. Let me keep using it. So, you know, part one was the fact that they went to him and they went to other political candidates. He was smart enough to use it. And Meta was fine with him manipulating or sending out bad messages because they said, look, that's more eyeballs. That's more revenue for us. So we'll take a hands off and, you know, we'll just see what happens out there. That was one issue. And then I think the second thing then is once you started seeing an ecosystem that became pretty poisoned by really violent rhetoric, by harassment and so forth, particularly geared in a political way to tarnish and smear enemies in an anti-democratic fashion, meant it very little, right? So, you know, the idea, and if anything, I would say, you know, the fact that he was able to so adeptly manipulate algorithms, bring on influencers and orient, view, you towards his messages was something that maybe Meta would have wanted to think about, or at least think about those big influencer users who are spreading that's messages, whether that's something they want to allow to remain on the platform. So at any rate, that happened. We can sort of debate to what extent were they culpable, to what extent should you just allow for an open environment and kind of let things go. Either way, without Meta there, that would have taken away a tool, that person used in order to spread his brand of politics.

Ryan:
[22:43] I guess the question though, Steve, is like, what should they have done? What should Meta have done, right? So should they have just not played ball at all and withdrawn the Meta platform entirely and Facebook entirely from the Philippines? This sort of reminds me of the discussion with, I don't know, over a decade ago now, of Google withdrawing from China. And basically what happened is Baidu became the default search engine for China because Google withdrew because they didn't want to customize their search results for the CCP. Is that what you're essentially advocating? Is that what a Western tech company should do, is just stand firm on open internet Western liberal values and not allow the customization of these platforms and just withdraw if they have to?

Steven:
[23:29] Yeah, I would. I would do that. I mean, I think you got to.

Steven:
[23:32] Draw a line.

Steven:
[23:32] I mean, I think that, I mean, first of all, the argument that, And I remember this, I heard this argument quite a bit when I served in government as the democracy and human rights official for Africa. I would travel around the continent, everywhere I would go, they would say things like, look, you're not respecting our local laws and traditions, which are authoritarian and that your liberal values don't comport well in our countries. And it's funny then, because I would talk to many different, you know, I'd talk to college students, young people, ordinary citizens, civil society groups in those same countries. They say, what are these guys talking about? Of course, we want to be able to express ourselves as we want. So, this idea that local laws, somehow, like we have different cultures that are like, you know, that are pro-censorship and pro-authoritarian, I mean, I think that's a fig leaf. That is made to put people on their back foot. What I had always said then was that it's about universal values, it's not about US imposed values.

Steven:
[24:21] Generally speaking, universally, you know,

Steven:
[24:23] Believes in freedom of expression, freedom of association, and so on. And so that's what we're trying to espouse. And I think like, if you go in a place and you're kind of, and there's a gun to Meta's head, or to another platform, and they say, conform or leave, I would say, follow the universal principles, say, we're going to follow these principles. You don't want us there, we'll find ways to work around your restrictions. But at the end of the day, we can't do it. Like, I would rather not be complicit with the CCP and get out as Google did, let Baidu go in there anyway, because ultimately, I think Baidu, China would ultimately replace Google anyway, as opposed to sort of trying to play ball with a clearly authoritarian, repressive enterprise that over time would boot you out once they had an opportunity to do so.

David:
[25:08] One of the reasons why I kind of feel this problem is so pervasive or so difficult to go up against is the customers of repression technology are the world's largest, most wealthy entities ever, the nation state. Not only are they the most wealthy entities ever, they can all print money, which if that's your customer, if your customer can print money out of thin air, it's a pretty good customer to have. And so I'm assuming there's this notion of kind of like in the same way there's the military industrial complex, like the government wants a very powerful military. It contracts out to the free market to service and provide the world's most deadly weapons. I'm assuming there's this very similar industry complex between repression technology and nation states. But Steven, other than just the United States companies that can manufacture weapons, Northrop Grumman, Lockheed Martin, all of these, I couldn't really know. And then Meta as well, like you know repression adjacent at the very least I couldn't really name you one other like company that's dedicated to sell to be like a country's vendor of repression technology? Are they out there? Am I just naive to them? Are they just not common knowledge as much as other companies? Like what's out there?

Steven:
[26:29] Yeah, there's a lot. Part of it is that it's a pretty diffuse marketplace. So it kind of depends on what type of technology you're looking at. So I'll give you a classic example for spyware, NSO Group, the Israeli manufacturer. I mean, they've been implicated. And if you're in this world, NSO Group is known to everyone. They're bad. They've been implicated in everything from spying on slain Washington Post journalist Jamal Khashoggi's wife to many other types of scandals like that, spying on all sorts of politicians, heads of states, and so forth in illegal ways. So they're an example. To me, they're a quintessential example of spyware targeted surveillance being used and the need to put in place rules to restrict their products. HitVision is another example, a Chinese company. They make facial recognition cameras. They're a key part of safe cities. Huawei, they often work very closely together with Huawei, which is the one that installs the safe cities completely. So if you look around the world and different places where you have repression and you have public surveillance goals, those companies are right out there in terms of selling repression technology.

David:
[27:36] Are they like dedicated repression technology sellers as in that's their one thing? Or is it also like they have other like Apple has fantastic facial recognition technology. That's how I unlock my iPhone. That's a service to me. I value that. That's good. I like that. But like there's a double edged sword there. They're also pioneering facial recognition technology. I don't know if Apple sells this technology to nation states, but you can see how there's like a good side and a bad side to the same product, which makes it a little bit harder to like point a finger at a company and say like, well, you're bad. Because you only sell products, repression technology to nation states, you just named three, which are like, okay, maybe these are, their only vendors are authoritarian sides of nation states. But like, I would guess it's also a little bit more diffuse in the sense that like, there's just, it's just technology. And like, sometimes a purchaser like China could just use it for bad purposes. It makes it a little bit more harder to pinpoint the actual nefariousness.

Steven:
[28:34] Yeah, no, a couple of points on that. It's a really interesting point that you raise. And so on NSO Group, it is just a spyware manufacturer. So there's no other dual use kind of aspect to it.

David:
[28:43] That's their slogan. We make spyware.

Steven:
[28:45] We make spyware really well. And you can do bad things with it. So there's not much.

Ryan:
[28:50] Or you could do good things. I mean, is it so bad for an intelligence agency in a country to deploy spyware to just go disrupt a terrorist organization and infiltrate their group, right?

Steven:
[29:01] Well, that's the argument they make. I mean, there's these narrow exceptions. I mean, they're very narrow. They're very small. If you look at kind of like, there's like a whole legal framework. Like it has to be legal necessity, it has to be proportionate to the threat out there, it has to be just, you know, kind of finely tuned towards, tailored towards that threat.

Steven:
[29:19] And by and large, if you just look empirically at how it's used, the answer is that's not how it's used. So everyone uses the national security exception, but then they don't conform to the national security exception. So to me, that's kind of like a bit of a, you know, it's a bit of propaganda by NSO group. But to the other point, let's look at Huawei or Hikvision and so forth. You're right.

Steven:
[29:37] It's dual use. There's lots of, Huawei's a huge company, right? I mean, they make everything from semiconductor chips these days to phones, operating systems, so forth, all up and down the stack, Huawei products are there. And frankly, a good many of them are used for benign reasons, for economic productivity and so forth. So to say Huawei is a digital repression company per se is not quite accurate. I think what a lot of people would argue, and this gets to the TikTok issue that we've we've seen stem up and other sorts of Chinese companies that because China's political establishment is so tightly linked to their companies because they have a national security law and they control how the products are used and they can build in backdoors and surveillance as they'd like because that's been sort of proven in different sort of aspects. There's an element where people say, well, if it's a Chinese advanced digital product, that we can't trust a period in a liberal democracy, that over the long term, that represents a threat to our ecosystem. And you're seeing that debate play out right now in the US, what to ban, how much to ban, whether to divest TikTok, among other sorts of platforms, because even if we haven't seen the threat manifest now.

Steven:
[30:50] There's such a possibility of that because their system is so resolutely opposed to ours and they have such control over their companies, We might as well assume that, assume the worst, and go in that direction. That's the argument.

David:
[31:03] It's as if China has a controlling board seat over every tech company that comes out of China. And so if we adopt that tech company, TikTok, in the United States, maybe they don't have a backdoor now, but China's on the governing board. And so if TikTok influences the United States, well, that's actually just China influencing the United States.

Steven:
[31:25] Not only do they have board seats or implied board seats, I mean, they can talk to any board member there and say, hey, vote this way. But they subsidize them. They give billions of dollars. When Huawei sort of faced a death sentence because it was blacklisted, first under Trump and then under Biden, the reason the company was able to stay afloat was because the CCP gave them billions of dollars to keep them going. And there's a number of interesting books and reporting out there that sort of shows this. So it's not just that they're even coercing, they are handing over money and basically saying, well, we own you and we can decide what you do. So when the time comes, do X, Y, or Z. And that's certainly what happens.

Ryan:
[32:06] Who are some of the worst offending countries, Steve? So we talked a lot about China. Maybe we think of them when we think of authoritarian tech, But this is pervasive. It's not just China. What regions of the world are most authoritarian from a digital tech deployment perspective?

Steven:
[32:25] Yeah. Well, what I'd say, I mean, what's interesting too is that when I first started looking at this, research, digital repression. There wasn't as much of a neat overlap then between just being authoritarian and using these techniques, in part because I think it was still pretty new, and you had a lot of countries that weren't kind of up to speed or in terms of having a lot of people online or just having acquired a lot of these technologies. But things move quickly. I mean, we live in a digital world now. People are connected all over the place. And so one thing I would say is just that in general, if any authoritarian country will also be digitally repressive, the two go hand in hand. I can't think of a country that isn't, you know, restricting of political rights and things like that, that somehow has like a free and open environment. I mean, sometimes you see like a little bit of a discrepancy and Russia is kind of an interesting case, although it's also closing down things completely in its digital ecosystem. But at this point, by and large, I would say find an autocratic country, find an area that has lots of authoritarians, and you will also see digital

Steven:
[33:30] repression. So, you know, we can look at a few places, you know, that are out there. The Gulf is one, you know, highly authoritarian in terms of repressing civil political rights. Also, big purveyors of everything from spyware to surveillance, Chinese authoritarian tech and so forth. Russia, North Korea, you know, countries like that, that are kind of classically classic dictatorships.

Steven:
[33:52] Rwanda is another good place. If you're kind of looking in Africa, the Middle East overall, you know, all these countries that repress rights, that are authoritarian, incarcerate prisoners also rely on these tools to try to control their populations.

Ryan:
[34:06] How about in democracies? How about in sort of the West? How about those of us who- Couldn't be.

David:
[34:12] Could never.

Steven:
[34:13] I don't know what you're talking about. It's not happening here.

Ryan:
[34:15] Is it? Right? It's just in the authoritarian countries?

Steven:
[34:17] Yeah. Look, these tools are used,

Steven:
[34:20] Right? I mean, the argument that democracies would give is that in general, they are used under the rubric of the rule law so that whatever, however these tools are deployed, there's checks and balances, that you have the right to appeal, that you have to go through a due process when it comes to prosecutions and so forth. But that doesn't mean that there certainly aren't abuses and those come out quite a bit. And there's a lot of edge cases. I mean, that's the thing. One of the big criticisms is that the law hasn't caught up to the technology. So when it comes to social media surveillance, no matter what the type of investigation there is, it could be the NYPD looking for a certain type of individual. It could be some other kind of law enforcement issue. And this particularly pertains when it comes to the border. There aren't clear rules in terms of what's invasive, what's a violation of privacy, and what is acceptable, particularly when it comes to online. And that's where I think we've really lagged when it comes to having privacy legislation, that kind of thing. So certainly the United States and many other democracies as well, there is certainly much more accountability in place. There's much more of an ability to push back and have redress when abuses occur. But the law isn't quite there in terms of how these advanced techniques are being deployed.

David:
[35:34] The rate of acceleration in this sector of repression technology, I'm assuming is accelerating. There's technology everywhere is accelerating. Like my understanding about democracies just goes so slow. There's got to be a gap there between the rate of capacity growth in repression technology and the rate of democracies to be able to contend with these things, right?

Steven:
[35:54] Yeah, yeah, no, there is. And I think it's one of those issues where, you know, it's hard to create laws when you don't have consensus about what those laws should be. I mean, generally one follows from the other. You reach a kind of societal consensus and you say, this is bad, whatever it is, and now let's create a law to codify it. I think people are still trying to figure out what this stuff means. They're trying to figure out what role does artificial intelligence play in my life? What guardrails do we need to have in place around that? If law enforcement uses this technology for certain activities, what does that mean for me? Do I like it? Do I not like it?

Steven:
[36:29] What's the balance between addressing terrorist threats versus protecting privacy? People are still debating this quite a bit. And frankly, in the political environment that we're in right now, we're not going towards consensus. We're going towards more division and, you know, more pointing at one another. And so that's part of the issue. How do you create a law in a society that really is kind of moving away from consensus and moving more towards division and disagreement, especially when it comes to new tools?

Ryan:
[36:57] There's maybe two big explainers here as I look at this issue, Steve, and I'm wondering if this, you know, this resonates for you. So the one explainer of why we're getting increasing digital repression across the world, even in our democracies, is the economics of it. It is so cheap to deploy digital repression at scale. It's so cheap, inexpensive, and effective relative to the old analog ways of doing this through some sort of secret police that's spying on everyone's dinner table conversation. It's like we can have machines and databases, and we haven't even talked about AI, which we're going to, but all of these things can be listening in and deploying at scale. So there's an economic, I think, narrative with respect to this story, and that's got to be a big driver. The other piece of it too is I feel like even in the West, our civil liberties were basically dreamed up for an analog type world where we look at something like the First Amendment

Ryan:
[38:00] Social media platforms, where do they fit into that? Where does right to privacy sort of fit in? I mean, maybe the framers and the founding fathers didn't create privacy amendments in the Bill of Rights because spying on everyone is just impractical from an economic perspective. And we didn't have anything that could do that. There was no contemplation of the digital world and the world we live in today. So even in our Western democracies, this transformation from the analog and the physical to the digital, we just haven't adopted the toolkit really and the legal toolkit. And I think governments, I mean, when they get big, do they even have an incentive to do that? Or is there incentive more to kind of creep into the areas of our lives that will kind of penetrate that? So to what extent do you think it's those two stories, the economic story and just the transformation from this analog era to the digital that explains a lot of the direction of travel here?

Steven:
[39:03] Yeah, I think that you've really nailed it well. So there's the political economy issue, right? The fact that you can do something cheaply at scale. And so why not? It's like the idea that if you're trying to search for a suspect, you know, why would you, you know, kind of call phone companies and look for like cell phone records when you can just go online and sort of immediately, you know, try to look and see for a specific individual with the characteristics in place and use an algorithm to kind of quickly narrow it down to a small number of people. I mean, it just makes sense to do that. But you got to have laws in place as well to make sure that you're not getting false positives and to make sure that you're conforming to, you know, basic civil liberty protections. And that's kind of where things fall down. So actually, that kind of gets you at two of the, both issues that you mentioned, both that the law hasn't caught up and also that these are tools that are kind of frictionless, relatively frictionless to use that are very effective in terms of what they accomplish. I mean, the other thing I would just also say is that, you know.

Steven:
[39:59] You know, part and parcel of this is that we live in digital societies. You know, the way we as humans interact is far different. So you can't, I mean, it doesn't make sense to go back to analog tools when frankly, we're not communicating in analog ways anymore, right? I mean, the idea of kind of looking through phone records, I mean, that's not really it. I mean, how many people talk on the phone nonstop every day as opposed to texting and other sorts of things? So what you would actually want to find anyway, given the way we communicate and how we interact with one another is different. You live in a digital world, you got to use digital tools. You should use digital tools more accountably. And that's where we are sort of not there.

David:
[40:34] Stephen, I am conscious of we've left a bit of the actual like toolkit that nation states have just a little bit nebulous. And I think that's part of the problem is where does repression technology come from? Well, it can kind of come from all over. It doesn't really come from any one particular spot. It's not in one. There's like loose categories that we've defined, but the actual specific strategies.

Steven:
[40:57] I think what we've still kind of left hazy for the listener,

David:
[41:00] Maybe to make it more specific, we can talk about the arcs of certain technologies that are going into the future. So when like AI, we haven't even talked about AI yet. When AI combines with drones, for example, I think we all know that there's something out there that should be really scary to all of us. It's not quite here yet, but something like that is out there. What are like some future technologies that we know are coming that give you the heebie-jeebies?

Steven:
[41:24] Yeah, well, I mean, I think this idea of predictive policing is something that's been in place for a while. In large part, that's sort of been based on kind of like past actions and you sort of plot it on a map and you sort of say, well, if there's like a bunch of crime that's happened here in this kind of grid, then we can assume that more crime will happen here. And that's actually been kind of abused quite a bit in terms of profiling. But, you know, with algorithms, you can really hypercharge that. You can have lots of more data points. You can start to bring about predictions based on, you know, how one will post online and language, linguistics, and then start to sort of watch those individuals who are flagged. And so this idea of automation where you take out the human and you rely increasingly on computing power and algorithms, if not agents, to sort of do the heavy lifting. And in the meantime, as you're outsourcing or delegating that type of responsibility, you don't have the right checks in place. And so not only are mistakes made, but civil liberties are infringed upon left and right.

Steven:
[42:28] You have very little recourse to kind of push back as a citizen. That bothers me. That worries me. It worries me that as we continue to move in this course, where we embrace automation at all costs, take humans out of the decision-making matrix,

Steven:
[42:43] that you end up with lots of situations that lead to abuse. I'll give you a more concrete example where we're seeing this right now. So, you know, I've been, I've written a little bit about AI targeting in war. And that's something that we've seen a lot of in the Israeli context when it comes to relying on algorithms to generate many thousands of suspects that would then be used for strikes. And so in the past, what you'd have is that you'd have to kind of collect this through human intelligence. You'd have to do it through individuals and it was laborious. And so you'd have much lower numbers of people that were kind of picked up. And so just in general, even when you had an error, you know, if you're making a mistake for an error rate of, let's say, 1%. So one out of every 100 strikes, it's still only like one person. So the overall number is lower.

Steven:
[43:30] Now, let's say you're striking 10,000 people or 15,000 people. Even if you have relatively the same error rate, you just have many more people who are going to be killed because you're relying on these systems at such a scale. And so that's part of the reason we're seeing such a high civilian casualty rate in Gaza is as you rely on automated systems to recommend who to target. And because you're just raising that number to such a high level, more people are dying as a result of that. And so extrapolate that same type of thing out to, you know, police law enforcement tools, where the number of suspects for different crimes is amped up to a much higher level. And which also means that the number of false positives, the number of potential problems, the number of procedural violations also get amped up. That, to me, is a scenario that I find very concerning, and it's very much an AI-based type of scenario.

David:
[44:21] So maybe to bring this home, let me paint an example that is completely hypothetical. It's not happening. But I'm starting to see like a little bit of warning signs, like flashing lights come up on my social media feeds inside the United States, especially after, specifically after Charlie Kirk's murder, where the right is very upset about the left celebrating Charlie Kirk's murder. And now there is like, kind of like the Daily Podcast brought up this term, a blue scare, which like the inverse of the red scare was like oh well the the the liberals are advocating for political violence and like you could make some combination of ai consuming everyone's social media posts and if you know the one side is in power if the conservatives are in power they say like well this one twitter account is saying a little bit too many things about political violence And there's a 95% chance that based on our models, that they actually do commit some crime. So we're going to arrest them today, because they're 95% guilty, but they actually haven't committed a crime. That's like a hypothetical future scenario that I just totally fabricated using my imagination. There's no evidence to say that that's going to happen. But like, that's part of like, a potential arc that might worry you of like, oh, yeah, 95% guilty is above our line. And therefore, we're going to go arrest that person. That's like predictive policing, like a model of predictive policing that you could imagine.

Steven:
[45:46] Yeah. So, yeah, and that's a really good example because, you know, ultimately what I think is that I focus, look, the technology scares, can be scary, it can be frightening, but ultimately it comes down to the politics underneath it, I think. And so the question there is that.

Steven:
[46:03] Do we have a First Amendment that matters? Having learned, I mean, we had a red scare when we didn't have very much high technology. Now we potentially have a blue scare. I mean, we'll see where that actually goes or doesn't. Let's hope it doesn't go very far. But either way, that's still kind of based on the fact that you see people who are trying to stretch the bounds of a law as we know it, who at least in practical terms, if you look at predictive policing, if you're trying to then take people offline because of the potential link to crime. Under any jurisprudence I know of, I mean, that doesn't compute, right? That's not enough of a reason, the thought that something could happen based on kind of loose circumstances that therefore means you can't say that and therefore you should be arrested. That's not how the constitution ought to work. And frankly, even though the constitution is old, I mean, that's been updated enough through court cases that we can still apply it that way. And so what scares me more about this type of thing is not the algorithms, It's the misuse for political gain by those in power of those algorithms. So I guess it's powerful tools that can be exploited in nasty ways that run against our democratic principles. And, you know, I think that's pretty alarming.

Ryan:
[47:17] Yeah, the example of predictive policing is alarming for our democracies. But at least the end goal of predictive policing is something that I think we want in healthy democracies, which is just like safety, public safety, right? And you can imagine our civil liberties and the institutions that guard against government encroachment of those civil liberties, they have to hold. They really have to hold so that this technology doesn't not only give us safety or some perception of safety, but strip away those civil liberties as well. I guess this same predictive policing concept, though, if you apply it to an authoritarian nation state, right? You could call it predictive political dissent.

Ryan:
[48:01] And you could gather all sorts of information about individual citizens to predict which ones might be anti-government or which ones might have a trajectory of leadership in anti-government. You could predict protests before they happen, and you could dismantle those protests. You could find out who the, you know, the disturbers are in any political movement and take some action to censor or constrain them or launch a disinformation campaign against them. I mean, the predictive political dissent capability when you start to track all information across all of your societies in the hands of a government authority who has all of this power and AI at its disposal, it almost seems like the authoritarian regime, those that are trying to stay in power, how could they be disrupted at that point?

Ryan:
[48:55] You know what I mean? Like, once you have this tech stack, and if you have marshaled it at scale across your society, aren't you basically a god? Like, aren't you basically immune from any kind of revolution or political dissent? The amount of power that this tech stack gives to existing governments is just astounding, and I don't think the world has ever seen it.

Steven:
[49:18] It's pretty, I mean, look, I agree. I think it's super alarming. And, you know, I mean, I think, I mean, one of the main problems, one of the main things it does is that it sort of breaks the ability to mobilize for collective action, right? I mean, ultimately, that's what it takes. I mean, whether it's Nepal, which we talked about earlier, whether it's in other contexts, you know, it's the ability for people to find a way to connect with one another, to reach a common decision about taking an action, and then to do that, even if it's a challenge to the streets. And the reason why protests, certain protests work and others don't is the ability for masses numbers of people to get out in the streets. I think social scientists say something like 3.5% of the population is a magic number. So if you get 3.5% of the overall population to protest or out in the streets, there's a good chance that you will then topple the regime. So in China or Russia, which already have these systems of control, as you mentioned, which have a great way of monitoring and regulating the ability of people to communicate things that challenge the authority of the Chinese Communist Party or Putin, can you get 3.5%? Can you get 1%? I mean, look, we had a million people die in Russia for a pointless war, and yet you can barely summon a thousand people because immediately they're identified.

Steven:
[50:36] Their families are doorknocked afterwards, they're arrested, people know that. So the ability to collectively want to come out and protest is severely limited. So while I personally think that there's very little love lost when it comes to a lot of people and support for Putin's regime, I think the ability of that government to break the will of collective action there in the country, keeps them in power for an indeterminate long period of time. And I think the same thing probably holds true in China.

David:
[51:05] My intuition is that if you're an authoritarian state and you know the magic number is 3.5%, so your goal is like, okay, never allow 3.5%. But allowing 0% is also bad because you want to allow some expression. So, like, you don't want to go too far to the other end. You want the people to be like, yeah, you know what? Like, I'm going to go protest about tea.

Ryan:
[51:28] Something innocuous.

David:
[51:29] Tea in the Boston Tea Party, you know? And I'm going to dump the tea overboard. And that's going to make me feel good. And I'm going to be like, yes, I expressed my desires as a citizen. And I'm going to go home and I'm going to just be happy. So, like, if I was trying it, I'd be like, you know what? 0.5%. let's encourage 0.5% of the population to express their needs, but no more. And I would use my AI and my propaganda and my disinformation tools to encourage, but no more, allow 0.5% of the population to go out and have a peaceful protest. You could finally tune those parameters. Yeah, exactly.

Ryan:
[52:05] David, you would make a very evil dictator, my friend.

Steven:
[52:08] I'm already a little chilled by that. Yeah. No, but I think that's right. I think the idea that you want to have a little bit of a valve, like an outlet, so that, you know, people have a way. And look, I mean, the more they can sort of channel their outrage or just channel their emotions towards something that's benign. I mean, it could be, you know, a shopping day. It could be towards consumerism. It could be towards, you know, sports fandom in a team. But as long as you channel. It could be.

Ryan:
[52:35] Even towards an enemy of the state, right? So some sort of international foe, right? You can channel in those directions.

Steven:
[52:41] Yeah. I mean, that's a time-honored tradition. I mean, look at Iran, right? I mean, going back decades, anytime, you know, death to the United States, anytime you want to sort of distract people from their own situation, find a foil outside and use that. I mean, although the thing is people aren't fooled either. I mean, they kind of, they know. The question is whether you can take, I mean, frankly, even in China, it's not like people are brainwashed. I mean, they're very savvy. They're very connected. They understand a lot more than maybe people give the citizens credit for. But the question is, can you tie that to action? Is it worth it for them to individually risk losing everything, the challenge of government when they know all around them, there are other

Steven:
[53:20] citizens, no matter how repressive things are. And this, you know, frankly, like, life can be okay for a lot of Chinese. Is it worth it to kind of stick your neck out when they know your other citizens aren't going to do that? I mean, that's part of the problem is this sort of collective paralysis. When you have a society that is getting richer, that seems like it's doing well, and a very strong, coercive state in the meantime, why not go towards door A to get the benefits and live your life and don't talk about politics as opposed to door B, where you can challenge the state agents and find yourself in prison for life? I mean, And Dore sounds a lot better for most Chinese citizens and frankly, for a lot of citizens in other countries as well. I mean, that's part of the, you know, sort of gaming the incentive structure, I think.

David:
[54:02] Yeah, I suppose this kind of goes back to where Nepal failed and why China is good at what it does. Nepal failed because it just used a blunt instrument and it took down otherwise law-abiding citizens alongside of the protesters because they just turned off the internet for everyone. Like, all right, none of you get the internet, which just, like, encourages the moderates to join the protesters. Like, well, you guys just lumped me in with the protesters, so I'm going to go join them. Whereas I think what China probably does pretty effectively is they smartly identify the biggest bang for the buck, which is like, does this action coerce the most number of people in the most invisible way possible? And so law-abiding citizens are otherwise undisturbed, but we could just like yoink this one loud individual, this one loud dissident and disappear them. And yeah, no one's going to really miss them because it's just one person. And also that also does the whole chilling effect thing, but otherwise normal people just don't care. And it's ultimately invisible at the end of the day. So that's a smart decision. I'm going to do the big bang for the buck thing. And that's probably the difference between an effective authoritarian state like China and an ineffective one like Nepal. Is that a line?

Steven:
[55:19] I think that's fair. I mean, look, Nepal is frankly not, I mean, Nepal is sort of like a mix anyway. It's a democratically elected president. It's a kleptocratic kind of regime, right? It's one that's sort of unrepresentative and small, but they have elections that are relatively free and fair. And so in one sense, even the tools that China has weren't necessarily available to Nepal anyway. And so they use a blunt tool at the very end, and at which point the die had already been cast. But I think what's also interesting about Nepal that's different than China, is that I think for a lot of citizens, especially those many who are in poverty, who are dispossessed, who are forced to look for work outside the country, is that what do I have to lose to go out in the streets?

Steven:
[56:00] What's the cost to me?

Steven:
[56:01] Life is already difficult. Corruption is already endemic. I have very little chance to actually rise up and make a good living for my family, let alone to have a voice in the system. So why not go out in the streets? I mean, you saw the same dynamic play on the Arab Spring with all the anti-corruption protests, including millions of Egyptians gathering in places like Tahrir Square saying, I don't want this anymore because the system delivers nothing for me. I don't think the ordinary Chinese citizen would say the same thing in Beijing. I think And one of the things that the Chinese Communist Party has done very well is that it has really put a dent in poverty. It's a flourishing country. There's lots of innovation. There's lots of opportunities and pathways to succeed. And so for someone who does have a lot more to lose, for them to go out in the face of this coercive at rest, it's a really hard decision to make. I think in Nepal, though, given much more difficult life was for the ordinary person, I think going out in the streets and saying, you know what? Screw this regime. And by the way, they're inept. So they probably aren't that coercive anyway. And no, now they took away the one thing I did have, you know, my access to communicate with other people. Well, now I'm doubly angry. So I'll definitely go out. And you see that playing out in places like Indonesia. In the Philippines, you're seeing kind of growing protests as well. And so that is common. You know, most countries aren't China. I think that's one thing that's also important, that China is pretty unique in terms of what it's able to do.

David:
[57:19] Yeah, I suppose there's a huge difference between authoritarian repressive regimes that are economically poor versus authoritarian repressive regimes that are economically wealthy. And I guess if you are an authoritarian regime and you control your citizenry, but you also make them wealthy, that starts to ask a different question about what is good or bad here. Steve, what do you think about Curtis Yarvin? Do you know him?

Steven:
[57:46] I do. I don't know his work well enough to sort of comment publicly. So I'm going to...

Steven:
[57:53] We're going to skip that one.

David:
[57:55] For the listener, Curtis Yarvin is this guy who thinks that basically a monarchy is the best style of governance. And so the question that I threw to Stephen was like, well, what do you think about that? But again, we'll move on.

Steven:
[58:08] Well, I can say, I can broadly comment on you politically. I don't think that, you know, I believe strongly in the democratic experiment, as flawed as it is, I think giving people a voice and not vesting yourselves in the whims of a particular individual, even if that's someone that you would hope is benign, you know, if you just kind of look at the aggregate side in terms of like what systems deliver better over the long term, it's always democracies, not autocracies. So, you know, I'll leave it there without commenting specifically on Curtis Yarvin's opinions, but I'd be very much in democracy.

David:
[58:38] Sure. What can we say about the pipeline between China and Iran? My understanding of Iran is that it's a kind of a testbed for a lot of Chinese products because Iran is like the more authoritarian of the two. They have a much stronger grip on their populace. And so they are applying repression technology to a larger degree than China is, but it gets a lot of its technology from China. Is all of that correct? And what can you say about like Iran's relationship with China.

Steven:
[59:07] Yeah, it's interesting. I would actually flip it around the other way. I think Iran is a follower of China. I think they're trying to do what China has done. So for one of the things, for example, that they're trying to establish is like a national internet that would be closed off from the outside world. And they're still kind of struggling to kind of put that in place. China's done that, right? So the technology and Iran is using some Chinese technology. It's not exclusive though. I think there are other types of, devices and components that they're also using from a range of countries but you know yes sure sure they're far from china but i think around is trying to do what china has done i think in some you can make an argument that around is more repressive in the sense that they've locked up more people that they are willing to kill more people on the streets that and that's partly because they have to resort to that because they have a much looser grip and power and the economy is in a free fall and you know it's a theocratic state in a country where most people no longer believe in the state religion, right? So they have a much looser link to legitimacy than the Chinese Communist Party does in China. So therefore, they have to resort to much more rudimentary physical tactics to get people to do what they want, because that's the only way out there. And they're also kind of getting attacked left and right pretty successfully by their neighbors. So they're losing wars, you know, and they're losing their economy. And, you know, the price of oil is falling and everything's kind of a mess. But I think they're following what China is doing. I don't think it's the other way around.

David:
[1:00:34] Okay, yeah. Yeah, that checks out to me. The reason why I ask is I kind of want to understand the arc of how repression technology is going to proliferate. I think if I understand like your literature and everything you wrote, you kind of think that repression technology is going to proliferate further and more than it is today in the future. Like in the future, there will be more repression technology than there is today. Do you believe that?

Steven:
[1:00:55] Yeah, well, look, I think in part it'll follow where I think autocratic patterns are going. Like, are more countries becoming authoritarian? If you think the answer is yes, and so far it looks like it is. Like, if you just look at, like, Freedom House and the number of countries that are democracies versus autocracies, we're, like, in 15 or 19 straight years of, like, lower numbers of democracies and higher levels of authoritarianism. So, point one is just that you have more authoritarian countries, there will be an increasing demand to get these systems. And I think that that's sort of part of it. I think point two is also that I found that, you know, some people really focus a lot on the export side, the supply side. I tend to think a little more about the mix between supply and demand. So it's as much about China pushing out or putting, you know, having certain types of repression technologies out for purchase or frankly, other Western countries like the United States and Israel having certain types of technologies that people can purchase as much as it is about who actually wants it. You know, do they have a motivation to use it? Do they have the resources so that it makes sense for them to institute a system of surveillance that'll actually help them control who says what in the country? And, you know, some places want to do that. Other places, even if they are autocratic, they say, you know, well, we have other ways to handle things. So, you know, why bother with it? It sounds really expensive. Maybe China will give us a discount, but what's the point? We.

Steven:
[1:02:21] But imprisoning people works just well. Or shooting three or four people and making them symbols of challenges to the state, that works pretty good. So we'll see. I mean, there's lots of different ways. But I think the demand side and looking at that is really important.

David:
[1:02:38] Do you have an opinion on the way that this repression technology proliferates around the world? Do you think that it will go bottom up and it will start from like the weaker countries who need to resort to authoritarianism first? And so like the weaker countries that just they starting to lose control. And so they start to lean towards authoritarianism in order to constrain their populace before they lose control. Maybe it starts that way and it goes starts at the weaker countries and it moves up and up and up to the bigger and bigger and bigger countries. Or does it start with sort of the superpowers? China, Russia, United States, you know, China has that relationship with Iran that we just talked about. And then it kind of like works its way out. It proliferates out into like these superpower repressive spheres around these superpowers. Do you have a, does it, is it directional in the way that this gets adopted?

Steven:
[1:03:26] I think it's really context specific. I think it's hard to generalize, but I, you know, I think we can look at a few patterns that will probably play out. So I think one pattern we'll see is that rich countries that are highly autocratic. So, you know, countries that have a lot of resources, Saudi Arabia, the UAE, maybe Egypt to some extent, countries like that, that have a lot of resources and also have a really high...

Steven:
[1:03:48] Interest in procuring these authoritarian technologies will get them and will continue to kind of crowd them in as much as possible. You know, it's interesting when it comes to kind of countries that are weaker, I think you have to recognize your weakness pretty far in advance in order to do something significant with this, because it's not like you bring in place a surveillance system and you can install it overnight, you know how to use it, and your police force is able to deploy it. One of the things that I looked at was sort of the ability of your security forces to be highly coercive and organized or not. And so you already need to have sort of a system in place that can leverage these technologies in order to do something with them. So if you're like a country that already starting like Nepal, like kind of in free fall, quasi-democratic, or Bangladesh is another place that had a change in government last year because of protests, like it's kind of too late at that point. I mean, A, you don't have the resources to do that much with it. B, your forces have been trained in terms of how to use it. And C, you have immediate problems.

Steven:
[1:04:42] These technologies are better for kind of more medium to long-term problems. I mean, you know, again, the classic example is a place like Russia that over the long-term is just slowly entrenching greater levels of control in a society. Slowly, it's booting out different Western platforms and trying to control what is said in the ones that remain, whether it's YouTube or Telegraph. Slowly, it's instituting facial recognition systems in all its major cities and even periphery cities. And then it's sort of supplementing that with kind of high profile killings of dissidents like Navalny. You know, that's how you sort of do it. You kind of have a slightly longer time horizon. You have the resource space necessary and you have the motivation in which to get these. And that's your kind of perfect storm. So I would say, you know, think about which countries sort of hit that composite, which fall in that frame. And those are the places where you'll see more and more purchasing of it.

Steven:
[1:05:34] I guess the best deployment,

Ryan:
[1:05:36] You know, the most effective deployment of repression technology is when the population doesn't notice that it's happening.

Steven:
[1:05:44] Sure. Or they choose to look elsewhere.

David:
[1:05:48] Right. The best time to plant a tree was yesterday.

Ryan:
[1:05:50] Yeah. It's definitely a boiling the frog type of effect. One other piece here, Steve, I want to talk about because, of course, we talk about, I mean, this is a cryptocurrency podcast as well, is the element of financial repression that also happens. I think a lot of your work on the digital repression side of things focuses in on kind of the communication layer. So it's like First Amendment stuff, you know, repression technology that restricts the right to assemble or freedom of speech, that kind of thing. There's also debanking that can happen once you fully deploy digital repression technology across your society. This hit the headlines this week about Vietnam. Vietnam instituted a government ID, national ID, with a biometric app on all smartphones. right? So they've got that piece of it. And now they have rolled out the need for all bank accounts to require that biometric data before they're opened or before you can send a payment or a transfer of funds above $750, for instance. And so

Ryan:
[1:06:56] And that way they can kind of control and digitally repress the entire financial system. And of course, all of this is when it comes to like, when we talk about communication or we talk about, you know, moving money around, it's all kind of coordination technology. And so if you're an authoritarian regime, you want to prevent your citizens from coordinating. You can prevent that by censoring your communication. You can also prevent that on the economic side of things, right? If they can't pay for different literature or they can't transact in the economy, you have complete control.

David:
[1:07:27] They can't donate to opposing political parties.

Ryan:
[1:07:30] Exactly. So to what extent are you seeing a crackdown on the financial side of repression as well?

Steven:
[1:07:35] Yeah, no, I think it's a great point. And, you know, we've already seen like a large number of instances of that. So, you know, one of the examples people raise quite a bit is the social credit system in China, you know, where essentially, you know, trying to tie in, you know, different types of records and databases all in the one to get a composite of an individual and then to then assign them a score based on other things that you do. And if that person, you know, sort of flunks a few different categories, then that can mean potentially not allowing them to open bank accounts in certain areas or to do other sorts of things. And in fact, in China.

David:
[1:08:08] You've even seen- But isn't that also a way to restrict movement too? Like preventing you from getting on the subway?

Steven:
[1:08:13] You can do any of those, right? I don't know to what extent that's been enforced. I mean, a little bit of kind of like a mix of kind of hyperbole or perspective. This could happen versus actual implementation. So that's sort of one of those things that I think is a little bit fluid, but certainly you could. I mean, you can do any kind of financial leverage from the state that you'd like against individual. And so it doesn't have to be just locking them up. It can just be denying them access to benefits or denying them access to their money. Economic coercion is It's hugely important in the arsenal of what governments, repressive governments want to do.

Ryan:
[1:08:47] Can you imagine something like this, Steve? So somebody says something on social media that the government doesn't like. So you just teach them a lesson and you freeze their bank account for 30 days.

Steven:
[1:08:58] I mean, in a country, you know, without the rule of law and, you know, where like assuming you could kind of in the United States, I think that would be hard, right? Sure. But in other countries, certainly where like where you either don't have regulatory protections in place or you have rules that have such large loopholes, national security loopholes, which every country seems to have. Yes. I mean, why not? I mean, in fact, there is like a pretty burgeoning human rights side that has looked at cryptocurrencies as a way around that. One of the examples that I think has been cited quite a bit has been to Myanmar, where you had an illegal military junta that came in, deposed the democratically elected government, threw them all into exile or into an insurgency, and now you have a civil war. And that's an example where trying to just get people money across the border, especially you can't just send in normal transactions because the government can stop that. Using crypto is a really great way to get around those controls. And so, you know, there, I would assume probably will be more and more instances of that where government's trying to leverage their ability to control regular financial networks and crypto can be an option away around that, like a liberation technology, frankly, for dissidents.

Ryan:
[1:10:08] That was very much the story in Nepal, right? So there were protesters, bank accounts getting frozen, and they were actually using like USDT stable coins to get around this, other cryptocurrencies as well.

David:
[1:10:18] I will say, Ryan, that every centralized stable coin, USDT, USDC, can all freeze your bank accounts if they are ordered to by the legal due process of the United States, but they could.

Ryan:
[1:10:31] Yeah, they could. And this is not true of some of our crypto native assets, which we're definitely advocates of. So the Bitcoin and the ethers of the world. So Steve, as we get to the end of this episode, this has been a fantastic discussion, I think. Yeah.

Steven:
[1:10:45] I don't know.

Ryan:
[1:10:46] I'm feeling kind of down, honestly. It's kind of, I mean, how do you look at

Ryan:
[1:10:51] all of this and not get pessimistic, I suppose? That would be a question or maybe a more optimistic way of asking the same question is, how do we avoid all of this?

David:
[1:11:02] What about our kids? Yeah.

Ryan:
[1:11:03] What are we going to do here?

Steven:
[1:11:05] Yeah. I have a couple of kids who are young, so I do think about that quite a bit. Look, I mean, I think there's a few things. I mean, one of which is that technology, I don't believe in sort of a technological deterministic future. I don't think just because we have powerful technologies that they're going to be misused and exploited by governments. I think ultimately it does come down to politics and how people choose to vote and how they choose to organize and what decisions they make. And so for every situation like China, which is pretty grim and which is kind of pessimistic when you look upon it, there are other situations where you do see the possibility for change like Nepal, like Indonesia, like Bangladesh.

Steven:
[1:11:40] Like even in our own country. And so I think that, at least in democracies and places that lean towards, lean democratically, there's a lot of openness. I don't think the future is written. And so in some ways, looking at authoritarian countries can be a bit of a warning. It can show us what's coming down the pike. It can tell us these are the possible issues that can arise if we don't guard our liberties carefully, if we don't think about ways to push back, we don't think about adding friction into the use of these technologies. I don't think it's, in other words, foreordained that powerful algorithms will take over the world and entrench governments at all. At the same time, I think we have to be careful because if we kind of sleep on it and look the other way, we could wake up one day and all of a sudden say, wait a second, we live in a surveillance state. We supposedly have a constitution. How did I end up in a situation right now where I don't know what my rights are? It seems like the government can do a lot of things. So we got to push back constantly. And I think, you know, using law, using kind of collective action and using models and examples from other places around the world in order, if nothing else, to learn what not to do can be really instructive and important for future generations. Steve, is there.

David:
[1:12:43] Anything that you do or don't do on a daily basis that relates to like,

Steven:
[1:12:48] I don't know.

David:
[1:12:48] I guess we'll call it repression technology hygiene. Like, you know, I can generally tweet out all of my opinions on Twitter. And I'm also aware that in the future, people will read those opinions. And so sometimes I think about that when I tweet. Is there anything you do on a daily basis? You're like, oh, better not do that. Or maybe I should do this just in case, you know, just in case.

Steven:
[1:13:09] Quite a bit of things. So I communicate more and more. I communicate with friends of mine, especially groups over signal and not text messaging. And in fact, you know, as things have gotten a little bit heated politically, I noticed in several of the groups, people have, the moderators have changed the disappearing messages to a day as opposed to longer. Just not that people are even willing to say anything, but there's just like a little bit of apprehension. I started self-censoring myself a while ago, but in terms of what I say publicly on social media, so it's pretty anodyne, it's pretty boring. It's not necessarily how I feel or think in a given situation, but I also have learned that while I might have a very specific emotional reaction to something, maybe it's better just to take a beat and certainly not to put it out publicly. So I just sort of say things to people, like my friends, but I don't put it out there. I mean, in some ways, it's a shame, right? Because, you know, I know very well the idea of the chilling effect with self-censorship and why that's a bad thing for a healthy society. On the other hand, you got to be careful.

David:
[1:14:11] You're just doing this actually makes the authoritarian's job easier.

Steven:
[1:14:14] Right.

David:
[1:14:16] If everyone decides to be a little bit more quiet.

Steven:
[1:14:18] Yeah. So, you know, I mean, I think about the social media side a lot. Yeah, those are, I mean, those are some of the kind of major things. But the move to encrypted apps, I think that's a really interesting thing. And that's been kind of like across many of my different friend groups, you know, people have sort of reverted to that more. Yeah.

David:
[1:14:36] So you're saying that if we were to get a beer together off the air, I might hear some different takes.

Steven:
[1:14:42] Perhaps there's no like hot mics nearby or someone kind of like discreetly like using your iPhone to kind of videotape you. Yeah.

David:
[1:14:50] I think if Ryan and I were in your chair, our answer would be, you know, you don't need to put your crypto, your savings into crypto. So, but I would establish a pipeline of going from your bank account to the crypto world so that if you ever need it, it's there for you. Don't figure that out before it's too late. I think that's what we would say to add to the arsenal of individuals. Yeah, it's not a bad idea.

Ryan:
[1:15:14] Yeah, I think we would, right? Encryption at all areas, even encryption for

Ryan:
[1:15:18] your communications, I guess encryption, cryptocurrency for your money too. Steve, this has been a pleasure. You've, I think, elucidated a lot of things today and it's just really been a fascinating conversation. So thank you so much for joining us.

Steven:
[1:15:30] Yeah, thanks for having me on. Your questions were great and I learned a lot just talking and hearing your perspective. It kind of forces me to throw things. Oh, last thing, Steve. Yeah.

Ryan:
[1:15:38] The new book. So that is coming out in 2026. That's right. Called Bites and Bullets, Global Rivalry, Private Tech and the New Shape of Modern Warfare. So give us a tease. What's that book going to be about?

Steven:
[1:15:49] Yeah, it's about how countries and different militaries are using technology for geopolitical competition. So it looks at everything from the tech war between the U.S. And China and how that'll play out and determine power to looking at the use of drones and the battlefields in Ukraine and elsewhere in terms of how that's rebalancing, you know, how militaries win or lose when it comes to warfare. So it's taking more of a little bit of an external view in terms of thinking about the balance between technology, power and military might and what that might mean for the future.

Steven:
[1:16:23] Yeah, it's definitely a driving force too.

Ryan:
[1:16:25] So when that book comes out,

Steven:
[1:16:26] We'll have to read it.

Ryan:
[1:16:26] And maybe get you back on. But thanks for

Steven:
[1:16:28] Joining us today. That'd be great.

Ryan:
[1:16:30] Bankless Station, got to let you know, crypto is risky. Of course, you could lose what you put in, but we're headed west. And encryption is the defense against losing what you put in. This is the frontier. It's not for everyone, but we're glad you're with us on the bankless journey. Thanks a lot.

Not financial or tax advice. This newsletter is strictly educational and is not investment advice or a solicitation to buy or sell any assets or to make any financial decisions. This newsletter is not tax advice. Talk to your accountant. Do your own research.

Disclosure. From time-to-time I may add links in this newsletter to products I use. I may receive commission if you make a purchase through one of these links. Additionally, the Bankless writers hold crypto assets. See our investment disclosures here.