A Pastor and a Philosopher Walk into a Bar

Re-Release - Outrage Porn, Echo Chambers, and the Seduction of Clarity: Interview with C. Thi Nguyen

Randy Knie & Kyle Whitaker Season 5 Episode 14

Text us your questions!

This is a re-release of an episode from our second season when we spoke with philosopher C Thi Nguyen. We think it bears re-listening in our current moment.

=====

What happens when we seek simple answers in a complex world? Philosopher C Thi Nguyen takes us into the machinery of belief, understanding, and value formation, exploring how we navigate information landscapes designed to manipulate us.

Thi introduces the concept of "moral outrage porn"—representations that give us the satisfaction of moral righteousness without requiring meaningful action. We discuss conspiracy theories and his notion of "the seduction of clarity"—the powerful feeling we get from explanations that seem to make everything simple. This feeling is particularly dangerous because we're limited beings who need mental shortcuts to navigate the world.

We also tackle echo chambers and why perfectly rational people can end up in them. Thi distinguishes echo chambers (where we systematically distrust outside sources) from filter bubbles (where we simply aren't exposed to contrary views), explaining that people inside echo chambers often follow logical procedures based on who they've decided to trust. This challenges the dismissive assumption that those with radically different beliefs are simply stupid or lazy.

Weaving through discussions of game design, social media metrics, and institutional incentives, Thi reveals how our values are increasingly captured by simplified scoring systems that reshape our priorities according to what can be easily measured. The result? We outsource our complex human values to technologies and institutions that weren't designed to handle them.

Uncomfortable yet?


Content note: this episode contains profanity.

=====

Want to support us?

The best way is to subscribe to our Patreon. Annual memberships are available for a 10% discount.

If you'd rather make a one-time donation, you can contribute through our PayPal.


Other important info:

  • Rate & review us on Apple & Spotify
  • Follow us on social media at @PPWBPodcast
  • Watch & comment on YouTube
  • Email us at pastorandphilosopher@gmail.com

Cheers!

Kyle:

Well, welcome to A Pastor and a Philosopher Walking to a Bar. Today we have a guest that I may be more excited about than any guest we've had.

Kyle:

I'm not sure I don't want to oversell it, but I've been following this guy's work for a long time. He writes basically all the things I wish I had written. Yeah, you said that he publishes in all the coolest philosophy journals. All of his arguments are novel and interesting and get a lot of traction, and he's also a pretty up-and-coming public intellectual, so he's able to somehow get op-eds in all the coolest places as well, and he's just so interesting, I don't know. I've heard him on a couple podcasts and he's just a really seems like a fun guy to talk to. So I'm really pumped about this.

Randy:

Well, now you just set us all up for disappointment, no matter what he says. No, I've listened to a couple podcasts as well, a couple of his him speaking and, holy cow, he grabbed me within three minutes. I mean, I was in super, super fascinating, interesting, philosophical but really relevant stuff to our culture.

Kyle:

Yeah, so we're talking to a guy named Thi Nguyen, and he's an epistemologist like myself except a real one, I guess.

Randy:

He works at the University of Utah and has graciously agreed to appear on our podcast, even though he doesn't really work on anything religious. Yeah, go figure. I mean, maybe he's Mormon, he's in Utah.

Kyle:

Maybe we're going to find out. That'll be the first thing I ask. Well, Thi Nguyen, thanks so much for being on A Pastor and a Philosopher Walking to a Bar. Thank you.

Randy:

It's great to be here. So Thi can you tell us, just tell our listeners about who you are, what area of philosophy you specialize in, and just your whole world. I'm T.

Thi:

Nguyen. I am Associate Professor of. Philosophy at University of Utah. My apparently my specialization in philosophy is so weird that when some of my colleagues were trying to describe to someone else, they just like gave up in like connections of laughter. Like, I work in philosophy of art, so okay stuff I've written about recently games as an art form, gamification, echo chambers, trust, porn yeah, some of our listeners just did a double take.

Thi:

I mean I think it's all related and one way to put it is that, like I think a lot of philosophy has historically been interested in, like the individual out of context, and I get super interested in the individual in context and a lot of the ways that people work on the individual in context just involve talking about, say, communities or governments, and I think that's right, but also technology is part of it.

Thi:

Like it's technology is part of the way we communicate, and I ended up working across two fields I think not many people work in. One is social epistemology, which is a study of, like how people know things in communities, as groups, and the other is the philosophy of art, which to most people is like a weird connection, but to me art has. The philosophy of art has always been the study of the relationship of communication and technology Cause the philosophy of art studies like oh what happens when we get photography?

Thi:

What happens when we get film? What happens like how does each of these change the ways that we connect to each other and express things, and express subtle things and so, like the last seven years of my life, we're obsessed with trying to get a better theory of, like, what games could communicate. That was unique. That's what I work on awesome.

Kyle:

Were you a gamer prior to that?

Thi:

oh my god, was I a gamer? Yes, um, though most of my interests, I think I'm. I barely play computer games these days, partially because I'm married with kids and I don't have time, and partially because I feel like mainstream computer games are getting better and better at the technology of addiction, which I'm sure we'll talk about yeah so, just for the gamers who are listening, what are like two, three games that you lost years of your life.

Kyle:

Oh my god, the civilization games like yeah like I'm never allowed to touch any civilization game again, like I've never played one specifically for that reason, I know myself too well and then like I mean, I mean, I can't really like I was really into the old school D&D computer world, like before Skyrim.

Thi:

There was, like you know, ultima and Baldur's Gate and like that era of like or like anything like very tactical, like XCOM, like from the original, like I mean, I I had an original atari like 2600, like I'm yeah, but I was, I was like anything that involves leveling and grinding is really dangerous this sounds like a dance club to me.

Randy:

I have no idea what you're talking about you have no idea what I'm talking leveling and grinding you lucky dog, you lucky so Thi you've talked about what you've written about recently and I've I've listened to a few things that you've talked about, the things that you've written, and as fascinated, so let's just dive right in. You talk about porn and you talk about specifically what I loved was you're talking about moral outrage porn. So can you first give us your definition of porn, whether that's sexual or whether that's food or whether that's real estate porn, all that stuff? Give us that kind of fit, fit our listeners in there, and then tell us about moral outrage porn and what, what you see, what you think.

Thi:

So this, this I wrote a paper with Becca Williams. This paper is a product of our thinking together and the conversation actually started as a drunk late night conversation between the two of us on somebody else's facebook post, um, and it was just about. I was just starting to joke around like you know what. We don't have a good philosophical account of porn in the general sense, like you know, like food porn or real estate porn. So my wife loves this site called things neatly organized to Tumblr. It's just like pictures, closeup loving pictures of like pencils arranged by color or like corks neatly stacked.

Thi:

This is all like if I told people like oh, this is organization porn. Like everyone knows what that means, right? So it seems like there's a general notion of porn that we we get. Actually, one of my favorite examples are writing this paper and becca called me and she was like oh my god, I was watching saturday night live. It was like I don't know, like five months after trump had gotten elected and someone one comedian was like just stop it with all this impeachment porn. And like everyone laughs because they know exactly what this means.

Thi:

And, funnily enough, if you look at the all the discussion of porn in philosophy, the definitions of what porn is are all inherently sexual, and so we're trying to figure out like no, we know what this means. What, what could it be? And becca actually found this, this paper by michael ray, who's a really good philosopher yeah, about sexual.

Kyle:

We're hoping to get him on the podcast. Oh yeah, he's awesome.

Thi:

But his definition of sexual pornography is basically that sexual pornography is images of sexual content used for immediate gratification, while avoiding sexual intimacy or the development of a relationship. Because what he was really interested in and I think this is deeply right is that two people in a loving relationship can exchange, like erotic, nude photos with each other, and that's not porn, right? And I think this is really deeply interesting that mere nude sexual erotic content isn't porn if it's part of a relationship, right, if it goes somewhere. And so we were like oh, my god, this seems all right.

Thi:

We can generalize this, and so our definition of porn, of any kind of porn. Philosophers we call it, you know, x porn, where x is a like algebraic signifier, right? So x porn is any case. And when you have representations of X which are used for immediate gratification, while avoiding the costs and consequences of genuine entanglement with X itself, that's so good. So like food porn, right? Food porn is pictures of food you use for immediate gratification, but you don't have to cook it or deal with the calories or deal with nutrition, or like try to go to a restaurant. Or like real estate porn you just see images and it makes you feel good. You don't have to buy it, you don't have to maintain it, you don't have to keep it clean.

Randy:

Right. Snl had this amazing skit this last year about how Zillow has turned into like the 30 and 40 year olds 900 call in line and you're exactly what you're talking about All the gratification without any of the property taxes, without any of the upkeep and maintenance.

Kyle:

It's porn. This is HGTV, yeah, yeah.

Randy:

Continue T.

Thi:

Oh yeah. So we were like okay, then if you have this definition, it's a good definition, it should help you, like it should be useful, and help you identify new kinds of porn. And so here's one kind of porn moral outrage porn. Moral outrage porn is representations of moral outrage used for immediate gratification while avoiding the costs and consequences of actual, genuine moral outrage.

Thi:

Oh so many listeners are just slayed right now. So I mean, I think people immediately know what this is. They know what I think in the paper were like maybe like 50 of twitter is moral outrage porn. But but I just want to say something. So this paper of ours we should have seen this coming has already been abused and I just want to tell you about the, the abuse that annoys me the most, because it's the opposite of what we want to say.

Thi:

What annoys me the most is people who want to be like oh, this means that any moral outrage is bad. Don't express moral outrage Like. Just be oriented at the morally terrible and the unjust. That is what gets you to act Like genuine moral outrage. So I believe philosophers like Martha Nussbaum, who say things like well-tuned emotions are perceptions of moral states of the world, like well-tuned anger and outrage is a rendering of how unjust the world is, like that's the important stuff.

Thi:

The problem with moral outrage porn is it undermines the importance of genuine moral outrage, right, like it. So either moral outrage porn is something you enjoy without taking action, right, or it's something that you enjoy because you want to have the pleasures of it. You simplify your morality. You get attracted to simple expressions of outrage instead of trying to figure out the nuanced, complex, nauseating thing which is the actual, which genuine moral life actually is. So this was never intended to be like an attack on moral outrage. This is intended to say something like moral outrage is so important that the pornification of it like threatens the ability to have genuine moral progress against injustice.

Randy:

Yep, and would you say a facet of moral outrage. Porn is the, the act of feeling righteous and like my voice will be heard, without actually doing anything about it.

Thi:

Well, yeah, I mean okay. So this is again. I just want to be so cautious, because it's not that every feeling of righteousness is a case of moral outrage porn. You can be righteous because you're on the side of right. Right Abolitionists in the antebellum South were righteous. They should have been so, and feeling good about being righteous in that case is really valuable, right? It keeps you going. The worry is that if you just want the feeling of righteousness devoid of actual moral truth, then what you're motivated to do is to find the simplest, easiest position you can. That just lets you get moral outrage all the time and, to take a sense, it won't lead to action, because action is hard right. Just the pleasures of simplified moral outrage are easy.

Randy:

Yeah, there's this little nuance between this idea of moral grandstanding and moral outrage porn, and you'd say probably there's a difference between the two right.

Thi:

I mean, they're really similar in a certain way. So the notion of moral grandstanding is basically using expressions of morality for social status, and the idea of moral outrage porn is using judgments of morality for pleasure. I think both of them have the same. What moral expression is supposed to do is track genuine morality and get you to take genuinely moral action. Both of these are, you you might say, perversions of morality. Right, they're perversions of either either aimed outwardly or aimed inwardly, but again, like but you also see the parallel problem, like. One of the things that really irritates me about the current social uptake of moral grandstanding is that people who have used it to make accusations against anyone that makes any moral claim and they're like oh, you're just grandstanding.

Elliot:

And it's like no, no, no, no, no, no, no no.

Thi:

It's not grandstanding but it comes from a genuine moral belief. It's grandstanding if it's been re-aimed at status and it's often really hard to tell.

Thi:

One of the things I notice is that a lot of people that make accusations I've seen people like using this stuff about moral grandstanding and moral outrage porn to like attack anyone that expresses moral moral stance. I keep thinking, like you know what? It's really funny that people keep accusing their political opponents of grandstanding and moral outrage porn, but not their side, and that itself is a sign that something has gone funky and wrong.

Kyle:

Yeah, so if this is, this is too nerdy, we can take it out. But in your conception of moral outrage porn, is it deontic, consequentialist, virtue, theoretic or none of the above? So here's a case. So let's imagine that Kim Kardashian gets really pissed off about something and genuinely morally outraged, posts about it to her, however, many million Twitter followers and because she has that many Twitter followers, actually changes the situation for the better. Is that an instance of moral outrage porn? Why or why not?

Thi:

So this account of pornography is the essential part is a particular user's intention in taking it up.

Thi:

So the same thing might be porn for one person and not for another, and this is actually built into the original Michael Ray account. Like he, this is a lovely point of his that you know a couple in a relationship can exchange nude photos and that's not porn. And then someone can take that out of context and use it as porn. And so the fact that someone's expression of moral outrage was genuine doesn't mean that somebody else couldn't also use it as moral outrage porn. That's the first question. We tried to write this thing so it was compatible with almost any. Didn't tie you to anything. Yeah, we were trying to write it Basically. As long as you think that moral belief should track actual states of affairs, then you should get on board.

Randy:

Okay, yeah, these sound similar to me or seem similar to me. You've also written and talked about the seduction of clarity and what comes from the seduction of clarity, understanding as orgasm Fascinating stuff. Can you bring our listeners and us into these ideas and concepts?

Thi:

This is another. Like this stuff is another case where there's this thing that's really good, that's clarity, and then people give you a I don't know, like a fake version of it, and that's the thing that.

Elliot:

I'm reading.

Thi:

I got really interested.

Thi:

So I mean I started thinking about this just because when I was working on some other stuff I ended up reading a lot of I spent a lot of time on conspiracy theory websites, just like hanging out reading forums trying to understand how things were working and like it seems like one of the one of the incredibly interesting things about conspiracy theories is they offer you a single potent explanation for everything.

Thi:

And people say things like it's very powerful and it's very clarifying, and I think part of the thing like actually I don't think the world is that simple, right, but it feels so good to have a single explanation that just takes everything into account. So I was getting really interested in how this might work and basically my thought is something like so we're limited cognitive beings and we can't investigate everything forever, for all time, and so we need to basically guesstimate what's worth investigating and what's worth not investigating, what's not worth investigating right, because you can't think about everything. You could go down the rabbit hole about any question forever, right? So you need to make this estimate. So it seems like there was a lot of empirical evidence in various forms of sociology A lot of us are using as our heuristic for when we should stop investigating, a feeling of clarity right?

Randy:

Can you explain heuristic for us non-philosophers?

Thi:

So in many cases, getting a calculation exactly right is really, really difficult, and a heuristic is a really simple rule of thumb that gets you through things. So let me give you an example of a heuristic. So at one point I used to eat like crap and I tried to eat better, and the first thing I did which is what a lot of people do is you start trying to track the calories, nutritional content of every single thing you eat. Right, and this is like this is debt. You can't, no one can do this for like people burn out.

Thi:

And then you start looking for really simple rules of thumb, like one I ended up using was like don't eat processed carbs. Right, like don't eat stuff that has flour in it. This is not a perfect rule of thumb and you can move beyond it, but as a beginning maneuver, like it's a really useful like first step. So, because we're limited beings, we need a way to like quickly estimate what's worth investigating or not, cause we can't investigate everything to figure out if it's worth investigating, and so one thought is we use the sense that something is clear, right, so what's?

Elliot:

that. So at this point like there's a.

Thi:

there's actually a really interesting amount of philosophy about this, about what actual clarity is, what actual understanding is, and one of thank you, I've just been brought a cocktail.

Randy:

Good interruptions.

Thi:

And a bottle of booze Thank you, my spouse understands me and what I need for a podcast. So in this literature, in the philosophy of science and the philosophy of education, there's this idea that what it is to really understand is to get a coherent model that can explain as much as it can right, that can explain as many things and find as many connections between things. They can be communicated easily. So for those of you who have some philosophy background, old school philosophy and epistemology used to think the goal is knowledge, having true beliefs right, and what a lot of people ended up saying was that's not enough, because under that model you can have a ton of individual knowledge but you have no coherency or overall picture. What do you want in science? You don't just want a bunch of true facts, you want a model that can unite all the true facts, explain them and make predictions and connect new phenomenon and connect new phenomenon. So my thought was like okay, if that's what it is to actually understand, what would it be like to fake that feeling? Right? How do you give people that feeling without actual understanding? So what you want is to give them a really powerful model that can fit, explain all kinds of things right and any new phenomenon I can just handle.

Thi:

And one suggestion is that's why conspiracy theories are so compelling right, and any new phenomenon I can just handle. And one suggestion is that's why conspiracy theories are so compelling right, because they're like a cartoon, simplified, powerful version of understanding. It just like explodes out and just can give you everything. And I think it's interesting in particular because if you believe in science and the specialization involved in science, no actual human being can explain everything. Right, and that's kind of a sad place to be in. But a lot of these systems the conspiracies, theoretic systems make it feel like everything is suddenly in your grasp and you have a model that can explain everything. So it feels more like understanding than the actual world, because you can never actually get, if you believe in science, an actual feeling can be totally honest, because I've got a really thick spiritual or religious skin.

Randy:

Would you put religions in that category as well?

Thi:

Yes. And then let me say more. It's interesting. One of my favorite undergraduate teachers was this English professor named Richard Marius, who's just a lovely guy. I remember him.

Thi:

We were sitting outside one day, office hours, we were talking about Thomas Pynchon, and he said oh, yeah, said. What Thomas Pynchon makes me think about is that there's a deep similarity between the aesthetic of mystery novels and the aesthetic of religion, because in both you have all these seemingly random events and then you find out this thing that provides this unifying explanation where everything makes sense. One caveat it's not like there aren't true theories that make sense of a lot of things. Right, that's what science is trying to get us right, and in many cases the whole point is that what you want is a theory that can explain everything. So the fact that a theory has explanatory power and a great unificatory power doesn't necessarily mean it's bad right. My worry is that there are certain systems that have been optimized for the feeling of understanding and not actual understanding. One thing I should say, though, that might soften the bite here is I think another place where you find exactly this effect is a lot of bureaucratic systems of justification.

Elliot:

That has nothing to do with religion.

Thi:

I see this constantly in administrative life, in universities. We have simple metrics and we're trying to create justificatory systems in which you can explain any action in terms of a couple of simple metrics. Sometimes I think there are a lot of intellectual systems whose appeal is that you can get everything from a single principle utilitarianism, libertarianism right and you might also make such an accusation of some versions of such systems.

Randy:

Yeah, sure, yeah, I mean as a pastor. I will say if you're a Christian or a spiritual person who enjoys the quick and easy answer, or your church leader has a habit of giving you a quick answer, pat answer, for every profound question about the universe, question things and question whether the motivations of that and question your motivation of feeling good about simple answers to complex questions, because it's just not usually the truth and there's a whole lot of mystery within. There should be a whole lot of mystery within our spirituality. So I'm fully endorsing what you're saying. We should be a little bit suspicious when we hear easy answers to really heavy questions.

Kyle:

Yeah, like where did the world come from? It's like that old joke, right? The right answer to every Sunday school question is Jesus. There's a reason that's funny.

Thi:

One thing I say at the end of this selection to clarity papers, something like what's the response? And it's something like heuristics are good until people know what they are and start gaming them Right, like I think. For example, in our evolution we probably evolved to have a instinctual heuristic, and that heuristic is consume as much sugar and fat as you can, and that heuristic, I think, works in what the evolutionists call the environment of evolutionary adaptedness, because there's not that much sugar and fat around. If you just cram your mouth as much fruit and animal as you can, you'll be fine, but then that heuristic gets gamed and what that gaming looks like is like Cheetos and Nilla wafers. And so I think, like what you have to evolve, what I have to evolve, because I speak as someone that's capable of taking down like a Costco-sized bag of kettle chips in a single go yeah, baby.

Thi:

I think, now that we know that there are people out there that are trying to game our sense of deliciousness, we have to devolve something. It's not to say that deliciousness is bad, but when you eat something and it's just so addictively tasty that you have to maybe be like wait, wait, wait, wait, wait.

Thi:

Hold on a second, Let me look at this bag. What's going on? I think there's something similar. Look at this bag. What's going on? I think there's something similar. It's not, again, that things that make sense and are easy are necessarily wrong. It's that because we are in a world in which people are trying to game our heuristic of easiness of understanding. If something just feels good, you should immediately be suspicious. You shouldn't just accept like the response to ease of understanding should be suspicion in an environment where people are trying to game you.

Randy:

Yeah, yeah, oh, man, politically, can you imagine how much that would change things in our QAnon? Whatever, I'm not going to go there. So this is kind of speaking to like can you put on your philosophical life coaching hat? I don't know if you've ever put on a hat like that before, but yeah. So I've heard you speak on whether or not we actually love the truth, whether or not we're dedicated to the truth, or if we just want to have everything that we already think and believe affirmed. You know confirmation bias, whether or not it's true. So philosophy is all about loving the truth and following where the argument leads and trying to get to the bottom of it. Can you just putting on that coaching hat, can you?

Thi:

tell us how to actually seek and love the truth, rather than living in constant confirmation bias? How do we examine ourselves and our own motivations, as we're in an environment where people are trying to give us cooked versions of the truth that go down easy, right? Okay, so let me try it this way. If you were trying to manipulate people and get them to accept what you wanted them to believe, a good strategy would to be make believing your chosen belief more pleasurable and easy, right? So, given that, I think we need to be suspicious of belief systems that are pleasurable and easy which is not to say that pleasure and ease are always false. Yeah Right, there are plenty of good things to eat that are delicious and there are plenty of truths that are incredibly pleasurable to grasp. But, given that we know that there are manipulators who have a lot of motivation to get a lot of power by using the manipulation of pleasure and beliefs to get us on their side, we should immediately be suspicious and ask what's going on that's good Yep.

Kyle:

Yeah. So I heard you say something similar on a different podcast interview and I had this thought and I'm curious what you think about it, because it kind of sums up my psychology in some ways. So it seems like, on reflection, that maybe the best answer, the most convincing answer that I like feel in my gut, is convincing that feels clear to me To any big question like why should I care about truth, or why should I care about a good method of gathering information, or why should I care about morality or what's you know right or wrong, or what's healthy or whatever. Why should I do any of that stuff? Maybe the most convincing answer to me is because it's hard and the other thing is easy.

Kyle:

Now, maybe that's something peculiar to my psychology, that that's what like hits me home the most. But like I've heard religion summed up in that way Kierkegaard kind of sums up religion in that way Christianity, right, his version of it is very hard, maybe even impossible, and that's why it appeals to me. So does that say something deep about human nature and, if so, can we weaponize it to actually combat conspiracy theories or misinformation?

Thi:

I think that theory is too easy. Why do the hard thing that's? I mean? I've actually said that to people before, but I don't I mean, so let me let me.

Kyle:

Let me complexify it a little. So I don't think that it's actually effective at people who just want the easy thing, but I think it's a very good way of weeding out who wants the easy thing and who actually wants the truth. Right, if you're someone who actually is prone to the truth or maybe has the capacity to desire it or something coming to the recognition that this is going to be hard and I'm going to have to dig deep to get it is maybe the best sales pitch for that kind of person.

Thi:

Yeah, maybe it is a good sales pitch, but again, like in the background, I think there's a lot of difference between a theory that says all falsehoods are easy and all truths are hard Like that's too easy, right, then you find the truth by just doing the hardest thing. Like, sure, that's too easy, right, you find the truth by just doing the hardest thing. Like that's like that's a meta easy. That's like that's a meta easiness for someone that's like trapped in some protestant work ethic of like the hardest thing is always the best thing be more productive, right.

Thi:

Be more overworking, right. I don't. Yeah, I don't think there's any. I mean, I think the equivalent is nutritionally Eat. The most disgusting thing. That's not actually. There's incredibly beautiful, delicious, wonderful food that's actually deeply nutritious and deeply.

Kyle:

Yeah, but balancing it is more difficult than just eating the most disgusting thing, Right? So I mean you're still aiming at what is actually going to challenge me.

Thi:

Right, yeah, I mean. So you've been, you're still aiming at, but that's what. What is actually going to challenge me, I think. I think one thing that I might accept is, given the presence of manipulators who are trying to game you, it is very unlikely that the easiest path is the right path, but that's really different from saying, like always do the hard thing, right.

Thi:

Actually, I think the hard thing is starting out the fact that some easy things are true and some easy things are false, like if it was that all easy things were false, then this would be yeah, trivial, yeah, don't believe any easy things, but that's that's. That's too easy.

Randy:

Now I'm wrapped up in better notes, right this is a fun little like being able to sit in on a philosophy debate is very enjoyable.

Kyle:

This is what every after conference drinks feels like Nice, very good.

Randy:

So switching gears to. You've spoken about epistemic traps or epistemic filter bubbles and echo chambers and the differences between them, and it's all very fascinating to me, especially being a church leader. So can you bring us into what are echo chambers? Particularly what are epistemic filters and filter bubbles All the words that you use, right?

Thi:

The words are important, I mean I started writing on this because I got irritated by the way people use words.

Thi:

So, basically, for me, there are two different concepts that people have been blurring together and it's really important to keep them separate. So one concept is an echo chamber and the other concept is sometimes called a filter bubble, but I want to call it an epistemic bubble for complicated reasons. So the bubble concept is the concept that most people have become obsessed with lately. A bubble is some kind of social phenomenon where you don't hear the other side or you don't get exposed to the other side's argument. This got really famous from a book from Eli Pariser, the Filter Bubble, and he was really interested in the fact that you know, if all your friends on Facebook share your politics, you'll just never hear the other side. You'll never be exposed to the evidence. Right Lately people have been using the term echo chamber and the term bubble synonymously to refer to that.

Thi:

But if you actually look at the early research that leads to this concept of echo chamber in particular a book called Echo Chamber by Kathleen Hall, jamison and Frank Capella, they have a different concept of an echo chamber. An echo chamber for them is a community where people distrust everyone on the outside and the difference between never being exposed to the ideas of people on the outside and distrusting systematically everyone on the outside is just totally different. These are different concepts. So the first thing I want to say is people blur these things together a ton and there's a lot of research that says, oh, there's no such thing as echo chambers or filter bubbles, which is all showing that actually conservatives know what the liberal arguments are and liberals know what the conservative arguments are, and climate change deniers know what the climate change arguments are, and I'm actually fairly sympathetic to the idea that there actually aren't many filter bubbles or epistemic bubbles in this world.

Elliot:

That right now given the media environment we're in.

Thi:

Most of us know what the other side's arguments are. I'm progressive, I know what Trump's arguments are and we just inherently distrust the other side.

Randy:

Yeah.

Thi:

It's that we think that the other side is systematically biased.

Kyle:

So what's a runaway echo chamber? Oh, a runaway echo chamber.

Thi:

A runaway echo chamber is a case where the following happens you pick all your advisors based on your estimation of who's expert or good, but if your notion of who's expert or good is flawed, then you're going to pick bad advisors. For example, if you're a white supremacist, you're going to pick moral advisors who are other white supremacists and they're just going to confirm your white supremacy, right? Similar thing, actually. I think people found this interesting, although this has a lot of political implications. I actually started thinking about this, thinking about art. I was interested in artistic echo chambers because I was in one in one, and my version of this was I was raised on european classical music and all the people that I trusted were people that were good at european classical music, right, and all of them thought rap was shit. So I grew up having no ability to understand rap and also having picked, because of my classical background, only advisors who were, you know yeah who thought that european classical was the highest form.

Thi:

It turns out that not only is rap amazing, but part of the problem is that because rap is rhythmically complex in a way that is kind of skewed to the complexities of European classical, that if you're raised in European classical you won't have the rhythmic skill to hear what's going on in rap, but you can maintain your belief if everyone you trust is like oh, that rap stuff is crap, don't spend any attention on it, right?

Kyle:

so that's a point. Epistemology here too, yeah, so let's keep with that analogy. Then what brought you out of that echo chamber?

Thi:

what was it that?

Kyle:

enabled you to appreciate rap, that's that's an interesting question.

Thi:

so I think in my case there were two things that happened, and I think this can be generalized. One was was at some point I looked at my shelf of music and I was, like everyone here is white, there's probably something wrong with me, like I'm Vietnamese.

Thi:

This is like and it's important that I am from the kind of Vietnamese who were wealthy enough to go to French schools and had a conception of French culture as and it's probably not that white people are just better at culture there's something some systematic racial bias have gotten into your education. The other is someone.

Elliot:

I trusted.

Thi:

I met someone who knew a lot of classical and they were like no, you should listen to rap, listen to this. And they had really complex, subtle views about classical. So I trusted them and so they got me to pay attention to rap and I think that's a similar thing across all echo chambers. One of the interesting things is when you see stories of people leaving echo chambers it does seem to be because of personal relationships of trust. There it is.

Kyle:

Yeah, and this, honestly, is the most terrifying fact about echo chambers to me, because it's not a scalable solution. Right To to depend on the patience and dedication of someone outside your chamber who also has taken the time to understand your chamber is not a scalable solution like we're not going to fix climate change, if that's, if that's what it takes yeah, I mean it's.

Thi:

Since I've written this stuff, people keep asking me like what's the large-scale policy solution? I don't know maybe there's not one, yeah yeah, yes, uh, yeah, uh, yeah, yeah, yeah, yes, yeah, yeah, maybe yeah.

Kyle:

Yeah, no, no seduction to that clarity. So let's let's talk about how rationality works in an echo chamber, because one of the things that you've pointed out beautifully that I try to convince other people of all the time, including in a talk that I just gave yesterday, is that being in an echo chamber does not make you irrational, and that writing off people as stupid or lazy or irrational or uneducated or fill in the blank of your favorite easy dismissal is just going to for one. It runs foul of the evidence, but it's just not going to be helpful with any social problem facing us, but it's the easy thing to do, right? So explain to us how it's possible for an echo chambered person to be acting rationally.

Thi:

To understand this we have to like, we have to get rid of this like profoundly false conception of how knowledge works that seems to affect a lot of us that shouldn't. And that profoundly false conception is that we're capable of knowing everything we need to know individually, that we have the ability to know things, everything that matters, on our own. That's the ideal of intellectual autonomy and that's just obviously false. Like in the current era of the proliferation of the size of science, like no human being can master even like one millionth of the amount of knowledge that's out there. I mean, I had a kind of conversion experience because of a book from a philosopher named Elijah Milgram called the Great Endarkenment, and the book's argument is basically that the essential epistemic condition of our era is that knowledge is so hyper-specialized that every single genuine practical conclusion comes from crossing so many fields that no person can actually master the whole argument. Right? Chemical engineers trust statisticians, trust physicists. Right? Like there are these huge long chains of trust that run totally out of our control. Like there are these huge long chains of trust that run totally out of our control.

Thi:

I think the way that we start our life is intellectually, we end up trusting other people right About tons of things. We trust large institutional structures. I trust my doctor with my life. Like literally, my doctor says take the spill. I'm like okay, I don't understand any of it. And not just that. If I asked my doctor to explain, my doctor probably can't explain all of the chemistry involved behind that and all the statistical modeling behind that. So my wife is a chemist and I asked her I was like you know how much of chemistry can you explain? And she was like look, I can I understand like one 100,000th of chemistry, right. Like here's like neighboring fields of chemistry. I know nothing about it, it's just so complicated. I just know my little patch of chemistry. I'm good at reading organic chemistry on one set of instrumentation. That's my specialty. It took me 10 years to learn.

Thi:

So our essential position is one in which we're born in the world. We have to start trusting people. Without understanding them right. We can't monitor and check all the people that we trust in. We trust large-scale institutions. Like I mean, I believe that climate change is real, but can I give you the evidence? Nope, right, it involves. I actually did this as an experiment once. I checked out climate change models and I can understand like maybe the first hundred words. It's a complicated statistical modeling of like meteorological events and I'm like I've got no clue, right yeah?

Elliot:

why do I trust them?

Thi:

I trust them because they're professors from like Princeton and Yale that are published in science Right. So what I'm trusting is large scale institutions. So if you grow up with your trust setting set to the wrong large scale institutions, you can go through the same procedure of using the people that you trust and the knowledge gathered from your trust networks to check on new things.

Kyle:

Yeah gathered from your trust networks to check up on new things. Yeah, yeah, yeah. So. So for the average person who believes that you know that the election was stolen from trump and handed to biden, how can that be rational? What? What large-scale institutions are they right trusting? So in a way, in a way that's kind of blameless.

Thi:

Yeah I, this is the point where I genuinely don't know if that position is blameless. I can imagine blameless positions. At that point you might start to worry about how much blatant counter evidence is available that people are dismissing.

Kyle:

On the other hand, you tell another story.

Thi:

That story looks like a person might arrive at the view that large-scale mainstream media is corrupted and that only a small news source is to be trusted, and that's not an inherently false view. We know plenty of positions. I mean, people always make fun of philosophers for talking about Nazi Germany, but imagine your resistance fighter in Nazi Germany. Right, it's true that most news sources are corrupt and that only a tiny, tiny true that most news sources are corrupt and that only a tiny, tiny fraction of the news sources are trustworthy.

Thi:

That's an available position. But yeah, I don't know if I'm willing to say that that particular case is a case in which there's a clearly rational procedure to enter. I mean, it's just so hard Because, I mean, let me think about my views about the importance of vaccination and the deadliness of the COVID epidemic, which are denied by the other side.

Thi:

Right? My views come from believing in the deadliness of COVID and the effectiveness of vaccines. But where do I get that information? I get that information from the New York Times and the New England Journal of Medicine, right? I haven't collected that for myself, right? That information comes from a pre-existing set of trust in a large-scale set of institutions. So I don't know.

Kyle:

Yeah, so this will be my last question about echo chamber. Do you think there is some kind of moral flaw somewhere in the causal history of all echo chambers? Is there a liar back there somewhere? I don't know I suspect not.

Thi:

That's not necessary, right. You can get echo chambers without that. All you need is someone to generate a plausible sounding explanation. That's a little too easy, right. All you need is someone to get too excited by an easy explanation and then come up with one and be convinced by it, and conspiracy theorists and whatnot.

Kyle:

And there's this, there's this way, this I don't know method, this tendency of manipulation that a lot of them have, that seems remarkably effective that I want to hear you riff on a little bit.

Kyle:

So I don't know if you could call it just like information overload or what, but like they have this way of just dumping really expert sounding information in front of you and then kind of making it appear as though the ball's in your court. What are you gonna do with that, right? And? And so either you just give up because you don't have the time to engage, or whatever, and of course that's going to make it seem like you lost the argument, right, and everybody else watching is going to be like oh look, our guy won, or whatever. So they like make the costs of engaging too high, our guy won, or whatever. So they like make the costs of engaging too high, which makes all the reasonable people self-select out, and you're just kind of left with the people who can't see through it or are too lazy to see through it, or something like that. So do you want to talk about that at all? I don't know. Like what can the average person who encounters something like that. Do about it, yeah.

Thi:

It's the thing you're talking about.

Thi:

I think is a really interesting and subtle strategy, and I was working it out with a guy named Aaron who runs a podcast called Enter the Void, and we were talking about, basically, the philosophy of spamming. The essential idea of spamming is that you are generating content and the cost of generating the content is really low for you and the cost of dealing with it is quite high, yeah. So I think one bad faith debate strategy is to just like generate an incredibly large number of theories and responses that would cost a huge amount of energy to reply to, yeah, but which you can just generate easily. I think that's a really that's a really hard debating strategy to deal with, and one of the reasons we want good faith conversations is because in good faith conversations, we don't spam each other right, we don't just overload each other, and I don't know how you deal with a bad faith arguer. Who's spamming?

Kyle:

you, or even as somebody who just wants to. You're a good faith arguer. You know the other side is a bad faith arguer, but you also know there's a large audience here that has a number of good faith arguers in it, or at least a number of good faith observers, and you want to communicate with them because you know you're not going to communicate with the other side. What's the strategy for dealing with this kind of information dump? I honestly don't know.

Thi:

If I do, I would tell people, but I don't.

Randy:

Yeah, Damn it T. We want that.

Kyle:

We want the seduction In my younger days, we know, when I was a certain kind of fundamentalist, I would, I would engage and I would do my best to humiliate the other person and I would chase down every rabbit hole. And the last person to give up wins. And I was the last person. Like, I get the last word and therefore the people watching think that I won and that vindicates my view.

Thi:

One of the things that's interesting. So just a background thought. A lot of the stuff I've been working on this space is about the problem of expert recognition. This is a problem that's as old as Socrates, right? If you're not an expert, how do you pick the right expert? In particular, how do you pick a real expert from someone that's posing and trying to like present as an expert?

Kyle:

The sophist.

Thi:

Yep, the sophist and I, like I'm, a lot of the work I do is based on pessimism about this problem, like I don't think there's a good solution to it and I got into it partially because of this work about how juries respond to expert witnesses, and it turns out that, like you know, juries tend to treat as expert the people that say clear, unqualified, confident things, but actual experts are often like these things are super complicated. They qualify things. They would say that things are unsure for this and that reason.

Thi:

And because of that, juries typically treat them as an expert and not knowing what they're talking about. So I mean, I think part of the problem is if your audience members are already inclined to take clear, confident statements as signs of expertise and worrying and fussing about details as not, then you're already fucked. I don't know what to do about that, except if you can somehow teach people ahead of time that clear, confident statements are not actually inevitable signs of expertise. I don't actually know how to do that. Maybe more philosophy classes.

Randy:

Nice. So changing directions to you, to. You've done a lot of research on the philosophy of games and you've tossed around that the word gamed as a verb. You know, and can you tell us what the philosophy of games is, what is, what is gamification and why are you interested in it? And after that I'll ask you what do gamification and QAnon and conspiracy theories and cults have in common, and how do they gamify things?

Thi:

I mean, I literally just wrote a book about this and you're like quickly tell me about it.

Elliot:

Okay, I can do this 30 seconds Okay really briefly.

Thi:

What is the philosophy of games? So I got into this question, partially because I was irritated at people who were talking about video games as an art form because they were kind of movie and I was like games aren't just a kind of movie, they're like something special, they're different. So I like went down a rabbit hole for five years and I ended up with this theory. And the theory is that games are unique as an art form because they work in the artistic medium of agency itself. What a game designer is doing is not just making an environment or telling a story, but a game designer is creating an alternate self for you to be an alternate agency, designing that agency, and then you pick that agency up and enter into it. So part of what that is is the designer gives you abilities. I mean, I think everyone recognizes a game designer tells you, oh, you can run and jump, or oh, you have a portal gun, or oh, you can like trade money right or bid. But most importantly, a game designer tells you what to want in the game. So I got this idea from one of my favorite game designers, reiner Knizia.

Thi:

He's this German board game design genius. He's been called the Mozart of game design, and in an interview he says the most important tool in this toolbox as a game designer is the point system. Because the point system tells the characters what to care about, right, it tells the players whether they are cooperating or on a team, or against each other, or trying to collect money or trying to kill each other. Right, that tells you what to want. As a philosopher, though, when you hear something like this, you're like holy shit, that's right, a game does tell you what to want. That's, that's part, that's, that's the core of the theory. Right, that a game designer is not just creating a world, but creating a self with an alternate value system that cares about competing or killing, or building an efficient railway network or collecting gold.

Thi:

So the basic theory is that game designers, through the point system, specify an alternate value system and you enter into it and this gives you certain pleasures, and one of the biggest pleasures for me is that games give you a sense of value, clarity, that in our normal world values are complex and plural and nauseating and unclear. But in games, for once in your life, you know exactly what you're trying to do, you know exactly what counts as success and you know exactly where you stand yeah so, yeah, that's great in games.

Thi:

That's fantastic in games because in games this is a temporary system. You step into it for a moment and you step back. So now let's move to gamification. So a standard view in the industry is that games are good, so gamification is good. So gamification is any process where we take a kind of normal activity and then we add points and levels to it. Right, like Fitbit gamifies fitness, duolingo gamifies language learning, a lot of educational. So an early gamification I had as a child is I think it was in my elementary school we got a certificate for a free Pizza Hut pizza. For every 500 pages we read yes, I did that too.

Randy:

Yes, okay, that's a lot of pages, right, you're getting points and clear levels and clear awards. Those personal pan pizzas paid off.

Thi:

So here's a worry, here's my worry. Gamification increases your motivation by simplifying the value system. You're not reading for pleasure or richness or curiosity, but just for the numbers Pages.

Kyle:

You're probably going to read pilf because it's easier to get the page count Right right.

Thi:

I mean, if you want pizza, you should read the dumbest shit possible, right. So similar thought with Twitter. Right, there are all kinds of complex values for communication, but Twitter doesn't measure all of them, it just measures short-term popularity. So the worry with gamification for me is in a lot of gamified systems you get an increase in motivation, but you get it for being pegged to and allowing yourself to be motivated by a simplified value system, where that simplified value system often has to meet the requirement of being easily instantiateable and a mass-producible technology yeah yeah, twitter can't measure empathy or understanding.

Randy:

It measures people punching the like button, which is much simpler yeah yeah, so now I feel like almost everything that we've been talking about is like crashing down upon itself with, you know, the seduction of clarity and moral outrage, porn and gamification. All of it happens within social media in really potent ways and I'm I'm scared to ask because I don't really want to change my social media habits but can you just how is social media messing with and fucking up our brains?

Thi:

Right, let me go simple and then I'll go philosophical. The simple version is simply that it is capturing your motivations and redirecting it along pre-established lines. So, by the way, this is not a guarantee. It's not like this will happen instantly. If you put on a Fitbit or start using Facebook, insofar as you're motivated by likes, then you are now motivated to aim at a pre-established value system. That is there partially because it's easily instantiatable in a mass technology. So in the game's book, I call this phenomenon value capture, and value capture is any case where your values are rich or subtle or in the process of becoming more developed and you get put in a place where the world gives you a simple, often quantifying value system the simple version takes over.

Thi:

So since I've written the book, I've been working more on the stuff and I have a better way to put this.

Thi:

I think what's going on is you're outsourcing your value system yeah, right that you should be figuring out, in response to the rich emotional experience of being in the world, what you care about, but instead you're outsourcing what you care about to Facebook or Twitter. And I just want to be clear this is not. I think there's a way of doing this. It's technophobic and I think these are just new wrinkles of something that's been going on for a while. So I think one of the best and most empirically well-studied examples is the coming of law school rankings and university rankings in the US News and World Report. Right, wendy Esplin and Michael Sauter have a book Engines of Anxiety that I think really really has a lot of carefully researched, empirical evidence for what I would describe as people outsourcing their values about their education and their career to the US News and World Report. And the US News and World Report is really insensitive to any particular person's cares about their life or legal education. Right, it tracks a few simple, easily accessed data points.

Elliot:

Yeah.

Kyle:

Yeah, and easily manipulated data points too. Right. The last university I taught at limited the class size to 15 students, but only in the fall, because that's only one.

Elliot:

That's the only time the US News and World Report looked at it in the spring.

Kyle:

It was double right.

Thi:

Yes, filthy gaming. If you read this book, it's just horrifying. Like so. One of the main things they track is US News and World Report tracks is employment rate and the nine-month mark after graduation. And so law school started telling their law students to take any job, including at a nail salon, nine months out, because it would make the ranking go up.

Randy:

Yeah, so good universities are changing their best practices and their value systems in order to rank higher on.

Kyle:

US News and World Report even though it might be a shittier education method of education and not even just in the us news and world report, which is at least like an organization that hires people that try to be professional, like they. They outsource that shit to, like one guy who has a blog that's influential. You know who I'm talking about.

Thi:

For people not in philosophy, a lot of philosophy for a while was ruled by the equivalent of US News and World Report, which was just one dude ranking journals and universities, which became incredibly influential. There's a footnote in my paper about this, by the way. I mean I think it's you see the same thing in academic research for like things like citation rates and impact factors. I think it's really broad, and I just want to say that if you read stuff on the history of bureaucracy and quantification, it should be clear that this isn't just social media. Social media is one instantiation of a really long trend towards hyper-simple metrification. I mean other examples that might be familiar to people Grade point averages right Grades.

Randy:

Yeah, you're right.

Kyle:

Or to bring it home church. Well, yeah, I was just going to say for us.

Randy:

For you churchy people listening, gamification happens in churches. I mean, like the way that many churches measure success is by number of baptisms, by number of baptisms, because number of baptisms equals the number of people who follow Jesus. And so, literally, churches will have a baptism service and they'll have people planted there who've already been baptized and they already know and they'll say, when we invite people to be baptized, you stand up and come down to get baptized again, because that'll motivate everybody else to go get baptized. And instead of saying, following Jesus is a lot of hard work, that's going to be a lifelong thing, we just want you to get dunked so that we can have it on our stats on our website this is how many people we've baptized this year. Oh my God, it happens. It happens.

Kyle:

I had no idea the church I grew up in had a sign hanging on the wall with numbers on it.

Thi:

Yeah, this is the number of people that committed this. This is what we got in a collection plate last week. Yeah, that is amazing. I mean, another version of this is get any journalist in a room and start asking them about clicks and how trackable clicks have, like, completely changed everything about journalism yeah I mean, we're on a fucking podcast, we care about that trackable

Kyle:

downloads is. It's a big deal to us. Yeah, people ask yeah, so we've kind of already tackled this a little bit. But just segueing from my comment about church there, if you have any insight about this at all, great. If not, no worries. So I don't. I don't know how much like religious epistemology you're familiar with, but do you do you think, as an expert on echo chambers, that there's anything? Is there a good explanation of why religious people seem to be so prone to them? And there's good. There's some empirical data to this too. Right, that's not just my sense of it. I just came across a paper the other day the title of which was belief in fake news is associated with delusionality, dogmatism, religious fundamentalism and reduced analytic thinking. So there's a pretty good amount of empirical data for this too. So any insight as to why that might be that religious people are particularly prone to living in echo chambers?

Thi:

I should offer a proviso here or a qualification, which is I'm not a religious scholar. Anyway, I don't know anything as a raw guess. I think for many people one of the appeals of religion is having a comprehensible understanding of the world. So one of the interesting things for me thinking about conspiracy theories is that the way it's like a parody of certain scientific and enlightenment values right like this. This is a thought that elijah milgram, this philosopher who got me into this stuff about the size of scientific knowledge, says.

Thi:

Which is the whole thing that started science was this enlightenment ideal of intellectual autonomy. We should all be thinking for ourselves and not trusting and not uptaking stuff. What is created is this world with so much information that no one can think for themselves and everyone has to trust this vast realm of experts. And one of the appeals of a certain kind of conspiracy theory is you get to throw away the experts and you get to put it all back in your head again and explain everything from something that you can hold and I think not all, but from my experience some religions, or at least some expressions of some religions, offer something like a complete, holdable explanation of the world.

Kyle:

But I want to push on that just a little bit because in both cases that's an illusion. So the average QAnon conspiracist or the average vaccine denier can no more explain to me the deep state or what precisely is wrong with the Pfizer vaccine than I could explain general relativity. We're both accepting that on authority, so it's just an illusion that I've got this simplistic explanation.

Thi:

Yeah, I mean, that's the point. It's an illusion and if you're attracted to that illusion and that illusion is behind a particular brand of religiosity, then it should also certain conspiracy theories offer another version of, then it should also certain conspiracy theories offer another version of that illusion should also be appealing.

Randy:

Sure, yeah, yep, so T? You've referenced a number of books already. Where can we find your stuff? We're going to put links on our show notes to your books, but what's the easiest way to find your stuff?

Thi:

My website is objectionablenet. There are links there to all my papers. My book is called Games Agency is Art and I'm on Twitter still as at add underscore hawk ad hoc, and you can find me in any of these places, along with a lot of increasingly weird papers. I've been getting emails from people that they're finding my papers more and more disturbing.

Kyle:

Like the newest one on transparency. What is that about?

Thi:

It's called transparency of surveillance, and it's the claim that institutional transparency is also a form of monitoring that undermines expertise and trust.

Kyle:

Okay, I am immediately suspicious, so I'm going to go read that. Excellent, awesome excellent, awesome.

Randy:

Well, teen wayne, this has been super fun, hilarious and really insightful. Really appreciate you spending time with us thank you so much.

Thi:

Thanks for having me.

Elliot:

It's been a good time thanks for listening to a pastor and a philosopher walk into a bar. We hope you enjoyed the episode and, if you did, please rate. Review the podcast before you close your app. You can also share the episode with friends or family members with the links from our social media pages. Gain inside access, extra perks and more at patreoncom. Slash a pastor, anda philosopher. We're so grateful for your support of the podcast. Until next time. This has been a pastor and a philosopher walk into a bar. Bye.

People on this episode