Episode Transcript
[00:00:00] Speaker A: In this episode, we discussed some things that stood out in the 2024 book Filter World, which looks at what we may be losing from a personal and a cultural standpoint, as we've handed over so much of our decision making as far as what we see to recommend recommendation algorithms, like services like Instagram, TikTok and Spot.
Hello, welcome to Call Like I See It Podcast. I'm James Keys, and rolling with me today is a man whose most shocking takes truly have no limit. Tunde. Ogunlana Tunde. You ready to break him off something today?
[00:00:47] Speaker B: Of course, man. Let's do it.
[00:00:49] Speaker A: All right.
[00:00:49] Speaker B: All right.
[00:00:50] Speaker A: Now, before we get started, if you enjoy the show, I ask that you subscribe or like the show on YouTube or your podcast app. Doing so really helps the show out.
We're recording on February 24, 2026, and we continue our culture series today by doing some reading between the lines in the 2024 book filter. How Algorithms Flatten Culture by Kyle Shakra. In this book, Shaker looks at the recommendation algorithms that control what we see, where they came from, and also how they work to change us and also change the trajectory of our culture and also where it all may be going. And if there's any way that we can change where it's going if we don't like necessarily where it's going. Looking at it again mainly from a cultural standpoint, not looking at, you know, from a news or political and so forth, even though you acknowledge those kinds of things. But he's looking really. He's really drilling down on how this influences our culture. So, Tunde, I know we're not gonna be able to touch on everything in the book today, but to get us started, what was your recommendation on the general premise of the book? That algorithmic recommendation has a huge impact on our culture and operates largely to flatten it, or as he describes, reduce it into simplicity. The least ambiguous, least disruptive, and perhaps the least meaningful pieces of culture are promoted the most. What's your thoughts on that general premise?
[00:02:10] Speaker B: I thought it was very interesting in terms of the idea of flattening.
And you posed the term least disruptive. And I think that made me really realize that for those, let's say, corporate interests, people that want to do marketing and sell stuff to the population,
[00:02:32] Speaker A: the
[00:02:32] Speaker B: least disruptive avenue is to try and create a global culture, because then we're all kind of, as consumers, interested in the same things, and that's less disruptive to those who want to disseminate their stuff on us. So I found that that Concept of things that are least disruptive being easier for.
I don't want to say those in power, like in a conspiratorial way, but I'm thinking those in the corporate industry that want to sell things to us and control the flow of economies, I think that makes life easier for them. So I found that interesting and I agree with it. I think it's happening as, you know, we're going through this period of the early information age.
[00:03:22] Speaker A: Yeah, yeah. I mean, this part, you know, I think with the culture piece with like news and politics, I feel like it's been easier to see, you know, like whether it be people being put into these groups and they kind of just are, you know, in these loops and all they see is one ideology or whatever. It's easier to see. But like, to me, the focus really has to be on when we're talking about this. It's really about how it's. The algorithms make it so that it's not. Or create the incentives for us to. If you're a creator, it creates the incentive for you to want to conform without being told to. Because the way these recommender systems work basically is they take what's their goal is to maximize attention. And so, and one of the, one of the quotes from the book is that attention becomes the metric which everything is judged by. Is it. Does it generate attention? And if you're in, if you're a commercial enterprise, then being able to generate attention a lot of times is directly correlated to your ability to generate, to market, to be able to bring people into your door. So how do you generate attention? Well, 30, 40 years ago, you buy ad space and you know, that's just a pretty standard thing. You buy the ad space, you want to get the best ad space you can afford and yada, yada, yada, more people to see it. The kinds of people that you want wasn't personalized, you know, it was just, okay, we want to do that. So the difference that you'll see now though is that in addition to, you know, you can buy, you know, sponsored ads or whatever in Google Search or whatever, but the ability to organically go viral or pop up on a social media feed or in a Google search is very, very, very valuable. And so what you'll want to do is. Or what the, what the recommendation algorithm that drives what you see, you know, like what that's looking at all of the things that people are looking at, how much time they're spending looking at it and then looking at your past activity and trying to Determine what to show you next and what to show you next and just a brief detour. It's not that I don't think that anyone makes the point that these things, you know, like, they shouldn't be there at all. Because where this comes from is the fact that there's so much content being created at all times. How do you decide if there are 3 billion people on Facebook? They can't just put up there everything that everybody's showing, you know, you have. There has to be some way to figure out what to prioritize, what to show people. It could have been just your followers or just, you know, it could have been, excuse me, the people you follow. Or it could have been, you know, chronologically for just people, you know, in a geographic location with you. There are many decisions. What was decided upon were these recommendation algorithms which are tuned to, to their self learning, meaning they figure out what works, what doesn't, where they measure what works, what doesn't work, and then adjust their own kind of weights and stuff based on that.
And so with these recommender algorithms though, they create an incentive. And we know humans respond to incentives. They create an incentive for you to do what's already popular in order for your stuff to get seen. And if you don't, if you don't do something that is, that's pleasing briefly to people and you know, like, can get into their feeds organically, then you're. You're going to be behind the eight ball compared to everybody else. And so to me, it's. What's really interesting about this is it's the pressure. Not that comes from Starbucks or McDonald's where they, they own it, you know, like they own the franchise. Or at least the franchise has signed some agreement to say we're going to be just like you in this other place. And you know, with minor tweaks, but it's like no, actually from. And then the second part I want to get to actually is how this works the other way too. But on the actual merchant or the content creator, how these pressures then push you towards or push the creator towards, you know, self homogenization. Like I got to be like what everybody else. The popular stuff is on hopes, fingers crossed that this algorithm will serve me up to more people.
[00:07:17] Speaker B: Yeah, man, that's actually a fascinating idea of this kind of. Now it's like a loop where the content creator is kind of chasing the flattened culture that's evolved, let's put it that way.
[00:07:31] Speaker A: And flattening it more as a result.
[00:07:33] Speaker B: Yeah, and flattening it more because they're saying, well, if everyone's doing that, I gotta run in there. And then we're all looking as the consumer saying, well, I guess everyone's doing that and we gotta kind of behave the same way. And that's where to me, it's very interesting where we're going with this because it's also, also, the Internet allows for the speed of this to take place at a rate which we haven't seen yet in human history. That's why a lot of this to me is unpredictable, how this plays out. Because if you look at kind of where I was just saying earlier and kind of the windup of how we got here to the Internet and these algorithms is, it's really based on technology.
So, you know, I could just quickly go through, let's say, you know, Columbus got lost trying to find a new spice route to India from Spain and wound up on the Americas. Think about the cultural difference between the Incas, the Aztecs and the Europeans, for example.
But there was the technology of sail ships and all that. By that period of human history that made that happen. Then we fast forward 20th century, you could say, you know, newspapers and TV and movies brought more people together, closer. And now we've got the Internet. So I think that this just continued. That's why I said, I think this is, that's why I'm, I'm always fascinated by the, in the book Sapiens, the Unification theory, because I just feel like, yeah, it seems like human culture is unifying closer and closer. And another one, James, I'll mention that has not less to do maybe with technology on the surface, but how it got spread, obviously involved technology would be religion. Because if you look at the European continent or the area of the Middle east and Northern Africa, prior to a thousand, two thousand years ago, they were extremely diverse culturally.
And now, you know, Europe's got Protestants and Catholics mostly, and the Middle east and Northern Africa's got Islam. But if you look at Europe prior to Christianity, Goths, Visigoths, Vikings, Jews, you know, Franks, Rus, all these different cultures that had all their Germanic tribes or their different religions and all that. So I think this is something that's been happening for thousands of years. And as you mentioned, because it's culture, it's hard for us to recognize it. But if you look at humanity over these blocks of time, we. All we're doing is we're becoming more and more similar in a sense. And so that's why I said that I think the Internet now makes it fascinating because it's speeding it up. And we have all these things in the palm of our hands. And like you're saying, when someone, when we see them both as a consumer and maybe some of us as creators, like you're saying our incentives, the way that the system now works, the incentives keep us coming, coming closer to closer. Close, sorry, closer and closer. And that's why I make. That kind of pressure is new. It's like a new world order.
[00:10:15] Speaker A: But that pressure right there is new. Like that isn't. Hasn't been happening for thousands of years. Where.
[00:10:20] Speaker B: No, I know, but as I said, is the new technology. That's why I want.
[00:10:23] Speaker A: No, not the technology part. Not the technology part is, is a normal progression, you know, like, like the, from ships to television, you know, like, yeah, yes, you, you, you ships allow you to see how somebody else dresses or whatever, and then you can decide if you like that or eventually maybe that might get integrated into your culture in some other way by some creative person or whatever. Television, same kind of thing. Everybody sees blue jeans on their TV screen and they can decide, you know, and their taste makers and their culture integrated into their culture or not or whatever. But what's happening now though, is that the decision on whether to be seen is going to be made by a, by artificial intelligence, so to speak. And it's based on the idea of keeping people in front of the screen, keeping people more engaged, more engagement is better, and so creates. It's a while the. You can look at it broadly and say it's kind of the same thing. What. One, yes, the Internet can speed it up, but two, what's happening here is these decisions on how to, or on how this stuff is going to be integrated and what's going to be integrated and what's not aren't being made by human beings. You know, this is going to be made by. Okay, well, the decision to harmonize is being made by a human being based on incentives that are being created by something. And this is where I wanted to go to next. Something that nobody really understands and very few people have any level of control over. And that is, you know what, the way the algorithm is making decisions is very opaque. It's very black box. And one of the concepts talked about in the book was this idea of algorithmic anxiety which can be experienced by content creators in the sense of, I don't know what I need to do in order to make my whatever I'm creating fly through, you know, be recommended by the algorithm. And so I'm, I'm guessing On one hand, and then I'm. But I'm trying to do my creative thing, but then I'm trying to tailor it one way and you don't know. And that this has been an observed phenomenon. Algorithm, algorithmic anxiety. And then on the other side, on the. The receiver of the information side, it's been observed in the sense that people don't always know. People who are conscious of these things, who think about it, are like, hey, do I like blank? Because that's just something I really like? Or do I like that because the algorithm has influenced me over time to be. Showed it to me enough and everything like that. So now that I like this. But did that originate because the algorithm put that in my head? Is that inception from an algorithm or did I actually genuinely like that or begin to like that in a more organic way?
[00:12:52] Speaker B: I don't know your thoughts on algorithmic anxiety. You know, I was going to say you sound like a guy with a.
You sound like a guy with a podcast trying to figure out how his YouTube channel works, or a guy with
[00:13:04] Speaker A: a song trying to figure out how to get it on Spotify, or a person with a painting and they're trying to put it on Insta. It's anything that you're Any creator.
[00:13:13] Speaker B: No, that's what I'm saying is that the fact you refer to the algorithms as kind of a black box and it's opaque, I think is very.
Again, these are things that we need to stop and recognize that we are operating, all of us as humans right now in the world, me and you. That's why I joke. You and I have a podcast together and we have a channel on YouTube. And honestly, we don't know how the algorithm works. So we are also playing this game as we're talking and doing this show, meaning we do tinker and try and see what works and what doesn't work. And maybe something works for a little while. And then we don't know if YouTube changes their algorithm. If Google decides we want to just shake it up, guys. And then what we were doing doesn't work.
[00:13:53] Speaker A: You know, like, that does happen, right? Yeah. You know, and the anxiety people experience, by the way, sometimes that people be like, oh, I've been shadow banned. Well, that's a form of algorithmic anxiety. Right there is like. Because you don't know, you know, whether or not you don't know.
[00:14:05] Speaker B: Exactly. Really? And. And the other thing, because you're right. Like, I see people that have certain words edited out of their discussion because they don't Want to be quote, unquote, maybe shadow, banned from topics they think that the lords of the Internet don't want them talking about. And then I'll see someone else with a video that doesn't have the exact same words edited and they got five times more viewers. So I don't know.
[00:14:27] Speaker A: Right.
[00:14:27] Speaker B: And so the. But that's what I'm saying, too, James. You throw on top of this what you just said about artificial intelligence, and then a discussion we had a few months ago on this show about the dead Internet theory, which is idea that over 51% now of all Internet traffic is bots. It's not humans.
Again, that's where I agree with you
[00:14:47] Speaker A: that
[00:14:49] Speaker B: the analogies I was making about historical technological advances that force humans together more and more. You made a good example of ships and then tv, allowing people in different parts of the world and all that to see each other and begin to mimic and all that.
And you're right, this is different because it is the first time that humanity, us as human beings, have another form of intelligence, let's call it artificial intelligence here, that is also an actor on the stage with us, and that is doing things on the Internet that we. Because think about it, James, I've said this to you joking about some of the comments I've seen on our own YouTube thing.
And I think about making a comment back. And then I got to remind myself, this could be a robot. Like, I might start arguing with a robot. Let me not do that.
And so I think most of us, though, don't think, stop and think, because people are triggered, and it's normal to be triggered. And when you're triggered, you respond very quickly. And I think this is kind of. I feel like we've all been in this kind of collective mass psychosis over the last decade or so, because as humans, we're sitting here arguing, having our fight or flight, natural responses of the cortisol, all these endorphins, emotions, adrenaline, all these things coursing through our veins because of something we're seeing on the screen of our phone.
And it might not be a human on the other side. It could just be like you're saying, a bot that's algorithmically programmed just to argue with us because conflict creates and all that creates activity online, which means money for big tech. So, you know, this is. This is interesting. I don't know where we go from here as a society, but we'll find out.
[00:16:38] Speaker A: You said something that was. I think that was very on point here because, like, I don't think this book is going to solve the problem, so to speak. The purpose, I don't think that's the purpose of this book. Purpose of the book is to try to identify something that's going on now. It may be two or three iterations down the road of people looking at it and trying to figure out what to do about it. People who may care may not. You know, it may be a while down the road before so you get to the solution stage. But this is, like you said, this is about stopping and recognizing what's happening. You know, and I was particularly fascinated by the, you know, and I've talked about how the creator will self police and try to conform to try to reach the algorithm and be favored by the algorithm. Please favor me, you know, and what was interesting in the book, it talked about how like Airbnb and you had like these hosts that would like, have these rituals and these things. They didn't know if it helped, but it's like, oh, well, I'm gonna try to, I'm gonna do this and you know, at 2pm I'm gonna log in and type this and then they'll like do all these little rituals to be like, okay, maybe that'll, that'll make the eye. That'll tickle the algorithm a little bit and make my, my listing show up more. And it's like, man, this is, this is what the algorithms. Like, this is what algorithmic anxiety is. Is that because you don't understand it? Because you don't know. And we've talked about it. Like, I always give the example with weather, you know, like before humans understood the concept, the way that, the way weather worked, you know, they were doing all types of crazy stuff to try to, you know, get the gods to bring the rain or, you know, whatever. And so the algorithms are like that now, basically, it's like, oh, yeah, I'm gonna, you know, I gotta write this word in this place or I gotta type this here and I gotta, I gotta do a little dance and then the algorithm will show favor on me. Then might be doing a rain dance on YouTube. So I think that that part about it is interesting, but the consumer side of it is very interesting to me. The consumer of the data and how like that it is so well established and kind of the, on the big tech side that we can be changed, that we can be influenced and changed. You know, our minds can be changed or our, we can, they can control our minds basically by showing us enough stuff. They can, you know, this, this goes back, we Talked about this years ago, like the. What was it called? The imperceptible changes in behavior. You know, like that the algorithms are able to.
The feeds that are fed by the algorithms that by showing us stuff at a certain level of frequency in a certain, you know, walking us down a path, that algorithms can change us. And so you put that here, you know, and then say, okay, well, so the algorithm can program us to say, okay, well, here's what we want, or here's what we don't want, or here's what we believe in, or here's what we don't believe in. And it's really interesting to me from that, like, going to. Just another part of the book was they played on that talked about how, you know, what was. It was the concept was called. I'm going to pull this up right now. I'm sorry. But it was the corrupt personalization was the concept, the term that he used, the phrase that he used, and it was how one of the things that sold to us with the algorithms would be Netflix or Spotify or anything like that is that they can personalize your experience and say, okay, based on what you have liked, Rachel, what you have done, we can just give you things that would be also acceptable to somebody like that. Now, setting aside that part about it of. Based on what you have done and whether you should be defined by, you know, in their mind, by what you have done, setting that part aside, what was also. What's also been observed is that that's that they're kind of gaming you on that because it'll be like, okay, yeah, we can show you things that may be the kinds of things that you would like, but we're also going to mix into that our own commercial interests. Like the example was given on Amazon. Like, okay, yeah, they'll kind of learn how your shopping habits are, and then they'll start recommending you Amazon products instead of necessarily what you may like a little something, may like something a little bit more than the Amazon product. But they put the Amazon product in front of you first because it's like, hey, this is Amazon product. And. And so, like, what was your take on or what stood out to you in that idea of the corrupt personalization and just from the consumer of the information side and how, you know, how the algorithms influence us. And again, like you said, this loop that is playing on both sides.
[00:20:53] Speaker B: No, it's very interesting because I think there was a portion, they call it the disruption of personal taste as well, which I found very interesting. And so here's my true story, which I didn't plan on sharing until you said what you just said.
Which is my dirty little secret, which will be dirty little secret no more. Which is that happened to me looking for some pants, right? That was on Amazon. So then of course Amazon recommends me the Amazon pants, right? Like the brand.
So I'm looking at the pants look good on the site and I'm telling you, this is probably like two years ago, so I don't know what they cost now, but these Amazon brand pants were like 18 bucks.
And so I bought the regular ones.
[00:21:33] Speaker A: Had to been like 30 or something, right?
[00:21:35] Speaker B: No, they were like friggin 50, 70 bucks. You know, it was like Kenneth Cole pants or something, you know, because it was like slacks, you know, for business.
So, so I bought. So first of all because I'm tall, I'm six foot four for the audience and I'm thin. So I've got a 34, 35 waist. But then my leg and like a 34 leg, you know, length. So it's, it's, it's sometimes, you know, it's hard.
So Amazon I found had every single.
[00:22:04] Speaker A: So instead of a 30 goes out to you, man.
[00:22:06] Speaker B: No, for sure.
[00:22:07] Speaker A: Tough life you live, man.
[00:22:09] Speaker B: I know.
Let's have another show about my life and how I can cry to you.
That might be for the like one of those special. And the audience has to pay for, you know, to get access to. So we'll do that paywall thing later. Congratulations to tune day. Yeah, yeah. So you'll make me forget about my pants story here?
[00:22:32] Speaker A: No.
[00:22:33] Speaker B: So but what happens, you know how when you've got like. I think normally I see size like by 2, like it's always even numbers. They're size 32, 34, 36. Well Amazon, all of a sudden I noticed it was like every number, right? So it was like size 33, 34, 35. So I decided let me try a pair of 35 waist and you know, the length and all that. I bought one, man, that was the most perfect fit I've ever had in my life, bro. I realized I'm a 35. I was like, wow, I didn't know this. I don't even know we need to wear a belt. I wear a belt now just for the fashion of it. But I don't really need one cause the pants are so perfect and I'll juggle my wife, I gotta stop drinking and eating crack. I can't put on, you know, even a millimeter on my waist now. Cause pants fit so Good. So, bro, so my dirtiest secret is I end up ordering like seven pairs in different colors and stuff. And I just, Dude, I just went out somewhere yesterday and I had on the khaki ones I have, man, some dude gave me compliments stuff. And I was thinking in my head, I could have told this guy I bought him at like, you know, some, some, some, you know, department store of something famous, spent $200. He would have believed me. Like, oh, man, those look good. So, so, James, I, I, I'm sorry to go on this tangent. It's probably way too long for this discussion on this podcast, but yeah, I
[00:23:44] Speaker A: was gonna say, I don't. It's an example of how personalization they actually did find. They, they, they met your needs. They, they figured out what you got.
[00:23:52] Speaker B: But here's, so here's where I'll get serious again. Think about what we just talked about, though. I went on looking at for pants and I was looking at other company stuff, right? So I think Kenneth Cole and Calvin Klein remember some of those brains, right?
So then to your point, the master of everybody in this, you know, the guy who owns the general store, who, who also owns the coal mine next door and who owns the housing, meaning Jeff Bezos, right. He's got Amazon, he's got the grocery store, he's got the pharmacy, he's got the defense contracts and the cloud system, all that. He has enough power through the financial prowess that he has to have this system where he can follow up after I'm looking for the pants.
And even though he has a platform that offers all these other retailers to come in and offer me and you stuff like pants, he looks and finds out what kind of pants I want. And his algorithm then sends me the same kind of product from his brand, Amazon, his general store, but at a very steep discount. So that's what I mean. You're right, James.
There's no market Amazon brand and not the other brand. So Amazon got to sneak through its algorithm system to see what I like first.
And once it figured out what I like, it sends me its own brand at a lower cost, and I fell for it and bought it.
[00:25:16] Speaker A: Well, that's a little different though, because Amazon's been doing that a long time. What they do there is they actually look at what performs, what items they do with clothing. They do it with every, like anything that can be generic. And this isn't much different than the generic product at Walmart or anything like that. But Amazon knows granularly which products sell. And so, and then that's the products they create the Amazon brands for. Just like the one that sells the most. Like, some of the more successful retailers have complained about that, you know, or the successful brands have complained about that. Where Amazon's taking their stuff and basically taking the. Whatever they have that sells a lot and then they replicate it and then add it with no, there's no marketing budget, so they can sell it for a lot less because, you know, and then they're using the information they're gleaning from. That's what I'm saying, you know. But, you know, the example in the book that was interesting to me on this was how Netflix, a couple of things that they would do, like how they play around with the thumbnails, which I wasn't aware of this. I'm very old school with this, apparently, because I go. I don't just surf and see what it serves up to me. I like when I go to Netflix, I'm looking for something, and then if I don't find it, I just leave. And so, but apparently, if you. And this will probably be very familiar, but when you're. When you go through scrolling through the thumbnail on what for whatever content you're looking at, that what's shown to different people can change. And so there's been reports where if it's a black user that the thumbnail will. If there's a black character, even if they have a minor role, the thumbnail had that black character there to make that person more likely to do it. And even though it's the same content, but they're just trying to make it seem like it's something that might appeal to you by trying to show you that familiar familiarity. And then the other piece or part of that as well was that, you know, there was a test that was run, they talked about in the book where the Fast and Furious was. Was served to everybody by Netflix. And they're like, well, but I have. My account is basically for a romantic who, like, love, you know, love stories and comedies. And they were like, but, you know, Netflix, you know, is trying to show that to you. And it's not under. Like they're saying that's personalization, but they're giving that to everybody. And whether that's because of the deal they signed with Fast and Furious are because their, their numbers say that if they show that to people enough, they'll watch it anyway. And even though, again, it's not something that's in line with what they have shown that they would want from a personalization Standpoint. So the amount of control, the thing that stands out to me about this is just that on so many occasions, people talk about how they want freedom and they want to be able to make their own decisions. And it seems like in this instance or these instances, people don't want to make their own decisions in a lot of cases. And maybe the point of this book is to just say, hey, just so you know. You know, like, this is us saying we don't want to make decisions anymore, but it's like, I don't know if making the decisions is too taxing or there's nervousness when it comes to. The author talked about having to pick a bar for his friends. He was all nervous about it because he didn't know, and he'd rather just. If he looked something up on Yelp, then he could just say, oh, if it didn't work. That's what Yelp said. So. But there's something to that, to me, that we say we want to make decisions and we want to have autonomy, but then we, human beings seem to be very happy to hand that over to a artificial system. Hey, you tell us what we like.
[00:28:24] Speaker B: Let me follow on that, because this is something I thought as I was preparing for this discussion, man. And we did a show like, two, three years ago about the biologist who wrote a book basically saying that human beings don't have free will.
And I've been intrigued by that ever since, because at first I really disagreed with it. And then the more I learned things like this, the more I've come around to that guy was onto something. And I think that because I don't like the idea to think that I don't have free will.
[00:28:51] Speaker A: Right.
[00:28:52] Speaker B: I like to believe that I have agency and autonomy in the way our culture is.
We're all, you know, especially in America, being very individualistic. We don't want to believe that we don't really, you know, control how we behave. But I think that based on all of these things we've learned with how the Internet, and you use this term earlier, which I know we've used in prior discussions, to this, imperceptible changes over time.
And I think many people listening can appreciate over the last decade, whether it's politics, whether something cultural, whatever, they've seen people that they know go down different rabbit holes, and over time, slowly, there's these imperceptible changes that end up kind of sticking, and those people end up in different directions today than they might have been 15, 20 years ago.
[00:29:39] Speaker A: They're imperceptible in the moment, but they cumulatively add up and they're very precise.
[00:29:44] Speaker B: Yeah, that's what I'm saying.
And that's why it was hard for me to accept that at first when I got this book, because I like to think that right now I think I can have free will. I can decide to start cursing you out and end this show or something like that. There's different ways I can choose to act in the moment. But when we talk about what I've learned more and more in this last decade just with this Internet and books like this, is that we get triggered. And when we are not calm and we're triggered and we're stressed, which all of these things on the Internet tend to do to us, we then act irrationally in an impulse way, kind of like a herd or mob mentality. And I think that that's what the Internet, especially social media, kind of pushes. You see all these culture wars and things like that is because, well, but that's looking at it from the moment
[00:30:35] Speaker A: perspective though, because from cultural standpoint, remember, from a cultural standpoint, it's not even like that. It's that they'll show you things over repeatedly over time and that will, that will.
[00:30:45] Speaker B: Yeah, that you're not liking those because,
[00:30:48] Speaker A: you know, they made you mad. You're liking them because you kept seeing Amazon brand this, Amazon brand that, Amazon, and then eventually like, oh yeah, Amazon brand, you know, because you've just seen it a bunch of times without even really perceiving it for so long.
[00:30:59] Speaker B: But think about it. I didn't. But that's not my choice then, right?
[00:31:02] Speaker A: No, I understand what I'm saying. I'm not arguing with your premise. What I'm saying is that in the context of culture, it plays differently than it does in the context that we've seen and it's less noticeable. As we talked about earlier.
[00:31:14] Speaker B: One more thing.
Yeah, one more thing. And then I know we want to get out of here is because you made this comment about Netflix and I've noticed that too, 100%. So like, for example, like I had to do a get appointed for a company I'm lice, you know, for my insurance license. And what I was surprised with it had this, this thing I got to fill out about myself and the dropdown already had me selected as black or African American.
And it just reminded me like, oh yeah, this, this, these systems already have made an assumption of who someone is based on the data brokers behind the scenes, you know, all this kind of, all this Information's out there. So, James, there's, you know, me, I'm a big nerd. So some of my history guys, so my YouTube is set to feed me a lot of that stuff. So The World War II in color thumbnail for me, I noticed, always showed a black American soldier, which, you know, I don't know what it looks like for other people, but maybe people who are listening to this go and put World War II in color on Netflix and see what it shows to you. And one of the things as you're talking, James, it reminds me this is, I think, the biggest thing for us to have to.
One of them for us to have to get comfortable with in this new Internet era.
We're used to this idea of ownership contracts. This is mine, this is yours, whatever.
And really, it reminds me that we don't own Netflix. We rent Netflix. Netflix makes the decisions on everything. Like you said, the algorithm's opaque. They're gonna decide if I say I want a personal site and they want to put Fast and Furious. That's their right. It's their company. So I think that this is how. This is what I think the ownership
[00:32:53] Speaker A: piece is the significant piece, though, because you didn't own Blockbuster either, you know, like. And you didn't own. When you go rent something from Blockbuster, you didn't own. But you at least.
[00:33:01] Speaker B: But the technology didn't allow Blockbuster to come in my house. And that's what I'm saying.
[00:33:05] Speaker A: You understand the layout of the store. Okay, I want to go see war documentaries. I got to go to this section. I want to see best new releases. I got to go to this section. And so that you felt a level of control over at least, hey, I can get to where I want to get to. And they're not, you know, kind of marketing me the whole way and trying to. Trying to brainwash me. The whole. Every step I take into this in this store and the documentary we keep referencing with the imperceptible changes in behavior was Social Dilemma, which was. That was a documentary from 20 or 2020 that we did a show on that might have been pre video, though, but. But, yeah, I think. And, yeah, we definitely need to wrap this up. But one of the things I'll say about the book that in many cases you could say, like, the solution. It made the case for human curation. As you get further into it, talks about, you know, globalization of how the algorithm does that. Talks about, you know, again, I talked about before how it introduced kind of algorithms and how they date back thousands of years and, you know, and what, you know, what they are at the core, at their core and so forth. But then looking forward, it's like, okay, the benefits. And almost trying to sell the idea that human curation still would have a role and should have a role in culture.
What's interesting to me, though, is that you can, if you're looking for this book, you know, for the solution to like, hey, this is going to make everything from a, again, a society, society, wide perspective better. I just don't, I don't think right now, I think we're in the stage and this book represents a stage of trying to get a handle on what has happened, happened. Because in many respects, I think that a lot of people, myself included, don't recognize all of the ways in which we are being influenced all day, every day. When it was, when it was a billboard, when it's television commercials, it's like, okay, I get it. And when I'm, when I'm driving down the highway and I see a billboard, I. Things are trying to influence me. But when I'm sitting at home, you know, playing with a toy with my kid, you know, there. Nothing. Sitting there trying to, you know, bombard me. But now you can be sitting at home, you know, like, you're on your phone or you're looking at Netflix or whatever, or even, you know, the tv, you know, it's like, okay, well, when I'm watching my show, I'm watching my show, they might try to, you know, have product placement or whatever, but. And then there's commercials. And we know what that is, but it's not subtle. Product placement will probably be the most subtle of all that. But now it's like, it's subtle and it's everywhere. Every step you take, every direction you look, it's someone trying to get into your head. And then on the creator side, you know, the, the, the fine line he talks about between trying to be creative and then also trying to make it so that what your creative stuff will even be seen is, you know, like, that's. That. That's the world that we've entered now. And so now that we've entered this world, you know, as things do things now, things move fastly, but we'll, we'll see the things we like, see the things we don't like ideally and not, and not fall into this thing where just what is, is what's always going to be, you know, And I think that's the biggest thing we want to avoid. Because when you read something like this is just, all right, let's be aware of what's happening and, you know, be on the lookout for types of ways we can change it if we feel the need, but not just feel like this is inevitable. This is just what's going to be forever and so forth. And all this control that other people, or not even people but machines are going to exert on our tastes and our opinions and everything like that, that's just perpetual. So, I mean, yeah, I mean, it's interesting, though.
[00:36:22] Speaker B: Welcome. In other words, welcome to the Matrix.
[00:36:25] Speaker A: There we go.
[00:36:26] Speaker B: Like you said when you said connect, machines, control. I'm thinking, oh, we got. We've seen that movie.
[00:36:30] Speaker A: Hey, hey. I thought you should have just dropped the mic at that point, man. But I think we can wrap from there. But, yes, yes, that. As terrifying as it is funny. So. But no. We appreciate everybody for joining us on this episode of Call Like I see it, subscribe to the podcast, rate it, review it, tell us what you think. Send it to a friend. Till next time, I'm James Keys.
[00:36:48] Speaker B: I'm Tunde One Lineup.
[00:36:49] Speaker A: All right, we'll talk soon.