Episode Transcript
[00:00:00] Speaker A: Foreign.
[00:00:14] Speaker B: Hello, welcome to Call It Like.
[00:00:17] Speaker A: I See it presented by Disruption. Now, I'm James Keys and in this episode of Call It Like I See it, we're going to react to a piece from 2021 Nobel Peace Prize recipient Maria Ressa that suggests that the level of disinformation and manipulation in today's media environment may be reaching a point where things like democratic elections and rule law are in serious jeopardy of continuing hard stop.
And later on we're going to take a look at common behaviors that could be considered to be sapping our spirit, you know, and just unhelpful to our day to day lives and our ability to keep trucking on. And we'll look at that and consider some of the suggestions made in the article as far as how we can improve on these things.
Joining me today is a man who knows what it takes to rock a rhyme that's right on time.
Tunde. Ogonlana Tunde. Are you ready to show the folks how you handle things that get, let's say, tricky?
[00:01:19] Speaker B: Of course, man.
[00:01:20] Speaker C: All right.
[00:01:21] Speaker B: I don't even know how to respond to that, so I'm looking forward to a good show.
[00:01:26] Speaker A: Now we're recording this on April 11, 2022 and last week we saw a piece in the Atlantic and it really piqued our interest. It was written by, As I said, 2021 Nobel laureate Maria Ressa, who is a Filipino American journalist and author.
And in it, Ressa goes into great detail on how our modern media and tech environment is systematically making it difficult for things like truth and and facts to prevail in the information ecosystem and ultimately how this could be pretty damning as you may imagine, for our ability to continue to maintain democratic societies, to say the least.
She explains how many of the pressures we are seeing are relatively new, stating at one point that we've had the equivalent of an atomic bomb hit our information ecosystem and goes into how our focus on content moderation, which dominates our discussions on what is happening right now, maybe missing the big picture on what's going on and what we need to do to address it. So Tunde, to get us started, what did you think about the author's point about how systematized the distribution of disinformation is and how our focus on just content moderation, that is having the outlets to pull things down when somebody says something that is mean or whatever, kind of misses where most of this action is happening?
[00:02:52] Speaker B: Yeah, it's very interesting.
I think that's why we decided to do it for the show out of everything we read last week, this one stood out for that point in particular.
[00:03:04] Speaker A: Because everybody usually looks at, oh, ban this person from Twitter or do this or do that, and that's a part of it. But to read someone lay out a case as to why that's actually not where most of the action happening is. Yeah, it definitely.
[00:03:19] Speaker B: Well, and also I think what intrigued me too is the author had a positive outlook in the future with. And I, and I found that to.
[00:03:28] Speaker D: Be.
[00:03:31] Speaker B: Or let's put it this way, to me, that differentiated this article from a lot of others that I've saw because most of it is all just doom and gloom and fear about what this is going to lead to down the road, which I can appreciate.
So it's nice when someone who's not only steeped in this information, but has been a victim and a target of some of the kind of, I guess, the darker side of what the Internet can do when they gang up on you type of thing.
[00:03:57] Speaker A: And it's part of people trying out there, trying to fact check and try to do all that stuff too. She's actually working against it.
[00:04:02] Speaker C: But yeah, yeah.
[00:04:03] Speaker B: And that's what I'm saying. Like, it's nice to see that someone who's been all in all these angles still say, hey, there's a positive light at the end of the tunnel here. And so that's, you know, I know we'll get into all this in the conversation, but that's what stuck out to me as well, is that there's a way out if we want it, I guess.
[00:04:22] Speaker A: Yeah. If we sit there, you know, it was interesting to me, this reminded me, like her focus, a lot of this is on the tech aspect of this, the tech platforms and the social media.
And she references, specifically she calls, that we're living in is a behavior modification system. Which sounds a lot like how when we've done things on the social dilemma or different deep dives on the social media and what it's doing to us, what it's trying to do to us, how it monetizes us and how our data and so forth, what's it doing? Behavior modification comes up a lot, and we don't think about it all the time because it's intended to be subtle, it's intended to be very, very short increments where it can move you. But hearing that and then hearing her breaking it down into factors, you know, in addition to what factors, what are going on in terms of the algorithmic amplification, which is, you know, that's. That's the one that I think is that's where all the action is in my, you know, the algorithms which are trying to figure out how best to. To hook you. And you know, it's doing so in a relatively dispassionate way. It's looking for the greatest reaction. So it's going to tap into your emotion from that. And so I've always looked at that and looked at the social media companies and say they are curators of information through their algorithm. And you know, to see her go into that and some of the other aspects she mentioned in terms of antitrust, how, you know, there's only, there's. So there's just a few big players and that ends up giving an outside influence and not a lot of competition as far as doing things different ways. And then going along with the algorithmic amplification is the surveillance enabled personality modeling. So it's not. The algorithm has a lot of. It knows you better than yourself in terms of how it's trying to present information to you and lead you down certain paths. So it sounded like somebody who had studied this stuff really well and really laid it out in a way that's okay. I get how these different factors are coming together to again create a behavior modification system, which sounds about as scary as it can sound.
[00:06:25] Speaker B: Yeah. And you bring up a few good points here because just to break it down, the idea of that these computer systems and the artificial intelligence knows us better than we know ourselves. That's a very interesting. And it's a very intimidating thing to think about because we've talked about this on several recent shows actually when we talk about psychology, right. The idea that as humans we want to have control of ourselves, of our surroundings. And not being in control is one of the greatest causes of anxiety, you know, in a human being and depression and fear and all that kind of stuff where you don't feel like you're in control.
And we talked about.
[00:07:05] Speaker A: That's the interesting thing about that it's either actual control or even perceived, perceived control.
[00:07:08] Speaker B: That's kind of where I think we're getting at. Because what we've discussed in recent, just random discussions when we've gone down this road is that we are in so much.
We're in such less control of our lives and our outcome than we want to believe. And I think the one that comes to mind is the show we did in recent weeks about those people that can sleep in a very. With four or five hours.
They show no negative signs over decades of those patterns. And they're just wired differently.
[00:07:40] Speaker A: And like we will be discussing here pretty soon with the righteous mind, the.
[00:07:43] Speaker B: Book, or even things like when we talked about the way that our muscles behave differently at different ages. I mean, there's just certain things that just we can't control. So when you're getting back to this topic, this article, I should say what I'm thinking of is things like the metadata and the computers at this point do know us better than we know ourselves, as much as we don't want to admit it. So I'll give you an example for me, so I don't pick on you.
I like to delude myself with how much screen time I have.
[00:08:13] Speaker D: Right.
[00:08:13] Speaker B: I like to think that I'm, oh, I don't spend that much time on screens. This and that. You got to love Apple because every week my iPhone or my iPad sends me this thing that your screen time was down 30% this week. And I'm thinking, wow, that much. Maybe I was only on 20 minutes a day.
Somehow I still was on four and a half, five hours in a day. You know what I mean? Like, it's never what I would have thought it was if you were to ask me, because I'll delude myself.
[00:08:37] Speaker A: Yeah, well, and let me jump in real quick because I want to add something to what you're saying because you mentioned the metadata. And so just in a nutshell, what we're talking about here, as far as how they know you better yourself, is just like what you said with the screen time, the machine is tracking your activity. And so if you're, for example, on a social media platform and you log in, you have your account, everything you look at, how long you look at it, what you click on, what you like, what you react to, what you comment on and so forth, all of that's tracked and it's used, it's put in simulators and models and so forth to create a profile, a personality profile, which I said, we've done a movie that. We did a show on, a movie that talked about this. I think it was a social dilemma and talked about how it creates a model. And so from that model, then it knows how to. It determines what it's going to show you, how often it's going to show you, how it's going to prioritize, what it's going to show you, how based on the feedback you've given it over the months or the years or whatever, as far as what you react to. So that's what we mean as far as it's Used in the metadata is that's data that it's building up, as far as that goes over time, that's tracking on you and then again modeling that to figure out what you're going to react to. So that's what we mean when we say you're talking about it knows you better than you know yourself, so to speak, because that's objective. That's not subjectively what you think. That's just that that's what the data says, so to speak. And that data is tracked from your clicks, your, you know, your. Your eyes and so forth.
[00:09:57] Speaker B: And I think, you know, we got to be clear here, too. I think a lot of times when you talk, like, not you personally, but when people have these kind of conversations, you know, there. It can come off sounding like that somehow the people that get manipulated by these algorithms and these tactics are idiots or, you know, or somehow they're not that smart. And I think, you know, we got to caution against that because. Yeah, that the point is, is that everybody gets manipulated by the cigarettes.
[00:10:23] Speaker A: It's a human thing.
[00:10:23] Speaker B: You know, I'm sure you and I included it more than we want to know. And so the point and the reason I say it like that is it reminds me a bit like whether tobacco or some parts of the food industry, you know, when these things are new.
[00:10:36] Speaker D: Right.
[00:10:37] Speaker B: It took a long time for the tobacco companies to acknowledge that they were putting in chemicals into the cigarettes on top of nicotine. That made it more addicting.
[00:10:45] Speaker A: Yeah.
[00:10:46] Speaker B: So somebody who started smoking cigarettes, their willpower might not be enough to quit.
And then with food, we learn how the food companies put the right amounts of salts and sugars to make us addicted to food, certain types of food that aren't good for us in the long run. And I think that with this modern technology of the 21st century, all we're seeing is a shift from that now, truly, from kind of ingesting something or inhaling it to truly getting it in your eyeballs and going to your brain. So now we're just talking about the ability for companies to manipulate not our eating behaviors, or let's say smoking behavior, but actually our mental behavior. And that's what you mean when you're saying about the incremental changes, behavior modification.
[00:11:27] Speaker A: No, I mean, it's a good example with the cigarette. So just let me add to that because I want to tie that back, because a lot of times this stuff, it kind of talks over what you mean, because what you're saying there, like with the cigarettes, that was generalized research that was like, okay, yeah, if you have this combination of chemicals, then it makes it more addictive with the nicotine or the food. That's generalized research. You're doing a thousand people and you're seeing how it affects their cravings and so forth. But the interesting thing about, with the social media and the tech stuff is that that's actually. That's not generalized. Like, oh, yeah, people who like this or part of this group generally like this also. That's like, no, no, no. We know because of the things you've clicked on.
Part of that is the groups you're a member of. But also you in particular, we know that you're going to be more susceptible to this message or that message and so forth. So it's the same thing, you know, it's all about, you know, trying to get people to buy things. Or with social media, it's trying to get people to spend more time there. And so, yeah, that's. I'm glad you took the time to say that and to. To so we can make that connection, because it's really about trying to tap in and get more of whatever they want from you, in this case, time and attention, you know, and use how our brains work to get there.
[00:12:34] Speaker B: Yeah, and that's what I mean. Like my iPad and my iPhone know me better than I know myself, because I want to delude myself that I don't. I must not spend that much time on it. But like you said, these systems know exactly when we check in and check out of them, what websites we're going to. So my memory can't keep up with that. So at the end of the day, these things do know us better than we know ourselves. And, you know, this all reminded me of one of the shows we did a while ago, probably two years ago, and it was about a mother who had joined a pregnancy group on Facebook, remember?
[00:13:05] Speaker C: Yeah.
[00:13:06] Speaker B: And she was one of these groups that does this, the natural births. And they don't trust the hospital system, you know, the kind of the modern medicine way of giving birth and all that. And remember how I talked about that? These, these type of groups, they also get. Then the algorithm push them into QAnon, push them into other areas. And so you have people that otherwise remember it was slowly incremental.
[00:13:30] Speaker A: Incremental. It's not right away.
[00:13:31] Speaker B: Yeah, yeah. And so that's my point, is that this also has created an environment where a lot more people are getting exposed to things that in the priority times, they didn't without this technology. And so I think you know, one of the things that the article Cites is a 2018 MIT paper that says that lies laced with anger and hate spread faster and further than facts. And you know what that reminded me of, though? This is where, you know, that it is probably fixable long term. And this is about humanity.
The famous quote that I've shared with you in the past from Mark Twain, and his quote was a lie, can make it halfway around the world before the truth can get its shoes on.
[00:14:11] Speaker C: Yeah.
[00:14:12] Speaker B: And so my point is if a guy in the 1800s came up with that quote, that tells us that this kind of thing about disinformation and the people being more susceptible to kind of fake negative information.
[00:14:25] Speaker D: Right.
[00:14:26] Speaker B: Than they are to truth and facts. This isn't something new and that other people have noticed this in prior periods. It's just how do we deal with it when it's in your pocket, when.
[00:14:34] Speaker A: It'S in this current iteration.
[00:14:36] Speaker B: Yeah. And I'll tell you this through in literally nanoseconds, this information, and then it takes all this extra time to get the truth out.
[00:14:43] Speaker A: Well, and I'll tell you this, it actually, she, if you, the next sentence, she talks about how, like the impact of that, because she, she says, this is my 36th year as a journalist and she spent that entire time learning how to tell stories about using facts and stuff in a way that makes you care. So learning how to craft what happens into something that actually makes you care. So that was a skill. But when it's a lie, and she said we're not against a lie, you can't win because the facts are boring and the lie particularly was lies laced with anger and hate. So that's automatically interesting. It doesn't have to be skillfully put together and weave together in a way that's so artful and everything, like what this, you know, with rest is trying to do as far as being a journalist, it can be thrown together and just. But if it hits those certain emotional cues, then it's going to pick up in a way that the truth really can, facts really can't. So, yeah, it was, I thought that. Yeah, I jumped on that part as well, though. Just saying. And that's. We've heard things like that before, like, oh, you know, like the lies on the Internet can spread this much X times faster than the truth or whatever. But to see kind of an explanation with it was really interesting.
So I got a question then. I mean, and this, it naturally leads to this. And the author, you know, russet, talks about this a little Bit. But I just want to cut to the chase. You know, seeing all this, you know, are we. Do we have to wonder or do we need to consider whether free speech in our modern media environment is actually on its way to becoming a tool of oppression? And what I say, the reason I say that, you know, is that she says specifically, you know, I quote using that bad actors are using free speech in order to stifle free speech. So, I mean, what do you make of that?
[00:16:29] Speaker B: I mean, I think it is true. I think that it's pretty obvious that, you know, there's actors out there that are using the kind of loopholes that we have in our beautiful system of the Constitution and the First Amendment. And by loopholes, what I mean is the fact that we have a First Amendment where the government cannot curtail generally speech, you know, unless it's, you know, causing violence or something.
There's people out there that willfully know they're lying. I mean, let's just put it just what it is. And the government can't tell them to stop lying. And, you know, that's why to me, based on just what we're talking about here, it goes back. I keep thinking about birtherism only because to me, that was such a symbolic thing, that it was a lie that had no proof, but that also the person or those that maybe wanted to derail the lie had to make a choice. Do we spend time on this even though there's no proof of it, and then that means we can't really prove that it's not true either in a sense, or do you just keep working? And I think, well, I mean, that was the thing.
[00:17:41] Speaker A: They rejected the proof, you remember, it was like, oh, well, that's not a real.
[00:17:44] Speaker B: That's what I mean, at some point it's like, how much time do you spend on that if you're trying to, you know, work and do your job? And then my point is, is so what I think the gets missed is even though people at the top weren't talking about this, let's say on the mainstream news networks and all, that it was simmering down in the bellies of the social media and the Internet. And so what. What I'm saying is that it's. It's just an example to me of how something that otherwise, in prior kind of parts of our history, at least in our lifetime, no one would believe just someone making a blanket statement like that about a president United States, that they weren't born in the country or something.
[00:18:20] Speaker A: Well, someone would. People would believe it, but not. It wouldn't have been widespread. Yeah, that's something like political party or something like more than half of a political party. It wouldn't have spread like that.
[00:18:29] Speaker B: Yeah. And so, and that's why I don't want to stay on that topic. But the idea is that that's an example where maybe in another system, not in the United States, the government itself would have put a quash on that and just said, you know, you can't go saying this about the leader of this country without a solid proof. So all you Internet companies, all you cable news company, all that, you guys, we're going to tell you to stop saying this.
[00:18:51] Speaker D: Right.
[00:18:52] Speaker B: That doesn't happen in the United States because the government can't do that. So that's what I mean is. So people, you take that example and now let's have another thousand lies since then.
[00:19:02] Speaker D: Right.
[00:19:02] Speaker B: And whether it be the big lie of 2020 election, whether it be the Russian propaganda that seeped into certain parts of our ecosystem domestically, the model, the.
[00:19:12] Speaker A: Fire hose of falsehood model, that's my.
[00:19:15] Speaker B: Point of saying that because because of our freedom of speech, it's like. And that's what I'm saying. It's this dilemma because I'm going to pick on me and you. I'm not going to say who else out there.
I believe you and I as individuals really appreciate the First Amendment and literally think of it as sacred.
So if me and you are leading this country, so let me just pick on us.
[00:19:38] Speaker D: Right.
[00:19:39] Speaker B: And we were in power, we had people saying we weren't born here or lying like that.
[00:19:43] Speaker D: Right.
[00:19:44] Speaker B: We would be faced with a tough dilemma. We would say either we gotta let it go or we gotta try and change this Constitution. And let's just say by some magic wand we had enough people in our party that we control the Senate, the Congress and 2/3 of state houses and we could really do a constitutional overturn. The First Amendment.
I believe you and I wouldn't do that personally because you and I appreciate the First Amendment for what it is and understand that people have a right to believe in birtherism and say it. People have a right to I guess rehash Russian propaganda. And I guess people have a right, even politicians. I'm learning to lie about elections because what I believe is the minute you give someone an authority, the power or you take away the First Amendment, all bets are off. At some point there's going to be a leader that really we would regret that we didn't have reason.
[00:20:40] Speaker A: Oh, go ahead.
[00:20:41] Speaker B: But that's just to finish off and then I pass it back is. I think that's where there's always this deficiency, especially when there's these new technologies in a period of time, because the people that appreciate living in a pluralistic environment with people that. And can handle people that think differently, that are saying, okay, I don't believe in the Ku Klux Klan or Nazis, but I believe they have a right to exist.
[00:21:05] Speaker D: Right.
[00:21:06] Speaker B: Because I believe in this country, in the system, we're always going to be at a deficiency to those who would want to usurp it. And I think that's what we're getting at when we're saying that freedom of speech, this kind of freedom of speech in this environment can also be weaponized against the environment.
[00:21:25] Speaker A: Yeah, I mean, well, that's. I think that is. You said it's kind of the.
It's the dilemma. It's the founding dilemma, so to speak, because you can't actually, like, you're correct in pointing out that if you try to restrict it, the people who actually weaponize its restricting are the same people that are weaponizing its existence. Like they're. Whatever the system is, they're going to turn it on its head and actually having it restricted, they could do more damage than it being something that everybody has freedom of speech. You know, like, you're not. We're not looking at what's happening, you know, the information ecosystem in Russia or China and pining for that, you know, like, so that's. We can see a real example where the government does control what can be said by anyone and so forth. And that's worse. So what's gonna have to happen here? It's not that we have to get rid of freedom of speech or that we have to curtail freedom of speech in any meaningful way now. And there are limits now. It's not a loophole. It's just a feature, not a bug. As far as the way that it works like this, because unpopular opinions sometimes are the right opinions. You know, if you're in 1900 and you're talking about, you know, that Jim Crow is bad, that's an unpopular opinion, you know, but ultimately it became something that we look at that would be a popular opinion now. But you cannot stifle, so to speak. People can't only say what's popular or what. It's generally accepted at a given moment. But ultimately we have to figure out better ways. And that's, I think, what is really the takeaway from this piece, which I found it to be Very insightful. Was actually trying to say, okay, we're not going to get rid of freedom of speech, we're not going to get rid of technological innovation. So how can we make the, how can we design what we're doing to give us a better chance to combat the people who want to weaponize the system against itself for their own personal gain or for their own personal grip on power? And that's what has to happen here. I've said it before in other contexts, like the people who want to, to, to live in a world where facts do matter just have to work harder, you know, like they're just gonna have to go out and be more creative and so forth, because you're not going to be in a place where people aren't trying to use freedom of speech in ways that are deceptive. I mean, we have, like, there's defamation laws. You, you can't say anything about anybody. But for people who are public figures, for example, it's very difficult to, you know, the law, the legal standard is very high as far as what constitutes defamation. It has to be actual malice. And I hate to use, you know, official legal terminology, but means you gotta, you have to know that it's wrong, like unequivocally know it's wrong and then keep publishing it. And that gives a lot of wiggle room, you know, because it's difficult to prove somebody absolutely knows something.
So ultimately there, there's jump on. Well, let me, let me, let me just wrap it up because what I wanted to say with that is ultimately you're correct when you point, you point to this. Oftentimes when you say it's the, when new types of technology are introduced. There is this period, basically, and that's pretty much what we're living in now is this time period where we have this new types of technology introduced. And you know, the immersion, the ability to immerse someone into a behavior modification system.
And so now we have to figure out ways, figure out what's happening and then figure out ways to.
Not everyone needs to get roped into that. Basically figure out ways to pull more people out and expose them to factual information and so forth in ways that can also be compelling or at least that they recognize is important to them.
[00:24:39] Speaker B: Yeah, and I think, you know, one thing that the reason I jumped in there at that point was it's a good observation because we've talked about this as well, is that the courts are the ones that have forced the truth to come out.
You know, the courts, whether, whether a certain percentage of this country is paying attention or wants to believe it or not, it was the courts that really proved that the election wasn't stolen by Biden and that, you know, the facts.
[00:25:05] Speaker C: Yeah, yeah.
[00:25:06] Speaker B: That he won.
[00:25:07] Speaker D: Right.
[00:25:08] Speaker A: It wasn't even a close call. Like, it was.
[00:25:10] Speaker B: Like, the courts also were the ones that proved that, you know, that Dominion Voting and some of these other, you know, the voting systems and all that weren't fraudulent, because all those lawsuit. When they started, when those companies started suing certain media outlets and certain individuals that were out there touting this, it all crumbled. So if they had all the. Like you're saying, if they had proof, you know, where was it? And so. And so it's. Again, I mean, thankfully, we have a strong legal system in this country, which I guess is part of the whole Constitution, which is great. The other thing, and that's what I wanted to just give you the props, is you've said this in private conversations that, you know, because. Let's say again, let me just pick on me and you, if we ran the whole show, you know, because we would not want to give up.
[00:25:58] Speaker A: Grandeur is coming off you today, man.
[00:26:00] Speaker B: No, I just don't want to pick on anyone. I'm joking with you.
But also because if I do say anything else in terms of identifying any political party or politician, someone might mentally put me in a mental filing cabinet and not listen to the rest of our show. So I think you and I are less threatening than some of the boogeymen out there. But. But, no, but that's my point. Right. People like us that want to defend the First Amendment, even for those that we disagree with.
[00:26:24] Speaker D: Right.
[00:26:25] Speaker C: Yeah.
[00:26:25] Speaker B: I think that that's. That's the message that people like us out there who don't like where this is all going and want to see it, we just got to be more creative because the risk is the desire to try and quash it. Like, let's say we were in power for real, Right?
Someone like us that says, well, I'm just gonna do it. And then when I've got a lid on this thing and we've got to figure it out, then we'll just get the. We'll just bring freedom of speech back.
Remember, that's what Palpatine said when he was given the emergency powers that become.
[00:26:59] Speaker A: I knew this was going. Star wars, remember, as he went from.
[00:27:02] Speaker B: Being chancellor of Palpatine to Emperor Palpatine, it was, oh, I'll give these emergency.
[00:27:07] Speaker A: Powers back as soon as we defeat the separate. Separate.
[00:27:09] Speaker B: Yeah. And the separatists never wanted that way, remember?
[00:27:12] Speaker A: Why not provoking the separatists, but we don't have to go down that road.
[00:27:16] Speaker B: But no. So to end this section off, I just want to actually cite something from the article, because this is, to me, where it all comes together as well, from maybe our domestic ideas of freedom of speech and then the technology, because I've never seen this before, but this author did a great job, this journalist. I see why she won a. She's a Nobel laureate of tying together the Stop the steel on where it came from. I didn't know this till reading this article.
[00:27:40] Speaker C: Yeah.
[00:27:41] Speaker B: So I'll quote here and stop the steal. And for the listeners that's specifically Talking about the 2020 election lie, you can see the narrative of election fraud was ceded in August 2019 on RT.
So RT is Russian television, Russia Today.
[00:27:59] Speaker D: Correct.
[00:27:59] Speaker A: Which is Russia Propaganda Network.
[00:28:01] Speaker B: Yep. Correct. So that's over a year before the election.
[00:28:05] Speaker D: Right.
[00:28:06] Speaker B: Then picked up by Steve Bannon on YouTube. This is where I'm getting at with the technology.
Then Tucker Carlson picks it up, and then QAnon drops it on October 7th of 2020. And then President Trump comes top down. You know, at that point is when he was starting to yell at from the rooftop. So what it shows is there's a coordination, whether it's everyone's in cahoots or someone sees something the other did and just picks it up. I mean, I'm not going to sit here.
[00:28:31] Speaker C: Yeah, yeah.
[00:28:31] Speaker B: I'm not. I don't know which one it is, but it doesn't really matter which one. Correct. But that's what I mean. There is a collusion of some sort, whether active or passive of this type of stuff. And that's what I'm getting at with the technology. Because if this was 1955, I believe that under a different set of circumstances, Steve Bannon might have just violated espionage laws. Because if the Russians had some dude in Washington, D.C. in 1955, and he drew something with a piece of chalk on a mailbox and dropped an envelope, and the FBI had, you know, camera on it, and they saw Steve Bannon pick up the envelope, and then he went and typed something in his newspaper, and the next day they found out it's the same thing that the Russians told him to write, that would have been espionage. That would have been. He would have been a traitor.
[00:29:16] Speaker C: Yeah.
[00:29:17] Speaker B: And so this is where it's. Because now, number one, the government can't stop Steve Bannon from being on YouTube because that's freedom of Speech, Right. And then we don't know, honestly, like I said, did. Did Steve Bannon get a call from the Russians? They say, hey, man, you know, why don't you put this out on the thing? Or did he just naturally see it on RT and decide, hey, I'm gonna. I'm gonna gum up the works because I feel like it, and do this? And that's where, again, it takes investigation, it takes time. And by the time that we've learned with the Mueller report and now with this January 6th investigation, by the time these investigations play out two, three years later, it's too late. Like the.
[00:29:51] Speaker A: And that's part of the game plan.
[00:29:53] Speaker B: No, exactly. And that's my. That goes back to the Mark Twain quote, that the lie already made it not only halfway around the world, it made it around the world eight, nine.
[00:30:00] Speaker A: Times before the truth.
There's 30% of people who just buy into it. It's like, so the investigation is too late, and it's not going to come up with anything that will get these people who have staked this as their position, alpha of it. And just one point that I want to make before we move off of this is because we've seen this, is that the government does not tell YouTube that they have to take this down. But private companies like YouTube or Facebook can moderate their own content. They can say who's allowed to put what on their platforms. It's just, it can't come as a directive from the government. So we do see that commonly, as far as what the terms of service, so to speak. Now, it may be enforced sporadically or whatever. But again, that's not. That's not the government. There's no due process in Facebook. You know, there's no due process with YouTube. It's like, they're private companies. You can always just start your own, you know, distribution hub if you want to do that. So. But it's not the government that it comes from. And so it's still consistent with the First Amendment and so forth, because that's a restriction on what the government can do.
[00:30:57] Speaker B: But you bring up sad about that. I just want to touch on what you just said. It's interesting because I never thought of it till you said it this way. So I'm thinking as you're talking, well, why wouldn't then YouTube say, let's not show Steve Bannon's thing because, you know, it's not factual? And that's the reason why they wouldn't do it is because they would lose market share. I mean, it's another proof, because they're.
[00:31:15] Speaker A: In the attention business.
[00:31:16] Speaker B: Yeah. Human beings want bullshit, you know what I mean? Like, that's.
[00:31:19] Speaker A: Some do, some do. They want to be well enough, let's.
[00:31:21] Speaker B: Put it this way.
[00:31:23] Speaker A: And so, yeah, unless it has to be really bad for them to take stuff off because they want everybody's attention and people who believe that and people who. They'll put somebody saying the opposite on their platform, too. And so generally speaking, and ultimately, we would want them to be generally neutral as far as this. But when people are out there spreading absolute falsehoods or falsehoods that were seated in Russia today, like propaganda networks, you would hope that there'd be something with that. I mean, at least again, from the private company, not from the government, because. Yeah, absolutely, you wouldn't run it from the government in any form. But in light of this, you bring up the Stop the Steal, which I noted that as well, as far as. Because she talked about how she saw the same kind of spread pattern with things attacking her. Like, there's a way that it'll happen when it comes from different ideologies, so to speak, or certain ideologies, as far as how they'll attack someone trying to get information out there or undermine a democratic institution or whatever cause. Remember, and we know this from the bipartisan Senate report from years ago, that Russia's goal ultimately is just to sow discord here, to undermine people's faith in our system and stop the steal was an incredible, you know, that being the case, that plan they hatched in 2019 and got picked up was a great job because it did both of those things, you know, that their stated goals are.
[00:32:38] Speaker B: Yeah. And I think that's, you know, the big thing I got actually out of reading this article and kind of just preparing for the day was actually to remind it that there's a profession out there that that's actually supposed to deal with this called journalism.
[00:32:50] Speaker D: Right.
[00:32:50] Speaker C: Yeah.
[00:32:51] Speaker B: And I think that's where the, like we've mentioned a couple times already, the technology explosion.
I mean, I think this will probably begin to get fixed over the next generation or so. But, you know, going back to what I said, the example I gave of the information starting on rt Russia Today, leading to then Steve Bannon picking it up and showing it on a show on YouTube and then it hitting Tucker Carlson.
[00:33:19] Speaker D: Right.
[00:33:19] Speaker C: Yeah.
[00:33:20] Speaker B: That's a direct line from Russian propaganda all the way to an American news.
[00:33:25] Speaker A: Outlet that the most popular nightly news.
[00:33:28] Speaker B: Yeah. Fox News is a legitimate news outlet, mainstream news outlet in the United States. So the, the, my point is, is that the middleman being Bannon.
[00:33:37] Speaker D: Right.
[00:33:37] Speaker C: Yeah.
[00:33:38] Speaker B: YouTube is not a news or it's not anything. Right. It's just a platform. So my point is, is that before all this technology, journalists were the ones guarding the information in a sense, right?
[00:33:49] Speaker A: Well, no, it was, remember, Ressa says in her piece in the Atlantic that she points to what happened, the reason why this happened is when journalists lost their gatekeeping powers on what information really made it out into the public sphere and what information was resigned to just being whispered around at the back of bars or whatever.
[00:34:08] Speaker B: And that's what I'm saying. Like there may have been, because I don't know if a journalist at Fox News would have picked up something from Russia Today themselves and just blatantly said it because they've got an editorial room, they've got certain checks and balances. But when it's put on YouTube and made to appear legitimate, then it's not like I'm just picking this up from the rush.
[00:34:30] Speaker A: Here's what it is, man. It's behavior modification. And so no, they wouldn't have picked something up off of Rush Russian propaganda directly in 2012 or 2006, but they might now because now these barriers are down even more. And so what's happened basically is it's a race to the bottom who can capture and retain the most attention.
And so Fox News now we saw this with the coverage of the election, they're competing against people that are, say, even crazier stuff than they will. And so when Fox News initially tried to poo poo the idea that the election was stolen after the election, they had to reverse course because they were losing viewers by the tens or hundreds of thousands. And so it creates this situation now where no, they may not have done it before, but their behavior's even been modified because of everything that's happening. And so, you know, ultimately I think we can, we can move on from there. Like, we hit the point pretty hard, but we, we're going to have to, as you kind of said, as our society matures with this newer forms of technology, there's going to have to be some other way to figure out how gatekeeping or, or trust gatekeeping can happen or trust can be built where people can rely that information that they have or that they're being given from source sources. They trust people at least who require that information, be truthful for them to really want to run with it, you know, because, and I'll leave it with this quote, you know, from, from Ressa. Without facts, you can't have truth. Without truth, you can't have trust. And without these, you have no shared space. And democracy is a dream. So that's what's at stake, basically. And that's from somebody, like I said, who thinks that, or like you said, somebody who thinks we can get. Get the toothpaste back in the tube or at least figure out a new tube for the toothpaste to go into.
[00:36:09] Speaker B: Hey, look, once this whole Metaverse thing really is good, and it looks really good, then I'm fine. I can just live in a. I'll live in a fantasy the rest of my life. I just put that little VR thing on my head and I'll be gone.
[00:36:21] Speaker A: We're running from reality.
[00:36:22] Speaker B: Maybe I'll be. Maybe I'll be not. I won't be the president. I'll be like the leader of the world. How about that?
[00:36:26] Speaker A: Your world. You'll be a leader in your own world.
[00:36:28] Speaker B: Fantasy land.
[00:36:29] Speaker A: Yeah. So. But we can jump from there.
I don't want to spend too much time on this, but we saw. There was just an interesting article I think was in Prevention, and they talked about.
They listed seven unhealthy habits. And these are things that people do, behaviors that are common and so forth. And they just list them out. It's kind of fun, you know, Like, I. Look, I sent it to you. Like, I'm sure you and I do some. Or a couple. One or some of these things or used to, and how we got through it. So I wanted to ask you, which of these are you guilty? You know, which of these are you guilty of, man? Which one do you have the hardest time with? Or did you. And you now you. Now you're doing better.
[00:37:07] Speaker B: How much time we got?
The audience. You want to stay with us for two weeks or where are you at here?
I think it's funny. There's several. I mean, for me, I think there's a little bit of this, each one. And all of us, I would say for me, definitely, I had to learn how to delegate better.
The notice who comes through.
I used to get hurt a lot by friends and stuff. So I like this. The quote, the injustice collecting causes us to see the glass half day. I've never heard of that term, injustice collecting.
[00:37:43] Speaker A: So the first one you said was.
It was the old habit is shouldering responsibility for everything. And then they say instead, you need to delegate. And so that was one the other way.
[00:37:52] Speaker B: Well, you know what? Let me stop on that. Because it's easy to say, oh, just Delegate. Like, you know, like, you and I are professionals, right?
[00:37:58] Speaker C: Yeah.
[00:37:58] Speaker B: So we might have, like, an assistant that does some of the administrative stuff. We might not want to do. That to me, is the traditional kind of Delegation 101. Or me. When I finally.
I remember when you told me you finally hired a lawn person because you used to like doing your own lawn, but when you had kids, you know, you wanted to spend the time. So that's delegating that responsibility.
[00:38:17] Speaker C: Yeah.
[00:38:17] Speaker B: I think I realized as I'm getting older, like, just there's more and more like. Like, put it this way, me selling my car last year and deciding to go all Uber.
[00:38:27] Speaker C: Yeah.
[00:38:27] Speaker B: I realized that a part of that was I delegated my time of driving to the Uber driver. You see what I'm saying? Like, that's where I'm saying on my mind of. Really, I. I'm learning that I want to keep delegating like that. Like, everything that I really don't want to do. I'm thinking about hiring someone just to do my laundry, honestly, because it takes.
[00:38:44] Speaker A: A lot of time, and it's always.
[00:38:46] Speaker B: Sloppy, and then it sits in the hamper half the time because I don't want to do it. Something I'm like, well, if I can find someone for 50 bucks to show up for an hour or two and just make sure once a week all my shit's folded, why not?
So to me, that was an interesting one.
[00:38:59] Speaker A: And then the other one. And then I want to. I want to.
[00:39:01] Speaker B: Yeah, go ahead.
[00:39:02] Speaker A: No, but the other one that you noted was the old habit, being tracking who disappoints you. So, you know, you keep this running. Oh, this person, you know, this person did this to me. This person did that to me. And instead, the suggestion is to notice who comes through for you.
[00:39:16] Speaker B: Yeah.
[00:39:16] Speaker A: Yeah.
[00:39:17] Speaker B: Well, you know, and that's the key, I think, for me, too, with business, because in my field, you're conditioned to kind of chase people.
And, you know, it's true. You can get caught into kind of having a target, like the big buffalo or elephant, and, you know, you might be chasing them and wasting your time.
And then you look back behind you and there's a stable full of horses that are already with you. You know what I mean?
[00:39:41] Speaker C: Yeah.
[00:39:41] Speaker B: Yeah. So that's kind of what I had to learn, too. Like, let me go back to the horses, staying in the stable. Like, they seem to like me, and they're pretty healthy, so let me just continue to feed them and make it healthier, and I'll get healthier.
[00:39:51] Speaker A: Yeah, I mean, that's one. I think that, that is one of those. I think anyone can either do, do that, start doing that, or always make an effort to do it better because.
[00:40:00] Speaker B: You know, it's a tendency.
[00:40:01] Speaker A: Well, it's a tendency in our humanity, it seems like, to, you know, like they say the grass is greener on the other side, so to speak. So you're looking at what didn't go the way that you wanted it to go or that, you know, like with, oh, well, you know, this is that, that's that person, you know, like. Or this, like this one got away or whatever. And so, and not focusing on the things that are going well or the people who deliver, as they put it, you know.
[00:40:25] Speaker B: And so I'm going to say we know some men that act like that in their marriage.
Let's stay silent.
[00:40:31] Speaker A: Everyone, Everyone knows men that act like that in their marriage.
[00:40:33] Speaker B: I was going to say that act like that. I'm just saying everyone acts like that. I don't act like that.
[00:40:37] Speaker A: No, no, everyone, Everyone knows. Everyone knows someone. That's what I'm saying. This is one of the more human ones that like, I don't think everybody tries to shoulder all responsibility. That's more of a certain types of personality which you happen to be. I happen to be that kind of personality as well, but I don't think everybody does that. But this another one. Well, let me jump in because another one that I wanted to mention that I do think is pretty.
This is one of those that touches almost everyone, if not everyone, even if you've gotten a handle on it, you can always do better. Is comparing yourself to people around you is the old habit, the new habit being thinking about what makes you special. And this is, this is another one of those things where we're, we do measure ourselves against what's around us or what we see and so forth. This is one of the things that makes social media very popular is that people are on there checking on, checking what other people are doing, you know. And so like it's. While people don't necessarily admit that's why they're there, you know, that's a lot of it. And so this is one now, this is one though that I like, I can honestly say I've made an effort for a long time in my life, you know, to, to really focus on. And you know, part of it is like I've learned, I learned early on about myself that I don't like to just incessantly worry. I like to relax my mind like I like when things I got to worry about or things I got to knock out, I like to be able to focus on. But. And I just learned that I couldn't keep up. Like, I couldn't worry about what this person's doing, that person's doing this and that. I'm just like, yo, I'm always worrying. I'm always thinking about somebody else. And so I almost added necessity because of the one I'm gonna get to after I throw it back to you. I almost necessity had to just focus on myself more in terms of, okay, well, what am I doing? What do I do? Well, let me lean into what I do well and not try to be amazing at everything or look, oh, this person's doing that, this person's doing that. Let me try that too, as well. So just out of necessity, I had to. And I still have to focus on that and make it a point to kind of keep my blinders on, so to speak, and keep moving forward and not have my head turning side to side seeing, you know, what everybody's doing. Actually, I just. For an example, it's like running a race, you know, Like, I was coaching my kids soccer team and I was telling the kids the other day, if you're running after something and somebody else is running after it too, if you keep looking side to side and seeing where they are, that slows you down. Like, you better look where you're going and run. And so it's kind of like that.
[00:42:52] Speaker B: Yeah. And it's funny because, you know, some of these are tied together. So I could see the overthinking tied together.
[00:42:58] Speaker A: That was going to be my next.
[00:42:59] Speaker B: Grievance, you know, and because part of what you're saying is also about, like we talked about again, going back to the. Our favorite book from last year, the Power of Now, about being present.
[00:43:11] Speaker D: Right.
[00:43:12] Speaker B: Because if you're always worrying and thinking and anxious, that means you're just not present with what you're like you're saying, if you're looking back, while if you're worrying about who's chasing the soccer ball with you and you're looking back, you're not present on the soccer ball.
[00:43:23] Speaker A: Yeah.
[00:43:23] Speaker B: So it's all kind of slow down.
[00:43:26] Speaker A: And the overthinking one was the old habit is overthinking the new habit. Or what they suggest instead is focus on what you can control.
[00:43:33] Speaker B: Yeah, that's it. And that's what I was going to say. Like, the notice who comes through on that stuck out to me when I go back to that example of paying attention to the stable I already have, I realized in reading it that it is kind of tied to the first one of think about what makes you special.
Because it says the quote they have in that section. By learning how to focus on ourselves instead of others, we can decrease our stress and anxiety, increase our happiness and self esteem, and live more purposeful and authentic life.
[00:44:06] Speaker C: Yeah.
[00:44:06] Speaker B: And I used to look at that the wrong way. I used to think that because people would say, oh, tunde, you know, it's not about the company, you got to sell yourself as the brand. You know, all that. And I always thought, well, if I just talk about myself, that's arrogant and I don't want to be arrogant. And so like when it says learning to focus on ourselves instead of others, I would have thought that that would make mean that I'm arrogant. And I never like arrogant.
[00:44:28] Speaker A: Actually. A true dilemma. How do you practice humility but focus on yourself? That's a real dilemma.
[00:44:35] Speaker B: Yeah, because. Because I don't like being around arrogant people and narcissists. So it's like, all right, I don't want to be like that. But then like you're saying that's. I mean, some of this I think takes maturity and time.
When I was thinking about not chasing anymore and focusing on what I got and allowing me to recreate a system that might attract people to me and me not being outbound, always chasing, then that also had to coincide with my own building of my self esteem in a certain way.
[00:45:02] Speaker C: Yeah.
[00:45:03] Speaker B: Where I had to say, look, dude, it's okay. You are that good at what you do. And it's not about being arrogant as you put in the time and the effort. You're a smart guy and people seem to like you. Yeah. So why not talk about what you do for them? And all this. And it just was a way for me to come at it from a different angle.
And it's just interesting, like, you know, just. Just how these are all kind of related to each other.
[00:45:24] Speaker A: Yeah, yeah. Well, you're trying to. Some of these, like I said, are natural tendencies almost in all of us. And so, and what you're talking about there is like, with the maturity and so forth, there's a thing, it's called like the imposter syndrome, you know, where, like sometimes it gets confused with humility, where in the imposter syndrome being that, you know, you don't necessarily truly feel like you deserve what you've achieved or where you are and so forth. And so people confuse that for humility. So there is, it probably does take for at least some people, some, some learning and some balance to, you know, to be able to learn how to, to, to understand that you've earned where you are and that you can talk about that without being arrogant, but at the same time still maintain your humility. And so there's a balance. Like this stuff they're throwing out, this stuff isn't necessarily easy. You know, these are like, these aren't end goal or these aren't end points. These are targets to, to work towards that. Even if you're good at it or even if you're good at one or two of them or whatever, it's still something that you continually have to work at because you're counteracting a lot of times some of your natural tendencies.
[00:46:26] Speaker B: Yeah. And it's, you know, that's why it's interesting with just all this like focusing on what you can control. Like you mentioned, I'm thinking, wow, that's probably why I got rid of cable news and got off social media.
Because you're right, being plugged into those specifically those two things we just spent time on the first section discussing, you know, had my mind feeling chaotic.
[00:46:50] Speaker D: Right.
[00:46:51] Speaker B: Had me, had me realizing you in.
[00:46:54] Speaker A: Things that you have no control over.
[00:46:55] Speaker B: Correct. That's exactly it. So the one thing that focusing on what I could control was well, you know what I can control, I can control turning all that stuff off.
[00:47:03] Speaker C: Yeah.
[00:47:04] Speaker B: And that's what I did.
[00:47:05] Speaker A: And it also, I guess the same thing which I mentioned briefly before is it immerses you in comparing yourself to what other people are doing because that's what people do there.
[00:47:13] Speaker B: And so. But the one that before I know we got to wrap it up, that stuck out to me a lot was actually it was the, the old habit they say was shopping for happiness. So they say instead revel in non material joys and experiences.
[00:47:26] Speaker A: Yeah, that's a good one.
[00:47:27] Speaker B: And you know, since this just the last two years that it's kind of post the pandemic shut down and us being at home more and all that.
That's. That one stuck out to me a lot because remember when the, the shutdown first happened, when we were all like really shut down back in like March, April of 2020.
And I would see these articles after a few weeks coming out about how children were having such a good time at home with their parents.
[00:47:53] Speaker C: Yeah.
[00:47:54] Speaker B: And you know, did. We were all paused from the rat race kids weren't being chauffeured around all these different stressful activities and Stuff and, and again, that's like anyone with kids, you know, the non material joy is just having your five year old just jump on your lap and hang out, you know, and, and I think when I, that's what got me thinking about that when I read this. Yeah, well, it's like that doesn't take anything, right? It's just non material, but it's just so joyful just being with your family.
[00:48:25] Speaker A: Well, it takes part presence though. That's the thing. And so that like it was almost forced presence at that moment. But that's something again, that's a target that you can set for is, you know, that's something that you can appreciate is your loved ones and being with them and enjoying engaging in things with them. And I mean for me with that, one, the thing I always like to keep in mind is that one, nothing is promised and two, you know, the things that are happening now, you know, there'll be other things hopefully, you know, Lord willing in the future. But you got to experience these things now because once they're gone, you know, that that phase is gone, so to speak. Like I joke with my wife sometimes and it's like, you know, certain things that our kids used to say, like the way, the cute little way they said it when they couldn't talk that well and I'm like, yo, we'll never hear that again. Like we used to hear it like smile like, oh, that was so cute. Like we'll never hear that again. And so you, you want to, you know, take time to enjoy that stuff. And obviously, you know, with Amazon has made a huge business, you know, based on people wanting to get, get joy shopping for shopping from happiness. You can do it from home, do it on your phone, whatever. And so, and that's, there's nothing wrong with that. I mean that's, it's a consumer society. I'm not here to reel against that. But there is more, you know, and there's, there's more that you can get and there's more that you can cultivate, you know, from that. And so yeah, that's a good reminder.
[00:49:39] Speaker B: Again, a good start.
So it made me think of things that I've done in my life too, like whether it's, you know, experiences, right. Boating, golf, traveling. You think about it. Yeah, it might be material to pay for your golf round or your clubs, but when you're out there hitting the balls, that's kind of a non.
[00:49:58] Speaker A: Friends or you know, enjoying people's company and so forth.
[00:50:00] Speaker B: Yeah, so that's a, that's I think, is a good one that a lot of us often forget.
[00:50:05] Speaker D: Right.
[00:50:06] Speaker C: Yeah.
[00:50:06] Speaker B: That it's just, you know, sometimes you need to take the you time and just like that joke sometimes about playing PlayStation, just take the time and do nothing. And that can be. That can be enjoyable.
[00:50:16] Speaker A: So, yeah, hell no. So, I mean, we can wrap up from there, but we appreciate everybody for joining us on this episode of Call It Like I See It. You can get us wherever you get your podcast. Subscribe to the podcast, rate it, review us, tell us what you think, share, share with a friend. And until next time, I'm James Keys.
[00:50:32] Speaker B: I'm Tundeguana.
[00:50:33] Speaker A: All right. And we'll talk to you next time.