Episode Transcript
The following is a computer-generated transcript.
WEBVTT
1
00:00:14.130 --> 00:01:12.100
Hello. Welcome to the call it like I see it podcast. I'm James Keys. And in this episode of Call It Like I See It, we're going to discuss the upcoming move in the clothing space to use AI generated people to model and display clothing. And we'll get into generally just more the rapid growth and implementation of AI that's so explosive that recently we've seen tech leaders call for a six month pause on the training of AI systems. And later on, we're going to react to the recent EPA survey that was released which found more than 9 million lead pipes in our water supply systems, which basically can lead to lead in our water in the United States and drinking water. Joining me today is a man who didn't need to eat a mushroom to become a big man. Tunde Ogunlana Tunde. Are you ready to pull some fire out of a flower for today's show?
2
00:01:12.130 --> 00:01:22.030
Yeah, man. I like your. I'm going to say I like that Mario Brothers reference because I thought you were going to go somewhere else with mushrooms, so I'm glad you didn't.
3
00:01:22.570 --> 00:02:36.500
There we go. There we go. All right. Rest easy, sir. Rest easy. Now we're recording this on April 10th, 2023. And in the past week, we saw Levi's, The gene company clothing company announced that they were going to be supplementing human models with AI generated ones to wear their clothing, like on the app and the website for the purpose of showing people the fit of the clothes in different types of sizes and so forth, or, you know, different shapes, sizes, everything. So they're going to work with a company, La La Land AI to do this. And stated goal, like I said, is to provide more diversity as far as shapes and sizes and colors so people can kind of envision how clothes may fit on them. Now, many are uncomfortable about this witch and not only just the people that have been uncomfortable about pretty much everything I is doing or the advancements, but also people even in the fashion industry are saying like, well, hold on, if you wanted to be able to show how the clothes fit on, different people just hire models. We have models for that. But nonetheless, so either way, it created a lot of waves when this announcement happened. So to get us started. Tunde what was your reaction to to Levi's announcing that it's going to start using AI generated models, you know, for for modeling clothes and on the website and then on the app?
4
00:02:36.890 --> 00:04:09.520
Um, you know, it was kind of like, whoa, that's new. But then my quick, my quick follow up in my mind was, okay, well, that's inevitable. Um, I just think it's another example of where technology can begin to make inroads at will. And now we have. The technological ability to create, you know, for lack of a better term, people out of out of thin air. You know, at least at least computers to do it. Um, I just mean, clearly, just from an image perspective, not, not a real, not a real physical being. So far, not so. Not so far. But that's, that's what I'm saying is that it's. It's. And I appreciate that we're going to get into all this stuff in the conversation about the nuances, the ethics, things like that. But that's what I'm saying is I don't think we should be surprised going forward as things like this continue. Yeah. And to the point you just made, maybe one day they create an actual living being out of just all this technology we've created. But for now, the fact that they've created. Or let's put it this way, artificial intelligence itself is good enough now to create images of of people. That's what I say. I can't say of human beings, right. Of fictitious people that look like real human beings and like real photographs of human beings. Yeah. Is is is amazing. And we're going to go a lot of directions in the conversation on it. But I'd say my my real answer is it's inevitable. That's why I'm not really that surprised. Yeah.
5
00:04:09.520 --> 00:05:15.980
Yeah. I mean, because I think with the stated goal, you can kind of understand where they're coming from from the standpoint of they use the term sustainability. Now, I don't think that's usually what people mean when they say doing something sustainably. They're saying it'd be too expensive to hire models of all shapes and sizes and everything like that to to, to model the clothing. And so in this case, they can show all of these different types of looks at a lower cost to them. And they think it's going to better serve people because they can say, okay, well, this is what this cut looks like on this size person or person with this kind of dimensions or this is what this color may look like depending on your skin tone and different things like that, because that is admittedly an underserved area. You know, that's something that's difficult to do in terms of really trying to understand a lot of clothes, you know, like how do they look and so forth. So you get it now that the capability is here, like, yeah, okay, you see that somebody is going to fill that need to fill that void in a way that doesn't in a substantial way at least undercut the the profitability, so to speak. So I'm with you.
6
00:05:15.980 --> 00:06:23.990
As far as the inevitability, to me, I think people are very uncomfortable with this, though, and understandably so, because we're going down this path basically where everyone becomes obsolete at a certain point, you know. And so now even the way even you as a human being and looking like a human being is not something that you can kind of hang your hat on anymore. Like, oh, yeah, you look like a human being. Well, we got a computer that can do that, too, so to speak. And so you get it from that standpoint. But ultimately, this is what I use this term loosely in this case, you know, but this is what quote unquote, progress looks like in that sense, where they're saying, okay, we can do more with less. So let's do that. You know, it's not much different. It's it's a lot different if you look at it from a granular standpoint, but from a just a big picture standpoint, it's not much different than saying, hey, we can move more stuff if we put a wheel under a wagon than if we just put a sled and just drag it, you know, like we're all the advancement has always been about, okay, how can we do more with less? So this is this is more with less.
7
00:06:23.990 --> 00:07:19.040
Yeah, no, you're right. And because here's the thing, right? Like in the article, there's a good job about it. And this is why I say it's inevitable because they they speak to the people, the company, like you said, La la land or la la la la land by the company. That's that's that's providing the service for Levi. And they say, you know, quote, They're not just hiring models, they're hiring photographers, hair stylists and makeup artists for those models. And that's true. I mean, that's my point. You're either hiring a modeling agency or you're outsourcing that to them, which is not free. Right? The agency has to make money and make a profit and pay all these people and be in business or you're dealing with yourself, which means you also got more, you know, activity in your HR department because you've got more employees, you got sets where people are filming where again, we've had all these issues in recent years being the news, things like the MeToo movement and all that, and you've got to deal with.
8
00:07:19.220 --> 00:07:22.820
The risk of people doing things they shouldn't be doing, correct? That's what I mean.
9
00:07:22.820 --> 00:08:36.880
So so now that a business can go ahead and offer something and it's funny, I went to La la Land A's website and I guess they're British because everything's in pounds, you know, their most popular what they have on their site in in pounds is for businesses for all these services is 480 pounds a month which I know the British pound and the dollar are close. So we can say it's probably somewhere between 450 and $550 a month in US dollars. My point is just saying that. If you think about Levi probably spending hundreds of thousands, if not millions of dollars a year for what I just said, the infrastructure to have models and all that stuff at any given time available for photo shoots versus potentially paying as low as $6,000 a year for all that stuff to be right there at their beck and call and not have the back end issues like we just discussed. God forbid somebody says the wrong thing on the set or touches someone inappropriately. Yeah, the employer is going to be brought in as part of this lawsuit. Right. So. And this issue. So that's why I say it's not that's why I say it's inevitable. I'm not blaming people that are on sets and get harassed and bring up those things at work like they shouldn't. I'm not saying that any of this stuff.
10
00:08:37.180 --> 00:08:39.790
Yeah, The fault there is what the person who did the wrong thing.
11
00:08:39.790 --> 00:09:04.890
Yeah, exactly. I'm not I'm not saying that there's any body to blame as to why. This. This is happening other than the technology is allowing this to happen. And as you just said, from the, you know, the sled to the wheel and all that, just like nature people want and businesses want efficiency. And to kind of cut to the chase, that's the shortest distance between two points is a straight line. Right?
12
00:09:05.010 --> 00:10:10.930
So let's cut to the chase now, man, because here's what it is. Essentially, we are on a path where technology is going to increasingly and increasingly, we've been on this path for 10,000 years. You know, I would say technology is going to make people more and more and more obsolete eventually, you know, in terms of what we need to do or what needs to be done and how it can be accomplished. So we've seen in certain areas, certain times, essentially we've seen jumps in that, you know, like you look at the industrial revolution and how there was a jump then in terms of, okay, we needed 1000 people to do this now, Now we can do it with 100. And then you go to like the assembly line and stuff like that and then you need 100. Now you might only need 50. And then like those are big jumps, you know? And so what it looks like right now is where we're entering now with this with with AI becoming more and more and more practically helpful to solve, you know, like discrete problems, real things that people want to do. Ai is becoming more and more helpful at being able to do that.
13
00:10:10.930 --> 00:11:08.830
Then we're getting to the point where, again, more and more people and people who might have thought that they were safe are may not be safe anymore. And so it'll lead to some very interesting societal discussions because ultimately, you know, people need things to do more or less. And if we're going to continue to develop and develop and develop to where technology can continue to do more and more and more and more things that we used to have people doing even in this space or the similar space seeing stuff recently as far as image generation, you know, whether it be Dolly and things like that. And those are getting better and you can describe something and it'll create an image for you and then boom, done. You don't need a photographer. You don't need like you're all of this stuff is AI replacing things that people normally did in the same way the assembly line did. And so you're going to have the consternation. But as we saw with the assembly line, there is no going back. You know, it's not like we just say, okay, we'll just shut this stuff down, you know?
14
00:11:08.830 --> 00:11:28.270
So it's interesting, too, because as you say, that I can't help but think of the physical aspect of that too, which is 3D printing. You know, while we're talking about digital and visual, the 3D printing will only get better, too. And I think, you know, they'll probably merge together at some point where we can create beings through through a printer, you know what I mean?
15
00:11:28.960 --> 00:13:04.200
So, yeah, I'm not ready for that one. I'm not either. I'm just saying it's probably coming. Yeah. I mean, well, no, that's what I'm saying though. It's like we, we look at this from the standpoint of this is the direction things are going. We can see that it's happening. And so honestly, really what the conversations we need to have are less about complaining about it, so to speak, and more about, okay, well, what type of guardrails are possible? What type of guardrails are needed? And so, I mean, that's the next place I wanted to go with you here is just kind of are you concerned, you know, or how concerned are you maybe is a better way to say it, just that how fast these things are really being rolled out and becoming practical solutions offered on the market. And, you know, with with so little in place as far as ethics, as far as how it should be done or even legality, because one of the things that stood out to me in a lot of these things was like, well, they train the AI systems, whether it be to generate people or to generate images of, you know, just image of a lake or something like that by showing the AI thousands or tens of thousands or millions of images. So they show the AI a lot of stuff, and the AI is then understanding what the what it's supposed to look like, so to speak, and then creating its own, but it's deriving from other things. It's not just creating it out of thin air. So the legality of it, you know, is something that I'm wondering like, well, hold on, where's the line there? But. So what do you think as far as like your level of concern, as far as how fast this this is going relative to how little we have, you know, like in terms of guardrails?
16
00:13:04.380 --> 00:14:31.150
Yeah, I mean, I'm concerned. I don't know how, you know, this is one of those things where. You made a good point when you talked about the assembly line and you said there is no going back. Once we got technology where robots were building the cars in Detroit and the other parts of the country, we weren't going back to hiring humans for those jobs. Right. And so and for the reasons we just gave for the. Levi You know, using AI to do for model shoots, you know, there's a lot less cumbersome ness in infrastructure when you're not dealing with so many employees and people. And so, you know, I think that, you know, am I concerned in general? Yeah, because you made an important statement just a couple of minutes ago in this discussion, which was technology will make people more obsolete. And I think that as we've joked around in various discussions in recent years on the show, one of the jokes we have is that sometimes we think that the way our society deals with itself internally, all our infighting and all that and culture wars is a symbol of Maslow's hierarchy being at the top of it, that we just have too much time on our hands and we no longer because we're not dealing with survival and certain things that, you know, those who understand have been exposed to Maslow's hierarchy. Being at the very top means all we have time to do is our minds to worry about trivial stuff. Yeah, and we're not.
17
00:14:31.150 --> 00:14:51.430
Worried on the bottom. It's kind of like your most baseline needs, you know? Yeah. And so as you grow up. Okay, well, yeah, your most baseline needs clean water, food, air. Then you go up from there, you got shelter, things like that, But you get up to the top and yes, yeah. It's like, oh, is this is is she going to say yes when I ask her out for a date? Yeah. You know, and like that.
18
00:14:51.430 --> 00:14:57.420
But this is why I think it's it's, it's worth pausing and taking a look at this. And that's why I say you.
19
00:14:57.430 --> 00:14:58.660
Sound like a tech leader, man.
20
00:14:58.660 --> 00:15:13.390
Yeah, I don't. Yeah, I definitely don't sound like one of those. The no, but but but no, because I'm going to go through some stats here. You know, in 2019, GPT two.
21
00:15:14.770 --> 00:15:18.700
And this is the, the, the text generator. Yeah.
22
00:15:18.880 --> 00:15:43.480
And so just, just the chat GPT that everyone is raving about lately is really the, the third one because the second 1 in 2019 consisted of 1.5 billion data parameters, which is one way of measuring its capabilities. Chat GPT three was born last November in 2022, so that's the one that just caught everyone's attention, right?
23
00:15:43.480 --> 00:15:45.610
That's the one that caught our attention. We're on four now.
24
00:15:45.610 --> 00:15:53.650
No, that's true. That's what I'm getting at. So so the one that came out in 22 comprises of 175 billion data parameters. So think about that.
25
00:15:53.650 --> 00:15:55.690
That's 100 billion more, right? Yeah.
26
00:15:55.690 --> 00:16:17.480
Well, no, it's it's a over 100 times more than than the 1 in 2019. And so now to your point. Chad Gpt4 is releasing this month. So we go from 2019, 1.5 billion to November of last year. So in about what's that, two, three years, you've got 1.5 to 175 billion data.
27
00:16:17.480 --> 00:16:20.210
Parameters, 1.5 billion, correct, 1.5.
28
00:16:20.210 --> 00:17:54.730
To 175 billion now between November last year. And think about we're talking we're in April, so we're talking if it releases in May of the following year. So call it six months, this one. Will have 100 trillion parameters. Wow. You see what I'm saying? Like, so I can't even fathom. Like with whatever was out before the one in November clearly wasn't as good as the one in November. And that one from 1.5 was a huge leap. Yeah, 2 to $175 Million and that thing. I mean, I get my kid actually tucked into his iPad and said he wanted Drake and I think I can't remember Drake and Tupac to write a rap about some topic, he said. And the Chatgpt rolled a whole song. And it actually I read this transcript. I was like, Damn, this is pretty good. And so no and so and I have a friend of mine who will go unnamed for a while. Why I'm going to say this he has a business in the financial compliance world, right? Like SEC stuff, right? Yeah. He said he was getting audited and he had to do something for like rugby or something was one of these retirement plan things. So he said he was in a rush and he went to chat GPT and said, Hey, write me something on rugby for the regulator about this and that. Oh my God, said it wrote four paragraphs and he said he submitted it and he passed with flying colors. So think about that. We're talking about now we're going from 175 billion to 100 trillion. Like I can't even fathom what that that might be like. Our brain, right? Like that just actually can really figure stuff out. So that's why I'll pass it back. But that's what I'm saying is that.
29
00:17:54.980 --> 00:18:28.150
It's it will do that, though, because of the way it's training. It's training it as it gets better, it gets more proficient at training itself, so to speak. And so the this is not something that's going to slow down. This is something that's going to speed up. And now I'll say this as to the how concerned we should be, we should be very concerned. And I don't say that in a like just a draconian way or any kind of like alarmist way. But the normal order for these things is innovation, then regulation and guardrails. And so and you see no reason that that's going to be different now, you know, like we're.
30
00:18:28.570 --> 00:18:33.910
I think you're right because we talked about this on a recent show, you can't regulate something that doesn't exist yet. And you also don't.
31
00:18:34.570 --> 00:18:55.950
Even and once it first comes onto the scene, you don't really know what it's capable of. So you can't really regulate it because you don't know what you're trying to prevent or what you know. It's all theoretical like. And I saw this in one of the things we were looking at this week. What if you had tried to regulate social media in 2008? Like, Yeah, you know what? They you wouldn't have known what the vulnerabilities were or like all that stuff. It would have been taken.
32
00:18:56.320 --> 00:19:00.610
In the dark. Now, when I might have tried it, maybe we should go back and try that one.
33
00:19:00.970 --> 00:19:02.980
Well, but see, now we don't know if it could.
34
00:19:02.980 --> 00:19:04.390
Have worked out any worse. That's all I'm.
35
00:19:04.390 --> 00:19:08.950
Saying. Well, but this is to my point, there's going to be pain and.
36
00:19:09.760 --> 00:19:12.100
That'll be the rare one. I'll say. Yeah, regulate that sh*t first.
37
00:19:12.610 --> 00:19:27.820
Well, until the time there's going to be the innovation and then lagging behind that once there's enough political will and societal harm that that there is political will that that people decide, okay, we need to we need to do something about this. Like, let.
38
00:19:27.820 --> 00:19:42.910
Me stop you on that, because that's a very important point. I just want to actually swim in that one for a minute here, because you make an excellent point, which is we have to let things play out before we can figure out like where where are they kind of intelligence.
39
00:19:42.910 --> 00:19:46.510
At least like we can try, but we're going to fail that way as well. Even like.
40
00:19:46.660 --> 00:20:03.260
You know what I realized what I was thinking of when we had the recent show on the bank, you know, Silicon Valley Bank, we talked about credit default swaps, how they were a new financial product in 93, but they didn't get regulated until 2010, the Dodd-Frank Act, because it took the collapse of zero eight. Right. But think about.
41
00:20:03.260 --> 00:20:14.980
The collapse of zero eight to crystallize it and to to solidify the political will. That's what I mean. That's what I'm saying. Yeah. You need the intel. You need the intel, and you need the political will. And a lot of times that comes from pain.
42
00:20:14.980 --> 00:20:39.100
Correct. And think about that. Those are the examples type of things we're used to seeing now. Social media. This is what I wanted to stop you here is a good one that you mentioned, because in zero eight, you're right, no one could really see what this looked like. The first iPhone was made in zero eight. Most of us were accessing MySpace and Facebook through our desktops. Right. We had no idea that all this other stuff would happen where in the palm of our hand we could have all this activity and then what it would do.
43
00:20:39.190 --> 00:20:42.460
Pinging us all day and it we become addicted to the pings.
44
00:20:42.640 --> 00:20:57.760
I remember the stat that in 2018, data surpassed oil as the number one commodity in the world. So all these things about, again, where value would would would be found because we didn't understand that back then, at least the layman that data would be that valuable. So. Well, no.
45
00:20:57.780 --> 00:21:11.980
You recall, you know, like at the beginning of the social media thing, everybody was saying, well, how is this ever going to make money? Like that was an open question at that time. No, it's fascinating. Now we look and say, Oh, well, it's obvious, but that was an open question. You know, so the MySpace and the Facebook early days, it's like.
46
00:21:11.980 --> 00:21:55.890
Well, here's why I'm concerned, really. And it's and it's and it speaks to everything we're talking about is because now we've learned after only what a decade of this really that the Facebook bought Instagram in 2012. And we learned about a year or two ago that the statistical damage that is done to teenage, especially teenage girls. Right. Depression, suicide, all that. But now. The political will isn't there in the same way because we have the bureaucracy and the. Well, no, but think about lobbying and the ability to call someone like Mark Zuckerberg 50 times in front of Congress over the last five, six years. And he keeps lying and they don't do anything about it. So there's the regular inertia of money and power that gets in the way once.
47
00:21:55.890 --> 00:22:03.420
Well, yes, that's what I'm saying, though. Like that has to something bad has or things bad things have to happen for us to eventually overcome that. And we just aren't there.
48
00:22:03.510 --> 00:22:52.690
Here's my concern. This is where I'm getting at is we only have partial script I in social media now, which is kind of like the, like bots, right? Yeah. Like they're not actually really able to make decisions on their own on a dime and say, okay, well now I'm going to say this because this person tweeted that they still kind of programed and then let out. When we're talking about this 100 trillion data points, we might be at the point where someone hits a button and these bots go out and they actually are like human brains and they really are able to manipulate and things like that. And so what I'm saying is the reason why this doesn't end well to me because, number one, with partial script, with bots and everything we have now, one reason why the political will isn't there to fix this is because now people in power are manipulating this stuff to be in power, right?
49
00:22:52.690 --> 00:23:04.780
So it's very it's become very disorienting. And so people don't really know a lot of people, I should say. Some people do, but a lot of people don't know what's up and what's down anymore. No, no.
50
00:23:04.870 --> 00:23:20.110
Think about with 100 trillion data points when you're talking about things like deepfakes and all that, the ability to make things quickly and to real. I mean, really, this is where we could see reality starting to fracture in society in a bigger way, more so than it has now.
51
00:23:20.170 --> 00:24:25.540
But this is what I'm saying, though. Like my point is, is that that is that's the process we have to go through. Like we have to the only way to the other side, whatever the other side is, is for the pain to happen. That's going to be severe enough that we then have the political will to deal with it in some kind of constructive way, you know, and so we're not there yet for just social media. And now we're about to stack I to your point, we're about to stack AI on top of that. Social media has been proven to be very make it very easy to manipulate people on a large scale already. We're going to add AI into that now. And like you said, the GPT four, which is, you know, we we thought we were impressed with the number three then, you know, a few months ago. We'll see what number four gets out there. And when it starts interacting like that with us on social media. And so the what tech has done and this is your this is a good point that you raised in terms of I'm talking about a normal kind of ebb and flow where you have the innovation and then following behind that, you have the like the factories and stuff and you got kids dying in the factories and all that.
52
00:24:25.540 --> 00:25:08.710
And then it's like, okay, a couple of decades later, hey, how about no child labor? You know, like, so yeah, but we had to go through, we had to go through that. We don't get political will without going through that, at least if history is any teacher. And so in this instance, we're going to have issues, things that are going to be terrible and we're going to go through them. And then ideally, if if we if we maintain through that, then we'll have the political will to address it after the fact. But the reason I'm concerned is because there is no other way to that addressing it but to go through the pain. And so we basically are looking at it like, okay, there is a tornado on the road in front of us. We have to drive through the tornado in order to get to the other side. And it's like and there's no other way to to do it. So that's that's kind of my point with that.
53
00:25:09.810 --> 00:25:12.300
Let's see if we survive that drive. That's.
54
00:25:13.290 --> 00:25:14.370
Yeah, that's. That's the thing.
55
00:25:14.380 --> 00:25:19.800
I get a feeling my car will get sucked up and spit out into another state. Know by the time that tornado is done, all you got to.
56
00:25:19.800 --> 00:26:02.400
Do is click your click, your heels, man, there's no place like home and you'll be all right. So. Well, like, where do you come down? I mean, all of this kind of, you know, we're in some respects what we're saying is the obvious. You know, it may not something you think about right away, but, you know, and you have people who are on the cutting edge of innovation. A lot of times right now, tech leaders that are saying, hey, whoa, whoa, whoa, whoa, whoa, We got to stop training AI systems. Like if anything is more than GPT four, which is the one you just talked about, 100 trillion data points and so forth. Anything more than that, we got to chill, like just pause. Six months. They want a six month voluntarily pause. And they're saying if people won't do that, governments need to put on some type of moratorium. I mean, it's. I would say.
57
00:26:03.030 --> 00:26:08.460
That, right. Yeah. You're going to go and happen. Yeah, you're going to go in someone's lab and stop them somehow, you know? Yeah.
58
00:26:08.460 --> 00:26:35.620
It may be, though. The point being that they're trying to bring more awareness to this, like they're trying to bring this to the forefront. Like, hey, things can go south really quickly here, you know, like if we're not careful. And so I don't know if it's if this is more crazy, like a fox type of thing, like, oh, of course they're not going to stop. But if we can get the conversation going. But hey, let's get the conversation going. So where do you come down on this proposal, beyond the fact, like you said, that it's kind of farfetched to think people would actually do it.
59
00:26:36.670 --> 00:27:11.050
Mean, where do I come down on it? Is is I mean, that probably doesn't really matter. I don't have an opinion on it in that way. I would say this it concerns me that the people that are creating this stuff are now scared of it. That's that's it. No, I'm on a serious note. Right. It's No, you're right. It's it reminds me of when a few years ago there were some EPA regulations on methane that were being jettisoned. Right. And the fossil fuel industry told the government at the time. Hey, we don't want. We didn't ask. Whoa, whoa, whoa, whoa, whoa.
60
00:27:11.050 --> 00:27:15.340
And because it was the Trump administration that was doing it. Yeah, you remember that? Yeah. And I just.
61
00:27:15.340 --> 00:27:28.620
Thought that's what I thought. I was like, hold on. When the fossil fuel industry is saying hold on this a little bit too much deregulation, we didn't say do all this much, that that makes me worry because they know what they're putting into the rivers and streams. Right. So and yeah.
62
00:27:28.870 --> 00:27:32.530
No, so that's just real quick, I want to piggyback on that because you make the.
63
00:27:32.530 --> 00:27:47.610
Industry never says, oh, hold on, you know, come on, regulate me more or they don't say we have this great new invention, but don't let us don't let us research and put it out yet, you know. Well but and here's the thing if they do that that that worries me like.
64
00:27:47.860 --> 00:28:11.000
Because they know more than we do this is the fossil fuel companies remember knew about the global warming stuff back in the 70 seconds. Yeah Facebook you said society didn't know that how Instagram was messing with teenage girls. Yeah no Facebook knew, you know so but see, the industry is like, well, so if the industry is like, hey guys, hold on then yeah, you're right. Because they know more than what we know.
65
00:28:11.150 --> 00:28:33.170
But see, here's the thing to the point we were making earlier, I think that we're probably correct that no matter what the government says or they say to other people, some smart computer guy is going to be in some lab somewhere, you know, in in in the, you know, the smoke filled room type of thing, working on his plan. Right. And these are things.
66
00:28:33.170 --> 00:28:39.830
Too, that because of that. Hold on. And because of that, the people who don't want to fall behind are going to be in there working, doing the same thing. So, yeah, correct.
67
00:28:39.830 --> 00:29:14.310
And remember, Chad, GPT three, not four, but three allows layman's like me to actually write code because I can just tell it what what I want. You tell the computer to write. So. So who's to say that these computers aren't out there right now? Already writing more code and more code? No one's going to stop that. And so the bottom line is, I think that train has left the station. I think my concern is everything we're talking about. Right. That it's already been shown that. Number one, like you brought up a great thing between the fossil fuel industry, Facebook, and then I'll bring in tobacco.
68
00:29:14.340 --> 00:29:17.130
Yeah, because another one. Yeah, they do. Tobacco.
69
00:29:17.370 --> 00:29:39.660
Tobacco knew it was bad. They were still selective and still selling ads to children trying to make future addicts, just like you said, about the fossil fuel industry in 1977, being told that they're going to create more climate change. And they kept doing it. And just like Facebook being told by their own internal research that they're causing more depression and more suicide in teenagers. So the bottom line, Facebook.
70
00:29:39.660 --> 00:29:44.100
Is like, well, what's doing it? Oh, it's the notifications.
71
00:29:44.100 --> 00:30:59.270
Yeah. But another thing, as I say, remember, we always used to be so kind of surprised about how every single executive in a social media company prevents their kids from getting on social media. So again, it's another example that, okay, when people from the industry are concerned about something, we should listen because like you said, they have an inside view that none of none of the rest of us have. So, yes, that concerns me. And then that's what I'm saying, man, is. What really concerns me is that, number one, the industry, like we said, without being stopped, won't stop, because why would they? And number two is that we already have now people have people have already been shown the way that they can manipulate a population. For power, political ends, all that kind of stuff. As relates to, like I said, partial script like bots online through social media, disseminating information that's false and all that. Imagine when this GPT four comes out 100 trillion data points. That's what concerns me, is that we may not get time for the regulators and those who can maybe do something about this to catch up because. Those who will be able to stop that may be able to do that through manipulating.
72
00:30:59.290 --> 00:31:01.560
Disaster has to come first, is what I'm saying. What I'm.
73
00:31:01.560 --> 00:31:03.840
Saying is the disaster may be irrecoverable, is my.
74
00:31:03.840 --> 00:31:06.120
Point. That may be the case. It may be the case.
75
00:31:06.120 --> 00:31:09.000
This could fracture countries is my. Well, the.
76
00:31:09.000 --> 00:32:43.660
One thing I would say, one area where I would say that we seem to have got ahead of it was the the nuclear arms and that it seemed like society was able to all societies around the world because they were dropped twice and not dropped any more. And, you know, even though more and more people got them. But so far, you know, like people haven't been dropping them, that's one area where it seemed like society was like, okay, hey, we don't need the total catastrophe in order to try to to scale this back. So it's not I'm not saying it's impossible. I'm saying based on what we've seen normally is that usually the pain has to come first. And I would say I would say this, though, it's not that they're not going to stop because why would they? They have reasons to stop, but they have reasons that they can't stop because nobody knows that everybody else is going to stop. And so because of that distrust around the whole globe, basically nobody's going to stop for fear of falling behind. You know, So it's this there's the incentive structure is set up to where people can't stop. And that's why I think I think the tech leaders I mean, I commend them for saying something. If they see something or think something, say something. But I think they're just talking about delaying the inevitable. If they want to be constructive here, they need to try to help governments, help people be more attentive to and responsive to their governments, so to speak, like help the systems actually work the way that they're supposed to work and not be bastions of manipulation, you know, because if our governments work, then conceivably we can like there can be moves like what was made with nuclear proliferation.
77
00:32:43.660 --> 00:33:55.660
But right now our governments don't really work. And that's in many places around the world, the most places around the world. It's only crisis that gets governments to do anything. Nobody can be proactive because half of the population is always off on some wild goose chase chasing something or emotionally mad about something and so forth. And so and it's the tech leaders whose tools are creating the environment for that, in large part. So they can take they can do things to try to empower a more sober discussion and put more people, empower people who want to deal with issues in a more constructive way. It may be to their detriment financially because watching a leader like you said, nobody watches C-SPAN, watching a leader just try to, you know, steward their country through tough times is not very interesting as much as saying that they're all the devil or something and, you know, evil this and Jesus that and all that stuff is very interesting. So I think they could do something. But what this them saying, hey, let's everybody take a pause. I think that's that's ridiculous. You know, like that's not going to help the situation at all. You know. So but there are things, like I said, because there is an example that we've seen within the last 100 years where world has come together and said, all right, we probably shouldn't do this anymore if we all want to be here.
78
00:33:55.870 --> 00:34:13.410
So I'm going to give you a quote because you mentioned someone's name just then. Okay. So this quote, and I quote, Do not worry for tomorrow. For tomorrow will worry about itself. Each day has enough trouble of its own. There you go. Close quote. Do you know who said that? I do not.
79
00:34:13.420 --> 00:34:14.500
Jesus Christ.
80
00:34:14.740 --> 00:34:17.770
You know, So even Jesus predicted this problem. Jesus.
81
00:34:17.920 --> 00:34:20.350
Even. Even Jesus. Wrong, Basically.
82
00:34:20.350 --> 00:34:24.610
Yeah. No, even Jesus predicted that we're going to have some problems tomorrow. Jesus knew about.
83
00:34:24.700 --> 00:34:26.050
Yeah, He must have known about.
84
00:34:26.080 --> 00:34:51.160
I know, man. Well, guess he knew a lot about a lot of things. But no, here's the thing. And I know you want to jump to part two, but I'll close with this, bro. I think that if this continues, let me just put it that way, because who knows how far this can go and all that. And it's funny sometimes I think like at the end, all this technology just needs electricity. So if we kind of pull the plug globally, we can reset and go back to hunter gatherer and still protect humans.
85
00:34:51.160 --> 00:34:58.440
But I take it you got some electromagnetic pulses in your garage? Yeah. In case you need to start walking around.
86
00:34:58.440 --> 00:35:55.660
With those things. Yeah. So. No, but. But I'm going to quote this article that one of them that I know you'll have up in the show notes is, quote, Chatgpt is hard proof. In other words, of the shocking capabilities of a particular species of artificial intelligence called an unsupervised machine learning driven by generative adversarial networks, or Gans. That shocked me because it was the first time I heard the use of the word species. Literally the capabilities of a particular species of. Of artificial intelligence. I've never heard it be talked like that before. And then when I look at what we just talked about earlier about the GPT two, three and four. You think about evolution of a species, and that's what I'm saying. For the first time ever, these kind of. This language that we're seeing in these articles makes me think about the worst parts of our sci fi history. Like, you know, you know.
87
00:35:56.180 --> 00:35:56.480
Because The.
88
00:35:56.480 --> 00:36:04.340
Matrix, the Terminator, you know, like meaning that these things, if these things can become species and evolve on their own without, you know.
89
00:36:04.370 --> 00:36:28.710
You just it also in there you herd species but you also herd self-learning. Yeah like so this is yeah this is that's but that's what I mean. This train has left the station. It's all about can we get there. Can we get ahead of it Right now? It looks unlikely. It looks like we're going have to go through the pain. But it's I mean, otherwise we go through the pain and then ideally, we hopefully, as you said, there's still something there. We can pick up the pieces afterwards.
90
00:36:29.430 --> 00:36:32.670
Yep. So in closing, I think we're going to need more Jesus Christ.
91
00:36:33.540 --> 00:37:37.710
More quotes from Jesus Christ. There you go. So I think we can get to the next topic from there. The second topic we wanted to discuss is mean in terms of just your immediate health is probably a bigger threat than I, and that is lead pipes that are used to our drinking system, our our drinking water. And so the water comes from the water treatment plant or, you know, whatever, wherever the water is, the source through lead pipes to your house. And they found not over 9 million setups like that. So people are there drinking water or at least getting water into their their homes from lead pipes. So. What was your reaction to seeing this? Now, this came from know there's been infrastructure work going on over the past couple of couple of years and that this survey that the EPA did was a part of that. But what what is your reaction to this? Like is your reaction to this? I'm glad we know. Let's address it. Or is it kind of like how did this happen or, you know, just kind of, you know, I'll leave it open to you. What what comes to your mind when you read that?
92
00:37:38.580 --> 00:37:44.670
I don't know. I'm still freaking out trying to figure out, is this worse than I or not? But that's that'll be my problem.
93
00:37:44.820 --> 00:37:46.020
In the short term. It is.
94
00:37:46.140 --> 00:38:18.540
I won't. I won't bother the audience with that. Um, I'll call you at four in the morning when I can't sleep. No. So, look, I feel like this isn't something new, right? I mean, we got 9.2 million lead service lines in the US that are corroding. I think that the system. Clearly must have known in some capacity. Maybe what the new study did was bring it all together. Maybe the EPA at the top level didn't know. But obviously this this, you know, cities, municipalities.
95
00:38:18.570 --> 00:38:22.280
Full analysis allows you to see like you might know it's there, but you don't know how much.
96
00:38:22.290 --> 00:39:48.400
That's what I'm saying. So, look, I think this is this is the this is the real stuff to me that what government's for. Right, is, is making the trains run on time type of stuff, making sure that, you know, number one, your population has good infrastructure to deliver certain services like fresh drinking water. And then secondly, once, like you said earlier, with the regulation stuff, right, once an issue is identified, which over time back in the day was that lead pipes corrode and the lead goes into drinking water. And it's bad for humans, especially children who are developing, then you expect the government to do something about it and fix it, right? Especially when the technology is there to fix it with other resources like PVC pipes and other materials that don't corrode and don't have lead in them. So that's why to me, this is pretty straightforward that I think this to me is what I would like to see the government spending more time on, meaning both Republicans and Democrats, solving problems like this that we know are there because and we did a show, I think, two years ago on like the Hoover Dam and all that. Yeah. Meaning like there's never been any major improvement on that dam since it was built in the 30s. There's been regular maintenance just to keep it from falling apart. But the fact that we don't look and say, okay, it's been 90 years since a dam was built in this country with all the efficiencies we now have, what we just talked about, maybe, maybe when that chat GPT four comes out, it can help us solve this since yeah.
97
00:39:48.850 --> 00:39:50.260
Ask it what we need to do with the Hoover Dam.
98
00:39:50.260 --> 00:40:35.500
Yeah, because our government officials are too worried about, you know, like you said, being on their social media accounts. So maybe, maybe actually the AI will do us a favor and keep them occupied and it'll do their job for them and keep us all actually happy and living well. So yeah, man, But that's what I'm saying is that I mean, to me this is like just good to see that they found this out and that we do have a study done. This is to me, part of infrastructure. So I'm glad it was done. And to me, we got to fix it. And that could be like I'm already thinking, okay, I want to know which companies are getting the contracts to fix this thing because then I want to invest in their stocks and do all that. You know, like this to me is just how money moves and let's get this stuff fixed and get make sure that our fellow citizens and kids in our country can grow up as healthy as they can be and without. Yeah, I mean, I think.
99
00:40:35.500 --> 00:41:46.440
That we should I mean, this is kind of stating the obvious, but I think we should prioritize, not poisoning ourselves. And yeah, I mean, this is in the kind of world I would want to see. This would be the kind of thing that's the headline. And that is on top of everybody's news and on the scroll and everything like that. I know this is, but this isn't sexy. Like a lot of topics that are more gossipy type topics, you know, even historic stuff, but stuff that, again, isn't about us being poisoned. You know, to me, like, I don't know, avoiding poison is kind of a priority that I think we should all we could all get behind. And so I was encouraged to see the survey, to put this to to to actually say, okay, here's the scale of the problem. And so now we have EPA estimating that the nation's going to need 625 billion to revamp the drinking water infrastructure. And that's a like a third increase from the estimate and from the last assessment, which was four years ago, four years prior to this one. Now, here's the thing. We need to do this right away, because if we wait another four years, you're going to poison a lot more people and then it's going to be another 50, it's going to be 50% higher and so forth.
100
00:41:46.440 --> 00:42:44.050
So to me, what I see here is that we need to have a level of urgency in dealing with something like this, because this is one of those kind of problems that is bad Every day it continues and it gets more expensive the longer that it takes to to to fix it. And so we borrow money all the time. This is I mean, you're a homeowner. You got a hole in your roof if you just, you know, like, hey, well, now it's all good. I'm just going to go on vacation or no, it's all good. I'm just going to do this and that. And then it's like, you know what? That hole in your roof is going to get worse and it's going to cause other problems and stuff like that. Like, that's the kind of thing you need to prioritize. And so as a nation, we don't know how else to say this. Like this should be, you know, high, high, high, super, high priority. And it's like, okay, well, Biden administration says we got we'll use 15 billion to start working on it. And it's like, well, that's a big difference between what what their what that is from the the bipartisan infrastructure bill to the 625. They're saying it needs to fix it.
101
00:42:44.050 --> 00:43:30.010
But think about it as you're talking. This is just another I'm sorry, I'm going to beat this drum again because I can't stop, I guess, leadership. Seriously, think about now I'm going to beat up everybody in our government, right? Meaning both parties, because I know they both have their messages and they both have their priorities and they both seem to say that they're so important to the American. Each side, right. This, to me is something that I would think the majority of Americans, if asked fairly without any, you know, political flowery on the questions. Just do you think fresh drinking water is important for you and your fellow citizens? I find I think it'd be very difficult for me to find somebody to say no. Right. Um, I.
102
00:43:30.010 --> 00:43:42.490
Don't know, man. There have been studies that showed like, for example, if you're having there are a lot of people that don't want the government to provide things if other people who they don't like get them as well. Well, that's why we've seen that with health care.
103
00:43:42.610 --> 00:45:31.670
We've seen that's why I'm saying leadership is important. That's my point on leadership. Right. Because number one, leadership worked. And I say this favorably to the leaders at the time with something like the CARES Act, right. When we needed to spend 5 trillion in an emergency and push out all this money and idle money, it worked, right? Now there's unintended consequences of inflation and all that. I get it. But we plugged a hole that was needed at the time and everybody got on board to do it immediately. And it worked in terms of the efficiency of it. Right. And the speed at which it got out now. Everybody could do the same thing and say we need to now deal with the infrastructure of our drinking water, which we haven't dealt with. From Flint to Jackson. There's been play. I know that there's different reasons why some of these cities have issues with the water, but we have infrastructure issues in a country that shouldn't have these issues. Right. Meaning the wealth of our country. But because. Because I will say this. Let me say it a different way. Our former president wanted to spend a lot of money and could have been this high on building a wall across the border. Immediately. 50% of Americans were with it. So my point is, leadership does matter because the right leader could move this country in a position where they would see their pipes and drinking water not being poisoned. And certain, you know, percentage of our population drinking water and also the future of kids who are growing up drinking this stuff and not developing properly, that they could get behind that, too. But because we don't have leadership in this country. That wants to come together and focus on actually fixing problems like this and only focus on being on TV and their social media accounts. Here we are. And I think you're right, 15 billion.
104
00:45:31.680 --> 00:45:41.900
I think that's a that's not a fair appraisal of the situation like the Biden administration is actually the reason why we know this is because of the Biden administration. And they do. They are out there talking about trying to fix it. And so that's that's.
105
00:45:41.900 --> 00:45:44.330
That's one group of leaders. But they need another problem.
106
00:45:44.330 --> 00:46:50.460
Is that that they aren't either to their fault or the fault of their political opposition. They they can't get everybody on board with this or they can't get everybody on board with. Well, but but they also can't get everybody on board with And again, you can blame who you want to blame, whether that they can't get them on board because they're not persuasive enough or the other side is just an intransigent and just doesn't want to get on board on anything that Biden has to do. But the debt issue we got coming up to the debt ceiling and it's like, okay, well, we're just not we're just everybody just put, you know, put their hands in their pockets and kind of, hey, we're just not going to agree on this. We're not going to try to move forward. I agree with you that it's a leadership issue to some degree. But I think see, ultimately, we can't pass the buck like that. This is a democratic society. And so we had these are elected representatives and they are there. We have a lot of people who are in Congress who aren't there to govern. And that is because you rile rile on the people that are that care more about their social media accounts. But that's what their people want. The people that voted them there they go. They don't hire legislative aides. They hire communication aides.
107
00:46:50.460 --> 00:47:52.320
And so that's a bigger problem in terms of we have people in our government who aren't there to govern. And so this kind of stuff is a waste of their time. This doesn't trend on Twitter. And so that's a bigger issue. But ultimately right now, I don't think in any respect, I think that the level of urgency or priority, like we don't see people go to the mat over something like this. And that's what I would like to see. Honestly. I just want to see like, hey, this is the type of issue, if you do want to expose someone as not being interested in governing, then maybe use an issue like this. Don't use a high charged issue, something like that. Expose them with something like this. Say, look, hey, let's let's not have lead in our pipes, you know, and some people still won't get behind it. Like I said, there are plenty of people in our country that would drink water out of lead pipes as long as the people they don't like also have to do it, You know, and that's a that's a problem in itself. But luckily, we can usually outvote those kind of people. But I think the issue is bigger than that. Like, again, who's going to go to the mat so that we don't poison ourselves with our water? And that's the question I have, really.
108
00:47:52.860 --> 00:48:45.200
Yeah. I mean, look, I guess as you make your eloquent statement, I'll say. I guess the American people need to go to the mat because that's what I'm saying. That's exactly what. But see, but that's my problem. I mean, I didn't plan on this, but I guess I'll I can see some connections here. That's my concern about in our first topic. Honestly, this, this, this, this I stuff because. And then not to get on a rant about politics and all that. But the reality is we both know the most of the public is not getting this information right, because they're either, like you said, they're either not getting any of it or they're being told why it's somebody's fault that this is happening, not not being told that, okay, we've got 100 years of lead pipes in this country and that's just how it is. You know, like it's and they're being the interesting thing is, James, for me and you, you know, the number one state with the highest amount is Florida. Yeah, I.
109
00:48:45.200 --> 00:48:46.850
Know. You sent me an article.
110
00:48:46.880 --> 00:49:01.190
25% of it. Yeah. So that's my point. Like, we could be drinking this tap water and it's not even just drinking. They take a shower with this stuff, right? Like, I mean, I don't want lead in my system. And whether on my skin, on my in my down my throat, whatever. So the bottom line told us.
111
00:49:01.190 --> 00:49:03.470
That that's bad. I know, but poison.
112
00:49:03.470 --> 00:49:31.710
But but but, you know, my governor is out there fighting Disney. Right. And trying to make make make bones about and his bone of fetus about how tough he is against the wall crowd. And so my point is, is I recognize our governor can walk and chew gum at the same time. Right. He can have his culture wars and all that and also deal with infrastructure and things like that. But it would be nice if our leadership would would would kind of lead off things like this as well sometimes, and not just always the pointing fingers. And so that's really my point.
113
00:49:31.710 --> 00:50:44.220
But also to see this is where you talk about the political leadership. This is where the tech leaders are foolish because if they wanted to emphasize this kind of stuff on their platforms versus the stuff that makes them more money because it's more engaging and it gets people angry and stuff like that, then they would they could do that. But that's not their priority. Their priority isn't creating a society where we can be sober about things like AI and we can be sober about things like, Hey, let's not drink poison. They would rather make a lot of money and drink bottled water and let everybody else figure it out and then call on people to voluntarily hold off on doing everything, knowing good and damn well that they're not going to hold off on doing it all. So that's the other, you know, kind of sleight of hand that's going on there as well. So nonetheless, the the ability to solve our problems, we do have we don't have the will. And as with the previous topic, as with this topic, clearly we need more pain in order to get the will. And I say that collectively as a people. And that's just really unfortunate because a lot of these things are foreseeable. And yes, good leadership that's a hallmark of good leadership is they can avoid foreseeable issues because there's going to be plenty of issues that are unforeseeable you got to deal with anyway.
114
00:50:44.520 --> 00:51:03.170
Yeah, no, that's that's why I don't have much hope between what we're talking about now in terms of important issues that we would want our leadership to be focused on, like infrastructure. And, you know, let's not get on all the roads and bridges that are collapsing. Right. And the trains that are derailing and all that kind of stuff. That is just again.
115
00:51:03.660 --> 00:51:07.620
I'm not carrying all these toxins that we don't even know what's going on.
116
00:51:07.620 --> 00:51:55.670
I'm not I'm not here to beat anyone up. Like that's why the point is, is that. You know, if we take this stuff serious and we have the right guardrails that we should have, then these things should be constantly being addressed and discussed with the public. Because think about it, infrastructure is constantly corroding because nature doesn't stop. And so and so these are things that we just have to understand are part of our of our spend and our cost and our way of living. But I just again, I didn't I didn't plan on these two topics actually having some sort of connection. But to me it's like, well, we discussed in the first topic is what concerns me because you know, it's just going to the incentives aren't going to be there to focus on things like this is going to be to be there to continue to get people just looking at other directions so that the current people in power can stay in power.
117
00:51:55.680 --> 00:52:24.850
Yeah. Yeah. Well, and I'll say this, I mean, I do commend the administration for for this, you know, for this study coming out and the assessment and all like the information. I mean, I'll give them credit for that, you know, so but now, like I said, I'd like to see them spring into action and try to make this an issue, you know, like that's just just like you got to appraise the situation, you know. So, yes, you got first step done. But steps I'd like to through ten involves solving the problem and I'd like to see some there So that's good.
118
00:52:24.850 --> 00:52:28.900
I'd like to just know who's getting those contracts so I can buy their stocks. But yeah, we'll keep it moving.
119
00:52:30.070 --> 00:52:39.850
So we can wrap it up from there. We appreciate everybody for joining us on this episode of Call It Like I See It. Subscribe to the podcast, rate it, review us. Tell us what you think. Send it to a friend. Until next time, I'm James Keys.
120
00:52:39.850 --> 00:52:41.170
I'm Tunde Ogunlana.
121
00:52:41.350 --> 00:52:42.580
All right, we'll talk to you next time.