Hands-on With Gemini Live
August 21, 20241:02:13

Hands-on With Gemini Live

[00:00:00] This is AI Inside, episode 31, recorded Wednesday, August 21st, 2024. Hands-on with Gemini Live.

[00:00:11] This episode of AI Inside is made possible by our wonderful patrons at patreon.com slash AI Inside Show.

[00:00:17] If you like what you hear, head on over and support us directly, and thank you for making independent podcasting possible.

[00:00:34] Hello, everybody. Welcome to another episode of AI Inside, the show where we take a look at the AI that is layered like a nice, delicious lasagna through so many things that we use and hopefully do not eat.

[00:00:46] I am one of your hosts, Jason Howell, joined as always by my co-host, Jeff Jarvis. How are you doing, Jeff?

[00:00:52] Hey there. So I've started using the lasagna thing all over now.

[00:00:56] Yeah. I mean, it makes sense.

[00:00:58] Or Napoleon, but that might be a little more confusing.

[00:01:01] Or Napoleon or Neapolitan.

[00:01:05] That's ice cream. You could do that, too.

[00:01:07] Napoleon is the up-key history thing, yeah.

[00:01:09] But I think lasagna works.

[00:01:11] I think lasagna works.

[00:01:13] For some reason, we feel the need to describe our AI as if it's food.

[00:01:17] Yes.

[00:01:17] I don't know. Maybe that has something to do with being hungry.

[00:01:20] It's a Big Mac.

[00:01:24] It's a Big Mac.

[00:01:26] Okay. For a second there, I thought you said Big Mac, and I was like,

[00:01:30] where is he going with this? It's a Big Mac. That's true.

[00:01:34] You've got the bread on the middle, the cheese.

[00:01:36] Right. You've got the cheese layer, the sauce layer, the whole lettuce layer.

[00:01:41] So many delicious layers.

[00:01:42] Pickles. Pickles. What's the pickle layer? I like this. I might switch. I don't know.

[00:01:46] There we go.

[00:01:46] Well, every week we can have a different analogy.

[00:01:48] Yeah.

[00:01:49] That's the beauty of having a weekly show. It's really up to us to choose.

[00:01:53] Good to see you, Jeff. We've got a lot of fun stuff to talk about today.

[00:01:56] Real quick, though, before we get started, a huge thank you to those of you who support us each and every week on Patreon.

[00:02:02] We could not do this show without you.

[00:02:04] Patreon.com slash AI Inside Show.

[00:02:07] And you could have your name called out at the top of the show like Tom Roughly.

[00:02:12] Thank you, Tom, for being a supporter on the Patreon.

[00:02:15] Patreon.com slash AI Inside Show.

[00:02:18] And if you are not a subscriber to the podcast in general yet, do it right now.

[00:02:22] Go over, you know, open up your podcatcher, subscribe.

[00:02:24] Even if you enjoy watching it live on the YouTube channel when we stream it, you can still be subscribed to the podcast for those random weeks where you miss the video live stream or you miss the video download a little bit later.

[00:02:38] So there we go.

[00:02:40] So, Jeff, this is one of those rare moments where in the world of journalistic parlance, it is a thing called embargo.

[00:02:54] And an embargo has lifted for a thing that I got last week when I was at the Made by Google event.

[00:03:01] I got my hands on the Pixel 9 phone.

[00:03:04] Oh, FOMO jealous, FOMO jealous.

[00:03:06] I know.

[00:03:07] It was pretty nice.

[00:03:08] I had to, of course, sign a loan agreement.

[00:03:11] So this is not my phone.

[00:03:12] I'm giving it back to Google when I'm all done with it.

[00:03:14] Probably a while, a ways down the road, though, because one of the big things about the Pixel 9 and just, you know, what Google is doing in general, and we've certainly talked about it a lot on this show, is they're really going all in, like head first on AI everything.

[00:03:31] And that's why we're even talking about it on today's show at all.

[00:03:35] And I thought it'd show off, if you're interested, Jeff, some of the stuff that I've seen and interacted with so far in my short amount of time.

[00:03:44] It's only been six days.

[00:03:45] I did actually publish a video just moments ago, actually, at the time of this recording to my YouTube channel with a long kind of preview of the Pixel 9 from a hardware perspective, yes, but also from an AI perspective.

[00:04:04] It's not a review because, like I said, I've only had this device for six days now.

[00:04:08] But there's a lot of really cool AI features on it.

[00:04:12] And some of the stuff that we've talked about in the past that I thought, you know, we could kind of show off.

[00:04:17] And I don't know.

[00:04:18] I'm curious to hear what you think about it.

[00:04:22] So let me see here.

[00:04:23] Let me see if I can rank.

[00:04:24] And probably for audio listeners, this might be a little bit more interesting if you are watching videos.

[00:04:30] So youtube.com slash at techsploder if you want to check out the video of this stuff.

[00:04:36] But I thought.

[00:04:37] Let me ask you a bottom line first.

[00:04:39] Yeah.

[00:04:39] Real quick.

[00:04:40] Since I'm considering, I've got a six myself.

[00:04:42] I'm considering it.

[00:04:43] After a week on it.

[00:04:44] Oh my goodness, you're on six.

[00:04:45] I'm on six.

[00:04:46] After a week, just bottom line, if you were me, would you buy the nine?

[00:04:51] I feel like you are so dialed in on technology.

[00:04:57] So you're not the everyday user, let's say.

[00:05:00] If you were an everyday user and you had the six, do you have to upgrade to the nine?

[00:05:05] Maybe not.

[00:05:06] You also have an interest in artificial intelligence and everything that lies there.

[00:05:11] That's why we do this show.

[00:05:13] I would say for you, sure.

[00:05:14] And I think that they have some pretty good trade-in deals.

[00:05:18] Even if you have the six, you're probably going to get a good trade-in value for it through Google, especially if you do that now.

[00:05:24] But I mean, these devices, they have seven years of update promise.

[00:05:27] So you get this.

[00:05:30] The nine has the Tensor G4, which the rest of the Pixel 9 series has.

[00:05:36] So it's not like because this is the baseline model, you get less processor.

[00:05:40] You do get less RAM.

[00:05:41] You get 12 gigs instead of 16 gigs.

[00:05:44] But still, it would be an excellent step up from the six.

[00:05:48] The six is now.

[00:05:49] Will you write a note to my wife justifying this?

[00:05:54] Well, I have a feeling we're going to convince her.

[00:05:57] Yeah.

[00:05:57] You're going to be able to convince her a lot easier than I am.

[00:06:00] That's my gut.

[00:06:02] But yeah, I think it's a really wonderful phone and I've really enjoyed it so far.

[00:06:08] So one of the things that I think, Jeff, that you are probably going to be interested in, if I can get this up here.

[00:06:14] I've got my little demo set up here so hopefully things look okay, is Gemini Live, which is part of the premium kind of subscription.

[00:06:25] If you buy the Pixel 9s not including the baseline, any of the other Pixel 9s, you get a year's worth of Gemini Live included.

[00:06:34] Which would cost $20 a month, right?

[00:06:36] That's right.

[00:06:37] That's right.

[00:06:37] Otherwise, it would cost $20 a month.

[00:06:39] But so this is going to be interesting.

[00:06:42] This is one of the demos that I did in my video, but I'll see if I can kind of recreate it here and do it in a way that you can hear all the audio and everything.

[00:06:50] So I'm going to kick into Gemini Live mode and I'll have a conversation.

[00:06:55] Okay.

[00:06:56] Hi, Gemini.

[00:06:57] How are you doing?

[00:07:00] Well, thanks for asking.

[00:07:02] It's always nice to hear from someone new.

[00:07:04] What's on your mind today?

[00:07:05] Well, I have a phone sitting on my table and I'd like to describe it to you and see if you can help me identify it.

[00:07:11] I have no idea what phone this is.

[00:07:12] Can you help me?

[00:07:15] I can definitely help you figure out what phone you have.

[00:07:18] Try describing it to me and I'll do my best to narrow down the possibilities.

[00:07:22] Feel free to tell me about its size, shape, color, and any features you notice.

[00:07:26] Okay.

[00:07:27] Well, I've got a lot of information for you.

[00:07:29] It is a standard kind of rectangular smartphone.

[00:07:33] You know, on the front, it looks pretty nondescript.

[00:07:35] It has a Windows logo, a back arrow button, a search button.

[00:07:39] When I flip it over, the camera bump says 41 megapixel PureView Zeiss.

[00:07:45] It's kind of a circular black camera bump.

[00:07:48] And on the middle of the back of the phone has an insignia that says Android.

[00:07:54] Oh, and by the way, this phone is bright yellow.

[00:07:57] It is a screaming yellow color.

[00:07:59] With all this information, what can you tell me about this phone?

[00:08:02] What is it?

[00:08:05] Okay.

[00:08:05] Based on your description, it sounds like you might have a Nokia Lumia 1020.

[00:08:09] It definitely matches the bright yellow color, the high resolution PureView camera with Zeiss

[00:08:14] optics, and the Windows phone interface.

[00:08:17] Is that a possibility?

[00:08:18] That's a possibility.

[00:08:19] I'm curious to know why this has Windows logo on the front and an Android logo on the back.

[00:08:26] Can you tell me why that is?

[00:08:30] Yeah, that's a great question.

[00:08:31] Historically, Nokia used to make phones with Windows Phone, but at some point they switched to Android.

[00:08:36] So it's possible this phone is a bit of a mix.

[00:08:38] Maybe it was one of the last Windows models.

[00:08:40] What year did it come from?

[00:08:42] ...to run Android.

[00:08:42] Without knowing more, it's hard to say for sure.

[00:08:45] Can you tell me what year this phone was released?

[00:08:49] Nokia Lumia 1020 was released in July 2013.

[00:08:53] July 2013.

[00:08:54] All right.

[00:08:55] Thank you so much.

[00:08:55] It was fun talking with you.

[00:08:58] Welcome.

[00:08:59] Thanks for letting me help.

[00:09:00] Have a great day.

[00:09:01] Okay.

[00:09:02] Could you have taken a picture of it, Jason?

[00:09:05] No.

[00:09:05] So that's the thing.

[00:09:06] Because there is no multimodal support right now, I can't do that.

[00:09:09] Oh, there isn't yet.

[00:09:10] Ah, okay.

[00:09:11] That's the Project Astra that's coming somewhere down the line.

[00:09:15] When I did this demo for the video that I just published, I also identified some other phones and everything, kind of put it through its paces.

[00:09:24] It had a sort of a different answer as far as why this had an Android on the back and Windows on the front, that it could be dual booted.

[00:09:33] It was a little more specific.

[00:09:34] It was like, yeah, this is a Windows phone, but the kind of unique quality of this is that you could actually set it up to be dual booted into Android.

[00:09:42] And this one didn't quite go there, but I guess that just shows you that it's always kind of different depending on the information.

[00:09:48] But I get how conversational it is.

[00:09:50] And you've always been using this enough already that the way you're interacting with it changes too.

[00:09:56] Right?

[00:09:57] We're used to a more search box mechanism.

[00:10:00] Uh-huh.

[00:10:01] And you were more conversational and it was more conversational.

[00:10:03] Totally.

[00:10:04] Yeah, and I think that's kind of the beauty of something like that.

[00:10:07] The latency is still long for my preference.

[00:10:11] And I think that's, you know, over time that's going to come down.

[00:10:13] We saw a very short latency when OpenAI did its ChatGPTO demonstration, however many months ago.

[00:10:19] And that's going to be really important too.

[00:10:21] So being able to be conversational with this in a way to where I don't have to think about syntax and I also don't have to have those long, awkward pauses is going to go really far.

[00:10:32] Once that happens and then Project Astra, that multimodal capability comes along, I'm super curious to see kind of how this technology builds out because it's pretty promising.

[00:10:42] I'm amused you're complaining about the latency.

[00:10:44] Oh, I had to go through the entirety of all knowledge about phones and I came back to you and you had to wait two seconds?

[00:10:51] You schmuck.

[00:10:52] Jeez.

[00:10:53] What do you want?

[00:10:53] Yeah, I know.

[00:10:54] We get spoiled.

[00:10:56] It's silly.

[00:10:57] Like totally.

[00:10:58] Yeah, what is that?

[00:10:59] I'm thinking of some comedian that did the thing about, you know, we still find a way to complain about all things.

[00:11:05] Yeah, you're on the plane and the Wi-Fi goes out and you're strapped to a piece of metal going through the sky and you're complaining that you can't get in connection with your grandmother for two minutes.

[00:11:15] Jeez.

[00:11:16] Exactly.

[00:11:17] Yeah.

[00:11:17] I mean, but my complaint isn't that, oh my goodness, I can't believe it can't give me the entirety of knowledge in two seconds.

[00:11:25] It's more from the perspective of when I talk to a human being, that perceived latency that we have in our communication is much shorter.

[00:11:34] And if there is a target that I think, you know, companies like Google are putting in their crosshairs as far as technology is concerned, they're aiming to, in some ways, replicate that type of experience.

[00:11:47] And one of the ways that they're going to have to do that is that latency.

[00:11:51] And we've seen OpenAI prove that it can be done.

[00:11:54] It certainly can be improved, you know.

[00:11:56] So we'll see it.

[00:11:58] I mean, it didn't slow, you know, it didn't ruin the second version of this demo, maybe the first one.

[00:12:04] But I think it will improve the whole experience.

[00:12:08] If they're going to add in fluff words.

[00:12:11] Well, gee, let me look that up while it's doing something so that it qualifies you.

[00:12:16] Oh, yeah.

[00:12:17] No, no.

[00:12:18] What you'll hear is, you know, you ever make those calls and you do the voice, you know, you answer with your voice.

[00:12:27] Well, one of my standard sticks as a teacher is somebody will ask me a good question and I will say, oh, that's a good question, which is how professors buy time to figure out an answer.

[00:13:03] Yeah.

[00:13:05] It's like a good question, which is Pixel Studio, which is essentially it's kind of like a mid journey.

[00:13:10] It's not a mid journey, you know, AI generation, image generation.

[00:13:14] It's not using mid journey or stable diffusion or anything like that.

[00:13:17] It's Google's own thing.

[00:13:19] But it's the same type of thing, right?

[00:13:21] Like you could create a dog wearing a Googler hat or something like that.

[00:13:31] And I'm assuming it'll come back.

[00:13:33] It's usually pretty fast.

[00:13:35] It's all Google.

[00:13:37] Oh, you know, it doesn't get the words quite right.

[00:13:40] But it's but it's image generation on your device.

[00:13:43] It's it's not sending any of that to the cloud.

[00:13:45] It's all happening on device.

[00:13:46] Actually, the first time you launch this app, it downloads the model to your phone right there.

[00:13:52] If you turned off your Wi-Fi and sell right now, it would still work.

[00:13:57] Yeah, it should.

[00:13:58] But we can we can go ahead and demonstrate this.

[00:14:01] So let's go into T-Mobile.

[00:14:03] Turn off my mobile data.

[00:14:05] We'll go back over here and I'll hit the try again button.

[00:14:10] Oh, no, you're right.

[00:14:11] Not connected to the Internet.

[00:14:13] I had someone at the event tell me that this was all done without without not not in the cloud.

[00:14:23] But this puts some doubt on that.

[00:14:25] Oh, that's so interesting.

[00:14:26] OK, so then I want to know if that's the case.

[00:14:30] Thank you for having me do that.

[00:14:31] How local?

[00:14:32] I want to know what's going on.

[00:14:33] That's the yeah.

[00:14:35] What?

[00:14:36] Because it downloads the model when you when you run the app for the first time.

[00:14:41] It actually it says it's downloading the the generation model to your phone.

[00:14:46] I'm so happy you had me do that because that is really interesting.

[00:14:51] That that totally goes counter to what I was told.

[00:14:53] Oh, so I probably maybe I misunderstood.

[00:14:57] Maybe there's some sort of subtlety.

[00:14:59] Maybe the image generation is happening, but there's some sort of query happening for part of it.

[00:15:04] I don't know.

[00:15:05] I'm really confused.

[00:15:06] And now I feel like I need to reach out to Google and get some clarification.

[00:15:09] And I wonder, too, when it comes to my mind, because I just don't understand.

[00:15:12] We're so used to a database front end and database world where that's the interaction.

[00:15:17] And this is not the case.

[00:15:18] Right.

[00:15:19] But what is it going?

[00:15:21] I mean, what does it need to return something to you?

[00:15:24] Does it need to say, I don't know what a dog looks like.

[00:15:27] Can you send me one so I can add that in here?

[00:15:29] I know it's a dog.

[00:15:29] It's all I'm going to ask you for is a dog.

[00:15:31] I don't know.

[00:15:32] I mean, yeah.

[00:15:32] No idea how that operates.

[00:15:34] Yeah.

[00:15:35] Yeah.

[00:15:35] Neither do I.

[00:15:36] I'm so perplexed.

[00:15:37] I truly and completely expected for that to work based on what I was told at the event.

[00:15:43] We learned something.

[00:15:45] We learned something new together.

[00:15:47] But there's a bunch of other features, something called Reimagine, which is a Google Photos kind of part of the editing suite in Google Photos.

[00:15:55] So you could take one of your photos that's like crooked, you know, maybe it's lopsided was the demo that they showed.

[00:16:00] The foot was like the shoe was cut off because the photo was cropped incorrectly.

[00:16:05] And you can say, you know, auto frame this and it will automatically correct it and generate the missing part of the shoe.

[00:16:13] You could turn the ground into an ocean or into daffodils or whatever, things like that.

[00:16:18] Have you used the, you know, again, reality becoming more about imagination than anything else.

[00:16:23] Sorry.

[00:16:23] Have you used pixel screenshots?

[00:16:26] Oh, yes, I have.

[00:16:27] Actually, pixel screenshots.

[00:16:28] That's the one that everybody is eager for.

[00:16:30] Yeah.

[00:16:32] Pixel screenshots is something that I'm really, really excited about, actually.

[00:16:36] So I'm happy you brought it up.

[00:16:38] While I was at the event and I was talking to the guy who, you know, the product manager, and I don't have his name right in front of me.

[00:16:45] Um, I brought up to him that it reminded me of like an on-device personalized Pinterest.

[00:16:51] And he was like, oh my goodness, that's a great comparison.

[00:16:54] That's exactly what it is.

[00:16:55] Because what I've noticed with pixel screenshots and using it is it really does kind of change how you use pixel or screenshots on your device.

[00:17:03] So instead of screenshots being this disposable, like one time and move on with my life sort of thing, now I start using screenshots as a way to remember things long term instead of just short term.

[00:17:15] Because you can go ahead and throw things into collections.

[00:17:20] So, you know, this microphone that I was looking at, you know, might go into music production.

[00:17:25] And then I can remember it later because I ran across an ad on Facebook for it.

[00:17:30] And, you know, if you're getting to these things from different places, then you end up getting, you can see the little Facebook logo down at the bottom of that.

[00:17:39] It remembers the linkages.

[00:17:41] So given that source still exists, it can take you back to the source from where you got that information.

[00:17:47] It just really kind of changes my or has changed my relationship with screenshots on my device and allows me to organize them.

[00:17:56] And not only that, you know, like this is a gardening or I can't remember what this was.

[00:18:00] This was a random screenshot from my history that I reprocessed.

[00:18:04] Oh, okay.

[00:18:05] Couldn't process that screenshot.

[00:18:06] Well, you're also offline right now, right?

[00:18:08] Did you go back online?

[00:18:11] Oh, well, yeah.

[00:18:13] But some of the stuff I already processed is the thing.

[00:18:16] See, like this, this was a screenshot of a recipe.

[00:18:20] But meanwhile, the screenshot doesn't have any of the recipe.

[00:18:22] It has that QR code.

[00:18:23] This is actually a really interesting example.

[00:18:27] But yet it was able to come up with the name of the recipe, Tacos Gobernador, which is a great recipe.

[00:18:34] I need to make that again.

[00:18:35] The author, the guy who created the recipe, you know, it describes the image.

[00:18:40] A person is chopping green peppers on a wooden board.

[00:18:42] That might be great for accessibility use cases.

[00:18:46] And then it has the QR code, which it describes.

[00:18:49] But it contextualizes it more than just a simple image.

[00:18:54] The one thing I'm curious about, Jason, is that this is for screenshots.

[00:18:57] That is to say a freeze drying of your screen.

[00:19:01] I would almost want to be able to do that seamlessly with photos, too.

[00:19:06] That I'm seeing through the, I'm seeing, in essence, through the phone to something.

[00:19:12] I'd want to interact with that the same way I would interact with something that's on the screen.

[00:19:18] But that's not the way it operates.

[00:19:19] It only does screen saves.

[00:19:22] Well, the Screenshots app only does that.

[00:19:25] I do wonder...

[00:19:26] I guess I could have my camera on and do a...

[00:19:29] That would be stupid.

[00:19:29] Do a screenshot of my camera.

[00:19:30] You just do an image.

[00:19:33] Yeah, but if what you're looking for is kind of the descriptor of what it's looking at.

[00:19:39] I'm looking around my space to try and find something that I can throw in front of it to read.

[00:19:46] Let's see here.

[00:19:48] Pardon my...

[00:19:49] God, I don't even know.

[00:19:50] Okay, so here's a live streaming tool.

[00:19:55] So if I went to my camera...

[00:19:56] Oh, and actually, I do need to put on...

[00:19:58] I need to put my internet back on, right?

[00:20:01] I need to give it data again.

[00:20:03] So we'll go ahead and do that so I don't confuse it.

[00:20:06] Okay, so now if I go to my camera,

[00:20:10] and I position the image on my camera,

[00:20:14] and I take a screenshot of it,

[00:20:18] boop, you can see that it flashed.

[00:20:21] That usually means...

[00:20:22] That little thing that came up,

[00:20:24] that usually means that it's in the Screenshots app,

[00:20:27] and it's processing.

[00:20:28] Let's see here.

[00:20:30] So yeah, so now it's working.

[00:20:31] Okay.

[00:20:31] So I use the camera's pass-through to get it into the Screenshot app.

[00:20:35] It's now working.

[00:20:36] It doesn't work 100% of the time.

[00:20:38] Sometimes it comes back,

[00:20:39] and it's like, I can't figure it out.

[00:20:41] And there's no way when it does that,

[00:20:43] that I've found to re-trigger it.

[00:20:45] It's just like, yeah, I don't know what to do with this.

[00:20:46] But in this case, it did.

[00:20:48] It says the product is the G-Stream M2 dual HDMI switcher.

[00:20:52] Box design, black with white text and a red logo.

[00:20:56] Does it have a red logo?

[00:20:57] I don't...

[00:20:58] Oh, I see.

[00:20:59] Nah, it's more like purple.

[00:21:01] Buttons on the switch.

[00:21:02] Oh, so it identifies the buttons on the switch,

[00:21:05] or at least it tries to.

[00:21:06] It doesn't get it perfect.

[00:21:07] Image on the pox.

[00:21:09] What does that mean?

[00:21:10] A person with a yellow hat and a black jacket is sitting on a couch.

[00:21:16] Wait a minute.

[00:21:18] Oh, wait.

[00:21:19] Hold on.

[00:21:20] Could that be...

[00:21:21] No?

[00:21:22] Oh, that's so weird.

[00:21:23] I wonder how it came up with that.

[00:21:25] Wow, that was off the mark.

[00:21:29] Available modes.

[00:21:30] Oh, because it's also...

[00:21:32] Okay, so this is describing...

[00:21:34] I don't know what the whole couch thing was,

[00:21:36] but it's also describing the camera elements in the screenshot.

[00:21:39] So, yes, it's that, but it's also looking at this stuff,

[00:21:43] which is the control for the camera itself.

[00:21:46] Photo, portrait, night sight.

[00:21:48] I mean, it's describing...

[00:21:49] Attempting to describe everything.

[00:21:51] So it says available modes.

[00:21:52] Portrait, photo, night sight.

[00:21:56] So, I mean, it's everything on the screen.

[00:21:58] You know, it doesn't know what you're trying to identify, I guess.

[00:22:02] But anyway, so yeah, you can do that.

[00:22:04] It's probably not perfect.

[00:22:07] Yeah.

[00:22:07] Yeah.

[00:22:08] Anyways, there's other features.

[00:22:09] We don't need to go into all of them right now,

[00:22:12] but, you know, I did a full kind of walkthrough

[00:22:15] on the TechSploder YouTube channel.

[00:22:17] So, youtube.com slash at TechSploder.

[00:22:19] You get a little insight into the Pixel 9 itself.

[00:22:24] Jeff, I think if you watch it, you'll probably be like,

[00:22:26] okay, I can be down for that.

[00:22:28] Do you like a large phone or a small phone?

[00:22:31] Yeah, so this is a little bit smaller.

[00:22:33] You probably then want to go for the Pro or the Pro XL.

[00:22:36] Well, did they give you a loaner of the watch too?

[00:22:39] No.

[00:22:40] No.

[00:22:40] This was it.

[00:22:42] This was it.

[00:22:42] And I'm not entirely certain, but if I want to review the other things,

[00:22:47] I might have to buy them, which is interesting,

[00:22:50] because I don't know that I have the money to buy them.

[00:22:52] But I want to play around with them,

[00:22:54] and I'll keep asking Google for a review unit.

[00:22:56] I mean, at this stage, it's the tool that I have

[00:22:59] is to get a review unit from them

[00:23:01] and borrow it for a bit to give it a spin.

[00:23:03] Yep.

[00:23:04] Good.

[00:23:04] Thank you, Jason.

[00:23:05] So we'll see.

[00:23:07] But, yeah.

[00:23:08] So everybody, check that out,

[00:23:10] and thank you for that.

[00:23:12] And in a moment, we're going to take a quick break,

[00:23:14] and in a moment, we've got a bunch of newsy stuff

[00:23:17] that has nothing to do with Gemini or anything like that.

[00:23:21] That's coming up.

[00:23:25] All right.

[00:23:25] From products to politics.

[00:23:28] Isn't that a fun transition?

[00:23:30] California's SB 1047 AI safety bill.

[00:23:34] We discussed this on the show last week.

[00:23:37] Got some pretty big adjustments,

[00:23:39] I think, shortly after the show.

[00:23:41] This happened on the 15th of August,

[00:23:42] so almost a full week ago.

[00:23:44] So right after we talked about it.

[00:23:47] Basically, they're looking at the concerns

[00:23:48] of AI companies and open source developers,

[00:23:51] and they've made some key changes,

[00:23:53] things like removing the attorney general's ability

[00:23:55] to sue AI companies preemptively,

[00:24:00] they're also relaxing the certification requirements,

[00:24:03] they're adjusting the safety standards

[00:24:05] from, quote, reasonable assurance to reasonable care.

[00:24:10] I don't know, understand too deeply

[00:24:13] what the difference between those two things are.

[00:24:15] Maybe you go, but, yeah.

[00:24:18] Well, so Kevin Warbach,

[00:24:21] who's a really good professor

[00:24:22] at University of Pennsylvania Business School,

[00:24:26] whose name I'm,

[00:24:27] which I'm suddenly forgetting,

[00:24:29] Wharton.

[00:24:29] And he has a substack called

[00:24:34] The Road to Accountable AI.

[00:24:36] Okay.

[00:24:37] So Kevin wrote about this,

[00:24:39] how I learned to stop worrying

[00:24:41] and love the world's most controversial AI bill.

[00:24:43] And I saw it because David Weinberger,

[00:24:45] who's somebody I admire,

[00:24:46] greatly came in and said,

[00:24:47] oh, can I have you convinced me, Kevin?

[00:24:48] I'm still not convinced.

[00:24:50] But there are changes in the bill

[00:24:53] that on the one hand seem better,

[00:24:54] on the other hand only make it more vague.

[00:24:57] So for example,

[00:24:58] you used to have to be able,

[00:24:59] you used to have to sign a statement

[00:25:02] saying that you're certifying

[00:25:03] that your model is safe

[00:25:05] under penalty of criminal perjury.

[00:25:08] That's no longer the case.

[00:25:09] Good.

[00:25:11] Well, so now you've got to exhibit

[00:25:12] a duty of care.

[00:25:14] Well, that sounds easy,

[00:25:15] but the problem with that is it's vague.

[00:25:17] Same problem that exists in UK,

[00:25:19] online safety,

[00:25:21] formerly known as online harms bill,

[00:25:23] is a duty of care.

[00:25:24] It means, well, you should have tried harder.

[00:25:25] You should have seen this coming.

[00:25:28] Where's the line there?

[00:25:29] Yeah, it's really hard to enforce a law like that,

[00:25:32] which to my mind will probably make it unconstitutional,

[00:25:35] but I'm not a lawyer,

[00:25:37] because it's kind of jumbling,

[00:25:41] trying to figure out

[00:25:42] how do we hold somebody responsible for this stuff?

[00:25:45] And again, what we've talked about in the show before

[00:25:48] is this still does it at the model layer,

[00:25:50] rather than the application layer

[00:25:52] or the user layer,

[00:25:53] in the lasagna,

[00:25:54] in the Big Mac,

[00:25:55] that is AI responsibility.

[00:25:59] And I think we've got to have

[00:26:01] a more wholesome discussion.

[00:26:03] I don't mean wholesome as in pure and lovely.

[00:26:05] I mean in the whole of it.

[00:26:07] Realistic.

[00:26:10] Of where responsibility might lie,

[00:26:14] in given kind of a wholesome lie,

[00:26:17] lay,

[00:26:17] I'm getting all my language screwed up,

[00:26:20] you know,

[00:26:21] in AI.

[00:26:22] And I'm not sure

[00:26:23] that we've thought this through enough

[00:26:26] to be able to figure that out yet.

[00:26:28] So,

[00:26:29] a lot of people were saying

[00:26:30] the bill's a lot better.

[00:26:31] I think they had to change it

[00:26:33] so that they could get it

[00:26:36] through the governor,

[00:26:37] who was dubious.

[00:26:40] But there's still a lot of opposition to it,

[00:26:42] and I don't know what's going to happen.

[00:26:43] California legislature ends this month,

[00:26:45] so one way or the other,

[00:26:47] it's either going to get passed or not.

[00:26:49] Mm-hmm.

[00:26:50] Yeah.

[00:26:50] Nancy Pelosi also issued a press release

[00:26:53] coming out against the bill,

[00:26:55] calling it well-intentioned

[00:26:59] but ill-informed,

[00:27:01] more harmful than helpful,

[00:27:04] stifling innovation.

[00:27:06] Also mentioned pushing for a moonshot mentality

[00:27:10] to act as a model for the country

[00:27:13] to enable small entrepreneurs and academia,

[00:27:17] not big tech to dominate.

[00:27:19] Moonshot mentality.

[00:27:20] What does that mean?

[00:27:20] I'm not sure I know what that means.

[00:27:22] I think, you know,

[00:27:23] in a California sense,

[00:27:24] probably the people who are talking to her

[00:27:26] are leave us alone,

[00:27:28] effective accelerationism,

[00:27:29] let us just do our job,

[00:27:31] and you'll figure it out later.

[00:27:32] And each worldview has its issues,

[00:27:36] obviously.

[00:27:37] Regulate the hell out of it,

[00:27:38] well, you really can't.

[00:27:39] Let it go,

[00:27:40] no, you really shouldn't.

[00:27:40] There is a middle ground to be found.

[00:27:43] It's not a bad discussion to have,

[00:27:44] but I think to rush into a law,

[00:27:48] to think,

[00:27:49] because there's a certain hubris

[00:27:51] to think,

[00:27:52] well, I understand enough about AI now.

[00:27:53] I can make a law that regulates it.

[00:27:55] I don't think anybody does yet.

[00:27:58] And that will be seen,

[00:28:00] some people can say to me

[00:28:01] for what I just said,

[00:28:02] well, how irresponsible of you, Jarvis,

[00:28:04] to just let it go then?

[00:28:06] But no,

[00:28:08] there's still lots of laws in the books

[00:28:10] with lots of liability potential

[00:28:11] for all kinds of bad things

[00:28:13] that can happen and stuff.

[00:28:14] I don't know that AI is yet so special

[00:28:18] that it needs its own laws.

[00:28:20] The other interesting thing was,

[00:28:21] so what was in the story, Jason?

[00:28:23] I lost it.

[00:28:26] There were some funny things to me

[00:28:28] is that it got rid of,

[00:28:29] oh, sorry here, folks.

[00:28:31] It got rid of like a commission

[00:28:33] and instead has a committee.

[00:28:35] I can't remember what it was, right?

[00:28:37] It no longer creates

[00:28:42] the Frontier Model Division,

[00:28:44] a new government agency

[00:28:44] formerly included in the bill.

[00:28:46] However, the bill still creates

[00:28:47] the Board of Frontier Models.

[00:28:51] What?

[00:28:52] What's the difference?

[00:28:55] And it's all inside

[00:28:56] the existing government operations agency.

[00:28:58] Interesting to me in this story, too,

[00:29:00] is that the way the TechCrunch positions it,

[00:29:04] the legislators took advice from Anthropic.

[00:29:08] So I don't know what made them so influential,

[00:29:11] except maybe they just tried and lobbied more

[00:29:13] and had more suggestions.

[00:29:14] But I found that really interesting.

[00:29:17] Very.

[00:29:18] Very.

[00:29:19] Okay.

[00:29:20] Well, that's fascinating.

[00:29:23] It's fascinating.

[00:29:23] It's fascinating.

[00:29:23] It also has to do with, you know,

[00:29:26] kind of politics,

[00:29:27] which don't come up on the show

[00:29:29] a huge amount.

[00:29:31] But boy, you know,

[00:29:32] for those of you who are allergic

[00:29:33] to political and technological stories overlapping,

[00:29:37] I'm sorry,

[00:29:38] but the next story you might want to mute then.

[00:29:41] There's election deepfakes

[00:29:44] that we've talked about many times.

[00:29:45] That thing that we've all been told

[00:29:47] is a big threat to our democracy,

[00:29:49] that bad actors are going to use this

[00:29:51] against the will of the people

[00:29:53] to trick them into believing things

[00:29:55] about our, you know,

[00:29:56] our governing bodies,

[00:29:57] our politicians that aren't real,

[00:29:59] that capitalizes on the reactive nature

[00:30:01] of social media,

[00:30:02] all this stuff that we've heard many times.

[00:30:04] But in this case,

[00:30:05] it's one of the candidates

[00:30:07] for the president of the United States

[00:30:10] spreading AI-generated imagery,

[00:30:13] showing Taylor Swift support materials

[00:30:15] for Donald Trump.

[00:30:16] So we've got kind of like a eye roll bingo card

[00:30:20] right here for certain listeners of this show.

[00:30:23] We've got Taylor Swift,

[00:30:24] we've got Donald Trump,

[00:30:25] we've got deepfakes, politics.

[00:30:28] You could practically spell out the word bingo

[00:30:30] on your card.

[00:30:31] But I think it's, you know,

[00:30:33] it's important to talk about the idea that,

[00:30:36] you know,

[00:30:37] for all of the negative reaction

[00:30:40] or kind of warnings

[00:30:43] around how AI can be used

[00:30:46] to generate things

[00:30:47] that might be impactful

[00:30:49] in a negative sense on the election,

[00:30:51] that we have some people

[00:30:53] at the very high, high up

[00:30:54] engaging in this.

[00:30:56] And yes, it's Taylor Swift,

[00:30:58] so it's easy to roll your eyes

[00:30:59] and be like,

[00:30:59] oh God,

[00:31:00] why are you guys even talking about this?

[00:31:02] But it is,

[00:31:03] I mean,

[00:31:04] that does have sway

[00:31:06] with a large amount of people.

[00:31:08] And everything in the images

[00:31:09] from pictures of Taylor Swift,

[00:31:12] you know,

[00:31:12] superimposed with the thing

[00:31:13] that says Taylor wants you

[00:31:14] to vote for Donald Trump

[00:31:15] to pictures of women

[00:31:17] in large crowds

[00:31:17] wearing Swifties for Trump shirts.

[00:31:20] I mean,

[00:31:20] it was all generated.

[00:31:21] And if it was generated,

[00:31:23] then it didn't exist.

[00:31:24] Yet he felt the need to share it.

[00:31:26] I don't know.

[00:31:26] It's just stupid.

[00:31:28] Yeah,

[00:31:28] they also have a picture

[00:31:30] of the Trump forwarded

[00:31:33] of the back of Kamala Harris

[00:31:36] addressing what would seem

[00:31:37] to be the convention.

[00:31:38] And there's a huge

[00:31:39] red communist

[00:31:41] hammer and sickle

[00:31:42] and red all around.

[00:31:44] It's obviously fake.

[00:31:47] Yes.

[00:31:47] Well, that one's,

[00:31:48] yeah, for sure.

[00:31:48] It's clearly made up.

[00:31:50] It's cartoony.

[00:31:50] But, you know,

[00:31:51] an artist could have

[00:31:52] just as well sat down

[00:31:52] with a pen and ink

[00:31:53] and made the same thing.

[00:31:55] But because it was made by AI,

[00:31:58] people see more threat in it.

[00:31:59] There was another story this week

[00:32:00] which I didn't put up

[00:32:02] in the rundown,

[00:32:02] but I think it's related

[00:32:04] just to this extent.

[00:32:05] Iran was found out

[00:32:07] to be using OpenAI

[00:32:08] to generate fake stuff

[00:32:10] for a few websites.

[00:32:12] And OpenAI found it

[00:32:14] and stopped it.

[00:32:15] Okay, so at the one level

[00:32:16] you think,

[00:32:16] uh-oh,

[00:32:17] foreign interference

[00:32:18] in the election,

[00:32:19] Iran,

[00:32:20] OpenAI,

[00:32:21] oh my God,

[00:32:21] what's going on?

[00:32:22] On the other hand,

[00:32:23] we all know how easy it is

[00:32:24] to go into OpenAI

[00:32:25] and say,

[00:32:26] write me a blank,

[00:32:27] draw me a blank.

[00:32:29] And just because AI

[00:32:31] did it for them

[00:32:32] doesn't make it

[00:32:33] substantively different.

[00:32:35] Except perhaps

[00:32:36] for scale and speed

[00:32:38] from what you could

[00:32:39] already do

[00:32:40] and what you should

[00:32:41] already be

[00:32:42] suspicious of

[00:32:44] anywhere

[00:32:45] in the world

[00:32:46] of social media.

[00:32:49] Again,

[00:32:49] not because the social

[00:32:50] technology is wrong,

[00:32:51] but because humanity

[00:32:53] has some liars in it.

[00:32:55] So, um,

[00:32:57] I think because this,

[00:32:58] this,

[00:32:59] this,

[00:32:59] this,

[00:33:00] um,

[00:33:01] uh,

[00:33:01] Big Mac gets the pickles

[00:33:03] of AI in it.

[00:33:04] Oh, I'm sorry.

[00:33:04] I was trying there.

[00:33:05] Um,

[00:33:06] uh,

[00:33:07] I like it.

[00:33:08] I like it.

[00:33:08] I like that we continue

[00:33:09] on with this.

[00:33:10] Because you got extra

[00:33:11] onions on your Big Mac

[00:33:13] because it has AI

[00:33:14] onions in it.

[00:33:15] Uh,

[00:33:15] you think,

[00:33:16] oh my God,

[00:33:16] it's awful.

[00:33:17] It's poison.

[00:33:18] It's still a hamburger.

[00:33:20] And,

[00:33:20] um,

[00:33:21] so,

[00:33:22] uh,

[00:33:22] yes,

[00:33:23] we got to be cautious.

[00:33:24] Yes,

[00:33:25] people have to be

[00:33:25] suspicious ever more

[00:33:26] of what they see

[00:33:27] anywhere.

[00:33:28] Uh,

[00:33:29] but the source matters

[00:33:30] more than anything else.

[00:33:32] Yeah.

[00:33:32] And for Donald Trump

[00:33:34] to act as if,

[00:33:35] uh,

[00:33:35] he's about to get an

[00:33:37] endorsement from Swifties,

[00:33:39] it doesn't matter

[00:33:40] what the image is.

[00:33:41] You should use your logic

[00:33:43] there and say,

[00:33:44] hmm,

[00:33:45] what's wrong with this

[00:33:46] picture metaphorically

[00:33:47] speaking?

[00:33:47] Yeah.

[00:33:48] And so,

[00:33:49] yeah,

[00:33:49] I think it's,

[00:33:50] I won't go so far as

[00:33:51] calling this moral panic,

[00:33:52] but I do think that

[00:33:54] we've got to draw back

[00:33:56] on this discussion

[00:33:56] because once again,

[00:33:57] it's going to lead

[00:33:58] to legislation.

[00:33:59] It's going to lead

[00:34:00] to somebody saying,

[00:34:01] you can't do this.

[00:34:01] Now,

[00:34:01] on the other hand,

[00:34:02] another story that's

[00:34:03] not at all the rundown

[00:34:04] is that Elon Musk

[00:34:05] and Twitter are in

[00:34:07] some,

[00:34:08] um,

[00:34:09] deep trouble

[00:34:10] because of the riots

[00:34:12] that have been going

[00:34:12] on in the UK,

[00:34:15] uh,

[00:34:15] because there people

[00:34:18] in that case,

[00:34:19] it was both Elon Musk's

[00:34:21] AI and Elon Musk's

[00:34:23] social network involved

[00:34:24] and people are already,

[00:34:26] you know,

[00:34:27] getting hauled into court

[00:34:28] because of what they're

[00:34:29] sharing.

[00:34:30] Um,

[00:34:31] and there's freedom

[00:34:32] of expression issues there.

[00:34:33] There's all kinds

[00:34:34] of things there.

[00:34:35] So it's a really

[00:34:36] complicated world.

[00:34:37] And I think if we jump

[00:34:38] to the conclusion that,

[00:34:39] oh my God,

[00:34:39] AI is doing stuff,

[00:34:40] it's got to be worse.

[00:34:42] It's still humanity.

[00:34:43] It's still all the problems

[00:34:44] we've already had.

[00:34:46] Yeah.

[00:34:46] Yeah.

[00:34:46] That's,

[00:34:47] that's very true.

[00:34:47] Do you think,

[00:34:48] uh,

[00:34:49] Taylor Swift has,

[00:34:51] you know,

[00:34:51] like,

[00:34:51] is,

[00:34:52] is there going to be

[00:34:53] a Swift suit

[00:34:54] coming from Taylor

[00:34:55] on something like this?

[00:34:56] Cause we know,

[00:34:57] I mean,

[00:34:57] cause if you know

[00:34:58] anything about Taylor Swift,

[00:34:59] you know,

[00:35:00] she's pretty opposed

[00:35:01] to Donald Trump.

[00:35:02] She is definitely

[00:35:03] not a supporter.

[00:35:05] Um,

[00:35:06] you know,

[00:35:06] she's also happens

[00:35:07] to be one of the most,

[00:35:08] you know,

[00:35:08] visible famous people

[00:35:10] on planet earth

[00:35:11] right now.

[00:35:12] Um,

[00:35:13] does she have,

[00:35:14] would she have a

[00:35:15] strong case?

[00:35:16] Do you think if she

[00:35:17] was to say,

[00:35:18] Hey,

[00:35:18] don't do this.

[00:35:19] I'm not a lawyer

[00:35:20] and don't play one

[00:35:21] on the socials.

[00:35:23] Neither am I

[00:35:24] a epidemiologist.

[00:35:26] Neither am I

[00:35:26] a war expert,

[00:35:27] which we all tend

[00:35:28] to become instantly

[00:35:29] on social media.

[00:35:30] Um,

[00:35:31] there could,

[00:35:33] a use of likeness.

[00:35:35] Yeah.

[00:35:36] Uh,

[00:35:36] could be an issue here.

[00:35:37] Uh,

[00:35:38] certainly if you used

[00:35:39] her song without authorization,

[00:35:40] you see a lot of people

[00:35:41] are going after him

[00:35:41] for using songs

[00:35:42] they wish he didn't use.

[00:35:44] A very amusing

[00:35:45] is Celine Dion.

[00:35:46] And the song he picked

[00:35:47] was the Titanic song.

[00:35:49] Um,

[00:35:50] uh,

[00:35:51] I,

[00:35:51] I missed that one.

[00:35:53] and she went on social media

[00:35:55] saying,

[00:35:55] no,

[00:35:56] I don't authorize you

[00:35:56] to do this.

[00:35:57] And that song,

[00:35:58] right?

[00:35:59] So,

[00:35:59] um,

[00:36:00] there's copyright issues.

[00:36:02] There's likeness issues.

[00:36:04] Um,

[00:36:04] uh,

[00:36:06] but I don't think

[00:36:07] it'd be worth

[00:36:07] a court case.

[00:36:09] No.

[00:36:09] Yeah,

[00:36:10] right,

[00:36:10] right.

[00:36:10] Is it worth

[00:36:12] squeezing the juice

[00:36:13] as it were?

[00:36:14] Um,

[00:36:15] real quick

[00:36:16] before we move on,

[00:36:17] I do want to

[00:36:18] highlight the Ozone Nightmare.

[00:36:20] Thank you so much

[00:36:21] for your super chat.

[00:36:22] Thank you,

[00:36:22] Ozone.

[00:36:22] Ozone Nightmare,

[00:36:23] as always,

[00:36:24] says Jeff,

[00:36:25] saw your Blue Sky post

[00:36:26] about how AIs

[00:36:27] do not learn

[00:36:28] on their own.

[00:36:29] An excellent survey

[00:36:30] under,

[00:36:32] uh,

[00:36:33] underscoring,

[00:36:33] sorry,

[00:36:34] my eyesight,

[00:36:35] the importance

[00:36:35] of directing criticisms

[00:36:37] of those who control

[00:36:38] the tools

[00:36:39] not the tools

[00:36:39] themselves.

[00:36:40] This really actually

[00:36:41] ties perfectly in

[00:36:42] with what exactly

[00:36:42] what we're talking about.

[00:36:44] Right, right.

[00:36:44] Who's pulling the trigger,

[00:36:45] not necessarily what,

[00:36:46] what tool.

[00:36:47] And that story

[00:36:48] is in the rundown.

[00:36:49] Uh,

[00:36:50] should we,

[00:36:50] should we switch the order

[00:36:51] of the next two?

[00:36:51] Yeah,

[00:36:52] let's do it.

[00:36:52] Yeah, yeah.

[00:36:53] Set that story up.

[00:36:54] Um,

[00:36:55] go for it.

[00:36:55] It was really interesting.

[00:36:56] It was a story,

[00:36:57] uh,

[00:36:57] out of,

[00:36:57] uh,

[00:36:59] Bath University,

[00:37:00] I think it is,

[00:37:01] uh,

[00:37:01] of three

[00:37:03] or four researchers

[00:37:05] who wanted to look

[00:37:06] at the question

[00:37:06] of existential risk.

[00:37:07] And they came at it

[00:37:08] a different way

[00:37:09] to me,

[00:37:10] um,

[00:37:11] in which they said

[00:37:12] that,

[00:37:13] um,

[00:37:14] they,

[00:37:14] they say straight out

[00:37:15] there's no existential

[00:37:16] threat to humanity.

[00:37:17] The reason is

[00:37:18] because AI

[00:37:19] is not going to go off

[00:37:20] on its own.

[00:37:22] It's not going to go

[00:37:23] make up something

[00:37:24] and come up with

[00:37:25] its own goal

[00:37:25] to do things.

[00:37:26] And part of the reason

[00:37:27] that this leads

[00:37:28] to the fear

[00:37:28] besides the general

[00:37:29] test,

[00:37:30] real,

[00:37:30] uh,

[00:37:31] macho AI boys,

[00:37:33] I can destroy

[00:37:33] the world chest

[00:37:34] thumping is

[00:37:36] that people think

[00:37:37] that there were

[00:37:38] skills that they

[00:37:39] found in AI

[00:37:40] that are emergent,

[00:37:41] that this wasn't

[00:37:42] programmed in.

[00:37:43] How the heck

[00:37:44] did it come up

[00:37:44] with this?

[00:37:45] It came up

[00:37:45] with something

[00:37:46] completely different.

[00:37:46] If it could do

[00:37:47] that,

[00:37:48] all powerful,

[00:37:49] what else could

[00:37:49] it do?

[00:37:50] Right?

[00:37:50] Right.

[00:37:51] We get to the

[00:37:51] pay-per-click.

[00:37:52] It has a mind

[00:37:52] of its own.

[00:37:53] It will,

[00:37:54] it will,

[00:37:54] it will become

[00:37:54] single-minded

[00:37:56] and decide the

[00:37:57] paperclips are the

[00:37:58] only priority on

[00:37:58] earth and everything

[00:37:59] else should be

[00:38:00] destroyed so I can

[00:38:01] make more paperclips.

[00:38:02] Right?

[00:38:02] And what they said

[00:38:03] in the research here

[00:38:04] is,

[00:38:05] uh,

[00:38:06] no,

[00:38:07] that these systems

[00:38:08] still learn

[00:38:09] and that they,

[00:38:10] we know how they

[00:38:10] learn and,

[00:38:11] and they stop at

[00:38:12] that and that they

[00:38:14] don't go off on

[00:38:15] their own.

[00:38:15] And if you're

[00:38:15] trying to think

[00:38:16] of something emergent,

[00:38:17] no,

[00:38:17] there's actually,

[00:38:17] you can,

[00:38:18] you can kind of

[00:38:18] track that back.

[00:38:20] So I was glad

[00:38:21] to see this

[00:38:22] and I'm sure

[00:38:22] that there'll be

[00:38:23] plenty of argument

[00:38:23] that says,

[00:38:24] well,

[00:38:24] you don't know

[00:38:24] what could come

[00:38:25] next.

[00:38:26] Um,

[00:38:27] and,

[00:38:28] um,

[00:38:30] should I play

[00:38:31] the clip?

[00:38:31] Sure.

[00:38:31] I think I,

[00:38:32] I think I,

[00:38:33] you,

[00:38:33] you added a,

[00:38:34] a YouTube link.

[00:38:36] It's,

[00:38:36] it's a pretty

[00:38:36] short clip

[00:38:37] and I think

[00:38:37] I've skipped

[00:38:38] skipped forward

[00:38:39] far enough in it

[00:38:40] for us to,

[00:38:41] to be able to

[00:38:41] hear what we need

[00:38:42] to hear.

[00:38:42] Let me unmute

[00:38:43] it and see

[00:38:43] if you can

[00:38:44] hear this.

[00:38:45] We've come

[00:38:46] to three

[00:38:46] conclusions.

[00:38:48] One,

[00:38:48] there's no

[00:38:49] existential threat

[00:38:50] that's posed

[00:38:51] by these

[00:38:51] language models.

[00:38:52] Two,

[00:38:53] the best way

[00:38:54] to use these

[00:38:55] models is to

[00:38:56] make the prompts

[00:38:57] as explicit

[00:38:58] and clear

[00:38:58] as possible.

[00:39:00] And three,

[00:39:01] further scaling

[00:39:02] up these models

[00:39:03] is unlikely

[00:39:04] to change

[00:39:05] either of these

[00:39:06] two things.

[00:39:07] In other words,

[00:39:08] the bigger

[00:39:08] and more powerful

[00:39:09] they get,

[00:39:10] the better they

[00:39:11] get at following

[00:39:12] instructions,

[00:39:13] but they do not

[00:39:14] move towards

[00:39:15] being able to

[00:39:16] pose a threat

[00:39:17] to us.

[00:39:18] And suddenly

[00:39:18] having a mind

[00:39:19] of their own

[00:39:20] and they're

[00:39:21] taking over.

[00:39:22] Right.

[00:39:23] So I found

[00:39:23] this good

[00:39:25] and hopeful

[00:39:25] because it

[00:39:31] makes us

[00:39:32] question,

[00:39:33] at least,

[00:39:34] this X risk,

[00:39:36] existential risk

[00:39:37] presumption

[00:39:38] that comes out.

[00:39:39] A,

[00:39:40] I mean,

[00:39:40] there's so many

[00:39:41] logical leaps

[00:39:42] here.

[00:39:43] We're going

[00:39:43] to get to AGI,

[00:39:45] to artificial

[00:39:45] general intelligence,

[00:39:46] that it can do

[00:39:46] anything.

[00:39:47] That's the key.

[00:39:48] It can do

[00:39:48] anything.

[00:39:49] Oh,

[00:39:49] well,

[00:39:50] uh-oh.

[00:39:50] Anything.

[00:39:50] How is it

[00:39:51] going to decide

[00:39:51] what you're

[00:39:52] doing?

[00:39:52] It doesn't

[00:39:53] really decide,

[00:39:53] but okay,

[00:39:54] so it can

[00:39:54] make paperclips.

[00:39:57] And then

[00:39:57] ergo,

[00:39:58] that is an

[00:39:59] existential threat.

[00:40:00] That's the

[00:40:01] leap that all

[00:40:02] the test grail

[00:40:03] boys are going

[00:40:04] to.

[00:40:04] That's what

[00:40:05] they argue

[00:40:06] has to be

[00:40:06] controlled.

[00:40:08] And the

[00:40:09] paradox there

[00:40:09] is,

[00:40:10] let us make

[00:40:11] this all

[00:40:11] powerful machine,

[00:40:13] but recognize

[00:40:14] how powerful

[00:40:15] we are in

[00:40:15] doing so.

[00:40:16] And so I

[00:40:17] like to see

[00:40:18] this leavening

[00:40:18] on that.

[00:40:19] So I thought

[00:40:19] it was worth

[00:40:20] sharing for

[00:40:20] that reason.

[00:40:21] Yeah,

[00:40:22] it benefits

[00:40:23] them to

[00:40:24] say that

[00:40:25] these things

[00:40:25] could become

[00:40:26] all powerful

[00:40:27] because it

[00:40:27] puts them

[00:40:28] into the

[00:40:29] light of

[00:40:30] being so

[00:40:31] smart and

[00:40:32] so good

[00:40:33] at what

[00:40:35] they do

[00:40:35] that they're

[00:40:36] even capable

[00:40:36] of creating

[00:40:37] something that

[00:40:38] could do

[00:40:38] something this

[00:40:39] powerful.

[00:40:39] Right.

[00:40:39] And it

[00:40:40] just kind

[00:40:40] of turns

[00:40:40] out that's

[00:40:41] not the

[00:40:41] case.

[00:40:42] So somewhat

[00:40:42] related to

[00:40:43] that, if

[00:40:43] we're talking

[00:40:44] about disaster

[00:40:44] and existential

[00:40:45] risk and

[00:40:46] everything, there

[00:40:46] was another

[00:40:47] paper that

[00:40:47] came out

[00:40:48] reported by

[00:40:49] livescience.com

[00:40:51] that said

[00:40:52] that an

[00:40:52] algorithm can

[00:40:53] predict tipping

[00:40:54] points for

[00:40:54] future disasters.

[00:40:56] So what struck

[00:40:57] me is, yes,

[00:40:58] there could be

[00:40:58] disasters, but

[00:41:00] those disasters

[00:41:01] can come from

[00:41:01] humanity, they

[00:41:02] can come from

[00:41:03] nature, they

[00:41:04] can come from

[00:41:05] other things

[00:41:05] where what

[00:41:07] AI does, I

[00:41:08] think we have

[00:41:08] to remember

[00:41:09] this, this

[00:41:10] made me think

[00:41:10] of it, is

[00:41:11] that it

[00:41:11] predicts based

[00:41:13] on prior

[00:41:15] experience, based

[00:41:16] on what it's

[00:41:17] been taught, it

[00:41:18] predicts word

[00:41:19] order, here's

[00:41:19] the next word,

[00:41:20] or predicts

[00:41:21] behaviors of

[00:41:22] markets, or

[00:41:24] it predicts

[00:41:26] events.

[00:41:27] And it does

[00:41:28] that based on

[00:41:29] history, it's

[00:41:29] all it can

[00:41:30] have, it's

[00:41:31] all it has.

[00:41:31] And so it'll

[00:41:32] be associated

[00:41:33] with disaster.

[00:41:34] The AI says

[00:41:35] the world's

[00:41:35] going to blow

[00:41:36] up this way,

[00:41:37] right?

[00:41:37] But only

[00:41:38] because it's

[00:41:38] predicting what

[00:41:39] already happens

[00:41:40] on Earth, either

[00:41:42] through humans or

[00:41:42] nature.

[00:41:43] So I think

[00:41:44] that there's

[00:41:44] this association

[00:41:45] of AI with

[00:41:47] danger and

[00:41:48] disaster that

[00:41:49] I find

[00:41:50] fascinating.

[00:41:50] I'm trying to

[00:41:51] get my head

[00:41:51] around why

[00:41:54] that is.

[00:41:55] And I don't

[00:41:56] think, everybody

[00:41:57] hates social

[00:41:58] media, but

[00:41:59] it's not, and

[00:42:00] yeah, it

[00:42:01] could ruin

[00:42:01] election, it

[00:42:02] could ruin

[00:42:02] democracy even,

[00:42:03] but it's not,

[00:42:04] it's less

[00:42:05] doom-based.

[00:42:08] When in

[00:42:09] fact, some

[00:42:09] of the

[00:42:09] technologies

[00:42:10] we've had

[00:42:10] before, internal

[00:42:13] combustion

[00:42:13] engines can

[00:42:15] cause more

[00:42:15] doom in the

[00:42:16] future than

[00:42:16] anything else.

[00:42:18] The doom

[00:42:19] from AI may

[00:42:20] come from

[00:42:20] its consumption

[00:42:22] of electricity.

[00:42:23] But I'm just

[00:42:24] trying to figure

[00:42:25] out that

[00:42:25] association that

[00:42:26] I think is

[00:42:27] starting in the

[00:42:28] culture around

[00:42:29] AI and bad

[00:42:30] stuff, which I

[00:42:31] just find

[00:42:31] fascinating.

[00:42:32] AI and

[00:42:33] bad things.

[00:42:35] Yeah.

[00:42:36] Yeah, I

[00:42:36] mean, the

[00:42:38] predicting the

[00:42:39] tipping point for

[00:42:40] future disasters

[00:42:41] study, I find

[00:42:42] that interesting

[00:42:43] because that

[00:42:44] kind of ties

[00:42:45] in, at

[00:42:46] least seems

[00:42:47] to me

[00:42:47] anyways, on

[00:42:49] the real

[00:42:50] strength of

[00:42:51] AI, which

[00:42:52] is, like you

[00:42:52] said, taking

[00:42:53] a look at

[00:42:54] large data

[00:42:55] sets,

[00:42:56] information.

[00:42:57] I like the

[00:42:58] way you put

[00:42:58] it, like

[00:42:59] looking at

[00:42:59] history,

[00:43:00] looking at

[00:43:01] what's

[00:43:01] happened

[00:43:01] before and

[00:43:02] predicting

[00:43:02] based on

[00:43:03] that.

[00:43:04] And, you

[00:43:04] know, in

[00:43:04] this case, it

[00:43:05] was 20

[00:43:05] years of

[00:43:06] satellite data

[00:43:06] from three

[00:43:07] regions in

[00:43:08] Central Africa

[00:43:09] that what

[00:43:10] the model was

[00:43:12] attempting to

[00:43:13] understand was

[00:43:15] the transformation

[00:43:16] of tropical

[00:43:17] forests into

[00:43:18] savannah and

[00:43:19] using that

[00:43:20] previous, you

[00:43:21] know, like

[00:43:22] that rich

[00:43:23] history of

[00:43:24] data points

[00:43:25] to predict it

[00:43:26] for the future,

[00:43:27] using information

[00:43:28] on rainfall,

[00:43:28] on tree

[00:43:29] coverage of

[00:43:30] the two

[00:43:31] other regions.

[00:43:32] And they

[00:43:33] say, you

[00:43:33] know, that

[00:43:33] data accurately

[00:43:34] predicted or

[00:43:36] the AI

[00:43:36] accurately predicted

[00:43:37] in the third

[00:43:38] region that,

[00:43:41] you know, when

[00:43:41] this would

[00:43:42] happen.

[00:43:42] 81% of the

[00:43:43] system nodes

[00:43:44] were used to

[00:43:45] come up with,

[00:43:47] was it 81%

[00:43:48] of the system

[00:43:49] nodes went

[00:43:50] unobserved?

[00:43:51] Okay, in this

[00:43:52] case, chunks

[00:43:53] of that land

[00:43:53] to come up

[00:43:55] with its

[00:43:55] inference and

[00:43:56] to come up

[00:43:57] with its

[00:43:57] understanding.

[00:43:57] And so, you

[00:43:59] know, it

[00:43:59] seems to me

[00:43:59] that's what

[00:44:00] AI is really

[00:44:00] good at.

[00:44:01] It's taking

[00:44:01] all this

[00:44:02] information,

[00:44:03] getting some

[00:44:04] sort of a

[00:44:04] collective

[00:44:04] understanding of

[00:44:05] what it all

[00:44:06] means, given

[00:44:07] we've asked it

[00:44:08] the right

[00:44:08] questions.

[00:44:09] Yeah, and

[00:44:10] then it can

[00:44:10] find inferences

[00:44:11] that we

[00:44:11] just don't

[00:44:12] see.

[00:44:13] Maybe overlook.

[00:44:14] Yeah, absolutely.

[00:44:16] I mean, we've

[00:44:16] seen this, there

[00:44:17] were some stories

[00:44:18] this year about

[00:44:18] breast cancer,

[00:44:19] that it could

[00:44:19] detect it much

[00:44:20] sooner than

[00:44:21] we otherwise

[00:44:22] would detect it.

[00:44:23] On the one

[00:44:23] hand, that can

[00:44:24] mean, are we

[00:44:25] going to

[00:44:25] over-treat or

[00:44:26] does it mean

[00:44:26] instead that we

[00:44:27] can be more

[00:44:28] confident of our

[00:44:29] treatment as we

[00:44:29] go, but all

[00:44:31] it's doing is

[00:44:32] finding inferences,

[00:44:33] making predictions

[00:44:34] based on past

[00:44:35] data, that's

[00:44:36] it, that our

[00:44:38] little minds

[00:44:39] alone couldn't

[00:44:39] do.

[00:44:41] Yeah, interesting

[00:44:42] stuff.

[00:44:44] Okay, and then

[00:44:45] we have Eric

[00:44:46] Schmidt, ex-Google

[00:44:48] CEO, Eric Schmidt

[00:44:49] talking to

[00:44:50] Stanford.

[00:44:50] It's all out of

[00:44:50] his mouth every

[00:44:51] other day these

[00:44:51] days.

[00:44:52] Oh, boy.

[00:44:53] Yeah, right.

[00:44:54] Basically, he was

[00:44:55] talking, there was

[00:44:56] a video on the

[00:44:57] Stanford YouTube

[00:44:58] channel that has

[00:45:00] since been removed,

[00:45:00] by the way.

[00:45:01] The transcript has

[00:45:02] been preserved,

[00:45:03] where Schmidt says

[00:45:04] a number of

[00:45:05] things that were

[00:45:06] kind of newsworthy.

[00:45:08] Said it was, he

[00:45:10] said that it was

[00:45:11] removed because of

[00:45:11] a comment he

[00:45:12] made about the

[00:45:12] pandemic, which

[00:45:13] was, quote,

[00:45:14] working from home

[00:45:15] was more important

[00:45:16] than winning the

[00:45:17] AI race for

[00:45:18] Google, according

[00:45:20] to Schmidt.

[00:45:20] The reason the

[00:45:21] startups work is

[00:45:22] because the people

[00:45:23] work like hell.

[00:45:24] Maybe there's some

[00:45:26] truth to that, I

[00:45:26] suppose.

[00:45:27] But he's saying

[00:45:28] that that's why the

[00:45:28] video was removed,

[00:45:29] but there were

[00:45:30] other things in

[00:45:30] there, also

[00:45:31] surmising that AI

[00:45:32] startups can, as

[00:45:34] Verge puts it,

[00:45:35] steal IP and

[00:45:36] hire lawyers to

[00:45:37] clean up the

[00:45:38] mess.

[00:45:38] That's one way of

[00:45:40] summarizing exactly

[00:45:41] what he was

[00:45:45] doing.

[00:45:45] You want to

[00:45:45] create a hot new

[00:45:46] AI startup, or no,

[00:45:49] I think it had to

[00:45:49] do with TikTok.

[00:45:50] If TikTok is going

[00:45:51] away, direct your

[00:45:55] AI model at the

[00:45:57] TikTok business

[00:45:58] model, learn

[00:45:59] everything you can,

[00:46:00] steal the users,

[00:46:01] steal the music,

[00:46:03] and then ask it to

[00:46:04] produce a program

[00:46:05] in the next 30

[00:46:06] seconds around

[00:46:08] that, release it,

[00:46:09] and in one hour,

[00:46:10] if it's not viral,

[00:46:11] do something

[00:46:11] different along the

[00:46:12] same lines.

[00:46:13] That's the command.

[00:46:16] Anyways,

[00:46:18] essentially making

[00:46:19] the argument that

[00:46:21] stealing the IP

[00:46:22] is one way to go

[00:46:24] for people to get

[00:46:24] a successful tech

[00:46:25] career.

[00:46:25] I don't know.

[00:46:26] Am I missing

[00:46:27] something?

[00:46:27] Yeah, it was weird.

[00:46:27] He was talking to

[00:46:28] Eric Bernalves in

[00:46:29] the Stanford class,

[00:46:30] and I think he was

[00:46:30] kind of just showing

[00:46:31] off how to be

[00:46:32] Stanford University

[00:46:34] entrepreneurial macho,

[00:46:36] right?

[00:46:36] And he got in

[00:46:38] trouble, I think,

[00:46:39] very much so with

[00:46:40] Google, for

[00:46:42] basically accusing

[00:46:42] all the employees

[00:46:43] there.

[00:46:43] I will summarize it

[00:46:44] in a way he

[00:46:45] didn't say it,

[00:46:45] being lazy and

[00:46:46] working at home.

[00:46:47] And then on the

[00:46:48] other hand, he's

[00:46:49] saying, if you guys

[00:46:49] really work hard,

[00:46:50] here's what you

[00:46:51] could do.

[00:46:53] And it's funny to

[00:46:54] me, Schmidt comes

[00:46:55] from a certain

[00:46:55] time in Silicon

[00:46:56] Valley.

[00:46:58] And I'm not sure

[00:46:59] they don't prefer

[00:47:00] that time to

[00:47:01] what we have

[00:47:01] today, but his

[00:47:02] time, I think,

[00:47:03] is gone.

[00:47:06] Yeah.

[00:47:07] Yeah.

[00:47:07] He also said, I

[00:47:09] found it

[00:47:09] interesting, if it

[00:47:10] was the same

[00:47:10] class or not, he

[00:47:12] said that the

[00:47:12] main winner in

[00:47:13] AI is going to

[00:47:14] be NVIDIA because

[00:47:15] of the chips.

[00:47:16] Oh, right.

[00:47:17] But in that, he

[00:47:18] said, and I found

[00:47:19] this rather gutsy

[00:47:20] to say, or

[00:47:21] inconvenient, when

[00:47:23] Microsoft did the

[00:47:24] open AI deal, it

[00:47:25] was the stupidest

[00:47:25] idea I've ever

[00:47:27] heard.

[00:47:29] I think...

[00:47:30] Did he give

[00:47:31] reason for that?

[00:47:32] Well, because, yeah,

[00:47:32] because they were

[00:47:33] going to make open

[00:47:33] AI the leader in

[00:47:35] AI where Microsoft

[00:47:36] should be, or

[00:47:38] should have

[00:47:38] itself thought it

[00:47:39] would be on its

[00:47:40] own.

[00:47:41] And he kind of

[00:47:41] didn't get the

[00:47:42] deal.

[00:47:42] I think Microsoft

[00:47:43] would agree to

[00:47:43] some extent because

[00:47:44] they're hedging

[00:47:45] their bets, right?

[00:47:46] They hired

[00:47:47] Mustafa Sulfiman,

[00:47:48] they're doing

[00:47:48] their own division,

[00:47:49] they're doing

[00:47:50] other things because

[00:47:51] I think they

[00:47:52] recognized that

[00:47:52] their fate was

[00:47:53] in Sam Altman's

[00:47:55] hands.

[00:47:56] And from the

[00:47:57] moment when Sam

[00:47:57] Altman got kicked

[00:47:58] out, I think

[00:47:59] people in

[00:47:59] Microsoft said,

[00:48:00] oh, we're a

[00:48:01] little vulnerable

[00:48:02] here, not in

[00:48:02] our control.

[00:48:03] So Schmidt's not

[00:48:04] wrong, but it

[00:48:05] was a little

[00:48:06] undiplomatic.

[00:48:08] He also said

[00:48:09] at one point

[00:48:09] during this

[00:48:10] whole conversation,

[00:48:11] don't quote me

[00:48:11] on that.

[00:48:12] I'm sorry,

[00:48:13] it's a little

[00:48:14] late for that,

[00:48:15] you're kind of

[00:48:15] on a video.

[00:48:18] It's the nature

[00:48:19] of what you're

[00:48:19] doing here,

[00:48:20] Eric, I don't

[00:48:21] know what to

[00:48:21] tell you there.

[00:48:22] Got a few

[00:48:23] things to round

[00:48:24] things out real

[00:48:24] quick here,

[00:48:27] pretty significant

[00:48:28] legal development

[00:48:28] with artists

[00:48:30] getting a pretty

[00:48:31] big victory in

[00:48:33] their copyright

[00:48:34] infringement

[00:48:34] lawsuit against

[00:48:35] AI art

[00:48:36] generators.

[00:48:37] This is the

[00:48:37] U.S.

[00:48:38] District Judge

[00:48:39] William Oreck

[00:48:40] allowing some

[00:48:42] key copyright

[00:48:43] and trademark

[00:48:43] claims to

[00:48:44] proceed against

[00:48:45] companies like

[00:48:46] Stability AI,

[00:48:49] Runway,

[00:48:49] Stable Diffusion,

[00:48:50] all of that.

[00:48:51] And the judge

[00:48:52] found that

[00:48:52] Stable Diffusion

[00:48:53] may have been

[00:48:54] quote, built to

[00:48:55] a significant

[00:48:55] extent on

[00:48:56] copyrighted works

[00:48:57] and created with

[00:48:59] intent to

[00:48:59] quote, facilitate

[00:49:01] infringement.

[00:49:02] Said the way

[00:49:03] product operates

[00:49:05] necessarily invokes

[00:49:07] copies or

[00:49:09] protected elements

[00:49:10] of those works

[00:49:12] and this could

[00:49:14] implicate other

[00:49:15] AI companies that

[00:49:16] have nothing to

[00:49:16] do with building

[00:49:17] the model but

[00:49:18] happen to be

[00:49:18] using it,

[00:49:21] like DeviantArt's

[00:49:22] DreamUp as

[00:49:23] one example as

[00:49:24] a firm that's

[00:49:26] using it that

[00:49:26] had no part in

[00:49:28] its creation but

[00:49:29] could be swept

[00:49:30] up in something

[00:49:31] like this.

[00:49:32] Before we get

[00:49:32] to the last

[00:49:33] one, let me

[00:49:35] mention that

[00:49:36] Tommy D in

[00:49:37] the YouTube

[00:49:38] comments has

[00:49:39] answered my

[00:49:40] question.

[00:49:41] Seems like you

[00:49:42] need to enable

[00:49:42] offline mode in

[00:49:44] the developer

[00:49:45] settings for

[00:49:47] your phone.

[00:49:49] I guess that's

[00:49:49] what it's referring

[00:49:50] to.

[00:49:50] Enable the option

[00:49:51] to run on

[00:49:52] device Gemini

[00:49:53] Nano right in

[00:49:54] your Pixel,

[00:49:55] go to the

[00:49:56] developer settings,

[00:49:57] search for the

[00:49:57] AI core settings,

[00:49:58] press it,

[00:49:59] activate,

[00:49:59] enable,

[00:50:00] which I'm

[00:50:01] sure you

[00:50:01] don't want to

[00:50:01] do all that

[00:50:01] right now,

[00:50:02] but offline to

[00:50:04] experiment with

[00:50:05] and see what

[00:50:06] it would be

[00:50:06] really interesting

[00:50:07] to me to see

[00:50:08] what it's capable

[00:50:08] of doing on

[00:50:09] the device

[00:50:09] alone.

[00:50:11] Yeah.

[00:50:12] To not only

[00:50:12] the power but

[00:50:13] also all the

[00:50:14] privacy questions.

[00:50:16] For sure.

[00:50:17] Yeah, well,

[00:50:18] like I said,

[00:50:19] I was really

[00:50:20] surprised to see

[00:50:20] that that didn't

[00:50:21] work because I

[00:50:22] had the guy

[00:50:23] who was part of

[00:50:25] the product lead

[00:50:26] and again,

[00:50:27] I can't remember

[00:50:27] his name,

[00:50:27] but when I

[00:50:28] was at the

[00:50:29] event showing

[00:50:30] it to me

[00:50:30] and I asked

[00:50:31] him specifically,

[00:50:31] is this running

[00:50:32] on device or

[00:50:32] off?

[00:50:33] He said,

[00:50:33] this is all

[00:50:33] entirely on

[00:50:34] the device.

[00:50:35] So that's

[00:50:36] really interesting.

[00:50:37] Thank you,

[00:50:37] Tommy D.

[00:50:37] I will

[00:50:38] absolutely try

[00:50:39] that and I

[00:50:41] appreciate that.

[00:50:42] I'll look for

[00:50:42] that.

[00:50:43] I do have

[00:50:43] developer options

[00:50:44] enabled so that'll

[00:50:45] be pretty easy

[00:50:46] to do.

[00:50:48] And finally,

[00:50:49] a couple of

[00:50:49] things that are

[00:50:51] more fun than

[00:50:52] anything.

[00:50:52] first of all,

[00:50:53] runway gen

[00:50:55] three alpha

[00:50:56] turbo,

[00:50:56] which to me

[00:50:57] sounds like a

[00:50:58] street fighter

[00:50:59] game.

[00:50:59] I was,

[00:51:00] I was thinking

[00:51:01] it was a race

[00:51:01] car,

[00:51:02] but yeah,

[00:51:04] latest version

[00:51:05] of its video

[00:51:06] generation model,

[00:51:08] uh,

[00:51:08] seven times faster

[00:51:09] generations.

[00:51:10] The runway

[00:51:11] co-founder and

[00:51:12] CEO Cristobal

[00:51:14] Valenzuela

[00:51:14] tweeted,

[00:51:15] it now takes me

[00:51:16] longer to type

[00:51:17] a sentence than

[00:51:18] to generate a

[00:51:19] video.

[00:51:20] Um,

[00:51:20] you can go

[00:51:20] from prompt

[00:51:22] to a 10

[00:51:23] second,

[00:51:23] second video in

[00:51:25] somewhere around

[00:51:25] 30 to 45

[00:51:27] seconds.

[00:51:28] And you played

[00:51:29] around with this

[00:51:29] a little bit.

[00:51:30] It's very fast.

[00:51:30] So I just went

[00:51:30] in and looked

[00:51:31] for any photo

[00:51:32] and I picked

[00:51:33] a stupid photo

[00:51:33] of myself,

[00:51:34] uh,

[00:51:34] when I was at

[00:51:35] the,

[00:51:35] um,

[00:51:36] rare book

[00:51:37] school learning

[00:51:38] how to set

[00:51:38] type in the

[00:51:39] old fashioned

[00:51:39] way,

[00:51:39] letter at a

[00:51:40] time.

[00:51:41] So I just

[00:51:41] took that photo

[00:51:42] in and cause

[00:51:43] it was the

[00:51:43] nearest I thought

[00:51:44] that you could

[00:51:44] animate in some

[00:51:45] way.

[00:51:45] And when I put

[00:51:46] it in,

[00:51:46] I had to crop

[00:51:47] the hell out

[00:51:47] of it because

[00:51:48] they wanted to

[00:51:49] a very horizontal

[00:51:49] image.

[00:51:51] but so we're

[00:51:51] going to play

[00:51:52] the video in a

[00:51:52] second and what

[00:51:53] you'll see is

[00:51:54] that image.

[00:51:55] And then you

[00:51:55] see there,

[00:51:56] play it again.

[00:51:58] It started,

[00:51:59] uh,

[00:52:00] we kind of,

[00:52:00] it skipped.

[00:52:01] Yeah.

[00:52:01] See if you

[00:52:02] can.

[00:52:03] Nope.

[00:52:03] Um,

[00:52:04] I'm trying to

[00:52:04] see if I,

[00:52:05] if I can.

[00:52:05] There you go.

[00:52:06] So this is

[00:52:06] here.

[00:52:07] I got one

[00:52:08] hand in it

[00:52:09] holding the

[00:52:09] type and

[00:52:11] then it moves

[00:52:12] up and suddenly

[00:52:12] the type reveals

[00:52:13] itself to be a

[00:52:14] circuit board and

[00:52:15] another hand

[00:52:16] comes up.

[00:52:17] But note that

[00:52:18] my first hand

[00:52:18] now has only

[00:52:19] three fingers

[00:52:20] and one thumb

[00:52:22] and I'm now

[00:52:23] holding a circuit

[00:52:24] board and it's

[00:52:25] moving my mouth

[00:52:26] and lips as if

[00:52:27] I'm talking.

[00:52:27] There's no sound

[00:52:28] so it isn't.

[00:52:29] Um,

[00:52:30] and it's so

[00:52:30] funny because I

[00:52:31] know you so

[00:52:31] well,

[00:52:32] Jeff,

[00:52:32] and even just

[00:52:33] watching your

[00:52:34] lips in this

[00:52:35] thing move,

[00:52:35] I'm like,

[00:52:36] that's not Jeff.

[00:52:36] Like that doesn't

[00:52:37] look at all.

[00:52:38] But I love how

[00:52:39] it goes to what

[00:52:39] it knows,

[00:52:40] right?

[00:52:40] The idea of,

[00:52:41] of,

[00:52:42] of Gutenberg

[00:52:43] movable type,

[00:52:44] it has no

[00:52:44] sense of what

[00:52:45] this is.

[00:52:46] So if it's

[00:52:46] going to animate

[00:52:47] it,

[00:52:47] it turns it

[00:52:47] into a circuit

[00:52:48] board.

[00:52:49] Oh,

[00:52:49] it totally did.

[00:52:50] It just like

[00:52:51] completely morphs,

[00:52:52] right?

[00:52:52] It's line of type

[00:52:53] and then boop.

[00:52:54] It's now what it

[00:52:55] imagines is under

[00:52:56] the surface of

[00:52:57] this thing has

[00:52:57] to be green

[00:52:58] with parts on

[00:52:59] it.

[00:53:00] It must be

[00:53:02] modern technology.

[00:53:03] There's no way

[00:53:03] on earth.

[00:53:04] I don't even know

[00:53:04] what that is.

[00:53:05] Yeah,

[00:53:05] I wonder how

[00:53:06] much of the

[00:53:06] data set is,

[00:53:07] you know,

[00:53:07] old school

[00:53:08] line of type

[00:53:09] stuff.

[00:53:09] Not much.

[00:53:10] Not much.

[00:53:10] Not much.

[00:53:11] So,

[00:53:11] but,

[00:53:11] it was,

[00:53:13] I think it's

[00:53:13] cool that you

[00:53:14] can animate

[00:53:14] any photo,

[00:53:15] A.

[00:53:16] B,

[00:53:16] it is really

[00:53:17] fast.

[00:53:19] C.

[00:53:19] Yeah,

[00:53:20] so it didn't

[00:53:20] take very

[00:53:20] long for you.

[00:53:21] No,

[00:53:21] you may lose

[00:53:22] a finger along

[00:53:22] the way.

[00:53:23] Ouch.

[00:53:24] But,

[00:53:25] actually,

[00:53:25] I didn't lose

[00:53:26] it so much

[00:53:26] as it merged

[00:53:27] two of them.

[00:53:28] It became

[00:53:29] this one big

[00:53:30] whale-like

[00:53:30] bulbous finger.

[00:53:32] And I had to

[00:53:32] watch it about

[00:53:33] 10 times before

[00:53:34] I reckoned,

[00:53:34] what's wrong

[00:53:35] with it?

[00:53:35] Oh,

[00:53:35] yeah.

[00:53:36] Because you

[00:53:36] have to think

[00:53:37] to look at

[00:53:38] what it does

[00:53:38] with hands.

[00:53:39] And then it

[00:53:40] made up a hand

[00:53:40] for me,

[00:53:42] which was

[00:53:43] interesting too.

[00:53:44] So,

[00:53:44] anyway.

[00:53:45] Yeah.

[00:53:46] Yeah.

[00:53:46] It's another

[00:53:47] cool tool out

[00:53:47] there.

[00:53:49] And I hope

[00:53:50] soon.

[00:53:51] And it creates

[00:53:51] long clips too,

[00:53:53] as I understand.

[00:53:53] Like 10 seconds,

[00:53:54] is that the length

[00:53:55] of these clips?

[00:53:56] That's what these

[00:53:56] are,

[00:53:56] yeah.

[00:53:58] So,

[00:53:58] I just wrote

[00:53:59] a syllabus

[00:54:00] for a class

[00:54:01] that I might

[00:54:01] be teaching

[00:54:02] at another

[00:54:02] university

[00:54:03] coming up

[00:54:03] about creativity

[00:54:04] and AI.

[00:54:05] And so,

[00:54:05] I'm now

[00:54:06] starting,

[00:54:06] I'll be coming

[00:54:07] to you,

[00:54:07] Jason,

[00:54:08] asking for a lot

[00:54:08] of your

[00:54:08] advice on this

[00:54:09] because I'm

[00:54:10] looking for

[00:54:10] tools like this

[00:54:11] the students

[00:54:11] can use

[00:54:13] to create

[00:54:14] for their

[00:54:14] own self-expression,

[00:54:16] to do things

[00:54:17] they couldn't

[00:54:17] otherwise have

[00:54:18] done.

[00:54:19] So,

[00:54:19] I'm really

[00:54:19] interested when

[00:54:20] I see it

[00:54:20] coming across

[00:54:21] these tools

[00:54:21] these days,

[00:54:22] especially the

[00:54:22] ones that are

[00:54:22] free,

[00:54:23] you can try it

[00:54:23] because students

[00:54:24] can't pay.

[00:54:25] Yeah,

[00:54:25] yes,

[00:54:26] and that is

[00:54:26] important to

[00:54:27] point out

[00:54:27] that that

[00:54:28] tool does

[00:54:28] have kind

[00:54:29] of a free

[00:54:30] tier,

[00:54:30] you get a

[00:54:31] certain number

[00:54:31] of free

[00:54:32] credits that

[00:54:32] you can play

[00:54:33] with and

[00:54:34] test it out.

[00:54:35] So,

[00:54:35] very cool.

[00:54:36] I appreciate

[00:54:38] that.

[00:54:38] I love that

[00:54:39] you created

[00:54:39] something.

[00:54:40] You also

[00:54:41] created

[00:54:41] something with

[00:54:42] this next

[00:54:43] and last

[00:54:43] thing,

[00:54:44] Google's

[00:54:44] Imagine 3,

[00:54:46] I think that's

[00:54:46] how you pronounce

[00:54:47] it,

[00:54:48] or is it

[00:54:48] Imogen?

[00:54:49] I think it's

[00:54:49] Imagine 3,

[00:54:50] text-to-image

[00:54:51] generation tool.

[00:54:52] It got an

[00:54:53] update,

[00:54:53] a big update

[00:54:54] that we

[00:54:54] first heard

[00:54:55] of at

[00:54:56] Google I.O.

[00:54:57] earlier this

[00:54:57] year.

[00:54:58] You can get

[00:54:59] to it via

[00:54:59] AI Test Kitchen,

[00:55:01] which is a

[00:55:01] Google site that

[00:55:02] has a bunch of

[00:55:02] its AI kind

[00:55:03] of projects,

[00:55:05] kind of works

[00:55:06] in progress

[00:55:06] at this

[00:55:07] point.

[00:55:07] Better detail,

[00:55:08] richer lighting,

[00:55:09] fewer distracting

[00:55:10] artifacts,

[00:55:12] according to

[00:55:12] the article,

[00:55:13] and you can

[00:55:14] also,

[00:55:14] by the way,

[00:55:15] highlight a

[00:55:15] part of the

[00:55:16] image and

[00:55:16] tell what

[00:55:17] you want

[00:55:18] to change

[00:55:18] with that

[00:55:19] particular

[00:55:19] part.

[00:55:20] So,

[00:55:20] that's

[00:55:20] kind of

[00:55:21] neat.

[00:55:21] And you

[00:55:22] created

[00:55:22] something.

[00:55:23] What was

[00:55:23] the...

[00:55:23] I was in

[00:55:26] Gutenberg mode,

[00:55:27] not surprising

[00:55:27] for me.

[00:55:28] Have a

[00:55:28] drink,

[00:55:28] everybody.

[00:55:29] And so,

[00:55:30] I just

[00:55:30] asked it

[00:55:32] to...

[00:55:34] I said,

[00:55:35] a man

[00:55:35] hand-setting

[00:55:36] type,

[00:55:37] letter-by-letter,

[00:55:37] and a composing

[00:55:38] stick for the

[00:55:38] letterpress long

[00:55:39] before the line-of-type

[00:55:40] and computerized

[00:55:41] typesetting.

[00:55:41] I was thinking

[00:55:41] of the other

[00:55:42] image.

[00:55:43] He is a

[00:55:43] middle-aged...

[00:55:44] He's middle-aged

[00:55:45] with a white,

[00:55:46] short beard,

[00:55:46] wearing a blue

[00:55:47] printer's apron,

[00:55:48] and a paper

[00:55:49] printer's hat.

[00:55:51] And then it

[00:55:51] didn't get the

[00:55:52] composing stick.

[00:55:53] It just had you...

[00:55:54] So,

[00:55:55] I went back

[00:55:55] and I added,

[00:55:56] when using a

[00:55:57] composing stick,

[00:55:58] a printer stands

[00:55:58] at a type

[00:55:59] case and

[00:56:00] picks up each

[00:56:00] letter with

[00:56:01] his right hand

[00:56:01] and puts it

[00:56:02] in the

[00:56:02] composing stick,

[00:56:03] which he holds

[00:56:04] in his left

[00:56:04] hand.

[00:56:04] It couldn't

[00:56:05] get that.

[00:56:06] It just

[00:56:06] didn't

[00:56:07] understand that.

[00:56:08] So,

[00:56:08] it's just a

[00:56:09] rather portly

[00:56:10] guy wearing

[00:56:10] a blue

[00:56:11] apron with a

[00:56:11] white beard.

[00:56:12] The funny

[00:56:13] thing was,

[00:56:13] I should have

[00:56:13] shown you all

[00:56:14] the images.

[00:56:14] It made

[00:56:16] a hundred

[00:56:16] different views

[00:56:17] of a

[00:56:17] printer's

[00:56:18] paper hat.

[00:56:20] This,

[00:56:21] for those of

[00:56:22] you who are

[00:56:22] on video,

[00:56:22] is a

[00:56:23] printer's

[00:56:23] paper hat,

[00:56:24] which in

[00:56:25] the old

[00:56:25] days used

[00:56:25] to be made

[00:56:26] out of a

[00:56:26] piece of

[00:56:27] newsprint,

[00:56:27] right?

[00:56:28] And they

[00:56:28] would wear

[00:56:29] this to

[00:56:29] keep their

[00:56:30] head clean

[00:56:31] while they

[00:56:31] were working

[00:56:31] on the

[00:56:32] presses and

[00:56:32] stuff.

[00:56:33] And then

[00:56:33] when kids

[00:56:34] would come

[00:56:34] to the

[00:56:35] newspapers,

[00:56:36] they would

[00:56:36] make this

[00:56:36] to wow

[00:56:37] them,

[00:56:37] they could

[00:56:37] walk away

[00:56:38] with it.

[00:56:38] So,

[00:56:38] that's the

[00:56:39] thing.

[00:56:39] So,

[00:56:40] this is a

[00:56:40] printer's

[00:56:40] paper hat.

[00:56:41] So,

[00:56:41] the computer

[00:56:42] made really

[00:56:43] creative ones.

[00:56:44] This one

[00:56:44] has kind

[00:56:45] of a

[00:56:45] swoop on

[00:56:46] it.

[00:56:47] And,

[00:56:47] you know,

[00:56:48] in text we'd

[00:56:48] call these

[00:56:49] hallucinations.

[00:56:49] It's got a

[00:56:49] visor.

[00:56:50] This one has

[00:56:50] a visor.

[00:56:51] Yeah,

[00:56:52] I don't know

[00:56:52] how the hell

[00:56:52] you'd fold

[00:56:53] that.

[00:56:53] It's a hell

[00:56:53] of an origami.

[00:56:55] Yeah,

[00:56:55] you could wear

[00:56:56] this outside

[00:56:56] and it'll

[00:56:57] protect you

[00:56:57] from the

[00:56:58] sun.

[00:56:58] Maybe it's

[00:56:58] a good

[00:56:58] adventure.

[00:56:59] We should

[00:56:59] patent that,

[00:57:00] I think.

[00:57:01] But,

[00:57:02] here too,

[00:57:03] I think the

[00:57:04] image quality

[00:57:05] was good.

[00:57:05] It was fast.

[00:57:07] It's getting

[00:57:07] better.

[00:57:08] But the

[00:57:08] thing about

[00:57:09] all AI

[00:57:09] that we

[00:57:10] have to

[00:57:10] keep in

[00:57:10] mind,

[00:57:11] it does

[00:57:11] not,

[00:57:12] cannot

[00:57:12] know

[00:57:13] reality.

[00:57:14] It has

[00:57:15] no way

[00:57:15] to tie

[00:57:16] to reality

[00:57:16] to understand

[00:57:17] what would

[00:57:18] be truthful

[00:57:19] and not.

[00:57:20] This is

[00:57:20] not a

[00:57:20] hallucination,

[00:57:21] it's that

[00:57:22] it doesn't

[00:57:22] have any

[00:57:23] experience.

[00:57:24] And it

[00:57:24] can't have

[00:57:24] any

[00:57:25] experience.

[00:57:26] Maybe

[00:57:27] what's

[00:57:27] actually

[00:57:28] going on,

[00:57:28] and we just

[00:57:29] haven't figured

[00:57:29] it out yet,

[00:57:30] is that

[00:57:30] AI,

[00:57:31] generative

[00:57:32] AI like

[00:57:32] this,

[00:57:33] it's not

[00:57:33] that it's

[00:57:34] creating

[00:57:34] things like

[00:57:35] it doesn't

[00:57:35] know reality,

[00:57:36] it's creating

[00:57:37] a different

[00:57:37] reality that

[00:57:38] we don't

[00:57:39] know exists.

[00:57:40] Matrix

[00:57:41] time.

[00:57:42] Yes,

[00:57:43] exactly.

[00:57:44] Somewhere

[00:57:44] the other

[00:57:45] side of

[00:57:46] there's

[00:57:46] not just

[00:57:47] a red

[00:57:47] pill and

[00:57:47] a blue

[00:57:48] pill,

[00:57:48] there's

[00:57:48] a green

[00:57:48] pill.

[00:57:52] Thank

[00:57:52] you.

[00:57:53] That's so

[00:57:53] amazing.

[00:57:54] I love

[00:57:54] it.

[00:57:54] The hat

[00:57:55] makes me

[00:57:55] laugh and

[00:57:56] makes me

[00:57:56] happy.

[00:57:59] Jeff,

[00:57:59] thank you

[00:58:00] so much

[00:58:00] for doing

[00:58:01] the show

[00:58:01] with me

[00:58:02] each and

[00:58:02] every week.

[00:58:02] This was

[00:58:03] a lot

[00:58:03] of fun.

[00:58:04] Gutenberg

[00:58:04] parenthesis,

[00:58:05] of course.

[00:58:06] I feel

[00:58:07] like,

[00:58:07] did you

[00:58:08] have

[00:58:08] some

[00:58:09] news

[00:58:10] that's

[00:58:11] hitting

[00:58:11] here?

[00:58:12] Oh,

[00:58:12] well,

[00:58:12] so the

[00:58:13] Monday,

[00:58:14] I hope to

[00:58:14] do

[00:58:14] sun

[00:58:14] have

[00:58:17] a

[00:58:17] book.

[00:58:19] It's

[00:58:19] web

[00:58:19] we

[00:58:19] weave.

[00:58:20] There's

[00:58:21] a

[00:58:21] green

[00:58:21] book

[00:58:21] over my

[00:58:21] shoulder

[00:58:22] over

[00:58:22] there.

[00:58:23] It's

[00:58:23] way too

[00:58:23] far

[00:58:23] back

[00:58:23] to

[00:58:24] see.

[00:58:25] Why

[00:58:26] we

[00:58:26] must

[00:58:26] reclaim

[00:58:26] the

[00:58:27] internet

[00:58:27] from

[00:58:27] moguls,

[00:58:28] misanthropes,

[00:58:29] and moral

[00:58:29] panic.

[00:58:30] It will

[00:58:30] be out

[00:58:30] in October,

[00:58:31] but the

[00:58:31] discount code

[00:58:32] for

[00:58:32] pre-ordering

[00:58:33] just came

[00:58:34] up.

[00:58:43] It's

[00:58:44] and I

[00:58:44] would

[00:58:45] feel so

[00:58:45] grateful.

[00:58:45] You

[00:58:46] could

[00:58:46] order it

[00:58:46] and just

[00:58:47] use the

[00:58:47] web 20

[00:58:47] to get

[00:58:48] 20%

[00:58:48] off.

[00:58:50] Nice.

[00:58:51] Excellent.

[00:58:51] That's what

[00:58:52] I had

[00:58:53] heard of.

[00:58:53] Thank you.

[00:58:54] Love it.

[00:58:55] Thank you,

[00:58:55] Jeff.

[00:58:56] Thank you,

[00:58:56] Jason.

[00:58:57] Always fun.

[00:58:58] And I

[00:58:59] want to plug

[00:58:59] Jason once

[00:59:00] again that he

[00:59:01] covered the

[00:59:02] Google event

[00:59:02] with the

[00:59:03] pixel and

[00:59:04] had all

[00:59:05] kinds of

[00:59:05] videos about

[00:59:06] the individual

[00:59:06] features,

[00:59:06] but also did

[00:59:07] a great,

[00:59:07] I don't

[00:59:08] know,

[00:59:08] I'd call

[00:59:08] it a

[00:59:08] travel log.

[00:59:09] What

[00:59:09] would you

[00:59:09] call it?

[00:59:10] It was

[00:59:11] the whole

[00:59:12] day.

[00:59:12] It was

[00:59:13] a vlog

[00:59:14] style approach.

[00:59:15] I really,

[00:59:16] when I was

[00:59:16] kind of going

[00:59:17] into my

[00:59:18] made by

[00:59:18] Google day,

[00:59:19] I had

[00:59:20] this thing

[00:59:20] with a

[00:59:21] show and

[00:59:21] Jeff in

[00:59:22] pre-show,

[00:59:23] which is

[00:59:23] maybe a

[00:59:23] little,

[00:59:23] you know,

[00:59:25] a little

[00:59:25] too behind

[00:59:26] the scenes

[00:59:26] for some

[00:59:27] of the

[00:59:27] viewers,

[00:59:27] but it's

[00:59:28] an Osmo

[00:59:28] Pocket 3.

[00:59:29] It's like

[00:59:29] a 4K

[00:59:31] handheld,

[00:59:32] really easy

[00:59:33] on-the-go

[00:59:35] camera for

[00:59:36] vlog style

[00:59:37] recording,

[00:59:37] and I used

[00:59:38] it for

[00:59:38] the event.

[00:59:40] And so,

[00:59:40] you know,

[00:59:41] it's really

[00:59:41] a lot of

[00:59:42] it is

[00:59:42] kind of

[00:59:43] what it's

[00:59:44] like,

[00:59:44] well,

[00:59:44] I'm

[00:59:45] showing an

[00:59:45] ad now,

[00:59:46] but what

[00:59:46] it's like

[00:59:47] to go to

[00:59:48] an event

[00:59:48] like that

[00:59:49] and see

[00:59:50] what I

[00:59:50] saw and

[00:59:50] be involved

[00:59:51] with it

[00:59:52] and everything.

[00:59:52] And yeah,

[00:59:53] it was just a lot

[00:59:54] of fun.

[00:59:54] It was a

[00:59:54] different style

[00:59:55] of video

[00:59:55] and I

[00:59:56] enjoyed it

[00:59:56] and I

[00:59:57] got a lot

[00:59:57] of positive

[00:59:57] feedback from

[00:59:58] it.

[00:59:58] So,

[00:59:59] if you do

[01:00:00] want to

[01:00:00] check that

[01:00:00] out,

[01:00:01] you can

[01:00:01] find that

[01:00:01] at the

[01:00:02] TechSploder

[01:00:02] YouTube channel,

[01:00:03] YouTube.com

[01:00:04] slash

[01:00:04] at

[01:00:04] TechSploder,

[01:00:05] as well as my

[01:00:05] video that published

[01:00:06] today about the

[01:00:07] Pixel 9 and

[01:00:08] all the AI

[01:00:08] features.

[01:00:09] I went into

[01:00:09] detail on a

[01:00:11] lot of that

[01:00:11] stuff there.

[01:00:12] So,

[01:00:12] just go to

[01:00:12] that page,

[01:00:13] search for it.

[01:00:14] Of course,

[01:00:15] you can also

[01:00:15] get our

[01:00:16] video,

[01:00:17] the video

[01:00:18] version of

[01:00:19] this show

[01:00:19] on that

[01:00:20] page,

[01:00:21] which we

[01:00:22] record live

[01:00:22] every Wednesday,

[01:00:23] 10 a.m.

[01:00:24] Pacific,

[01:00:24] 1 p.m.

[01:00:25] Eastern.

[01:00:26] And if you

[01:00:27] don't want to

[01:00:27] see this in

[01:00:28] video and you

[01:00:29] just wanted

[01:00:29] an audio,

[01:00:30] I recommend

[01:00:31] you go to

[01:00:32] AIinside.show.

[01:00:33] That is the

[01:00:34] webpage with

[01:00:36] all the

[01:00:36] information for

[01:00:37] the podcast,

[01:00:38] so you can

[01:00:38] subscribe to

[01:00:39] the audio

[01:00:40] podcast and

[01:00:41] we appreciate

[01:00:42] if you do

[01:00:42] that.

[01:00:43] And finally,

[01:00:44] I would be

[01:00:45] remiss if I

[01:00:45] didn't point

[01:00:46] out the

[01:00:47] Patreon.

[01:00:47] That's

[01:00:47] patreon.com

[01:00:48] slash

[01:00:49] AIinsideshow.

[01:00:50] You can join

[01:00:51] so many patrons

[01:00:52] who support us

[01:00:53] each and every

[01:00:53] week,

[01:00:54] who get

[01:00:54] special perks

[01:00:55] like ad-free

[01:00:55] shows,

[01:00:56] early access

[01:00:57] to videos

[01:00:57] that have

[01:00:58] something to

[01:00:58] do with

[01:00:58] AI on the

[01:00:59] TechSploder

[01:00:59] YouTube

[01:01:00] channel,

[01:01:00] a Discord

[01:01:01] community,

[01:01:02] regular

[01:01:02] Hangouts.

[01:01:03] You also

[01:01:04] get,

[01:01:04] by the way,

[01:01:05] an AIinside

[01:01:06] t-shirt.

[01:01:07] That's

[01:01:07] something that

[01:01:08] you want.

[01:01:08] You can

[01:01:09] become an

[01:01:09] executive producer

[01:01:10] at that level

[01:01:11] and you'll

[01:01:12] get a t-shirt

[01:01:12] and you

[01:01:14] will also

[01:01:14] get your

[01:01:14] name read

[01:01:15] at the

[01:01:15] end of

[01:01:15] the show

[01:01:15] like

[01:01:16] Dr.

[01:01:16] Do,

[01:01:17] Jeffrey

[01:01:17] Maricini,

[01:01:18] WPVM

[01:01:19] 103.7 in

[01:01:20] Nashville,

[01:01:20] North Carolina,

[01:01:21] and Paul

[01:01:22] Lang.

[01:01:23] Yay,

[01:01:23] thank you

[01:01:24] all.

[01:01:24] Yay,

[01:01:25] thank you

[01:01:25] all so much

[01:01:26] and thank

[01:01:27] you so much

[01:01:28] to everyone

[01:01:29] even outside

[01:01:30] of the

[01:01:30] Patreon

[01:01:32] for supporting

[01:01:32] and watching

[01:01:33] this show.

[01:01:34] We will see

[01:01:34] you next time

[01:01:35] on another

[01:01:35] episode of

[01:01:36] AI Inside.

[01:01:37] Bye,

[01:01:37] everybody.

[01:01:38] See y'all.