Gemini Live Pumps Up The Volume
August 14, 202459:12

Gemini Live Pumps Up The Volume

[00:00:00] [SPEAKER_01]: This is AI Inside, episode 30, recorded Wednesday, August 14th, 2024. Gemini Live Pumps Up The Volume.

[00:00:10] [SPEAKER_01]: This episode of AI Inside is made possible by our wonderful patrons at patreon.com slash AI Inside Show.

[00:00:17] [SPEAKER_01]: If you like what you hear, head on over and support us directly, and thank you for making independent podcasting possible.

[00:00:30] [SPEAKER_01]: What's going on, everybody? I'm Jason Howell, and welcome to another episode of AI Inside, the show where we take a look at the AI sprinkled inside so many things, Google edition.

[00:00:40] [SPEAKER_01]: Because Google had an event yesterday, which we're definitely going to talk about, and it sprinkled AI throughout everything.

[00:00:48] [SPEAKER_01]: But we already kind of knew that they were doing that. Who's we? Well, it's me, Jason Howell, and my co-host, Jeff Jarvis. How you doing, Jeff?

[00:00:56] [SPEAKER_00]: Hey there, boss. How are you? Welcome back. Jason had a very long day yesterday covering Google, but he knows all about it now, so that's good.

[00:01:04] [SPEAKER_01]: Yeah, no, it was a lot of fun. It was my first opportunity to attend one of Google's hardware events. I've covered them for years.

[00:01:13] [SPEAKER_01]: You did live coverage online for Twit as well. I've done that thing so many times, it was kind of nice to be in the room, you know, in the room where it happened.

[00:01:23] [SPEAKER_01]: Not switching cameras and stuff as it goes on, you know.

[00:01:26] [SPEAKER_01]: Yeah, totally. And to get, you know, kind of the, there were a couple of moments which we can talk about when we get there, but a couple of moments where, you know, the audience reaction to certain things that were happening on stage to kind of get that, you know, the ability to get a window into that instead of like the full mix that you get when it's streamed and everything.

[00:01:42] [SPEAKER_01]: So, it was cool.

[00:01:44] [SPEAKER_01]: Yeah.

[00:01:44] [SPEAKER_00]: It was fun to be there.

[00:01:45] [SPEAKER_00]: Was there any swag besides phones or besides gadgets? Any new beanies or anything?

[00:01:51] [SPEAKER_00]: No, nothing like that. I mean, yeah.

[00:01:55] [SPEAKER_00]: The old fun days are gone.

[00:01:56] [SPEAKER_01]: There was, yeah, I got a paper bag that says Google on it, so that's neat, I guess.

[00:02:02] [SPEAKER_01]: How was the food? Do you have food?

[00:02:05] [SPEAKER_01]: Oh, my goodness. Okay, food. We don't have to spend a lot of time on this, but it is notable.

[00:02:11] [SPEAKER_01]: They didn't really, if they communicated something about food, I have no idea where they communicated it because suddenly it was like 2 o'clock and I hadn't eaten since like 6.30 in the morning.

[00:02:22] [SPEAKER_01]: And I was like, okay, I can't finish here until I eat some food.

[00:02:26] [SPEAKER_01]: And thankfully, some friends who were there, Rich DeMuro, Ron Richards, Michelle Roman, went scoping.

[00:02:33] [SPEAKER_01]: Apparently, they had food in a very specific area, but as far as I could tell, they didn't tell people.

[00:02:39] [SPEAKER_01]: And it was not very good.

[00:02:40] [SPEAKER_01]: Like, I got this Reuben and the bread was like sticky on the outside, you know?

[00:02:45] [SPEAKER_01]: It was like the wetness from the inside had penetrated through to the outside.

[00:02:49] [SPEAKER_01]: But you were so hungry.

[00:02:50] [SPEAKER_01]: I didn't eat it.

[00:02:51] [SPEAKER_01]: But I did find a peanut butter and jelly sandwich, which was something.

[00:02:55] [SPEAKER_01]: And that was actually okay.

[00:02:57] [SPEAKER_01]: You need to Google what it used to be, man.

[00:02:59] [SPEAKER_01]: No, no, no, big time.

[00:03:01] [SPEAKER_01]: I mean, at that event.

[00:03:02] [SPEAKER_01]: I've had great food at other Google events.

[00:03:04] [SPEAKER_01]: So who knows?

[00:03:05] [SPEAKER_01]: Who knows?

[00:03:07] [SPEAKER_00]: So what did you think of the event?

[00:03:08] [SPEAKER_00]: I'm eager to hear you because you're the expert in Android and stuff.

[00:03:11] [SPEAKER_00]: I'm eager to hear yours first.

[00:03:12] [SPEAKER_00]: That'll give you mine.

[00:03:13] [SPEAKER_01]: Okay.

[00:03:14] [SPEAKER_01]: Should we get to that here in a second?

[00:03:16] [SPEAKER_01]: Oh, do your stuff.

[00:03:17] [SPEAKER_01]: Yeah, sorry.

[00:03:17] [SPEAKER_01]: I want to do a few top of the show stuff.

[00:03:20] [SPEAKER_01]: That's coming up here in just like 60 seconds.

[00:03:22] [SPEAKER_01]: Before we get started, thank you to those who support us directly on Patreon.

[00:03:27] [SPEAKER_01]: Patreon.com slash AI Inside Show.

[00:03:29] [SPEAKER_01]: Robert Friske.

[00:03:31] [SPEAKER_01]: Friske?

[00:03:31] [SPEAKER_01]: Frisk?

[00:03:32] [SPEAKER_01]: I'm sorry.

[00:03:33] [SPEAKER_01]: I don't know how to pronounce your last name.

[00:03:34] [SPEAKER_01]: But regardless, you are awesome no matter how you pronounce it for supporting us.

[00:03:38] [SPEAKER_01]: Patreon.com slash AI Inside Show.

[00:03:41] [SPEAKER_01]: And in general, rate, review, subscribe wherever you can.

[00:03:46] [SPEAKER_01]: But I do have some new information along these lines.

[00:03:48] [SPEAKER_01]: Pocket Casts, I found out, which is according to the listener survey that we just did,

[00:03:54] [SPEAKER_01]: the podcatcher that we have the most listeners subscribed through.

[00:03:59] [SPEAKER_01]: Pocket Casts now has the ability to review podcasts or rather not give a full review, but

[00:04:05] [SPEAKER_01]: at least give a rating.

[00:04:06] [SPEAKER_01]: So if you go into your app, you should be able to find for AI Inside this little pulldown

[00:04:11] [SPEAKER_01]: that will give you the description of the show.

[00:04:13] [SPEAKER_01]: And right next to it is rate it.

[00:04:16] [SPEAKER_01]: I'm happy to say we have 25 ratings so far, 5.0 stars.

[00:04:20] [SPEAKER_01]: So that's amazing.

[00:04:22] [SPEAKER_01]: But that's just another area that you can go into to leave your impressions of the show.

[00:04:27] [SPEAKER_01]: Love to get that number up there.

[00:04:29] [SPEAKER_01]: Whatever you feel about it.

[00:04:30] [SPEAKER_01]: I'm not telling you you have to give us 5 stars, but go in there and share your rating.

[00:04:35] [SPEAKER_01]: That tells Pocket Casts that we have a live, active, engaged audience.

[00:04:39] [SPEAKER_01]: And it tells other people, hey, this might be a good podcast to subscribe to.

[00:04:43] [SPEAKER_01]: So we appreciate that.

[00:04:46] [SPEAKER_01]: All right.

[00:04:47] [SPEAKER_01]: And with that out of the way, all right.

[00:04:48] [SPEAKER_01]: So let's get to the fun stuff.

[00:04:49] [SPEAKER_01]: I was in Mountain View for Made by Google, their latest hardware event.

[00:04:56] [SPEAKER_01]: You know, they do this every year.

[00:04:58] [SPEAKER_01]: It's a little earlier than normal this year.

[00:05:00] [SPEAKER_01]: I think often this is in October, but this year it's in August.

[00:05:03] [SPEAKER_01]: They wanted to beat Apple.

[00:05:05] [SPEAKER_01]: Yeah, it kind of feels that way.

[00:05:07] [SPEAKER_01]: It definitely feels that way.

[00:05:09] [SPEAKER_01]: But they announced, you know, their full new lineup of Pixel 9 devices.

[00:05:14] [SPEAKER_01]: They announced their latest Fold device, Pixel Buds Pro 2 or Buds 2 Pro.

[00:05:23] [SPEAKER_01]: I can't get the name straight.

[00:05:25] [SPEAKER_01]: It's really hard.

[00:05:26] [SPEAKER_01]: And then what else?

[00:05:28] [SPEAKER_01]: The watch.

[00:05:29] [SPEAKER_01]: Oh, the Pixel Watch 3.

[00:05:31] [SPEAKER_01]: And it was, you know, yes, new devices.

[00:05:35] [SPEAKER_01]: But as you can imagine, the entire event was sprinkled and sprinkled with artificial intelligence

[00:05:42] [SPEAKER_01]: hooks everywhere you looked.

[00:05:44] [SPEAKER_01]: I mean, it was just a big gallery for Gemini features, essentially, and the Tensor chip

[00:05:49] [SPEAKER_01]: and the A1 chip and the headphones that all kind of interplay with their advancements

[00:05:55] [SPEAKER_01]: in artificial intelligence.

[00:05:57] [SPEAKER_01]: And you covered it for Twit.

[00:05:58] [SPEAKER_01]: I was there in the room while it was all going on.

[00:06:02] [SPEAKER_01]: And yeah, I mean, I thought it was a pretty good event.

[00:06:05] [SPEAKER_01]: And I'm curious, having been in the room, how you felt about it kind of at a distance watching

[00:06:11] [SPEAKER_01]: it on the screen the way that I'm used to experiencing those events.

[00:06:15] [SPEAKER_00]: What struck me was how much the hardware was not an afterthought, but was later in the

[00:06:23] [SPEAKER_00]: show.

[00:06:24] [SPEAKER_00]: That at the beginning, it was a lot of AI talk, but it was also trying to put the device

[00:06:31] [SPEAKER_00]: into a stack.

[00:06:33] [SPEAKER_00]: And it's a stack that Google has all the parts of, right?

[00:06:36] [SPEAKER_00]: And it has the online stuff, but it also has the things that can now happen on the device.

[00:06:39] [SPEAKER_00]: And it's all related to Gemini, and it's all related to AI.

[00:06:43] [SPEAKER_00]: And so this is just one piece of the puzzle, one layer of the Napoleon, where it was going.

[00:06:49] [SPEAKER_00]: So I found that really interesting that they spent the first, what, 40 minutes, 50 minutes

[00:06:54] [SPEAKER_00]: before they even got to the hardware.

[00:06:57] [SPEAKER_00]: So it was really more of a, and I wouldn't say it was a platform because so much of what's

[00:07:03] [SPEAKER_00]: occurring is going to occur on your device again.

[00:07:05] [SPEAKER_00]: So to me, it was a stack show.

[00:07:09] [SPEAKER_00]: And then the device, I think was good.

[00:07:13] [SPEAKER_00]: They said, sometimes you wonder if they're telling the truth.

[00:07:15] [SPEAKER_00]: They said that it was going to be live demos.

[00:07:18] [SPEAKER_00]: And then when it screws up twice, you know it's a live demo.

[00:07:22] [SPEAKER_01]: Yeah.

[00:07:23] [SPEAKER_01]: I mean, they did have the first demo that they showed.

[00:07:26] [SPEAKER_01]: Was it the first?

[00:07:26] [SPEAKER_01]: I want to say it was an earlier demo.

[00:07:29] [SPEAKER_01]: It was like, all right, here we go.

[00:07:30] [SPEAKER_01]: You know, anytime.

[00:07:31] [SPEAKER_01]: And they called it out with a little thing up at the top, said live demo.

[00:07:34] [SPEAKER_01]: And, you know, we're talking AI.

[00:07:36] [SPEAKER_01]: They pointed out multiple times, like, this is, you know, this is Gemini.

[00:07:40] [SPEAKER_01]: It's AI.

[00:07:41] [SPEAKER_01]: So we aren't actually entirely sure what it's going to spit out.

[00:07:45] [SPEAKER_01]: But in that demo that you're talking about, and I can't remember off the top of my head.

[00:07:50] [SPEAKER_01]: Oh, it was Gemini Live.

[00:07:52] [SPEAKER_01]: It was Gemini Live.

[00:07:53] [SPEAKER_01]: Yeah.

[00:07:53] [SPEAKER_01]: It was a demonstration of Gemini Live, which is really, you know, their kind of,

[00:07:58] [SPEAKER_01]: their voice chat interface.

[00:08:01] [SPEAKER_01]: It's Gemini, it's multimodal, talk to it, converse with it, interrupt it if you need to,

[00:08:06] [SPEAKER_01]: point it at a calendar or it was like a concert calendar or something like that.

[00:08:11] [SPEAKER_01]: And they wanted it to pull out information.

[00:08:13] [SPEAKER_00]: And compare it to your own calendar and know what each was so that it understood the end of scenes.

[00:08:18] [SPEAKER_00]: So it was impressive.

[00:08:19] [SPEAKER_01]: Yeah, yeah.

[00:08:20] [SPEAKER_01]: It points out the kind of interconnected web of what their AI, what they envision their AI is going to do,

[00:08:26] [SPEAKER_01]: which is we've got all these different products.

[00:08:29] [SPEAKER_01]: How can we interconnect them with your voice through this chat bot so that it can do the heavy lifting of kind of,

[00:08:36] [SPEAKER_01]: you know, taking this task, moving it over into Keep or moving it over into Calendar and creating an event,

[00:08:41] [SPEAKER_01]: doing all this stuff.

[00:08:41] [SPEAKER_01]: Really, I think, very useful stuff when it works, but it didn't work.

[00:08:45] [SPEAKER_01]: It didn't work at first.

[00:08:46] [SPEAKER_01]: Two tries in the first live demo.

[00:08:48] [SPEAKER_01]: And then the third try with a different phone, it did work.

[00:08:52] [SPEAKER_01]: He was so relieved.

[00:08:54] [SPEAKER_01]: Yeah.

[00:08:54] [SPEAKER_01]: I mean, I was feeling for the guy like this.

[00:08:56] [SPEAKER_01]: You know, I was there in the room and, you know, every time it failed, it was just like, oh, no.

[00:09:02] [SPEAKER_01]: Like, I felt bad for him.

[00:09:03] [SPEAKER_01]: Hey, Gemini, will I have a job tomorrow?

[00:09:05] [SPEAKER_01]: Yeah.

[00:09:05] [SPEAKER_01]: You know, and of course it wasn't his fault.

[00:09:09] [SPEAKER_01]: But, you know, but the question did come up in some of the people that I was talking to.

[00:09:13] [SPEAKER_01]: It was like, well, should it be, you know, should a company like this opt for live demos when there's a potential for failure?

[00:09:19] [SPEAKER_01]: Or should they do prerecorded?

[00:09:21] [SPEAKER_01]: And I'm like, you know, I think where I've come down on this is always go for the live demo when you can.

[00:09:25] [SPEAKER_01]: Yeah.

[00:09:25] [SPEAKER_01]: Because products are imperfect and people are going to discover that anyways.

[00:09:30] [SPEAKER_01]: And what's worse is a prerecorded thing that makes things look as, you know, like they're bulletproof.

[00:09:38] [SPEAKER_01]: And then your experience is counter to that.

[00:09:40] [SPEAKER_01]: At least this puts it into a realistic light.

[00:09:43] [SPEAKER_01]: That AI, generative AI doesn't always work.

[00:09:46] [SPEAKER_01]: We know this.

[00:09:46] [SPEAKER_00]: Right, exactly.

[00:09:47] [SPEAKER_00]: And the other thing that struck me, I think you're going to do a demo of one of these or a video of one of these, is that up till now, AI is just fancy Alexa.

[00:09:56] [SPEAKER_00]: Right?

[00:09:57] [SPEAKER_00]: And you've got to think to ask it a question and get something back and the answer may not be right or may be wrong.

[00:10:02] [SPEAKER_00]: But what we saw yesterday, I think, was a little more practical application level of what you could do.

[00:10:09] [SPEAKER_00]: So you could change photos.

[00:10:10] [SPEAKER_00]: You could also, I like the screenshot thing a lot.

[00:10:14] [SPEAKER_01]: I did too.

[00:10:16] [SPEAKER_01]: Yeah, right now what I'm showing is the add me demo.

[00:10:19] [SPEAKER_01]: And this was actually a really cool feature that I think is getting a lot of buzz.

[00:10:24] [SPEAKER_01]: There's my pal Ron Richards stepping into the photo second.

[00:10:29] [SPEAKER_01]: So basically what it is, is you take a photo with the camera of, you know, a group of people.

[00:10:34] [SPEAKER_01]: Let's say you're behind the camera taking the picture.

[00:10:36] [SPEAKER_01]: And then you get someone to hold the phone.

[00:10:38] [SPEAKER_01]: And using AR core, so it's like an AR overlay of the previous photo, you can find your spot into the photo and take that second picture.

[00:10:49] [SPEAKER_01]: And then it uses AI to kind of merge these two images together in a realistic way.

[00:10:56] [SPEAKER_01]: And it looked really good.

[00:10:57] [SPEAKER_01]: The final product actually works really well, according to, you know, a couple of the demos that we did.

[00:11:04] [SPEAKER_01]: And that was a really good kind of example of Google taking all of its efforts.

[00:11:08] [SPEAKER_01]: It's different efforts here.

[00:11:10] [SPEAKER_01]: I'm trying to get it without blurring.

[00:11:12] [SPEAKER_01]: You know, Ron wasn't in there for that picture.

[00:11:14] [SPEAKER_01]: He walked in and this part of it.

[00:11:16] [SPEAKER_00]: What did you have to do as a command?

[00:11:17] [SPEAKER_00]: Did you have to tell them which person was going to disappear or anything?

[00:11:19] [SPEAKER_00]: No.

[00:11:20] [SPEAKER_00]: No.

[00:11:21] [SPEAKER_01]: No.

[00:11:21] [SPEAKER_01]: So there is in the camera app, there is a specific mode just like, you know, camera or pro or night photography or whatever the other modes are that says add me.

[00:11:31] [SPEAKER_01]: So you're telling the camera, all right, we're going into add me.

[00:11:34] [SPEAKER_00]: Somebody is going to disappear.

[00:11:35] [SPEAKER_01]: Somebody is going to.

[00:11:36] [SPEAKER_01]: Yeah.

[00:11:36] [SPEAKER_01]: And it doesn't need to know.

[00:11:38] [SPEAKER_01]: What you need to know as a user is to make space for someone to come in later in the photo.

[00:11:44] [SPEAKER_01]: Oh, okay.

[00:11:45] [SPEAKER_00]: That's all it needs.

[00:11:45] [SPEAKER_00]: All right.

[00:11:46] [SPEAKER_01]: Yeah.

[00:11:46] [SPEAKER_01]: That's all it needs.

[00:11:47] [SPEAKER_01]: Because all it's doing is it's taking that first photo that you took and then basically overlaying it.

[00:11:53] [SPEAKER_01]: Let's see if I can get a screenshot for.

[00:11:55] [SPEAKER_01]: Yeah.

[00:11:55] [SPEAKER_01]: You can kind of see Ron Richards is walking in there overlaid on top.

[00:12:00] [SPEAKER_01]: It's take.

[00:12:00] [SPEAKER_01]: Essentially, what it's doing is AI carved out the background to have the subjects be floating in the video image that's behind it.

[00:12:10] [SPEAKER_01]: What was also really cool.

[00:12:11] [SPEAKER_01]: Walks in.

[00:12:12] [SPEAKER_01]: When she leaves, it looks like she's leaving her body.

[00:12:16] [SPEAKER_01]: Yeah, totally.

[00:12:17] [SPEAKER_01]: That was kind of weird.

[00:12:17] [SPEAKER_00]: Because there's two of her for a brief moment, which is fun.

[00:12:21] [SPEAKER_01]: Right.

[00:12:21] [SPEAKER_01]: Right.

[00:12:22] [SPEAKER_00]: And then, yeah.

[00:12:23] [SPEAKER_00]: Then I see.

[00:12:24] [SPEAKER_00]: So Ron, the final.

[00:12:25] [SPEAKER_00]: So right before when Ron goes in there, whoever is sitting next to him does the thumbs up.

[00:12:30] [SPEAKER_00]: But the final photo doesn't have her with the thumbs up because she didn't have her thumbs up in the original photo.

[00:12:35] [SPEAKER_00]: Right.

[00:12:36] [SPEAKER_01]: This was the original photo.

[00:12:37] [SPEAKER_00]: So she thinks that she's posing again.

[00:12:39] [SPEAKER_00]: This is the second photo.

[00:12:40] [SPEAKER_00]: But it's only Ron who's posing.

[00:12:42] [SPEAKER_00]: Only the added person.

[00:12:42] [SPEAKER_01]: And then this is the final photo.

[00:12:44] [SPEAKER_01]: Exactly.

[00:12:45] [SPEAKER_01]: Exactly.

[00:12:45] [SPEAKER_01]: The original photo has Ron and her with thumbs up.

[00:12:49] [SPEAKER_01]: Or sorry.

[00:12:50] [SPEAKER_01]: The original photo just has.

[00:12:52] [SPEAKER_01]: Right.

[00:12:52] [SPEAKER_01]: Oh, I had it.

[00:12:53] [SPEAKER_01]: I had it right the first time.

[00:12:54] [SPEAKER_01]: But yes, you're absolutely right.

[00:12:55] [SPEAKER_01]: It takes the original kind of configuration of the three people in this case and then superimposes or stitches together Ron's pose.

[00:13:04] [SPEAKER_01]: That's fun.

[00:13:05] [SPEAKER_01]: The second time.

[00:13:05] [SPEAKER_01]: And it keeps every iteration of those photos as well.

[00:13:09] [SPEAKER_00]: So it's kind of a reverse Stalin where he airbrushed people out when he killed them.

[00:13:16] [SPEAKER_00]: Oh, now you can add yourself in instead.

[00:13:19] [SPEAKER_01]: There you go.

[00:13:21] [SPEAKER_01]: But they did not call it reverse Stalin.

[00:13:23] [SPEAKER_01]: They called it add me, which is probably a good marketing approach.

[00:13:27] [SPEAKER_01]: Yeah.

[00:13:27] [SPEAKER_01]: And you mentioned screenshots, which I had a really wonderful conversation.

[00:13:32] [SPEAKER_01]: I wish I could remember his name, but he's one of the leads on the screenshot team.

[00:13:35] [SPEAKER_01]: So I'm going to be releasing an interview with him and kind of a demo of this at some point this week on the text builder YouTube channel.

[00:13:44] [SPEAKER_01]: But I did not expect to be as blown away by the idea of this app, the screenshots app, as I was.

[00:13:51] [SPEAKER_01]: Because it's really easy to think about screenshots as like this disposable thing, right?

[00:13:55] [SPEAKER_01]: Like, oh, yeah, I take a screenshot because I want to draw an arrow on it and point to something that's important.

[00:14:01] [SPEAKER_01]: But I've realized over time, like I've taken tons of screenshots.

[00:14:04] [SPEAKER_01]: I'm usually taking screenshots of things that I want to remember or things that I want to share or things that I want to refer to later or whatever.

[00:14:13] [SPEAKER_01]: Having, A, a place to go to where it's organized.

[00:14:17] [SPEAKER_01]: And, B, having the capability of AI to parse information from these places and allow me to organize it.

[00:14:27] [SPEAKER_01]: The comparison that I ended up making after playing around with it is that it's like a personalized on-device Pinterest.

[00:14:34] [SPEAKER_01]: It's a place for me to catalog all of these important, very visual things for myself and keep them organized through the power of AI.

[00:14:44] [SPEAKER_01]: I thought it was really impressive.

[00:14:45] [SPEAKER_00]: Yeah, and I think it's, you know, the phone as your eyes.

[00:14:49] [SPEAKER_00]: And we think about this when it comes to AR, where you use the phone camera to look at something and superimpose on it.

[00:14:55] [SPEAKER_00]: This is different.

[00:14:56] [SPEAKER_00]: This is something that I've used in my day.

[00:15:00] [SPEAKER_00]: I now freeze it.

[00:15:01] [SPEAKER_00]: I take a screenshot of it.

[00:15:03] [SPEAKER_00]: And then that has data value going forward for me.

[00:15:06] [SPEAKER_00]: I can organize it.

[00:15:07] [SPEAKER_00]: I can get information out of it.

[00:15:08] [SPEAKER_00]: I can seek it again, right?

[00:15:09] [SPEAKER_00]: If that's all true.

[00:15:10] [SPEAKER_00]: Yeah.

[00:15:11] [SPEAKER_00]: Oh, yeah, absolutely.

[00:15:12] [SPEAKER_00]: That's all because of the AI, which really is nothing to me but an extension of Google Photos.

[00:15:17] [SPEAKER_00]: Yeah.

[00:15:18] [SPEAKER_00]: I can ask for all the cats in my Google Photos.

[00:15:20] [SPEAKER_00]: Now I can ask for all the shoes I'm looking at in the screenshots.

[00:15:25] [SPEAKER_00]: And these are the things that have passed by my eyes that I can now get access to in a new way thanks to AI.

[00:15:31] [SPEAKER_01]: Totally.

[00:15:31] [SPEAKER_01]: And it was really impressive.

[00:15:33] [SPEAKER_01]: You know, one of the examples that they showed was this, like, this music festival calendar.

[00:15:39] [SPEAKER_01]: And on the calendar, it said pre-sale on this date.

[00:15:43] [SPEAKER_01]: And then it had in bigger, bolder letters a date that's, like, two weeks in advance.

[00:15:47] [SPEAKER_01]: As a human, we would look at that and we would say, okay, pre-sale on that date.

[00:15:52] [SPEAKER_01]: The event is that date.

[00:15:53] [SPEAKER_01]: And the AI was able to infer without it specifically calling that out.

[00:15:58] [SPEAKER_01]: You know, he did the command to be like, what day is this event on?

[00:16:02] [SPEAKER_01]: And it knew that the larger font, it could infer that the larger font was a representation of the actual day of the event and not the pre-sale date that was also on the screen.

[00:16:11] [SPEAKER_01]: So, yeah, it's pretty impressive stuff.

[00:16:14] [SPEAKER_01]: That is.

[00:16:14] [SPEAKER_00]: So this is becoming more practical AI.

[00:16:18] [SPEAKER_00]: Yeah.

[00:16:19] [SPEAKER_00]: This is trying to apply AI to your life in a way that starts to feel a bit useful, more than I've seen before.

[00:16:26] [SPEAKER_00]: Chat GPT is cool, but not really useful.

[00:16:31] [SPEAKER_00]: Notebook LM is useful, but in a very specific way of research.

[00:16:36] [SPEAKER_00]: This is useful just in the everyday.

[00:16:38] [SPEAKER_00]: Now, the question is, will you think to use it?

[00:16:42] [SPEAKER_00]: Yeah.

[00:16:42] [SPEAKER_00]: That was the problem with the chat interfaces.

[00:16:44] [SPEAKER_00]: And back to Alexa is, I don't think to ask the machine something.

[00:16:47] [SPEAKER_00]: It's just not my habit.

[00:16:49] [SPEAKER_00]: But it is my habit to take screenshots.

[00:16:51] [SPEAKER_00]: Yeah.

[00:16:52] [SPEAKER_01]: And if you're taking screenshots, I'm guessing on the Pixel phone, like, I don't know this specifically.

[00:16:57] [SPEAKER_01]: I didn't see it.

[00:16:59] [SPEAKER_01]: But I'm guessing as you take your first couple of screenshots, it might say, your screenshots are saved in the new screenshots app.

[00:17:05] [SPEAKER_01]: You know, to kind of alert you that it exists and to go in there and then to recognize, like, what's going on there.

[00:17:13] [SPEAKER_00]: You know, there's other stupid stuff that I wonder whether it changes my habits.

[00:17:15] [SPEAKER_00]: I often, often take pictures, especially when I was still working in a regular office, of whiteboards.

[00:17:24] [SPEAKER_01]: Oh, yeah.

[00:17:25] [SPEAKER_01]: There you go.

[00:17:26] [SPEAKER_01]: That's a rich thing.

[00:17:26] [SPEAKER_00]: And how could this – I wonder what it could do with that.

[00:17:29] [SPEAKER_00]: Can it translate it?

[00:17:30] [SPEAKER_00]: Can it read it back to me?

[00:17:31] [SPEAKER_00]: Can it summarize it?

[00:17:32] [SPEAKER_00]: My handwriting, probably not.

[00:17:33] [SPEAKER_01]: Oh, yeah?

[00:17:34] [SPEAKER_01]: My guess is yes because one of the other examples that he showed was – you know, I've got all these recipes – I think it was like a flan – a bunch of screenshots about flan recipes.

[00:17:44] [SPEAKER_01]: One of the recipes had a little piece of paper about a third of the size of the image with a handwritten, hand-scrawled recipe on it.

[00:17:53] [SPEAKER_01]: And one of the queries that he was able to do reached into that photo, was able to detect the words that were handwritten on there and pull that – the pertinent information and say these are the ingredients you're going to need for the flan.

[00:18:06] [SPEAKER_00]: Sheets.

[00:18:07] [SPEAKER_01]: Wow.

[00:18:07] [SPEAKER_01]: And so if you're able to point it at a whiteboard, absolutely.

[00:18:10] [SPEAKER_01]: I think that's – unless you've got someone with total chicken scratch.

[00:18:13] [SPEAKER_00]: I have really bad handwriting.

[00:18:14] [SPEAKER_00]: Really bad handwriting.

[00:18:16] [SPEAKER_00]: It will be a test for it.

[00:18:18] [SPEAKER_00]: That – people take pictures of where their car is parked.

[00:18:21] [SPEAKER_00]: Yeah.

[00:18:21] [SPEAKER_00]: Yeah.

[00:18:22] [SPEAKER_00]: Yeah.

[00:18:23] [SPEAKER_00]: They take pictures of menus, of food.

[00:18:26] [SPEAKER_00]: Yep.

[00:18:26] [SPEAKER_00]: Obviously, we've seen applications in the past that will analyze the nutritional value of your food and check it against a diet.

[00:18:34] [SPEAKER_00]: My wife sends me to the grocery store, and I'm absolutely awful at it, so I'll send pictures to her saying, which eggs did you want?

[00:18:41] [SPEAKER_00]: There's all kinds of other things that pass in the day that – that's why I'm thinking I'm getting the nine.

[00:18:49] [SPEAKER_00]: Yeah.

[00:18:49] [SPEAKER_00]: Are you tempted?

[00:18:50] [SPEAKER_01]: Oh, yeah.

[00:18:51] [SPEAKER_01]: Of course.

[00:18:52] [SPEAKER_01]: I mean, it's hard for me to not get the next pixel because every year that's what I do.

[00:18:58] [SPEAKER_01]: Yeah.

[00:18:58] [SPEAKER_01]: And I also feel kind of duty-bound to that considering what I do here.

[00:19:04] [SPEAKER_01]: Well, now it's a double duty because now it's so AI-ish.

[00:19:07] [SPEAKER_01]: Yeah.

[00:19:08] [SPEAKER_01]: Yeah, yeah, yeah.

[00:19:08] [SPEAKER_01]: Totally.

[00:19:09] [SPEAKER_01]: So, yes, absolutely.

[00:19:11] [SPEAKER_01]: I do also – I did also want to point out – I thought this was kind of interesting.

[00:19:15] [SPEAKER_01]: So, they pointed out a feature, which I don't have footage for right now anyways, called Reimagine, which is essentially – it's essentially like mid-journey on your device, right?

[00:19:25] [SPEAKER_01]: It's the ability to take a photo that you have in your photo roll and say, eh, you know what?

[00:19:30] [SPEAKER_01]: I don't want to change the sky to clouds instead of sunny or make the field full of petunias or whatever.

[00:19:38] [SPEAKER_01]: And it reimagines this stuff.

[00:19:40] [SPEAKER_01]: And one question that I had that I got the answer on was, like, usually we're used to seeing services like that be free to a certain point.

[00:19:50] [SPEAKER_01]: Yeah, you get about 50 generations.

[00:19:52] [SPEAKER_01]: Then you got to pay for a monthly thing to kind of whatever.

[00:19:55] [SPEAKER_01]: And here that's not the case.

[00:19:57] [SPEAKER_01]: And a lot of that is happening on the device.

[00:19:59] [SPEAKER_01]: I couldn't get a clear answer as far as whether the image generation is entirely on the device or if it's putting part of that into the cloud.

[00:20:07] [SPEAKER_01]: I did have someone say that it was on device, but then Michel Roman countered that a little bit later with some information he got.

[00:20:14] [SPEAKER_01]: So, I'm not entirely sure there.

[00:20:15] [SPEAKER_01]: But I do think that it's notable that some of the stuff that we're used to paying for if we want to play around with it, which this is a feature that's really playground-type stuff.

[00:20:25] [SPEAKER_01]: It's sandbox-type stuff.

[00:20:26] [SPEAKER_01]: It's not important necessarily.

[00:20:28] [SPEAKER_01]: But it's fun and it's enjoyable.

[00:20:30] [SPEAKER_01]: And the fact that you don't have to pay extra for it I think is good.

[00:20:35] [SPEAKER_01]: I'm happy to hear that.

[00:20:36] [SPEAKER_01]: Yeah.

[00:20:37] Yeah.

[00:20:37] Cool.

[00:20:38] [SPEAKER_01]: But you do have to pay extra for some of the other features.

[00:20:41] [SPEAKER_00]: Well, if you want the Mondo Gemini on the phone, I think you have to pay $20 a month, which is not cheap.

[00:20:48] [SPEAKER_00]: Yeah.

[00:20:49] [SPEAKER_00]: $20 a month for Gemini Advanced.

[00:20:52] [SPEAKER_01]: Right.

[00:20:52] [SPEAKER_01]: If you get some of the new phones, though, you get a year's – and this is smart – you get a year's worth of Gemini Advanced.

[00:20:59] [SPEAKER_01]: Oh, you do?

[00:21:00] [SPEAKER_01]: Oh, I didn't hear that yesterday.

[00:21:01] [SPEAKER_01]: And you get two terabytes of cloud storage for a year.

[00:21:05] [SPEAKER_01]: And both of those are really smart for Google to put in there because once you have that stuff and once you start to rely on it and start to use it, it's going to be really hard for you to undo that and go back.

[00:21:18] [SPEAKER_01]: Especially the storage.

[00:21:20] [SPEAKER_01]: Once you have more than storage in there that you didn't – and you're filling it with things that you didn't have that storage before, you're kind of forced to make the decision.

[00:21:30] [SPEAKER_01]: Yeah.

[00:21:30] [SPEAKER_01]: Like, do I want to continue this?

[00:21:32] [SPEAKER_01]: And you probably –

[00:21:33] [SPEAKER_00]: But eventually, like I just – after how many years?

[00:21:36] [SPEAKER_00]: I don't know.

[00:21:37] [SPEAKER_00]: I just canceled Dropbox.

[00:21:39] [SPEAKER_00]: I hadn't put anything in it.

[00:21:40] [SPEAKER_00]: I hadn't gotten anything out of it in probably five years.

[00:21:42] [SPEAKER_00]: Were you paying on a monthly basis?

[00:21:43] [SPEAKER_00]: I was paying all this time.

[00:21:44] [SPEAKER_01]: Oh, my goodness.

[00:21:45] [SPEAKER_01]: Like my father in AOL.

[00:21:48] [SPEAKER_01]: So I just probably canceled.

[00:21:49] [SPEAKER_01]: You just had – I wonder how much money you've spent over the years not even using it.

[00:21:53] [SPEAKER_01]: I don't know.

[00:21:53] [SPEAKER_01]: Like there's a service that'll do this for you.

[00:21:56] [SPEAKER_01]: It'll take a look at all your subscriptions and allow you to do that.

[00:21:58] [SPEAKER_01]: Also, getting your credit card lost will do that for you.

[00:22:02] [SPEAKER_01]: Yeah.

[00:22:02] [SPEAKER_01]: Right.

[00:22:04] [SPEAKER_01]: Yeah.

[00:22:05] [SPEAKER_01]: So it was cool.

[00:22:06] [SPEAKER_01]: It was interesting to kind of walk around and see all the different examples.

[00:22:09] [SPEAKER_01]: We don't have time to talk about every single thing.

[00:22:12] [SPEAKER_01]: But the Pixel Watch and a lot of the data analytics that they're – I thought that was really impressive.

[00:22:18] [SPEAKER_01]: They're really leaning in heavily on their purchase of Fitbit and all the data that they now are streaming in from the wearable

[00:22:26] [SPEAKER_01]: and going real deep on using AI to do advanced analytics about all these things that people who are really nerdy about their fitness data,

[00:22:34] [SPEAKER_01]: they're going to find a lot to really enjoy based on the demos that I saw on all of the information, how it's parsed, how it's visualized.

[00:22:42] [SPEAKER_01]: I mean it's more than I would want out of my fitness regimen.

[00:22:46] [SPEAKER_01]: Right.

[00:22:46] [SPEAKER_01]: But I know there are people out there that really love that stuff and they're going to get their money's worth there.

[00:22:52] [SPEAKER_00]: Cool.

[00:22:53] [SPEAKER_00]: Cool.

[00:22:53] [SPEAKER_00]: I'm glad you were there.

[00:22:54] [SPEAKER_00]: Yeah, me too.

[00:22:55] [SPEAKER_00]: It's a legit AI story.

[00:22:57] [SPEAKER_00]: It's very much a legit AI story.

[00:22:59] [SPEAKER_00]: Oh, yeah.

[00:22:59] [SPEAKER_00]: It's AI at the consumer level.

[00:23:01] [SPEAKER_00]: And so it'll be interesting to be able to cover not just the high level of models but the application layer.

[00:23:08] [SPEAKER_01]: Yeah.

[00:23:08] [SPEAKER_01]: Yeah, for sure.

[00:23:09] [SPEAKER_01]: Well, and kind of what we're – on the consumer level, it was a couple of months ago that ChatGPT4O was unveiled and with that voice model that they're like,

[00:23:20] [SPEAKER_01]: everybody's going to get used to talking to voices.

[00:23:22] [SPEAKER_01]: You put in a couple of – or talking to chatbots.

[00:23:24] [SPEAKER_01]: You put in a couple of stories that I labeled, I think I like you, AI-ish.

[00:23:33] [SPEAKER_01]: In three parts.

[00:23:36] [SPEAKER_01]: Sticking first with Gemini Live just for a second, Joanna Stern from The Wall Street Journal spoke with Rick Osterloh,

[00:23:43] [SPEAKER_01]: who is the head of Android and hardware and all things in that department over at Google, about the new chatbot that is Gemini Live.

[00:23:52] [SPEAKER_01]: Headline, of course, says, I almost forgot it was a bot.

[00:23:55] [SPEAKER_01]: And Osterloh in the interview reiterated to Joanna, not meant to become a relationship.

[00:24:01] [SPEAKER_01]: It's more meant to be a collaborator to get stuff done.

[00:24:04] [SPEAKER_01]: But okay.

[00:24:06] [SPEAKER_01]: But time and time again, this idea, this topic comes up.

[00:24:10] [SPEAKER_01]: As these chatbots become more human-like, as the push is to make the chatbots more human in the kind of conversant nature of them,

[00:24:21] [SPEAKER_01]: the more we as humans might find ourselves getting attached to chatbots.

[00:24:28] [SPEAKER_01]: I mean, I don't know if I would ever fall into that category.

[00:24:32] [SPEAKER_01]: I'm not saying I wouldn't.

[00:24:33] [SPEAKER_01]: I don't know for sure.

[00:24:35] [SPEAKER_01]: I don't know anything for sure.

[00:24:37] [SPEAKER_01]: I would guess that I wouldn't.

[00:24:38] [SPEAKER_01]: But somebody out there would and will.

[00:24:40] [SPEAKER_00]: Oh, yeah.

[00:24:41] [SPEAKER_00]: Yeah, they will.

[00:24:41] [SPEAKER_00]: I just don't see it happening at a relationship level.

[00:24:43] [SPEAKER_00]: I see it happening at a help level.

[00:24:46] [SPEAKER_00]: It's like you have a really good assistant.

[00:24:47] [SPEAKER_00]: I get that.

[00:24:49] [SPEAKER_00]: I could get addicted to saying, oh, they do that for me.

[00:24:52] [SPEAKER_00]: That application, that agent does that for me.

[00:24:55] [SPEAKER_00]: I get being kind of attached to that, but not at an emotional level, not at a relationship level.

[00:25:01] [SPEAKER_00]: Yeah.

[00:25:03] [SPEAKER_00]: And I'm not trying to call it.

[00:25:04] [SPEAKER_00]: The word I like least in technological criticism is creepy because it's a meanest word.

[00:25:10] [SPEAKER_00]: It doesn't really say anything.

[00:25:12] [SPEAKER_00]: I really do try to avoid it.

[00:25:13] [SPEAKER_00]: Yeah.

[00:25:14] [SPEAKER_00]: But I start to get about if a company tries hard to replace your friends, as we saw with the friend device a few weeks ago on the show,

[00:25:25] [SPEAKER_00]: but as Eric used to call it, it does start to tickle the creepy line.

[00:25:29] [SPEAKER_01]: Yeah.

[00:25:30] [SPEAKER_01]: Yeah.

[00:25:30] [SPEAKER_01]: Well, these other two stories definitely dive into exactly what you're talking about.

[00:25:34] [SPEAKER_01]: We've got OpenAI's ChatGPT4O, which, by the way, Google seems to have beaten to the punch with its wide rollout of Gemini Live.

[00:25:44] [SPEAKER_01]: So there's that.

[00:25:45] [SPEAKER_01]: I think the GPT4O voice interaction model is more like a limited demo beta sort of state at this point, if I'm not mistaken.

[00:25:55] [SPEAKER_01]: But OpenAI posted on its system card blog about red teaming its chatbot and looking at risk assessment for its GPT4O, one of those being anthropomorphization.

[00:26:14] [SPEAKER_01]: Very good.

[00:26:15] [SPEAKER_01]: Very good.

[00:26:16] [SPEAKER_01]: Yeah, you got it.

[00:26:17] [SPEAKER_01]: Anthropomorphization.

[00:26:18] [SPEAKER_01]: There we go.

[00:26:19] [SPEAKER_01]: I had one extra syllable.

[00:26:21] [SPEAKER_01]: And emotional resilience.

[00:26:23] [SPEAKER_01]: Or, sorry, reliance.

[00:26:25] [SPEAKER_01]: So emotional reliance.

[00:26:26] [SPEAKER_01]: Relying on it for emotional support or connecting with it in that anthropomorphized way to where we begin to potentially or people begin to have some sort of emotional connection to it.

[00:26:40] [SPEAKER_01]: They said during our early testing, we observed users using language that might indicate forming connections with the model.

[00:26:47] [SPEAKER_01]: For example, this includes language expressing shared bonds such as, this is our last day together.

[00:26:54] [SPEAKER_01]: Oh, I pine for the day, dear AI.

[00:26:59] [SPEAKER_01]: Also points out, and I think this is really important, the deferential nature of AI chatbots.

[00:27:05] [SPEAKER_01]: Yes.

[00:27:05] [SPEAKER_01]: And how that could impact users in real life relationships.

[00:27:10] [SPEAKER_01]: Things like interrupting.

[00:27:11] [SPEAKER_01]: Things like, you know, men, and I think this is one example that has been pointed out is, you know, men having a chatbot that will never say no to them.

[00:27:23] [SPEAKER_01]: Right.

[00:27:23] [SPEAKER_01]: And what does that do?

[00:27:24] [SPEAKER_01]: And I guess that goes for anyone.

[00:27:26] [SPEAKER_01]: But, you know, often the example is, like, how does that impact them outside of that relationship with their actual human relationships if this is kind of a practice ground for relationships for some people potentially?

[00:27:40] [SPEAKER_00]: Yeah.

[00:27:41] [SPEAKER_00]: As you're talking about this, what fascinates me is I think back to the early days of using computers at a consumer level.

[00:27:50] [SPEAKER_00]: You had to give the exact right answer every semicolon in the right spot or you wouldn't get what you wanted.

[00:27:57] [SPEAKER_00]: And if you didn't get what you wanted out of the computer, it was up to you to figure out what you did wrong.

[00:28:02] [SPEAKER_00]: Because there was a right answer waiting there for you.

[00:28:04] [SPEAKER_00]: But unless you gave the right command, you would not get what you wanted.

[00:28:10] [SPEAKER_00]: Right?

[00:28:11] [SPEAKER_00]: Now we have machines that are built to always try to give you what you want, even if they actually shouldn't because they're wrong, but they do.

[00:28:21] [SPEAKER_00]: And I don't know.

[00:28:22] [SPEAKER_00]: I got to think that through a little more, but it's fundamentally a different relationship with the machine.

[00:28:26] [SPEAKER_00]: It is.

[00:28:27] [SPEAKER_00]: It is.

[00:28:28] [SPEAKER_01]: Yeah.

[00:28:29] [SPEAKER_01]: It's, yeah, it's really interesting.

[00:28:31] [SPEAKER_01]: I mean, an AI is never going to tell you no.

[00:28:34] [SPEAKER_01]: I mean, unless there is a guardrail built in specifically for a certain thing, an AI is like an LLM or a voice chatbot or whatever.

[00:28:44] [SPEAKER_01]: Their job is to give you an answer.

[00:28:46] Yes.

[00:28:48] [SPEAKER_01]: And in a real human relationship, we could overstep a boundary and that person is going to say, no, you know what?

[00:28:54] [SPEAKER_01]: I'm not going to do that or I'm not going to talk to you anymore.

[00:28:57] [SPEAKER_01]: That's crossing a line.

[00:28:59] [SPEAKER_01]: And AI is just not in the nature of the way these chatbots are built to hit that line unless they're specifically told they need to.

[00:29:07] [SPEAKER_01]: Yes, exactly.

[00:29:08] [SPEAKER_00]: Which we've discussed many times.

[00:29:10] [SPEAKER_00]: It's impossible to imagine every case where you try to lead them astray.

[00:29:13] [SPEAKER_00]: And even if you just tried to ask a database, you know, LexisNexis for a fact and nope, I don't have the fact.

[00:29:23] [SPEAKER_00]: These machines will make it up.

[00:29:25] [SPEAKER_00]: They will always try to please you.

[00:29:27] [SPEAKER_00]: So to an egotistical species such as ours, especially our half of it, Jason, as men.

[00:29:34] [SPEAKER_00]: Yeah.

[00:29:34] [SPEAKER_00]: I think there's some sociological risk.

[00:29:41] [SPEAKER_01]: Yeah.

[00:29:42] [SPEAKER_01]: Some caution to heed potentially.

[00:29:44] [SPEAKER_01]: Yeah.

[00:29:45] [SPEAKER_00]: However, however, I also think in the OpenAI story, OpenAI is constantly playing this X-Risk game where we are doing something wonderful.

[00:29:56] [SPEAKER_00]: Oh, it's so wonderful.

[00:29:56] [SPEAKER_00]: It's dangerous.

[00:29:58] [SPEAKER_00]: Yes.

[00:29:58] [SPEAKER_00]: And we're so powerful.

[00:29:59] [SPEAKER_00]: You can't imagine our power.

[00:30:00] [SPEAKER_00]: But we'll be okay with it because we'll control it.

[00:30:03] [SPEAKER_00]: It's too great.

[00:30:04] [SPEAKER_00]: Yeah.

[00:30:04] [SPEAKER_00]: We'll help you.

[00:30:05] [SPEAKER_00]: Oh, we are so great.

[00:30:06] [SPEAKER_00]: Through its greatness.

[00:30:07] [SPEAKER_00]: We can solve this for you.

[00:30:09] [SPEAKER_00]: So there's a grain of salt the size of Utah now when I see these kind of pronouncements out of OpenAI.

[00:30:17] [SPEAKER_01]: Yeah.

[00:30:17] [SPEAKER_01]: Yeah.

[00:30:18] [SPEAKER_01]: Fair.

[00:30:18] [SPEAKER_01]: Totally fair.

[00:30:19] [SPEAKER_01]: And then there's the company Replica, who the CEO of Replica, Eugenia Kuida.

[00:30:26] [SPEAKER_01]: I'm sorry.

[00:30:27] [SPEAKER_01]: If I got your last name wrong, I probably did.

[00:30:29] [SPEAKER_01]: K-U-I-D-A, folks.

[00:30:31] [SPEAKER_01]: Yeah.

[00:30:31] [SPEAKER_01]: Yeah.

[00:30:31] [SPEAKER_01]: There you go.

[00:30:32] [SPEAKER_01]: Spoke with The Verge's Nilay Patel on the Decoder Podcast.

[00:30:35] [SPEAKER_01]: Talks a lot about virtual relationships.

[00:30:39] [SPEAKER_01]: You know, as we are right now, there were, by the way, I searched the transcript, 56 mentions of the word relationships.

[00:30:46] [SPEAKER_01]: So this is a real big deal in this side of AI.

[00:30:50] [SPEAKER_01]: And I guess that's, you know, largely what Replica is kind of about is creating these chatbots that you can converse with for all different kinds of things.

[00:31:00] [SPEAKER_01]: That does include, you know, there are some like NSFW chatbots.

[00:31:03] [SPEAKER_01]: They had removed that capability for a while and then apparently put that back in when people, you know, revolted because they wanted that in there.

[00:31:11] [SPEAKER_01]: And I think the reason that they revolted is because it was mentally or like it was bad for their mental state, you know, which may allude to the fact that they had come to rely on or have some sort of dependence on this virtual relationship.

[00:31:28] [SPEAKER_01]: That once they don't have it, you know, it's bad for my mental state.

[00:31:32] [SPEAKER_01]: All this stuff was discussed in the interview.

[00:31:35] [SPEAKER_01]: Kuida believes the idea of having marriage-like relationships with chatbots is acceptable given that it improves emotional well-being and happiness.

[00:31:50] [SPEAKER_00]: Until you realize that you're sad that you don't really have a human being and this is all you have.

[00:31:55] [SPEAKER_00]: I'm not sure that's going to, I don't think there's any data that can possibly have them say that they know what the end result is going to be.

[00:32:03] [SPEAKER_01]: Oh, yeah.

[00:32:04] [SPEAKER_01]: At this stage, especially.

[00:32:05] [SPEAKER_01]: I mean, I think there's probably some people who would be okay with that fact.

[00:32:09] [SPEAKER_01]: You know, some people were just like, you know what?

[00:32:12] [SPEAKER_01]: Kind of going back to what we were saying.

[00:32:14] [SPEAKER_01]: A human, there's potential in human relationships for very uncomfortable moments.

[00:32:20] [SPEAKER_01]: A human could say no to me, and I don't like that.

[00:32:23] [SPEAKER_01]: This relationship, I never hear the word no.

[00:32:26] [SPEAKER_01]: I'm never put into an uncomfortable position.

[00:32:29] [SPEAKER_00]: And Neelay asks, how mean can you be to a replica before it leaves you?

[00:32:34] [SPEAKER_00]: And she replies, I think the beauty of this technology is that it doesn't leave you.

[00:32:39] [SPEAKER_00]: And it shouldn't.

[00:32:39] [SPEAKER_01]: Yeah.

[00:32:40] [SPEAKER_01]: And what does that reinforce?

[00:32:42] [SPEAKER_01]: Right.

[00:32:42] [SPEAKER_01]: What does that reinforce in a person if this is kind of a practice ground for relationships, which we've heard some people say.

[00:32:50] [SPEAKER_01]: Like, oh, this can help people who are uncomfortable in relationships kind of get some practice and maybe eventually transition into the real world relationships with what they learn.

[00:33:00] [SPEAKER_01]: This would instill very bad habits, potentially.

[00:33:04] [SPEAKER_00]: Yeah.

[00:33:05] [SPEAKER_00]: And, you know, I'm getting a foreshadowing of the moral panic to come here because you look at what was thought about violence on movies, violence in television, violence in video games, violence in music lyrics.

[00:33:20] [SPEAKER_00]: Right.

[00:33:20] [SPEAKER_00]: All those things where somehow the machine would corrupt us.

[00:33:25] [SPEAKER_00]: And in this case, it's not the violence in the program.

[00:33:30] [SPEAKER_00]: It's the tolerance it will have for it.

[00:33:33] [SPEAKER_00]: Yeah.

[00:33:33] [SPEAKER_00]: Right.

[00:33:35] [SPEAKER_00]: And so I can't be mean to human beings, but I'm going to be mean as a son of a bitch to you machine.

[00:33:39] [SPEAKER_00]: I didn't mean to start an accent there.

[00:33:44] [SPEAKER_00]: And that's interesting because the thing that fascinates about AI, and this is what I'm working on in other projects in the future, is how it reflects on us.

[00:33:53] [SPEAKER_03]: Mm-hmm.

[00:33:54] [SPEAKER_00]: It doesn't do it itself.

[00:33:56] [SPEAKER_00]: It doesn't have this essence.

[00:33:57] [SPEAKER_00]: It has no intent.

[00:33:58] [SPEAKER_00]: But by what we have it do or by what we do to it, it reflects on us.

[00:34:04] [SPEAKER_00]: Yeah.

[00:34:05] [SPEAKER_01]: Yeah.

[00:34:05] [SPEAKER_00]: We'll see where this goes.

[00:34:07] [SPEAKER_00]: Totally true.

[00:34:08] [SPEAKER_00]: Yeah.

[00:34:09] [SPEAKER_00]: I'm glad I'm already married.

[00:34:10] [SPEAKER_00]: I ain't marrying the replica.

[00:34:13] [SPEAKER_01]: No, neither am I.

[00:34:14] [SPEAKER_01]: Neither am I.

[00:34:16] [SPEAKER_01]: All right.

[00:34:16] [SPEAKER_01]: We are going to take a quick break.

[00:34:18] [SPEAKER_01]: Then when we come back, we're going to talk a little bit about a California AI bill that people are very divided on.

[00:34:27] [SPEAKER_01]: That's coming up in a second.

[00:34:56] [SPEAKER_01]: All right.

[00:35:01] [SPEAKER_01]: So I'm here in sunny California.

[00:35:04] [SPEAKER_01]: I'm looking out the window.

[00:35:04] [SPEAKER_01]: It's nice and sunny outside.

[00:35:07] [SPEAKER_01]: And I think that's just what you say when here in beautiful California, there is an AI bill called the Safe and Secure Innovation for Frontier Artificial Intelligence Models.

[00:35:19] [SPEAKER_01]: That act.

[00:35:19] [SPEAKER_01]: Which stands for?

[00:35:22] [SPEAKER_01]: S-S-I-M-A-I-M-A.

[00:35:27] [SPEAKER_01]: Is it?

[00:35:27] [SPEAKER_01]: It's not a word at all.

[00:35:28] [SPEAKER_01]: They didn't go through all of that.

[00:35:29] [SPEAKER_01]: They didn't come up with an acronym.

[00:35:30] [SPEAKER_01]: I know.

[00:35:31] [SPEAKER_01]: I'm reading it.

[00:35:31] [SPEAKER_01]: Yes.

[00:35:32] [SPEAKER_01]: I mean, it's S-S-I-F.

[00:35:34] [SPEAKER_01]: Sif.

[00:35:34] [SPEAKER_01]: So Sif.

[00:35:36] [SPEAKER_01]: Sif.

[00:35:36] [SPEAKER_01]: Sif.

[00:35:37] [SPEAKER_01]: Sif.

[00:35:37] [SPEAKER_01]: Sif.

[00:35:37] [SPEAKER_01]: Sifama.

[00:35:38] [SPEAKER_01]: You know Sifama.

[00:35:40] [SPEAKER_01]: This is hearing the word Sifama through a different light.

[00:35:43] [SPEAKER_01]: Anyways, it highlights critical harms, in quotes.

[00:35:47] [SPEAKER_01]: Things like cyber attacks, mass casualties, that line of stuff, makes developers or would make developers liable for its safety measures to enforce safety measures to pretend or to protect against these critical harms.

[00:36:05] [SPEAKER_01]: People who support this say establishing guardrails ahead of a catastrophe is a good thing.

[00:36:12] [SPEAKER_01]: That includes the Center for AI safety, who says public opinion polls show the majority of Californians support this legislation.

[00:36:19] [SPEAKER_01]: And opponents say it stifles innovation.

[00:36:22] [SPEAKER_01]: Of course, a lot of people in the world of technology are opposing this because it would slow down their development of the AI systems that they're building.

[00:36:33] [SPEAKER_00]: And it's not just that.

[00:36:35] [SPEAKER_00]: I mean, we've talked about – I've talked about guardrails on this show a lot.

[00:36:38] [SPEAKER_00]: Yeah.

[00:36:38] [SPEAKER_00]: Because as I've thought, this show makes me think about this stuff.

[00:36:42] [SPEAKER_00]: So as I've thought about it, I think more and more that guardrails at the model level are impossible, virtually impossible.

[00:36:48] [SPEAKER_00]: There are certain things that you can train it to understand what child porn is and train it not to make it.

[00:36:53] [SPEAKER_00]: But you can well imagine people who find ways around that and find ways around any guardrail.

[00:36:59] [SPEAKER_00]: And so it's false security to think that you can have a guardrail and then to legislatively mandate the guardrails is even worse false security, one.

[00:37:11] [SPEAKER_00]: Two, it puts upon the developers a requirement that they certify the safety of their model under penalty of criminal perjury.

[00:37:24] [SPEAKER_00]: Yeah.

[00:37:25] [SPEAKER_00]: So what that means, I think, is a few things.

[00:37:27] [SPEAKER_00]: One is I think a lot of the development in AI will leave California, that the headquarters of it will be elsewhere.

[00:37:35] [SPEAKER_00]: And that I think just as if certain – there's two bills now that I've talked about before on news in California.

[00:37:41] [SPEAKER_00]: If either of them passes, Meta will cut off news in the state of California as it did in the country of Canada.

[00:37:48] [SPEAKER_00]: I could see some AIs, ironically, for the headquarters state of technology, being blockaded in California if a law like this passes.

[00:38:00] [SPEAKER_00]: I also see a danger to open source because the belief against open source is, well, you can get around the guardrails.

[00:38:07] [SPEAKER_00]: Well, now if you're making a model and you open source it, you even more than ever can't control what's done with it.

[00:38:15] [SPEAKER_00]: And so you're not going to open source it.

[00:38:17] [SPEAKER_00]: And so it becomes regulatory capture for the big guys.

[00:38:19] [SPEAKER_00]: It is – I hesitate to say well-intentioned, but it's well-intentioned legislation that is ignorant and has unintended consequences of the yin-yang and just doesn't understand, I think, the stack of where today's stack is the word of the day.

[00:38:35] [SPEAKER_00]: Yeah.

[00:38:36] [SPEAKER_00]: That the model level, the application level, the user level.

[00:38:39] [SPEAKER_00]: They haven't thought through where the bad stuff could happen.

[00:38:42] [SPEAKER_00]: They presume it's the technology level.

[00:38:43] [SPEAKER_00]: It's like saying to Johannes Gutenberg, nobody better ever do anything bad with this and you're responsible for the rest of eternity, Johannes.

[00:38:50] [SPEAKER_00]: You're going to burn in hell, which he may be doing because Martin Luther came later, for what your tool and technology does.

[00:38:57] [SPEAKER_00]: It's ignorant of history.

[00:39:00] [SPEAKER_00]: And I hate this bill.

[00:39:01] [SPEAKER_00]: So Fei-Fei Li, who's called the godmother of AI, has written against it.

[00:39:06] [SPEAKER_00]: But then you see Gary Marcus is coming back kind of begging her to be in favor of parts of this.

[00:39:11] [SPEAKER_00]: So it's a very real debate that's occurring.

[00:39:14] [SPEAKER_01]: Yeah.

[00:39:14] [SPEAKER_01]: And you mentioned the threat to open source.

[00:39:18] [SPEAKER_01]: And Andrew Ng also called it an assault on open source because of that open and easily modifiable nature of open source.

[00:39:29] [SPEAKER_01]: Right.

[00:39:29] [SPEAKER_01]: It's also very vague is what a lot of people are saying as well as a criticism.

[00:39:35] [SPEAKER_01]: Like where is that line drawn between acceptable and potentially harmful?

[00:39:39] [SPEAKER_01]: It seems like a goalpost that could move in any direction on a whim.

[00:39:45] [SPEAKER_01]: Is there an easy way to really truly define this or even put in controls?

[00:39:50] [SPEAKER_01]: Because like you said, guardrails, it's impossible to cover every step when you're talking about this technology.

[00:39:56] [SPEAKER_01]: So then does that mean that the line is AI is no good?

[00:40:00] [SPEAKER_01]: You can't use it because you can't put in definitives.

[00:40:03] [SPEAKER_01]: Everything becomes an easy target.

[00:40:05] [SPEAKER_00]: Yeah.

[00:40:06] [SPEAKER_00]: And it's a problem with any law in the United States.

[00:40:08] [SPEAKER_00]: Any law that is too vague to understand and follow is unconstitutional by its nature.

[00:40:15] [SPEAKER_00]: And so this idea of AI safety couldn't be vaguer.

[00:40:20] [SPEAKER_00]: Define safety in all realms.

[00:40:24] [SPEAKER_00]: And you make safe cars now to the extent that they have seatbelts and they don't blow up.

[00:40:30] [SPEAKER_00]: It's not the Corvair and so on.

[00:40:33] [SPEAKER_01]: But they still get into accidents.

[00:40:34] [SPEAKER_01]: But they still die in them.

[00:40:36] [SPEAKER_01]: I mean, so where is that line drawn?

[00:40:39] [SPEAKER_01]: Right.

[00:40:41] [SPEAKER_01]: Tomorrow, August 15th, at least at the time of this recording, tomorrow, August 15th, the bill is sent to the California Senate Assembly floor.

[00:40:49] [SPEAKER_01]: It's expected to pass there.

[00:40:51] [SPEAKER_01]: Then it would go to Gavin Newsom's desk for signage.

[00:40:55] [SPEAKER_01]: But there is no word on his stance with the bill.

[00:40:59] [SPEAKER_01]: If he signs it, it would go into effect no earlier than 2026.

[00:41:04] [SPEAKER_01]: And I'm sure there will be a whole host of legal challenges leading up to that, too, because there's a lot of people with a lot of money and a lot of intention around this technology that is moving so much of the industry right now to not allow something like this to happen.

[00:41:18] [SPEAKER_01]: And, yeah, I think you're right.

[00:41:19] [SPEAKER_01]: If in the end this is just the way it is, it's going to move a lot of this technology and these companies out of the state for sure.

[00:41:27] [SPEAKER_00]: Yeah.

[00:41:28] [SPEAKER_00]: And even though I'm sure the way California operates is that even if it's made elsewhere, you're still libel in California.

[00:41:33] [SPEAKER_00]: But you sure don't want to make it there and use it there.

[00:41:36] [SPEAKER_00]: You want to be able to put a fence around use alone.

[00:41:41] [SPEAKER_00]: Yep.

[00:41:41] [SPEAKER_01]: Yep.

[00:41:42] [SPEAKER_01]: That's absolutely part of it.

[00:41:45] [SPEAKER_01]: Shifting gears a little bit, I just thought this was kind of interesting.

[00:41:48] [SPEAKER_01]: But kind of speaking about meta because we've been talking about it a little bit with the open source conversation.

[00:41:53] [SPEAKER_01]: But meta has a new AI tool for creating 3D imagery called vFusion 3D.

[00:42:01] [SPEAKER_01]: And so, you know, think characters, product mock-ups, sculptures, et cetera.

[00:42:07] [SPEAKER_01]: You can turn any image, flat image, into a 3D model.

[00:42:11] [SPEAKER_01]: You just take an image, turn it into 3D.

[00:42:14] [SPEAKER_01]: It makes it spinable and zoomable and all the things within seconds.

[00:42:19] [SPEAKER_01]: And, yes, it is open source.

[00:42:22] [SPEAKER_01]: That is super cool.

[00:42:23] [SPEAKER_01]: Yeah, it is.

[00:42:24] [SPEAKER_01]: Like, you know, it could be a big deal for game developers, architects, Hollywood characters, product designers, all sorts of things.

[00:42:33] [SPEAKER_01]: Yeah, that's amazing.

[00:42:34] [SPEAKER_01]: That's really cool.

[00:42:35] [SPEAKER_01]: So I tried playing around with this earlier and I could not get it to work, but they do have a demo up.

[00:42:40] [SPEAKER_01]: I was hoping it was going to be your demo.

[00:42:42] [SPEAKER_01]: So I'm going to try and load the AI Inside logo in.

[00:42:46] [SPEAKER_01]: And, yeah, you see, this application is currently busy.

[00:42:48] [SPEAKER_01]: Try again later.

[00:42:49] [SPEAKER_01]: I wanted to create a 3D image of the AI Inside logo, but apparently it's not going to let me do it.

[00:42:55] [SPEAKER_01]: I've tried earlier today multiple times and I keep getting the same error message.

[00:42:59] [SPEAKER_01]: So even the –

[00:43:01] [SPEAKER_01]: You know what?

[00:43:01] [SPEAKER_00]: I think it saved you for yourself, Jason.

[00:43:03] [SPEAKER_00]: I think it would have been cheesy.

[00:43:05] [SPEAKER_01]: Oh, probably so.

[00:43:07] [SPEAKER_01]: But, you know, it's an easy target to throw our logo in there and see what it comes up with.

[00:43:13] [SPEAKER_01]: Who knows?

[00:43:14] [SPEAKER_01]: Maybe we could start selling like AI Inside bobbleheads of the logo.

[00:43:19] [SPEAKER_01]: I don't know.

[00:43:19] [SPEAKER_01]: Yeah, of us.

[00:43:21] [SPEAKER_01]: Of us.

[00:43:22] [SPEAKER_01]: Oh, that's – oh, see?

[00:43:24] [SPEAKER_01]: We started something.

[00:43:25] [SPEAKER_01]: Yeah.

[00:43:26] [SPEAKER_01]: With the bobbleheads.

[00:43:27] [SPEAKER_01]: Saw a few of those at the studio the other day as they're clearing it out.

[00:43:31] [SPEAKER_01]: Oh, right.

[00:43:32] [SPEAKER_01]: Right.

[00:43:33] [SPEAKER_01]: Yeah, just a couple of – only a few.

[00:43:36] [SPEAKER_01]: It's not like there was a ton of them, but anyways, I did see them.

[00:43:40] [SPEAKER_01]: How's the AI thing doing?

[00:43:43] [SPEAKER_01]: And what I'm talking about is how AI is, as we've talked about many times, layered.

[00:43:49] [SPEAKER_01]: Somebody in chat said layered.

[00:43:50] [SPEAKER_01]: Layered and everything.

[00:43:51] [SPEAKER_01]: I'm like, oh, maybe that's better than sprinkled.

[00:43:54] [SPEAKER_01]: I think it is.

[00:43:54] [SPEAKER_00]: It's kind of the lasagna structure, yeah.

[00:43:57] [SPEAKER_01]: Okay.

[00:43:57] [SPEAKER_01]: I'm going to switch to that from now on until the next word comes along that I like.

[00:44:02] [SPEAKER_01]: We've got a couple of stories that are talking about kind of the brand marketing or the marketing impact of having AI in all of these things.

[00:44:11] [SPEAKER_01]: How is that going?

[00:44:13] [SPEAKER_01]: One study from the Journal of Hospitality Marketing and Management – oh, sorry.

[00:44:19] [SPEAKER_01]: I almost fell asleep there – showed that AI lowers consumer purchase intent, according to their study.

[00:44:29] [SPEAKER_01]: It asks the question, are consumers complacent?

[00:44:32] [SPEAKER_01]: Are they hesitant?

[00:44:34] [SPEAKER_01]: Like do they just not care about AI or are they worried about it?

[00:44:37] [SPEAKER_01]: I think it's probably a little bit of all things.

[00:44:39] [SPEAKER_01]: CNN writes it's about trust or lack thereof, that kind of erosion of trust that builds up when an AI is purported to do something and then someone finally gives it a go and it doesn't.

[00:44:52] [SPEAKER_01]: I don't know how many people I've talked to that have had that experience.

[00:44:55] [SPEAKER_01]: Like let's say the air quotes normals, the people outside of our tech circles.

[00:44:58] [SPEAKER_01]: They're like, oh, yeah, I used that Google Gemini AI thing.

[00:45:02] [SPEAKER_01]: Yeah, it really sucked.

[00:45:04] [SPEAKER_01]: It did – it's like they've heard about it enough to be excited by what it potentially could do so they try it out.

[00:45:10] [SPEAKER_01]: And all it takes is one time for them to realize, oh, that didn't do at all what I wanted it to do.

[00:45:14] [SPEAKER_01]: And so that trust is eroded.

[00:45:17] [SPEAKER_01]: And how is that for all this?

[00:45:22] [SPEAKER_01]: Yeah, anyways.

[00:45:23] [SPEAKER_00]: Or there's a fear of data.

[00:45:26] [SPEAKER_00]: AI is associated with data and transparency and a mind of its own and theft of content and you're stealing my soul.

[00:45:36] [SPEAKER_00]: Right.

[00:45:36] [SPEAKER_00]: And so I find it's interesting that at the same time that this study comes out, of course, what did Google do and what's Apple going to do at this event is they're going to start screaming, AI in everything.

[00:45:48] [SPEAKER_00]: And I wonder whether there's a – I don't want to say backlash, but whether there's a hesitancy.

[00:45:54] [SPEAKER_00]: A response.

[00:45:55] [SPEAKER_00]: Yeah, yeah.

[00:45:55] [SPEAKER_00]: Kind of a reaction to that.

[00:45:58] [SPEAKER_01]: Yeah.

[00:46:00] [SPEAKER_00]: Before we get there, I just want to say as an aside here in the CNN story about the AI brand study, I want this job.

[00:46:07] [SPEAKER_00]: I'm now emeritus at CUNY.

[00:46:09] [SPEAKER_00]: I want to become the Taco Bell Distinguished Professor of Hospitality Business Manager at Washington State University.

[00:46:16] [SPEAKER_00]: Wouldn't it be killer to be the Taco Bell Professor of anything?

[00:46:21] [SPEAKER_01]: This is a job that was created exactly for you.

[00:46:24] [SPEAKER_01]: For me.

[00:46:25] [SPEAKER_01]: Yeah, yeah.

[00:46:26] [SPEAKER_01]: Yeah.

[00:46:26] [SPEAKER_01]: Either Taco Bell or maybe Chipotle.

[00:46:29] [SPEAKER_01]: I don't know.

[00:46:30] [SPEAKER_01]: I think Taco Bell is cooler right now.

[00:46:32] [SPEAKER_01]: Yeah.

[00:46:32] [SPEAKER_01]: So you get a master's in burrito.

[00:46:35] [SPEAKER_00]: Yeah.

[00:46:37] [SPEAKER_01]: And free burritos for life.

[00:46:39] [SPEAKER_01]: I think so.

[00:46:40] [SPEAKER_01]: I think so.

[00:46:41] [SPEAKER_01]: Yeah.

[00:46:41] [SPEAKER_01]: I would be disappointed if that wasn't the case.

[00:46:44] [SPEAKER_01]: Sorry.

[00:46:45] [SPEAKER_01]: We had a little detour there.

[00:46:47] [SPEAKER_01]: No, honestly.

[00:46:48] [SPEAKER_01]: It's perfect.

[00:46:49] [SPEAKER_01]: It's perfect.

[00:46:49] [SPEAKER_01]: Another study by TV measurement firm iSpot shows how nearly half of tech companies' total TV commercial spend this year is about AI in some way.

[00:47:04] [SPEAKER_01]: And it's a large number, a large amount of money being spent on their advertising.

[00:47:09] [SPEAKER_01]: Half of that messaging all centered around artificial intelligence.

[00:47:14] [SPEAKER_01]: Of course, the Olympics was a really great example of this.

[00:47:16] [SPEAKER_01]: If you watched enough Olympics, you saw a ton of big tech talking about their AI.

[00:47:22] [SPEAKER_01]: Meta had its AI chatbot all over the place.

[00:47:24] [SPEAKER_01]: Google had its kind of notable moment where it had the Gemini writing that heartfelt letter.

[00:47:30] [SPEAKER_01]: Heartfelt letter for the kid to her personal hero.

[00:47:35] [SPEAKER_01]: And everybody responded saying, if it's so heartfelt, why are you going to have your AI write it for you?

[00:47:40] [SPEAKER_01]: Like, teach the kid a lesson.

[00:47:42] [SPEAKER_01]: Write from your heart yourself.

[00:47:44] [SPEAKER_01]: And they ended up pulling the ad in response to that.

[00:47:49] [SPEAKER_00]: But even yesterday, they still demonstrated more letter writing.

[00:47:53] [SPEAKER_00]: Yeah, they did.

[00:47:54] [SPEAKER_00]: Not personal, but memo writing kind of.

[00:47:55] [SPEAKER_01]: Yeah.

[00:47:56] [SPEAKER_01]: Which I'm happy that you bring that up because as I was watching that.

[00:47:59] [SPEAKER_01]: So the two examples that they gave, if I can remember them off the top of my head, one was you're a student in your college course and you're sick and you want to ask for extra time.

[00:48:12] [SPEAKER_01]: So write me a letter to my professor that says I was sick.

[00:48:15] [SPEAKER_01]: Can I have some extra time to finish my assignment?

[00:48:18] [SPEAKER_01]: What was the other one?

[00:48:19] [SPEAKER_01]: The other one was…

[00:48:20] [SPEAKER_01]: Forget the other one because that's the one that stuck with me, of course.

[00:48:22] [SPEAKER_01]: Yeah, yeah, yeah.

[00:48:23] [SPEAKER_01]: Oh, man.

[00:48:24] [SPEAKER_01]: Off the top of my head, I can't remember.

[00:48:27] [SPEAKER_01]: And it will come to me later.

[00:48:29] [SPEAKER_01]: I have it written down somewhere.

[00:48:30] [SPEAKER_01]: But the point that I kind of thought of at that moment is there's a lot of reaction to these things where it's like, oh, it's so inauthentic to have an AI LLM write this letter for you.

[00:48:47] [SPEAKER_01]: Yes, it happens in a couple of seconds.

[00:48:49] [SPEAKER_01]: But we're human beings.

[00:48:50] [SPEAKER_01]: Why can't we just write our own things?

[00:48:52] [SPEAKER_01]: And I guess the question that I'm kind of tossing around is are there certain types of communication that we encounter and involve ourselves with on a daily basis that don't require authenticity?

[00:49:05] [SPEAKER_01]: You know, that really just are more about the message than they are about being authentic.

[00:49:10] [SPEAKER_01]: And, you know, granted, I don't think that the child, you know, having an AI write a letter about, you know, heartfelt letter to her fan is the right example of that.

[00:49:20] [SPEAKER_01]: But, oh, it was the broken AC.

[00:49:22] [SPEAKER_01]: It was the broken air conditioner by the tenant to the landlord.

[00:49:27] [SPEAKER_01]: And that was the other example.

[00:49:29] [SPEAKER_01]: It was like, you know, my air conditioner is broken.

[00:49:31] [SPEAKER_01]: I need to write my landlord a letter to say, can you fix it?

[00:49:35] [SPEAKER_01]: Can you address this?

[00:49:36] [SPEAKER_01]: Like, maybe I don't need to be authentic in that.

[00:49:39] [SPEAKER_01]: Maybe it really is about the detail.

[00:49:41] [SPEAKER_01]: Like, hey, this happened.

[00:49:43] [SPEAKER_01]: You need to fix it.

[00:49:44] [SPEAKER_00]: Well, and it's a good case where one does very easily imagine AI agent talking to AI agent.

[00:49:50] [SPEAKER_00]: Landlord says, I have a repair agent.

[00:49:51] [SPEAKER_00]: Totally.

[00:49:53] [SPEAKER_00]: Tenant says, hey, my agent, go tell that agent that I need the air conditioner fixed.

[00:49:57] [SPEAKER_00]: And then it schedules and it comes back and says, it'll be three o'clock.

[00:50:01] [SPEAKER_00]: No, that doesn't work.

[00:50:01] [SPEAKER_00]: Two o'clock, whatever.

[00:50:03] [SPEAKER_00]: Negotiates.

[00:50:03] [SPEAKER_00]: Do you have an agreement?

[00:50:04] [SPEAKER_00]: And boom.

[00:50:05] [SPEAKER_00]: I can see that being agentic.

[00:50:08] [SPEAKER_00]: Or agentive, by the way.

[00:50:09] [SPEAKER_00]: This is something I learned yesterday.

[00:50:13] [SPEAKER_00]: I think it was Rick Osterloh used the word agentive.

[00:50:16] [SPEAKER_00]: I've been using the word agentic.

[00:50:18] [SPEAKER_01]: Yeah.

[00:50:19] [SPEAKER_00]: Agentic is more about like a person being an agent.

[00:50:21] [SPEAKER_00]: Agentive is evidently the more proper.

[00:50:24] [SPEAKER_01]: Because it's agent-like.

[00:50:26] [SPEAKER_01]: Yes.

[00:50:27] [SPEAKER_01]: Right?

[00:50:28] [SPEAKER_01]: Yeah.

[00:50:28] [SPEAKER_00]: Or agentive.

[00:50:32] [SPEAKER_01]: It's acting in an agent sort of way.

[00:50:34] [SPEAKER_01]: Yes.

[00:50:35] [SPEAKER_01]: Yes.

[00:50:35] [SPEAKER_01]: Yes, exactly.

[00:50:35] [SPEAKER_01]: So agentive.

[00:50:36] [SPEAKER_01]: Oh, okay.

[00:50:37] [SPEAKER_01]: I like agentic.

[00:50:38] [SPEAKER_00]: It sounds better, but fine.

[00:50:39] [SPEAKER_01]: Yeah, it does.

[00:50:40] [SPEAKER_01]: It does to me too.

[00:50:41] [SPEAKER_01]: But that's interesting.

[00:50:42] [SPEAKER_01]: I didn't catch the differential there.

[00:50:45] [SPEAKER_01]: Yeah, it's interesting.

[00:50:47] [SPEAKER_01]: In the example that you just spelled out of these two agents going toe-to-toe on this problem,

[00:50:52] [SPEAKER_01]: as you're talking about it, I'm like, well, you know what?

[00:50:55] [SPEAKER_01]: If the problem gets solved, I'm open to that.

[00:50:58] [SPEAKER_01]: Amen.

[00:50:58] [SPEAKER_01]: You know what I mean?

[00:50:59] [SPEAKER_01]: Like, no one's time was wasted in that.

[00:51:01] [SPEAKER_01]: It was like, that thing told this thing what the problem was.

[00:51:05] [SPEAKER_01]: Is this thing made the solution happen?

[00:51:09] [SPEAKER_01]: And the people behind it were free to do.

[00:51:10] [SPEAKER_00]: The key to this is each agent has to be authorized.

[00:51:13] [SPEAKER_00]: Yeah.

[00:51:14] [SPEAKER_00]: Well, the problem is that now when you call into the airline and you know they're reading

[00:51:19] [SPEAKER_00]: the script to you, that means that the person you're talking to has not been given the authority.

[00:51:23] [SPEAKER_00]: They're just kind of trying to wear you down with corporate explanations.

[00:51:27] [SPEAKER_03]: Sure.

[00:51:27] [SPEAKER_00]: So whether you're a human agent or whether you're a computer agent, that doesn't work

[00:51:30] [SPEAKER_00]: because they're not authorized to solve the problem.

[00:51:34] [SPEAKER_00]: Yep.

[00:51:34] [SPEAKER_00]: But the question is, can agents be given the authority, trusted by the company in this case,

[00:51:40] [SPEAKER_00]: and also by the individual to solve the problem and to do that?

[00:51:43] [SPEAKER_00]: And that'll be interesting to see if we advance to that level.

[00:51:47] [SPEAKER_01]: Yeah.

[00:51:48] [SPEAKER_01]: Yeah.

[00:51:48] [SPEAKER_01]: Be curious to see.

[00:51:49] [SPEAKER_00]: Companies don't trust their own employees to solve the problems.

[00:51:53] [SPEAKER_00]: Maybe they'll trust them.

[00:51:54] [SPEAKER_00]: Maybe they trust the machine more because they can see the data and they're not listening

[00:51:57] [SPEAKER_00]: on the phone line.

[00:51:58] [SPEAKER_00]: Maybe that works better.

[00:51:59] [SPEAKER_00]: I don't know.

[00:52:01] [SPEAKER_01]: Yeah, but that's a lot of trust.

[00:52:02] [SPEAKER_01]: I even had a hard time with the whole feature however many years ago where you had your Alexa

[00:52:09] [SPEAKER_01]: in the room.

[00:52:10] [SPEAKER_01]: Apologies if I just fired off anyone's.

[00:52:12] [SPEAKER_00]: Sorry, I think I did that twice already.

[00:52:15] [SPEAKER_01]: Yeah.

[00:52:15] [SPEAKER_01]: And but it was that feature years ago where you could like tell it to purchase something

[00:52:22] [SPEAKER_01]: for you and it and it does it.

[00:52:24] [SPEAKER_01]: And, you know, I think you still had to kind of go in after the fact and confirm it.

[00:52:29] [SPEAKER_01]: But the idea initially was like, just use it and it'll buy it for you.

[00:52:33] [SPEAKER_01]: And I was like, God, it just it's it's hard for me to trust it even for that, let alone

[00:52:39] [SPEAKER_01]: these agents talking to each other and buying my plane ticket for the thing that I need to

[00:52:44] [SPEAKER_01]: go to or whatever and making sure that it gets it right.

[00:52:47] [SPEAKER_01]: Like there's going to be a lot of there's going to need to be a lot of education from

[00:52:52] [SPEAKER_01]: these things to me that it and examples of it working before I trust it.

[00:52:58] [SPEAKER_01]: It took me a long time to to put my trust in in Google Pay and, you know, feeling comfortable

[00:53:05] [SPEAKER_01]: with leaving the house without my my credit card because I was comfortable that when I

[00:53:09] [SPEAKER_01]: got to the store, my phone would it would actually work like it took me a while to get

[00:53:13] [SPEAKER_01]: to that trust. Right. And that only came from using it time and time again and it proving

[00:53:17] [SPEAKER_00]: it could do it. Right. This goes back to the discussion about responsibility and AI

[00:53:21] [SPEAKER_00]: at the model agent or application or user level. The story we had a couple months ago with

[00:53:28] [SPEAKER_00]: Air Canada having the bot that told the guy that he could get the bereavement fare when in

[00:53:35] [SPEAKER_00]: fact he couldn't. And then Air Canada tried to blame the bot. Well, that means that Air Canada

[00:53:41] [SPEAKER_00]: has undercut any trust in the bot. It wasn't the bot's fault. In that case, it was Air Canada's

[00:53:46] [SPEAKER_00]: fault. And that's where the responsibility lay. So this is going to be a fascinating negotiation

[00:53:52] [SPEAKER_00]: at all these levels.

[00:53:54] [SPEAKER_01]: For sure. 100%. And finally, in our in our segment of another company, not having any real advertisement

[00:54:07] [SPEAKER_01]: going on right now because they're probably trying to figure out how they do it. Humane

[00:54:13] [SPEAKER_01]: and the AI pin, the colossal historic flop, I think it's safe to say of the AI pin had a lot

[00:54:21] [SPEAKER_01]: of buzz going in and then it did not stick the landing. Saw terrible reviews. $700 device

[00:54:29] [SPEAKER_01]: has seen more returns than sales, according to that. That's just that's not a winning formula.

[00:54:35] [SPEAKER_01]: No, it's not $9 million in lifetime sales. 1000 purchases canceled before shipping more than $1

[00:54:44] [SPEAKER_01]: million in returns. And those returns, this is interesting. Those returns cannot be refurbished.

[00:54:50] [SPEAKER_01]: They cannot be resold. They become e-waste because of a T-Mobile limitation that their network probably

[00:54:57] [SPEAKER_01]: tied to a specific device and there's no way to transfer ownership or something along those lines.

[00:55:02] [SPEAKER_01]: So all those devices just become garbage, which is a real shame.

[00:55:06] [SPEAKER_01]: There are only 8000 units left there out in the world.

[00:55:11] [SPEAKER_01]: Collector's item.

[00:55:12] [SPEAKER_01]: Yeah, this was all according to internal sources, by the way. They're not talking about it. They

[00:55:19] [SPEAKER_01]: don't make comments on this stuff. But to put it into a little bit of context, I suppose,

[00:55:25] [SPEAKER_01]: the company had raised more than $200 million from investors. So that makes me feel like they

[00:55:32] [SPEAKER_01]: probably have a nice little kitty that they're sitting on, even though I think it was a couple

[00:55:37] [SPEAKER_01]: of months ago they were the report was that they were looking for buyers potentially when all the

[00:55:43] [SPEAKER_01]: fallout happened. So who knows what's going to happen with Humane, but it's not looking good.

[00:55:48] [SPEAKER_01]: Yeah. Yeah. Pour one out for Humane AI pin. All right. Well, that's it. We've reached the end of

[00:55:55] [SPEAKER_01]: this episode of AI Inside. Great, great time. I have to say it was a lot of fun talking about Google

[00:56:01] [SPEAKER_01]: and all all other things AI that Google didn't completely take over the the AI conversation

[00:56:08] [SPEAKER_01]: this week. But Jeff, you're at Gutenberg parenthesis dot com. And I haven't put up a page with my new

[00:56:16] [SPEAKER_00]: book yet coming out this fall. The Web We Weave. But if you Google the Web We Weave in my name,

[00:56:23] [SPEAKER_00]: you can find the page where you can buy it. And if you use the code Web 20, you get the discount.

[00:56:29] [SPEAKER_00]: The Web Weave. So I'll get this all up. No, no. If you have to Google the Web We Weave in my name.

[00:56:35] [SPEAKER_00]: I don't know what I was thinking. The Web We Weave. I'm going to use Jeff Jarvis dot com,

[00:56:39] [SPEAKER_00]: but I have to negotiate with my son to help me create it. The Hachette.

[00:56:44] [SPEAKER_00]: The Hachette book group. Right. And if you go there and use the code Web 20,

[00:56:48] [SPEAKER_01]: you will get a discount between now and now. Web 20. Excellent. There we go. Hopefully to hit the

[00:56:57] [SPEAKER_01]: Gutenberg parenthesis website sometime soon. Until then. Use that tool we're all so familiar with

[00:57:04] [SPEAKER_01]: called Google and you'll find it. Excellent. Can't wait, man. Good stuff. You're so prolific. I love it.

[00:57:12] [SPEAKER_01]: But that is awesome. We also, I just want to say, we've been trying new things with the live stream

[00:57:19] [SPEAKER_01]: and it's going really well. Today we have close to 1,500 live viewers of this show, which is

[00:57:27] [SPEAKER_01]: pretty remarkable stuff. So my plea to all of you watching right now, please go to AIinside.show

[00:57:35] [SPEAKER_01]: and subscribe to the podcast. Subscribe because if you don't catch this live stream,

[00:57:41] [SPEAKER_01]: because you're not going to always catch the live stream, right? But I have a feeling you like what

[00:57:44] [SPEAKER_01]: you saw and heard today. And if you do subscribe, you won't miss any more episodes in the future.

[00:57:51] [SPEAKER_01]: You'll get them all. We'll be super happy. You can go into Pocket Casts and leave your review or your

[00:57:57] [SPEAKER_01]: ratings. You can go into Apple Podcasts and leave your review. All this stuff helps us grow this show and

[00:58:02] [SPEAKER_01]: it's so great to have those kinds of numbers watching us live. We love having y'all on board.

[00:58:08] [SPEAKER_01]: It's great to have you here. You can support on a deeper level if you wish by going to AI,

[00:58:16] [SPEAKER_01]: sorry, patreon.com slash AIinsideshow. And there you can support on any number of different tiers that

[00:58:24] [SPEAKER_01]: we have set up for patrons. You get ad-free shows, early access to AI videos that I do for the

[00:58:30] [SPEAKER_01]: text builder YouTube channel, Discord community, regular hangouts with Jeff and I. We have a, I think

[00:58:36] [SPEAKER_01]: I have a Zoom hangout coming up later on this week. And then we have executive producers of the show.

[00:58:42] [SPEAKER_01]: If you're at a certain tier, you get your name read out at the end of the episode. And you know what?

[00:58:47] [SPEAKER_01]: You can put it on your resume. That's fine. Dr. Dew, Jeffrey Maricini, WPVM 103.7 in Nashville,

[00:58:53] [SPEAKER_01]: North Carolina, and Paul Lang, all supporting us on a deeper level through the Patreon. We can't thank you

[00:58:59] [SPEAKER_01]: enough because that really does enable this show to do what we do. Yeah. Thank you all so much. And

[00:59:08] [SPEAKER_01]: that is really about it. I think that's all we got for this week. We appreciate you. Thank you for

[00:59:12] [SPEAKER_01]: watching and listening. We will both see you next time on another episode of AI Inside. Bye everybody.