Hands-on With Project Astra and Project Moohan
December 13, 202401:06:57

Hands-on With Project Astra and Project Moohan

[00:00:01] Kundenservice kontaktieren? Für viele Menschen ist das der beste Weg, einen schönen Tag zu ruinieren.

[00:00:06] Aber bei Zendesk sorgen wir für eine bessere Customer Experience.

[00:00:09] Besser für die Großmutter, besser für die Floristin, besser für den jungen Mann in Haus Nummer 3a, besser für sie, besser für alle.

[00:00:17] Denn während einige behaupten, dass der Kunde immer recht hat, sagen wir, dass KundInnen immer Menschen sind.

[00:00:22] Und da wir auch Menschen sind, wollen wir etwas Gutes für uns alle tun.

[00:00:26] Zendesk. Customer Experience mit KI für Menschen gemacht.

[00:00:29] Brauchen wir wirklich noch einen Computer? Alle wahrscheinlich nicht.

[00:00:33] Aber wenn du Musik mit der Power eines Neuralprozessors neu erfindest oder unterwegs Migrationsmuster mit einem ganztägigen Akku analysierst

[00:00:39] oder deine Ideen mit dem KI-gesteuerten Co-Creator zum Leben erwächst, dann kann ein Copilot Plus PC einen Unterschied machen.

[00:00:46] Nicht alle brauchen einen leistungsstärkeren KI-Computer.

[00:00:48] Aber wenn du versuchst, die Welt zu verändern, auch wenn es nur deine eigene ist, haben wir einen für dich entwickelt.

[00:00:52] Microsoft Copilot Plus PC mit Snapdragon. Die bisher schnellsten und intelligentesten Windows-PCs.

[00:00:57] Die Akkulaufzeit variiert hier nach Nutzung und Einstellungen.

[00:01:00] Das ist AI Inside, Episode 46.

[00:01:03] Recorded Thursday, December 12th, 2024.

[00:01:06] Hands-on mit Project Astra und Project Muhan.

[00:01:11] Das Episode von AI Inside ist möglich bei unseren Patronsen.

[00:01:15] Patreon.com slash AI Inside Show.

[00:01:18] If you like what you hear, head on over and support us directly.

[00:01:20] And thank you for making independent podcasting possible.

[00:01:30] Hey, everybody.

[00:01:32] Welcome to another episode of AI Inside.

[00:01:34] This is a show where we take a look at the AI that is layered in a nice reality-presented fashion in front of your eyes while you're wearing the proper hardware.

[00:01:44] I'm one of your hosts, Jason Howell, joined, as always, by my friend Jeff Jarvis.

[00:01:49] How you doing, Jeff?

[00:01:50] Hey, how are you? Sorry everybody out there, we're a day later, but it's going to be worth it because Jason has lots of things to report that were embargoed.

[00:01:57] Plus, he had a cold yesterday.

[00:01:59] Yeah.

[00:01:59] Yucky. You don't want that.

[00:02:01] You know, I swear, I had been looking forward to this morning, and we're going to talk about the news in a second.

[00:02:07] It's about Android XR and Gemini and all that.

[00:02:10] I had been looking forward to this morning for a little more than a week.

[00:02:14] And wouldn't you know it, it's like last weekend, I got a stomach bug.

[00:02:18] And then yesterday, I woke up with a really nasty cold.

[00:02:22] I was like, my body is trying to take me down.

[00:02:25] It doesn't want this to work.

[00:02:26] It doesn't want me to stick the landing.

[00:02:29] But it all worked out.

[00:02:30] So yeah, we ended up postponing the show.

[00:02:32] It just wasn't going to happen yesterday.

[00:02:33] I wasn't feeling very well.

[00:02:35] And so thank you for your patience, for being a day late.

[00:02:39] Thank you, Jeff, for you being so flexible to move it to today.

[00:02:43] I'm officially retired, so what do I have to do?

[00:02:46] No, okay. All right, cool.

[00:02:47] Well, I promise not to lean too heavily into that week after week.

[00:02:53] Before we get started, huge thank you to all of you who support us on Patreon,

[00:02:58] patreon.com slash AI Inside Show.

[00:03:01] So like BV Mir, who is one of our supporters.

[00:03:05] So BV, thank you so much.

[00:03:06] Thank you to those of you who I have not named specifically right now,

[00:03:10] but we do really appreciate you.

[00:03:12] Patreon.com slash AI Inside Show.

[00:03:14] And if you happen to be watching live right now,

[00:03:16] a lot of you just watch live when it hits Twitter

[00:03:19] or whatever other platforms that we're streaming live to.

[00:03:23] You might not be subscribed.

[00:03:24] And if you're not, you should.

[00:03:25] AI Inside.show.

[00:03:27] Go there, find all the details, and you won't miss a single episode,

[00:03:30] even if you happen to miss the live event.

[00:03:33] Okay, so what am I excited about in this moment?

[00:03:38] Well, let me tell you, Jeff.

[00:03:39] I saw some things, and it was a lot of fun.

[00:03:43] I also felt kind of special because Google reached out to me and said,

[00:03:47] hey, we've got these prototype hardwares that we want to show you.

[00:03:52] If you come down to Mountain View,

[00:03:53] we'll show you the prototypes that are running our new effort called Android XR.

[00:03:58] So they invited me down to Mountain View,

[00:04:00] and I got to go down there and hang out with the executives.

[00:04:03] And let's stay there for a second.

[00:04:04] I think it's really great that this podcast is less than a year old,

[00:04:07] and you have relationships and expertise around Android.

[00:04:10] You've had a long time,

[00:04:11] but I'm really delighted that you are influential.

[00:04:15] I don't want to call you an influencer because that's tacky,

[00:04:18] but you are influential,

[00:04:20] and they care about your opinion and trust you

[00:04:23] to be an independent journalist doing that.

[00:04:26] So I think it's really cool that you're on their list.

[00:04:28] So that's good.

[00:04:28] Yeah.

[00:04:29] I mean, I completely agree.

[00:04:31] I think it's really cool too.

[00:04:32] I will admit I was kind of surprised,

[00:04:34] not because I haven't worked with Google before.

[00:04:37] Like we, you know, with Android Faithful,

[00:04:39] we have a lot of interviews with different people from Google

[00:04:42] leading different efforts and everything.

[00:04:43] And so I think that is definitely part of it.

[00:04:46] But to realize that I was one of like 10 people

[00:04:49] that actually got to check out prototype hardware,

[00:04:51] like, you know, there were a couple of moments

[00:04:53] where I was feeling that imposter syndrome of like,

[00:04:55] why me?

[00:04:57] Like, I'm honored, but I'm kind of surprised.

[00:05:01] And, you know, at the end of the day,

[00:05:03] I'm a fan of all of this stuff

[00:05:06] as much as I am a journalist and a podcaster and everything.

[00:05:09] So it was a really cool opportunity for opportunity's sake.

[00:05:14] Like it was also just a lot of nerdy fun, you know?

[00:05:17] Like it felt like a roller coaster to me.

[00:05:19] Yeah, that's cool.

[00:05:20] And, you know, I think that the age of,

[00:05:23] this is what I write about all the time

[00:05:24] and everything I write is the age of mass media is over.

[00:05:27] And it's about specialization and trust at a human level.

[00:05:31] And I was talking to an executive

[00:05:32] at another technology company, not Google.

[00:05:35] The other day was kind of,

[00:05:36] how do I get my message out at scale?

[00:05:37] I said, you don't anymore.

[00:05:39] You've got to buy people together.

[00:05:41] I said, so for instance,

[00:05:42] you could support a whole bunch of really cool,

[00:05:44] good, trustworthy tech podcasts.

[00:05:47] And you'd be better off for it.

[00:05:49] I think that's, I think it's a smarter way to go.

[00:05:51] Yeah, yeah.

[00:05:52] Cool stuff.

[00:05:53] Well, I was super excited to make the drive,

[00:05:57] even though on the way back from Mountain View,

[00:05:59] I got caught in like three and a half hours worth of traffic.

[00:06:01] So that wasn't, that wasn't amazing,

[00:06:03] but it was what it was.

[00:06:06] But so what, what did Google actually show off?

[00:06:08] They were showing off Android XR,

[00:06:11] which is their platform for extended reality devices.

[00:06:15] It's really pertinent to this show specifically

[00:06:18] because they made a big deal about the fact

[00:06:22] that this is their first OS that they've created

[00:06:24] with Gemini at its core.

[00:06:27] So essentially building it around Gemini

[00:06:30] from the ground up.

[00:06:31] And you remember, I'm sure,

[00:06:33] the Google I.O. keynote earlier this year,

[00:06:37] they showed off a little glimpse of Project Astra,

[00:06:40] which was the glasses with the see-through camera

[00:06:44] and interacting with the elements in the room

[00:06:47] and everything.

[00:06:47] It made a big, big splash of Google I.O.

[00:06:50] And that's kind of, you know,

[00:06:52] that's a big key piece of what this is all about.

[00:06:55] This is Gemini running on these wearable devices,

[00:06:58] be it a VR form factor,

[00:07:01] be it a glasses form factor like Project Astra.

[00:07:03] And the camera being your eyes along with your eyes

[00:07:09] and, you know,

[00:07:10] the microphone being your ears along with your ears

[00:07:12] so that as you go through life wearing this hardware,

[00:07:17] it brings a lot of contextualization opportunities.

[00:07:20] It brings, you know, a lot of memory.

[00:07:22] If you forget something, it can remember this.

[00:07:26] And all of this stuff was really explored

[00:07:29] during the demos that I got to see.

[00:07:31] Before you go to the demos,

[00:07:33] I'm just trying to set the architectural scene here

[00:07:36] in two ways.

[00:07:38] One is, I wondered when you mentioned this,

[00:07:41] when I read your story about this,

[00:07:45] which you should plug by all means the digital trends,

[00:07:47] and whether the shift to,

[00:07:55] the rumor of a shift from Chrome OS to Android

[00:07:59] for the next Chrome, a Pixelbook.

[00:08:03] Okay.

[00:08:04] So just a rumor, we don't know,

[00:08:06] but I wonder whether this plays into that,

[00:08:09] whether Android being the key operating system

[00:08:12] and the key operating system that's tied to Gemini and AI

[00:08:16] makes Android now preeminent.

[00:08:18] Android XR being the pathway into the future OSs of Google.

[00:08:24] I just wondered about that.

[00:08:26] And then secondly, we've talked about

[00:08:29] how Google is working on things,

[00:08:31] and you're going to talk about it in a minute,

[00:08:32] where it could watch what you're doing on your screen

[00:08:34] and take that over and so on.

[00:08:36] And that's, tell me if I'm wrong here,

[00:08:38] that's built into Gemini more than Android XR, right?

[00:08:41] Those are changes that are done in Gemini.

[00:08:44] Yeah, I think I, yes.

[00:08:45] The two seem to come together,

[00:08:46] is what I'm saying,

[00:08:47] in a way that seems to be more strategic

[00:08:49] and fundamental than even what they demonstrated.

[00:08:53] Taking the Gemini layer

[00:08:55] and putting it into all of these other places

[00:08:57] to bring that capability that Gemini has

[00:09:00] into those other experiences.

[00:09:03] Yeah, absolutely.

[00:09:03] Right, and through Android XR, right.

[00:09:05] And through Android XR, exactly.

[00:09:08] Yeah, I mean, I've, you know,

[00:09:10] obviously we follow Gemini and these voice assistants

[00:09:14] and LLMs and all this stuff on the show a lot.

[00:09:17] I've also kind of been a pretty casual fan

[00:09:20] of virtual reality and environments like that

[00:09:23] and augmented reality and stuff.

[00:09:24] And I've often felt that the chocolate peanut butter moment

[00:09:28] is when you take the context

[00:09:31] and kind of capabilities

[00:09:33] of these artificial intelligence systems

[00:09:35] and you put them into a virtualized environment,

[00:09:37] that's when things start to really sing.

[00:09:40] Right, and it doesn't have to come from glasses

[00:09:42] and a camera and glasses.

[00:09:43] It can come from what you're doing on your screen.

[00:09:45] It can come from what it hears.

[00:09:46] It can come from what you say.

[00:09:48] It can come from things you've saved.

[00:09:52] We've talked about saving screenshots

[00:09:53] and doing that kind of stuff.

[00:09:54] So it can gain that context

[00:09:57] and then act on it in a lot of ways.

[00:09:59] Yeah, yeah, indeed.

[00:10:01] Multimodal, indeed.

[00:10:02] Well, yeah, and I mean,

[00:10:04] we talk also a lot about multimodality.

[00:10:08] From a kind of like an informational standpoint,

[00:10:12] that's definitely here.

[00:10:14] The Gemini is looking at everything you're looking at.

[00:10:17] There was one point at which I switched on eye tracking

[00:10:20] in the Project Wuhan,

[00:10:22] which is Samsung's VR headset.

[00:10:24] I could activate eye tracking,

[00:10:25] so it would, after a short configuration,

[00:10:29] the cursor on the screen would actually follow

[00:10:31] wherever my eye was looking.

[00:10:33] It was very accurate.

[00:10:34] And then I could click with my hands

[00:10:36] just sitting in my lap or whatever.

[00:10:38] But you have the ability as a user to tell the system

[00:10:42] whether it shares that information

[00:10:44] or considers that information with Gemini or not.

[00:10:48] You know, because that's kind of like an ethical kind of area

[00:10:52] is like, okay, well, now it knows what you're looking at.

[00:10:54] And that used to seem like only the thing that we know.

[00:10:58] And now technology knows how to follow that too.

[00:11:00] Yeah.

[00:11:00] Do you feel okay with Gemini knowing so much

[00:11:04] about what you're facing and what that says?

[00:11:07] And where you are too, yeah.

[00:11:09] And where you are and like all that stuff.

[00:11:12] It gets really interesting.

[00:11:13] So I interrupted you.

[00:11:15] Talk about what you saw.

[00:11:17] No.

[00:11:17] I mean, you know, there was so much as part of the demo.

[00:11:22] I think the Project Astra demo was really impressive

[00:11:27] because of the kind of memory aspect

[00:11:29] was one thing that really stood out to me.

[00:11:32] There was a portion where I was looking at a shelf

[00:11:35] of liquor bottles and, you know, I looked at it

[00:11:38] and I was like, hey, you know, what can I make with these?

[00:11:40] You know, and I tried to be very generic

[00:11:43] and force it to kind of identify things.

[00:11:45] So I didn't, I tried not to say like,

[00:11:47] what can you make with these liquor bottles?

[00:11:49] But more like, you know, take a look at this table

[00:11:52] and tell me what you can make here.

[00:11:53] And it would tell me and give me a recipe.

[00:11:55] Do I need to make that neat?

[00:11:56] Or on the rocks or whatever.

[00:11:58] And then I went on with my demo.

[00:12:00] And then I asked it, hey, back at the liquor,

[00:12:03] you know, back when I was with the bottles,

[00:12:05] there was a book sitting next to it.

[00:12:06] What's the name of that book?

[00:12:08] And it knew the name of the book.

[00:12:09] So it had that memory.

[00:12:11] There was a point where I was talking with someone

[00:12:13] who was in there talking to me in Spanish

[00:12:15] and I was making eye contact with her.

[00:12:17] But because I had the glasses,

[00:12:18] my right eye was seeing through the Raxiom display.

[00:12:22] So I was seeing the translation in real time

[00:12:26] while still making eye contact with her.

[00:12:28] So that was kind of neat.

[00:12:30] Which device were you wearing then?

[00:12:32] That was the Project Astra.

[00:12:34] Right now, this is all Project Astra.

[00:12:36] Yeah.

[00:12:37] You know, I had it identify a painting

[00:12:38] and ask questions about that.

[00:12:40] There was a sign on the wall

[00:12:41] that was in a foreign language.

[00:12:42] I asked it to translate that into English.

[00:12:45] And then I asked it to translate that into Japanese.

[00:12:49] There was a record.

[00:12:51] So I picked up the record and I said,

[00:12:52] hey, I've never heard this record before.

[00:12:54] Can you play me a track?

[00:12:54] And it immediately knew to go to YouTube Music

[00:12:56] and pull up a track from the album

[00:12:58] and play it in the stereo speakers.

[00:13:00] I'm still a little confused though.

[00:13:01] Were you wearing Wuhan?

[00:13:02] Were you wearing the black glasses?

[00:13:03] What were you wearing?

[00:13:04] Still wearing the black glasses.

[00:13:06] The black glasses.

[00:13:07] I think the glasses is essentially the Project Astra

[00:13:10] or maybe Project Astra is a component of the glasses.

[00:13:12] But they didn't really give me an aim for the glasses.

[00:13:15] That was very prototype-y.

[00:13:17] That was not a brand yet or a manufacturer yet.

[00:13:19] No brand.

[00:13:20] No brand labeling.

[00:13:22] I saw a pair of the glasses that was monocular,

[00:13:26] so a single display,

[00:13:27] and then one that was binocular,

[00:13:30] so it had a display in each eye.

[00:13:31] So it had that kind of three-dimensionality.

[00:13:33] And then they had a version of the glasses

[00:13:35] that was transparent,

[00:13:36] so you could see all the technology

[00:13:38] and you could see the little display inside

[00:13:39] and the light guide and how it all worked.

[00:13:42] I'm like, you need to sell that one.

[00:13:44] That nerds would love that one.

[00:13:47] Yeah.

[00:13:48] So you and I share the same PTSD,

[00:13:50] though I think mine is worse

[00:13:51] because I spent more on it,

[00:13:53] having been veterans of Google Glass.

[00:13:55] I don't know what you're talking about, Jeff.

[00:13:58] What are you talking about?

[00:13:59] What can be possible?

[00:14:00] I'm so stupid,

[00:14:01] I got prescription lenses in mine.

[00:14:04] You're not stupid.

[00:14:05] You know, the thing that I really...

[00:14:07] Okay, a couple of things.

[00:14:09] The thing that I realized here

[00:14:10] is that Google was really ahead of its time.

[00:14:12] It was ahead of its time with Google Glass.

[00:14:15] The world was different 10 or so years ago

[00:14:18] when Google Glass first came out.

[00:14:20] It's phenomenal how early they were, right.

[00:14:22] People were freaked out

[00:14:23] about the ever-present camera

[00:14:25] and the ingestation

[00:14:28] or whatever you want to call it

[00:14:30] of information through the camera

[00:14:31] and how do I know

[00:14:32] if you're taking a picture and all that.

[00:14:34] Times have changed.

[00:14:35] People are far less skittish about that now.

[00:14:38] There's that.

[00:14:39] And then there's also the fact

[00:14:40] that Google Glass

[00:14:41] and a lot of what I saw

[00:14:43] with the Project Astra demo

[00:14:44] are really very much the same.

[00:14:47] It's just...

[00:14:48] This is just a frame

[00:14:49] with a little like...

[00:14:52] And by this, I mean Google Glass

[00:14:53] is just a frame

[00:14:54] with a little crystal

[00:14:55] that has the display up here.

[00:14:57] And the Project Astra frames

[00:14:59] had a little display

[00:15:01] that put very similar information.

[00:15:04] Like I had maps streaming to it.

[00:15:06] It was like directions.

[00:15:07] And if I kind of tilted my head

[00:15:09] slightly down,

[00:15:10] it faded into this top-down view

[00:15:12] of the immediate area on a map.

[00:15:14] And so I could see

[00:15:15] that I'm turning up ahead.

[00:15:18] And you kind of envision

[00:15:19] just like walking down the street

[00:15:20] and, oh, I don't...

[00:15:21] Oh, yeah, that's where I'm going.

[00:15:22] You turn down and look

[00:15:24] and you see where you're going.

[00:15:25] So it felt reminiscent of...

[00:15:27] of the technology.

[00:15:27] ...but better than Google Glass.

[00:15:30] For sure,

[00:15:30] because now it can do so much more.

[00:15:32] But, you know,

[00:15:33] even with Google Glass,

[00:15:34] we were querying it

[00:15:36] with voice commands

[00:15:37] and things like that.

[00:15:39] And fingers.

[00:15:40] All right, so how much...

[00:15:40] Is there...

[00:15:41] Fingers.

[00:15:42] There's voice commands with this.

[00:15:44] For sure, yeah.

[00:15:45] Is there finger stuff too?

[00:15:47] You can.

[00:15:48] I noticed that it did

[00:15:50] a really good job

[00:15:50] of knowing when I was talking

[00:15:52] to someone else

[00:15:53] and not it,

[00:15:54] which I thought

[00:15:54] was really interesting.

[00:15:55] Oh, interesting.

[00:15:56] Also rejecting other people.

[00:15:58] I was like,

[00:15:59] how come you guys

[00:16:02] do it?

[00:16:03] And they've done

[00:16:04] a lot of work

[00:16:04] in denoising algorithms

[00:16:06] and everything to know,

[00:16:07] you know,

[00:16:08] where that audio

[00:16:08] is coming from.

[00:16:10] But you can pause it.

[00:16:11] You can tap on the side

[00:16:12] of the frames.

[00:16:12] There's also a couple

[00:16:13] of other buttons there

[00:16:14] that do different things.

[00:16:15] So in looking

[00:16:17] at the brief demo

[00:16:17] that's online,

[00:16:21] as a New Yorker,

[00:16:23] we get upset

[00:16:24] with tourists

[00:16:25] who go slow

[00:16:26] and people

[00:16:27] on their phones

[00:16:28] who slow down

[00:16:29] to start typing

[00:16:30] on their phones.

[00:16:32] and then almost

[00:16:34] walk into traffic

[00:16:34] and be killed.

[00:16:35] Oh, yeah, yeah.

[00:16:35] Right.

[00:16:36] Right?

[00:16:36] So how does this,

[00:16:38] you think,

[00:16:39] affect your relationship

[00:16:41] with the outside world

[00:16:42] when you're interacting

[00:16:44] with it?

[00:16:45] Yeah.

[00:16:46] Good question.

[00:16:47] I mean,

[00:16:48] hmm.

[00:16:49] I do wonder

[00:16:51] about that.

[00:16:52] Like when the person

[00:16:53] was there,

[00:16:54] so I had kind of

[00:16:56] like my media handler

[00:16:58] during this demo.

[00:16:59] Oh, yeah.

[00:16:59] She was kind of

[00:17:00] walking me around.

[00:17:00] Okay, and here's a painting

[00:17:01] and blah, blah, blah, blah, blah.

[00:17:02] And I'm trying to like

[00:17:04] listen to her.

[00:17:04] Meanwhile,

[00:17:05] there were some misfires

[00:17:06] where Gemini's also talking

[00:17:07] in my ear at the same time

[00:17:08] that she's talking to me.

[00:17:10] And so I had to kind of like

[00:17:11] maneuver that.

[00:17:12] So that was a little

[00:17:13] awkward at times.

[00:17:13] Without being rude

[00:17:14] to the human.

[00:17:15] Yeah.

[00:17:15] Yeah, totally.

[00:17:16] Even though she,

[00:17:17] you know,

[00:17:17] she's on the team.

[00:17:18] She's totally used to it.

[00:17:19] She understands.

[00:17:20] But trying to think about

[00:17:21] in the real world.

[00:17:22] And then the woman

[00:17:23] that was talking to me

[00:17:24] in Spanish

[00:17:26] and that it was translating.

[00:17:28] Even though I'm looking

[00:17:29] at that person

[00:17:30] or at you

[00:17:31] while it's translating,

[00:17:33] I still,

[00:17:34] I know in my mind

[00:17:35] that I'm also,

[00:17:36] I'm reading something.

[00:17:37] So it's not,

[00:17:38] I'm not purely connecting

[00:17:39] with the person.

[00:17:40] I am still looking

[00:17:41] at a layer between me.

[00:17:43] And so there is

[00:17:44] some disconnect there.

[00:17:45] And I don't know

[00:17:45] how that's going to interplay

[00:17:47] when you're out and about.

[00:17:49] Certainly,

[00:17:50] there's not something

[00:17:51] showing on the screen

[00:17:52] at all times.

[00:17:53] It does fade out

[00:17:54] at a certain point.

[00:17:57] You know,

[00:17:57] if you're not using it

[00:17:58] or if you haven't queried

[00:17:59] or whatever.

[00:18:00] So it's not like

[00:18:01] you always have that screen there

[00:18:03] kind of separating you

[00:18:04] from the world.

[00:18:05] But it's definitely

[00:18:05] a consideration.

[00:18:06] What scared me about it

[00:18:08] was the thought

[00:18:10] of people wearing this

[00:18:10] while they're driving?

[00:18:12] Because you're here

[00:18:13] or you're there, right?

[00:18:14] You know,

[00:18:15] the thought,

[00:18:15] idiots will do this.

[00:18:17] But it would be neat

[00:18:18] if you could transfer this

[00:18:19] to a heads-up display

[00:18:21] in your Android Auto

[00:18:22] or something like that

[00:18:24] where you can still interact

[00:18:26] but not here.

[00:18:28] Or, you know,

[00:18:29] Google knows

[00:18:31] when you're in a car

[00:18:32] and you're moving

[00:18:32] although it doesn't know

[00:18:33] if you're a passenger

[00:18:33] versus a driver.

[00:18:34] Right.

[00:18:35] But maybe,

[00:18:35] you know,

[00:18:36] maybe there's some,

[00:18:37] you know,

[00:18:38] this isn't a feature by the way.

[00:18:39] This is just thinking

[00:18:40] about what could be.

[00:18:41] But, you know,

[00:18:41] maybe there's some future

[00:18:42] to detect that

[00:18:43] and know to switch

[00:18:44] to an audio-only version

[00:18:45] versus a screen,

[00:18:47] you know,

[00:18:47] a heads-up version.

[00:18:49] Something along those lines

[00:18:50] because, yeah.

[00:18:52] All right.

[00:18:52] So there was that.

[00:18:54] There was Astra

[00:18:55] through those glasses.

[00:18:57] There was Wuhan.

[00:18:58] And then there was

[00:18:59] all the announcements

[00:18:59] about Gemini.

[00:19:01] Yeah.

[00:19:02] So,

[00:19:02] and those came

[00:19:03] a day earlier.

[00:19:03] I want to hear

[00:19:03] about all those.

[00:19:05] Yeah.

[00:19:05] Yeah.

[00:19:05] Well,

[00:19:06] so the Gemini announcements

[00:19:07] came one day earlier

[00:19:08] and of course

[00:19:08] for anyone watching

[00:19:09] or reading through

[00:19:10] the announcements,

[00:19:11] you know,

[00:19:11] there is some overlap here,

[00:19:14] of course.

[00:19:15] Gemini 2.0

[00:19:17] does tie into

[00:19:18] the XR announcement

[00:19:19] at least as far

[00:19:20] as Project Astra

[00:19:21] is concerned.

[00:19:23] The 2.0 update

[00:19:25] improved the latency

[00:19:26] which I actually found

[00:19:27] the latency

[00:19:27] to still be a little

[00:19:29] too posy

[00:19:30] in the conversation.

[00:19:32] so hopefully

[00:19:32] they continue

[00:19:33] to work on that

[00:19:34] but improved memory

[00:19:36] taps into

[00:19:37] search,

[00:19:37] lens,

[00:19:38] maps,

[00:19:38] all things

[00:19:39] that I,

[00:19:39] you know,

[00:19:40] experienced personally,

[00:19:41] improved dialogue,

[00:19:42] that sort of stuff.

[00:19:43] But the,

[00:19:44] so that was part

[00:19:45] of what I saw

[00:19:46] when I was down there

[00:19:47] but then there's

[00:19:48] also a bunch

[00:19:48] of other news

[00:19:50] related to Gemini 2.0

[00:19:51] to 2.0 Flash

[00:19:53] just kind of

[00:19:54] a faster,

[00:19:55] more capable

[00:19:56] version of 2.0 Flash

[00:19:57] or of Gemini Flash

[00:19:59] and then

[00:20:00] the rest of them

[00:20:01] I think are all

[00:20:02] agents related,

[00:20:03] agentic.

[00:20:04] Yeah,

[00:20:04] I was trying to figure out

[00:20:05] how they announced

[00:20:07] this as,

[00:20:08] as our new AI model

[00:20:10] for the agentic era

[00:20:11] which also says

[00:20:12] that Google has decided

[00:20:13] on agentic

[00:20:14] not agentive

[00:20:14] so we're,

[00:20:15] we're,

[00:20:16] we're,

[00:20:16] we're,

[00:20:16] we're,

[00:20:16] solid.

[00:20:18] But I was trying

[00:20:18] to figure out

[00:20:20] what exactly

[00:20:21] that meant.

[00:20:23] How,

[00:20:23] how agentic it is,

[00:20:25] right?

[00:20:25] So they showed

[00:20:26] unlocking agentic

[00:20:28] experiences

[00:20:29] with it

[00:20:30] and

[00:20:32] they included

[00:20:33] Astra with that

[00:20:35] but I was trying

[00:20:35] to figure out

[00:20:36] how it's agentic.

[00:20:38] Do you have any

[00:20:38] sense of that?

[00:20:41] Astra?

[00:20:41] It can help you

[00:20:43] play a game.

[00:20:44] It can give you

[00:20:45] tips.

[00:20:45] Yeah,

[00:20:45] so there's,

[00:20:46] that's one.

[00:20:47] Right,

[00:20:47] totally.

[00:20:47] I,

[00:20:48] like when I think

[00:20:48] of Astra

[00:20:49] I don't,

[00:20:50] I don't automatically

[00:20:50] think of

[00:20:53] agentic necessarily.

[00:20:54] I,

[00:20:54] I think of it

[00:20:55] as being,

[00:20:56] right,

[00:20:56] support,

[00:20:57] you know,

[00:20:57] kind of serving

[00:20:58] the questions

[00:20:59] and interactions

[00:20:59] that I'm looking

[00:21:00] for.

[00:21:00] I think some

[00:21:01] of the,

[00:21:02] some of the

[00:21:02] agentic features,

[00:21:04] even,

[00:21:04] even games,

[00:21:05] I don't,

[00:21:05] I don't know

[00:21:07] that I think

[00:21:08] about that as

[00:21:08] agentic because

[00:21:09] even though they're

[00:21:10] saying in a,

[00:21:11] oh,

[00:21:11] I see,

[00:21:11] in addition to

[00:21:12] exploring agentic

[00:21:13] capabilities in the

[00:21:14] virtual world,

[00:21:15] oh,

[00:21:15] we're experienced,

[00:21:16] experimenting with

[00:21:17] agents that can help

[00:21:18] in the physical

[00:21:18] world.

[00:21:19] So,

[00:21:20] so how do you

[00:21:20] define agentic

[00:21:21] then I think

[00:21:22] is my question

[00:21:23] to Google

[00:21:23] because when I think

[00:21:24] of agent,

[00:21:25] yeah,

[00:21:26] when I think

[00:21:26] of agent,

[00:21:27] I think of it

[00:21:27] does a task

[00:21:28] for you.

[00:21:29] Is a task

[00:21:30] answering a

[00:21:31] question or

[00:21:32] is a task

[00:21:33] actually,

[00:21:35] actually tweaking

[00:21:36] a setting

[00:21:36] on my computer

[00:21:37] when I want it

[00:21:38] to,

[00:21:38] you know,

[00:21:39] turn me into

[00:21:39] dark,

[00:21:40] turn it into

[00:21:40] dark mode

[00:21:42] as you recoil.

[00:21:44] You know,

[00:21:45] Project Mariner

[00:21:46] was another one

[00:21:47] of their

[00:21:47] announcements,

[00:21:48] which is an

[00:21:48] experimental

[00:21:49] Chrome extension

[00:21:50] that is basically

[00:21:51] an agent

[00:21:52] for the browser

[00:21:52] that I could

[00:21:53] see like

[00:21:54] and that ties

[00:21:55] into the stuff.

[00:21:56] Right,

[00:21:57] we talked about

[00:21:57] earlier is when

[00:21:58] we heard rumors

[00:21:59] or reports

[00:22:00] that Google

[00:22:01] like Microsoft

[00:22:02] was going to

[00:22:03] look at your

[00:22:03] browser and

[00:22:04] act off the

[00:22:05] context of your

[00:22:06] browser.

[00:22:06] I think that's

[00:22:07] Mariner,

[00:22:07] isn't it?

[00:22:08] I believe so.

[00:22:09] I mean,

[00:22:11] that's what I'm

[00:22:12] led to understand

[00:22:13] based on what

[00:22:14] little they

[00:22:14] shared about

[00:22:17] Mariner.

[00:22:17] Where is it?

[00:22:18] Yeah.

[00:22:19] So it's meant to do

[00:22:20] tasks for you.

[00:22:20] So I signed up

[00:22:21] for the early thing

[00:22:22] and what kind of

[00:22:22] task do you want

[00:22:23] to do?

[00:22:23] I don't know,

[00:22:24] what can you do?

[00:22:25] Yes.

[00:22:26] But I think,

[00:22:26] you know,

[00:22:27] making scheduling

[00:22:27] that shows you

[00:22:28] with a spreadsheet

[00:22:29] and you can have

[00:22:30] it do tasks there.

[00:22:32] So it's also

[00:22:33] a little bit like

[00:22:34] a co-pilot

[00:22:35] and having you

[00:22:35] do something.

[00:22:36] It's all,

[00:22:36] this is all

[00:22:37] mushing in

[00:22:40] together here

[00:22:41] in interesting

[00:22:42] ways,

[00:22:43] which is okay

[00:22:43] because better

[00:22:44] than it knows

[00:22:45] the language

[00:22:46] we speak

[00:22:46] or the language

[00:22:47] Google wants us

[00:22:47] to speak.

[00:22:48] Yeah.

[00:22:49] Yeah.

[00:22:50] Yeah.

[00:22:51] I think that's,

[00:22:51] that's a fair

[00:22:52] question though.

[00:22:53] Like I wouldn't

[00:22:54] immediately say,

[00:22:55] oh,

[00:22:56] Project Astra,

[00:22:57] that's agent,

[00:22:57] agentic AI based

[00:22:59] on my kind of

[00:23:00] current kind

[00:23:02] of feeling

[00:23:03] or knowledge

[00:23:04] or understanding

[00:23:05] of what agentic

[00:23:06] means in AI,

[00:23:07] but maybe it is

[00:23:08] because I'm

[00:23:09] asking you to

[00:23:10] translate something

[00:23:11] and it's translating

[00:23:12] it.

[00:23:13] So is that an

[00:23:13] agentic action?

[00:23:14] Well,

[00:23:15] okay.

[00:23:15] So let's be

[00:23:16] generous here.

[00:23:16] I think you're

[00:23:17] right,

[00:23:17] Jason.

[00:23:17] If you're

[00:23:18] walking down

[00:23:19] the street

[00:23:19] and it knows

[00:23:19] you want to

[00:23:20] go somewhere,

[00:23:21] then it is

[00:23:21] your agent

[00:23:23] tipping you

[00:23:24] off about

[00:23:24] where to go

[00:23:25] or,

[00:23:26] right?

[00:23:27] And it says,

[00:23:27] well,

[00:23:27] you want to

[00:23:27] turn right

[00:23:28] up the next

[00:23:28] street.

[00:23:29] Is that an

[00:23:29] agent?

[00:23:29] Not in the

[00:23:30] way I thought

[00:23:30] of it,

[00:23:31] but I guess

[00:23:31] the way you

[00:23:33] just described

[00:23:33] it.

[00:23:34] Yeah,

[00:23:34] it could be

[00:23:34] because it's

[00:23:35] your little

[00:23:35] help bait

[00:23:36] saying,

[00:23:37] hello,

[00:23:37] idiot,

[00:23:37] turn right.

[00:23:39] So then how,

[00:23:41] but then along

[00:23:42] those lines,

[00:23:43] how is it not

[00:23:44] agentic if I go

[00:23:45] to ChatGPT

[00:23:46] and I say,

[00:23:46] rewrite this

[00:23:47] sentence for me?

[00:23:48] Like that's

[00:23:49] agentic because

[00:23:50] it's carrying out

[00:23:51] an action.

[00:23:51] Like at that

[00:23:52] point,

[00:23:52] everything seems

[00:23:53] agentic because

[00:23:54] it's all carrying

[00:23:54] out action.

[00:23:55] It's the hot

[00:23:55] word right now,

[00:23:56] so they're all

[00:23:57] going to try

[00:23:57] to call it

[00:23:57] agentic,

[00:23:58] so now we're

[00:23:58] dubious.

[00:23:59] Oh no,

[00:24:00] that's where

[00:24:00] we're at.

[00:24:01] The lines are

[00:24:01] going to get

[00:24:02] blurred.

[00:24:03] Does its

[00:24:04] awareness of

[00:24:05] the context

[00:24:05] matters?

[00:24:06] Because I know

[00:24:07] what you're

[00:24:08] doing,

[00:24:09] I can help

[00:24:09] you in a way?

[00:24:10] Is that

[00:24:11] agentic?

[00:24:11] Maybe so.

[00:24:12] Oh,

[00:24:13] that could,

[00:24:13] yeah,

[00:24:14] maybe that's

[00:24:14] the differentiation.

[00:24:15] I'm not

[00:24:16] entirely sure.

[00:24:17] It's a great

[00:24:17] question.

[00:24:18] I hadn't really

[00:24:18] considered that,

[00:24:19] but that's true.

[00:24:20] When I saw

[00:24:20] Google's blog

[00:24:21] post,

[00:24:21] I kind of

[00:24:22] had the

[00:24:22] same reaction

[00:24:23] initially,

[00:24:24] and then as I

[00:24:25] read down,

[00:24:25] I was like,

[00:24:25] oh,

[00:24:25] here's a couple

[00:24:26] of things

[00:24:26] that I would

[00:24:27] think of as

[00:24:27] agentic.

[00:24:28] You know,

[00:24:28] the Mariner,

[00:24:29] the Chrome

[00:24:30] extension kind

[00:24:31] of falls into

[00:24:32] the bucket

[00:24:32] that I would

[00:24:33] consider.

[00:24:34] Right.

[00:24:35] They also

[00:24:35] talk about

[00:24:36] Jules,

[00:24:36] which is

[00:24:37] agents to

[00:24:37] assist developers.

[00:24:39] So,

[00:24:39] you know,

[00:24:39] you could send

[00:24:40] AI code

[00:24:40] agent to

[00:24:42] integrate into

[00:24:43] the GitHub

[00:24:44] workflow.

[00:24:44] So,

[00:24:45] you put in

[00:24:46] a piece of

[00:24:46] code,

[00:24:46] it knows

[00:24:47] kind of

[00:24:47] where it

[00:24:48] goes and

[00:24:48] how to

[00:24:48] use that,

[00:24:50] that sort

[00:24:50] of stuff.

[00:24:51] But,

[00:24:51] yeah,

[00:24:52] but I don't

[00:24:52] know.

[00:24:52] It kind

[00:24:53] of seems

[00:24:53] like a

[00:24:53] terminology

[00:24:54] that's

[00:24:54] getting a

[00:24:55] little,

[00:24:55] maybe getting

[00:24:56] a little

[00:24:56] mushy and

[00:24:57] gooey

[00:24:57] and pliable.

[00:24:58] The other

[00:24:58] thing that's

[00:24:58] in here is

[00:24:59] part of Astra,

[00:25:00] the

[00:25:00] announcement,

[00:25:01] which you

[00:25:01] talked about

[00:25:01] earlier,

[00:25:02] is better

[00:25:02] memory,

[00:25:02] is remembering

[00:25:03] things that

[00:25:04] you see.

[00:25:05] Is that

[00:25:05] agentic?

[00:25:06] Because it

[00:25:07] can call

[00:25:08] on things

[00:25:08] for you.

[00:25:09] You can

[00:25:10] say,

[00:25:10] hey,

[00:25:10] remember

[00:25:10] that thing

[00:25:11] you saw

[00:25:11] back there?

[00:25:12] Do this

[00:25:12] with it.

[00:25:13] I guess

[00:25:14] that's

[00:25:14] agentic.

[00:25:15] Okay.

[00:25:16] Yeah.

[00:25:18] The presumption

[00:25:19] about agentic

[00:25:19] is make an

[00:25:20] airline reservation

[00:25:21] for me.

[00:25:22] Yeah.

[00:25:23] Right.

[00:25:24] Do a

[00:25:24] task that

[00:25:25] I otherwise

[00:25:25] could order

[00:25:26] dinner for

[00:25:26] me.

[00:25:27] Well,

[00:25:27] okay,

[00:25:28] but what

[00:25:28] do you

[00:25:28] want?

[00:25:30] It's like

[00:25:30] husband and

[00:25:31] wife.

[00:25:31] Whatever

[00:25:32] you want.

[00:25:32] No,

[00:25:32] no,

[00:25:32] tell me

[00:25:33] what you

[00:25:33] want.

[00:25:35] And then

[00:25:35] you get

[00:25:36] it.

[00:25:36] I didn't

[00:25:36] want that.

[00:25:37] Right.

[00:25:37] Oh,

[00:25:37] yeah.

[00:25:39] Yeah.

[00:25:40] I didn't

[00:25:41] want that

[00:25:41] agent.

[00:25:42] Geez,

[00:25:42] you should

[00:25:42] have known

[00:25:43] me better

[00:25:43] than that.

[00:25:44] Well,

[00:25:44] just your

[00:25:44] agent.

[00:25:45] In other

[00:25:45] words,

[00:25:45] your agent

[00:25:45] is your

[00:25:46] spouse.

[00:25:46] Your

[00:25:47] agent

[00:25:47] is your

[00:25:48] backseat

[00:25:48] driver.

[00:25:49] Your

[00:25:49] agent

[00:25:49] is your

[00:25:51] judge.

[00:25:52] Yeah.

[00:25:54] So they

[00:25:55] also

[00:25:56] mention

[00:25:56] jewels for

[00:25:58] developers.

[00:25:58] Since I'm

[00:25:59] not a

[00:25:59] developer,

[00:25:59] I don't

[00:26:00] really

[00:26:00] understand

[00:26:00] all this,

[00:26:01] but I

[00:26:01] think that

[00:26:02] it's

[00:26:02] obviously a

[00:26:02] hot piece

[00:26:03] here in

[00:26:04] all of

[00:26:04] this is

[00:26:04] helping

[00:26:05] developers.

[00:26:06] And it's

[00:26:06] what they

[00:26:06] understand.

[00:26:07] It continues

[00:26:07] to be.

[00:26:08] It continues

[00:26:08] to be.

[00:26:09] So tell me

[00:26:09] about Wuhan.

[00:26:13] The Wuhan?

[00:26:14] Yeah.

[00:26:15] Wuhan was

[00:26:16] really just

[00:26:17] the VR

[00:26:18] aspect of

[00:26:19] this.

[00:26:19] Just like

[00:26:19] Apple's?

[00:26:21] Well,

[00:26:22] I personally

[00:26:23] have not

[00:26:23] tried

[00:26:24] Apple Vision

[00:26:24] Pro,

[00:26:25] so I

[00:26:25] can't

[00:26:25] compare it

[00:26:26] specifically

[00:26:27] to that.

[00:26:27] But it

[00:26:28] had a lot

[00:26:29] of similarities

[00:26:29] in the

[00:26:31] Gemini

[00:26:31] aspects of

[00:26:32] it with

[00:26:33] the pass

[00:26:33] through

[00:26:33] camera.

[00:26:33] I think

[00:26:34] the thing

[00:26:34] that really

[00:26:36] struck me

[00:26:37] about

[00:26:38] Wuhan was

[00:26:39] that I'm

[00:26:40] used to

[00:26:41] when I go

[00:26:41] into a

[00:26:42] VR environment

[00:26:42] with a

[00:26:43] pass through

[00:26:43] camera,

[00:26:43] A,

[00:26:44] I'm used

[00:26:44] to the

[00:26:44] pass through

[00:26:45] camera looking

[00:26:45] like a

[00:26:46] potato compared

[00:26:47] to reality.

[00:26:48] And in this

[00:26:48] case,

[00:26:49] things looked

[00:26:50] very sharp.

[00:26:52] Not a

[00:26:52] perfect

[00:26:53] representation

[00:26:53] of the

[00:26:55] reality

[00:26:55] through the

[00:26:55] camera,

[00:26:56] but closer

[00:26:57] than I've

[00:26:57] ever seen

[00:26:58] and very,

[00:26:58] very convincing.

[00:26:59] Close enough

[00:26:59] that you

[00:27:00] could walk

[00:27:00] around a

[00:27:01] building and

[00:27:01] not kill

[00:27:02] yourself?

[00:27:02] That was

[00:27:03] exactly it.

[00:27:04] At no

[00:27:04] time did

[00:27:05] I feel

[00:27:06] the scale

[00:27:07] or anything

[00:27:08] or that you

[00:27:09] were isolated

[00:27:09] from the

[00:27:09] world.

[00:27:10] Yeah.

[00:27:11] And part

[00:27:12] of that

[00:27:12] also is

[00:27:13] that the

[00:27:14] bottom of

[00:27:14] the goggles

[00:27:15] actually do

[00:27:15] let through

[00:27:16] not only

[00:27:16] light,

[00:27:17] but they

[00:27:17] let through

[00:27:17] part of

[00:27:18] the room.

[00:27:18] And you

[00:27:19] can block

[00:27:20] that if you

[00:27:20] choose to get

[00:27:21] full of

[00:27:21] or you

[00:27:22] can let it

[00:27:23] through.

[00:27:24] And actually

[00:27:25] if you're

[00:27:25] doing AR

[00:27:26] kind of

[00:27:27] stuff,

[00:27:27] letting it

[00:27:28] through kind

[00:27:28] of lends

[00:27:29] to more

[00:27:29] of the

[00:27:30] illusion

[00:27:30] because now

[00:27:32] the way

[00:27:33] I liken it

[00:27:33] is you

[00:27:34] know how

[00:27:34] people have

[00:27:34] their TV

[00:27:35] set and

[00:27:35] then they

[00:27:35] put LEDs

[00:27:36] on the

[00:27:36] back of

[00:27:37] the TV

[00:27:37] set to

[00:27:37] splash on

[00:27:38] the wall

[00:27:38] with the

[00:27:39] colors of

[00:27:40] the TV so

[00:27:40] it makes it

[00:27:41] more immersive.

[00:27:41] Have you seen

[00:27:42] that?

[00:27:42] No, I never

[00:27:42] have seen

[00:27:43] that, no.

[00:27:44] Well, it's a

[00:27:45] trend that people

[00:27:45] have been doing

[00:27:46] for a while

[00:27:46] to kind

[00:27:47] of like

[00:27:47] broaden the

[00:27:48] perceivable

[00:27:49] size of

[00:27:50] the information

[00:27:51] on the TV

[00:27:51] and this

[00:27:53] is kind

[00:27:53] of like

[00:27:53] that.

[00:27:54] It lets

[00:27:54] in more

[00:27:55] of the

[00:27:55] room and

[00:27:55] so it

[00:27:56] kind of

[00:27:56] gives you

[00:27:57] your peripheral

[00:27:58] view a

[00:27:58] sense that

[00:27:59] what you're

[00:27:59] seeing actually

[00:28:00] does exist

[00:28:01] in the real

[00:28:01] world.

[00:28:02] And then

[00:28:03] there's the

[00:28:04] visual acuity

[00:28:05] of the

[00:28:05] information,

[00:28:06] the digital

[00:28:06] virtual

[00:28:07] information was

[00:28:08] very high

[00:28:09] resolution to

[00:28:10] my eyes.

[00:28:11] It looked

[00:28:11] really sharp

[00:28:12] and so they

[00:28:13] blended really

[00:28:14] well and

[00:28:14] that was pretty

[00:28:15] impressive.

[00:28:16] And so

[00:28:16] clearly with

[00:28:16] the VR

[00:28:17] glasses you

[00:28:18] can also

[00:28:20] do a lot

[00:28:20] more because

[00:28:21] you can have

[00:28:21] your computer

[00:28:22] screen and

[00:28:24] media and

[00:28:25] other things

[00:28:26] presented to

[00:28:26] you.

[00:28:27] That's the

[00:28:27] real difference?

[00:28:28] For sure.

[00:28:29] That's certainly

[00:28:30] one of the

[00:28:31] advantages.

[00:28:32] The displays

[00:28:33] that I was

[00:28:34] working with

[00:28:34] and by

[00:28:35] display I

[00:28:35] mean like

[00:28:36] Chrome

[00:28:36] browsers.

[00:28:37] I had a

[00:28:37] whole row

[00:28:39] of Chrome

[00:28:40] browsers which

[00:28:40] I could say

[00:28:41] to Gemini,

[00:28:42] hey clean

[00:28:42] up my view

[00:28:45] and it'll

[00:28:45] sort them

[00:28:46] for you

[00:28:46] and everything.

[00:28:47] But anyways

[00:28:47] those individual

[00:28:48] displays look

[00:28:49] like I've

[00:28:50] got this

[00:28:50] giant 4K

[00:28:51] display in

[00:28:52] front of me

[00:28:52] in real life

[00:28:53] right now

[00:28:53] and they

[00:28:54] look like

[00:28:54] that.

[00:28:55] They weren't

[00:28:55] like mushy

[00:28:57] or kind of

[00:28:57] hard to see

[00:28:58] or whatever.

[00:28:58] It really

[00:28:59] was high

[00:29:00] resolution.

[00:29:01] There was a

[00:29:02] standing desk

[00:29:03] area that I

[00:29:04] went to at

[00:29:04] one point

[00:29:05] and it had

[00:29:06] a Bluetooth

[00:29:07] keyboard and

[00:29:08] a Bluetooth

[00:29:08] mouse so the

[00:29:09] multi-modality

[00:29:10] of control I

[00:29:11] could just

[00:29:11] wiggle the

[00:29:12] mouse and

[00:29:12] now I'm in

[00:29:13] mouse mode.

[00:29:14] I have my

[00:29:15] display in

[00:29:16] front of me

[00:29:16] and I

[00:29:18] could have

[00:29:19] multiple

[00:29:20] displays if

[00:29:20] I want to

[00:29:21] and they're

[00:29:21] all high

[00:29:22] resolution and

[00:29:22] it was kind

[00:29:23] of neat.

[00:29:24] I would be

[00:29:24] curious to

[00:29:25] work like

[00:29:26] that.

[00:29:27] Absolutely.

[00:29:28] Right.

[00:29:29] Yeah.

[00:29:29] Because you

[00:29:30] got your

[00:29:30] Bluetooth

[00:29:30] keyboard.

[00:29:32] And then

[00:29:32] all the apps

[00:29:34] that they

[00:29:34] showed off,

[00:29:35] photos,

[00:29:35] maps,

[00:29:36] Google TV,

[00:29:38] Play movies,

[00:29:39] all kind

[00:29:40] of demonstrated

[00:29:41] how this

[00:29:42] virtualized

[00:29:43] environment,

[00:29:44] how they

[00:29:44] tweaked their

[00:29:45] apps to

[00:29:45] work inside

[00:29:46] of the

[00:29:46] virtual

[00:29:46] environment,

[00:29:48] their kind

[00:29:49] of capabilities

[00:29:49] around

[00:29:50] spatializing

[00:29:51] 2D footage.

[00:29:52] And so if

[00:29:53] you have your

[00:29:54] photos library,

[00:29:54] this is actually

[00:29:55] really cool,

[00:29:55] Jeff,

[00:29:56] if you have

[00:29:56] your photos

[00:29:57] library,

[00:29:57] every single

[00:29:58] photo and

[00:29:59] video that

[00:30:00] you've ever

[00:30:00] taken can

[00:30:01] be spatialized.

[00:30:02] And so then

[00:30:03] suddenly you

[00:30:04] view it in

[00:30:04] your VR

[00:30:06] headset and it's

[00:30:07] in three

[00:30:07] dimensions.

[00:30:08] Even the

[00:30:08] videos.

[00:30:09] And so it's

[00:30:10] all done through

[00:30:11] their AI

[00:30:11] algorithms and

[00:30:12] I didn't see

[00:30:13] the process.

[00:30:13] I don't know

[00:30:14] how long it

[00:30:14] takes to do

[00:30:15] this or what

[00:30:16] the process is

[00:30:16] there, but I

[00:30:17] saw, you

[00:30:18] know, of

[00:30:18] course,

[00:30:19] they're hand

[00:30:19] picked versions

[00:30:20] of this being

[00:30:21] done.

[00:30:21] And you just

[00:30:22] simply wouldn't

[00:30:23] know that it

[00:30:23] hadn't been

[00:30:24] created like

[00:30:25] that to begin

[00:30:25] with.

[00:30:26] How are we

[00:30:27] going to do

[00:30:27] anything real,

[00:30:28] Jason?

[00:30:30] Now you're

[00:30:31] standing in the

[00:30:31] middle of the

[00:30:31] room.

[00:30:32] I was in the

[00:30:32] middle in

[00:30:33] maps.

[00:30:34] I ended up

[00:30:35] going,

[00:30:35] walking,

[00:30:35] you know how

[00:30:36] some establishments

[00:30:37] have a street

[00:30:38] view thing that

[00:30:39] you can go into

[00:30:40] an actual

[00:30:40] building?

[00:30:41] And I did and

[00:30:42] they had a

[00:30:43] restaurant in

[00:30:44] downtown New

[00:30:45] York that had

[00:30:45] been shot by

[00:30:46] street view

[00:30:46] cameras years

[00:30:47] ago, totally

[00:30:48] flat footage,

[00:30:49] and they

[00:30:49] had spatialized

[00:30:51] it in AI.

[00:30:51] And sure

[00:30:52] enough, I'm

[00:30:53] standing in the

[00:30:53] middle of this

[00:30:53] restaurant.

[00:30:54] I'm like,

[00:30:54] wait, this

[00:30:55] wasn't shot in

[00:30:55] 3D?

[00:30:56] Are you kidding

[00:30:57] me?

[00:30:57] Like, it was

[00:30:57] crazy.

[00:30:58] And I'm

[00:30:58] looking around

[00:30:59] and I was

[00:30:59] just in the

[00:30:59] middle of this

[00:31:00] restaurant.

[00:31:00] It was really

[00:31:01] cool.

[00:31:01] You know,

[00:31:02] we were talking

[00:31:03] about this on

[00:31:03] Twig last night.

[00:31:07] I said, if

[00:31:08] only they didn't

[00:31:09] give us all

[00:31:10] this AGI BS,

[00:31:12] we'd be amazed

[00:31:13] at each one of

[00:31:14] these steps.

[00:31:15] Yeah, right.

[00:31:16] This is all

[00:31:16] really,

[00:31:17] pretty cool.

[00:31:18] But does it

[00:31:18] think?

[00:31:18] No, but it's

[00:31:19] not going to.

[00:31:21] It's amazing

[00:31:22] on its own.

[00:31:23] So that's

[00:31:23] cool.

[00:31:24] It sounds like

[00:31:24] it was a

[00:31:24] really cool

[00:31:25] trip, even

[00:31:26] with the

[00:31:26] three hours

[00:31:27] traffic.

[00:31:28] Yeah, it's

[00:31:29] all right.

[00:31:29] It was worth

[00:31:30] it.

[00:31:30] As a technology

[00:31:31] nerd and

[00:31:32] fan, it was

[00:31:33] a lot of

[00:31:33] fun.

[00:31:34] Also, kind

[00:31:35] of knowing

[00:31:35] that I was

[00:31:35] seeing something

[00:31:36] that not a

[00:31:36] lot of people

[00:31:37] were seeing,

[00:31:37] that felt

[00:31:38] nice.

[00:31:39] Walking into

[00:31:39] all the rooms

[00:31:40] of the executives

[00:31:41] like, hello,

[00:31:41] Jason.

[00:31:42] It's like, oh,

[00:31:43] I feel so special

[00:31:45] right now.

[00:31:45] It was really

[00:31:46] cool.

[00:31:46] That's cool.

[00:31:47] Yeah.

[00:31:48] Anyways, it was

[00:31:48] awesome.

[00:31:49] So people can

[00:31:51] check out my

[00:31:51] article on

[00:31:52] digital trends.

[00:31:53] I'm now a

[00:31:54] contributor to

[00:31:55] digital trends,

[00:31:55] actually.

[00:31:56] So I'll be

[00:31:56] writing more

[00:31:57] for them.

[00:31:58] And your video.

[00:31:59] Also put up a

[00:32:01] whole video.

[00:32:01] It's a whole

[00:32:02] hour and ten

[00:32:03] minutes of me

[00:32:04] talking about

[00:32:04] everything from

[00:32:05] this experience.

[00:32:06] So you've got

[00:32:07] time on your

[00:32:08] hands.

[00:32:08] Important to

[00:32:08] say Jason was

[00:32:09] not allowed to

[00:32:10] shoot images or

[00:32:12] video there.

[00:32:13] No, I was

[00:32:15] not allowed to

[00:32:15] shoot any

[00:32:16] footage.

[00:32:17] I recorded audio

[00:32:18] simply because I

[00:32:19] wanted to remember

[00:32:20] things and I

[00:32:20] knew I'd lose

[00:32:21] detail if I

[00:32:22] didn't and they

[00:32:22] were fine with

[00:32:23] that.

[00:32:23] And so I just

[00:32:24] kind of recount

[00:32:24] the whole thing.

[00:32:25] Did you feed

[00:32:26] it into

[00:32:26] Notebook LM and

[00:32:27] have it make

[00:32:27] its own podcast?

[00:32:30] I did not

[00:32:31] use it to make

[00:32:32] a podcast but I

[00:32:33] did use

[00:32:33] Notebook LM in

[00:32:34] the last handful

[00:32:35] of days to

[00:32:35] check my work

[00:32:36] to help you

[00:32:36] check facts.

[00:32:40] And I've

[00:32:41] realized

[00:32:42] Notebook LM is

[00:32:43] really a strong

[00:32:44] tool for that

[00:32:45] sort of thing.

[00:32:46] Here's my

[00:32:46] data set.

[00:32:47] Here's the

[00:32:47] thing I'm

[00:32:48] working on.

[00:32:48] Have I said

[00:32:49] anything here

[00:32:50] that's counter to

[00:32:51] what's in the

[00:32:51] data set?

[00:32:52] Oh, perfect.

[00:32:53] Perfect.

[00:32:54] Yeah.

[00:32:54] And it's not

[00:32:55] I don't think we

[00:32:55] put it in the

[00:32:56] rundown but just

[00:32:56] as a note the

[00:32:57] top three product

[00:32:59] people at

[00:32:59] Notebook LM have

[00:33:00] left Google.

[00:33:02] I saw that.

[00:33:03] To work on

[00:33:03] something related,

[00:33:04] obviously related,

[00:33:05] and so it'll be

[00:33:06] interesting to see

[00:33:06] the competition

[00:33:07] between Notebook LM

[00:33:08] and whatever their

[00:33:09] startup comes up

[00:33:10] with.

[00:33:10] I think it's a

[00:33:10] really exciting

[00:33:11] field.

[00:33:12] Yeah.

[00:33:13] They're like,

[00:33:13] yeah, we're

[00:33:14] cool.

[00:33:14] We made

[00:33:15] something cool.

[00:33:16] Let's go do

[00:33:16] it for ourselves.

[00:33:17] Yeah.

[00:33:19] And that'll keep

[00:33:20] happening.

[00:33:20] There's so much

[00:33:21] happening in this

[00:33:22] space and that

[00:33:23] gives us plenty to

[00:33:24] talk about.

[00:33:25] Good work, Jason.

[00:33:26] Thank you.

[00:33:27] Thank you for the

[00:33:28] wonderful questions

[00:33:29] around that.

[00:33:30] We're going to take a

[00:33:31] super quick break.

[00:33:32] We've got a lot more

[00:33:33] to talk about.

[00:33:34] Not only did Google

[00:33:35] have kind of an

[00:33:36] onslaught of new

[00:33:37] stuff, but so did

[00:33:39] OpenAI and continues

[00:33:40] to.

[00:33:40] That's coming up in a

[00:33:41] second.

[00:33:45] All right.

[00:33:45] So OpenAI has been

[00:33:46] celebrating, I'll put

[00:33:47] that in air quotes,

[00:33:48] celebrating the 12

[00:33:49] days of shipmas,

[00:33:51] release, and released a

[00:33:53] bunch of new stuff in

[00:33:55] a short amount of

[00:33:56] time.

[00:33:57] I think top of that

[00:33:59] list is probably the

[00:34:00] fact that they now

[00:34:01] have a $200 a month

[00:34:03] tier of ChatGPT.

[00:34:05] Well, thanks a lot,

[00:34:06] Santa.

[00:34:07] Yeah.

[00:34:07] Yeah.

[00:34:07] That's a real great

[00:34:09] gift under the tree

[00:34:10] called ChatGPT Pro.

[00:34:13] $200 a month for

[00:34:14] unlimited O1 access,

[00:34:17] O1 pro mode.

[00:34:18] You know, you better

[00:34:20] get a lot of perks for

[00:34:20] $200 a month.

[00:34:22] Yeah.

[00:34:22] And I keep on, the

[00:34:23] business model of all

[00:34:24] this just keeps on

[00:34:25] smelling like cheese to

[00:34:27] me.

[00:34:29] Because there's so

[00:34:31] much that's available

[00:34:32] for free out there.

[00:34:34] And the increment

[00:34:37] of what's included in

[00:34:39] free keep moving

[00:34:40] along.

[00:34:41] And so it's, it's,

[00:34:43] it's, it's, it's

[00:34:44] they're tripping over

[00:34:44] themselves to try to

[00:34:45] make that value for

[00:34:46] consumer revenue.

[00:34:47] It's going to be hard,

[00:34:48] I think.

[00:34:50] I'm not spending

[00:34:50] $200 a month.

[00:34:51] Neither of us is

[00:34:52] spending $200 a month

[00:34:53] unless we get a lot of

[00:34:54] new members, folks.

[00:34:55] A lot of new members

[00:34:57] will join.

[00:34:58] We got it.

[00:34:59] We got it.

[00:35:00] Yeah.

[00:35:00] We can only afford to

[00:35:01] do that if we're

[00:35:02] making enough that $200

[00:35:03] a month doesn't seem

[00:35:04] like, you know, we're

[00:35:06] burning through it all.

[00:35:08] Um, and yeah, so

[00:35:09] that, well, that makes

[00:35:10] me wonder like, so

[00:35:11] then who is, who, who

[00:35:12] is doing $200 a month

[00:35:13] for chat GPT pro?

[00:35:15] Like, I wonder what

[00:35:16] the, well, you know,

[00:35:17] it's like the wall

[00:35:18] street journal user.

[00:35:19] If you've got an

[00:35:19] expense account and you

[00:35:20] go to your, I'm the

[00:35:21] AI guy at company X

[00:35:23] and you say, well, I

[00:35:24] have to know all this,

[00:35:25] don't I?

[00:35:26] And the company may

[00:35:27] not yet be buying an

[00:35:28] enterprise license, but

[00:35:29] should your AI people

[00:35:31] have been doing this?

[00:35:31] Yeah.

[00:35:32] Um, that's a good

[00:35:33] point.

[00:35:33] And thank you.

[00:35:34] Oh, no nightmare.

[00:35:35] Uh, Joe just gave us

[00:35:36] our first contribution

[00:35:37] toward, uh, chat GPT pro.

[00:35:40] Thank you.

[00:35:40] Joe.

[00:35:42] Thank you.

[00:35:43] We are $190 away from

[00:35:45] one month of chat GPT

[00:35:47] pro.

[00:35:47] For one of us.

[00:35:49] Uh, we actually saw the

[00:35:50] ozone nightmare at your

[00:35:51] book signing.

[00:35:52] Yes, it was wonderful to

[00:35:53] see you, Joe.

[00:35:53] Thank you.

[00:35:54] Great to see you, Joe.

[00:35:55] Thank you for coming in

[00:35:56] Jason.

[00:35:57] Yeah, of course.

[00:35:58] It was a lot of fun.

[00:35:59] I was really stoked to be

[00:36:00] there.

[00:36:01] And it was actually also

[00:36:02] totally a tangent, but it

[00:36:04] was, it was also cool

[00:36:05] because so many of the

[00:36:06] themes that we talk about

[00:36:07] on this show and

[00:36:08] everything, you know, came

[00:36:09] up and it, it shows me

[00:36:11] how, um, complimentary

[00:36:13] your writing of the book

[00:36:14] and these topics.

[00:36:15] That's why I value the

[00:36:16] show because it's what

[00:36:17] enables me to keep

[00:36:18] researching this stuff

[00:36:19] and trying to stay up on

[00:36:20] it.

[00:36:20] So every week I got to

[00:36:21] go through all the AI

[00:36:22] stuff I can find, uh,

[00:36:24] to see what's happening.

[00:36:24] Right.

[00:36:25] So that's, that's, that's

[00:36:26] helpful.

[00:36:26] Makes, makes it seem like

[00:36:27] we're up to it.

[00:36:29] Totally.

[00:36:30] Totally.

[00:36:30] So, so 200 bucks a month,

[00:36:32] uh, for the, for the

[00:36:33] Mondo one.

[00:36:34] And then just to list them

[00:36:35] first and we'll go

[00:36:35] through them, um, is

[00:36:37] Sora, uh, Canvas

[00:36:39] open and, uh, open AI

[00:36:42] O one and Apple.

[00:36:45] And then today, day six

[00:36:46] was advanced voice video

[00:36:47] and Santa mode.

[00:36:51] So I missed that one

[00:36:52] cause I've had a really

[00:36:53] busy morning.

[00:36:54] I just put it in there.

[00:36:55] I just watched it.

[00:36:55] Okay.

[00:36:56] Um, so where do you want

[00:36:57] to start?

[00:36:59] Oh boy.

[00:36:59] I don't even, I don't even

[00:37:01] know on this one.

[00:37:01] I mean, I mean, well, so

[00:37:02] there was O one lying a

[00:37:05] lot.

[00:37:06] You had put in a couple

[00:37:07] of articles about O one's

[00:37:08] lies.

[00:37:09] It's lies.

[00:37:10] Damn it.

[00:37:11] What is it?

[00:37:11] What, what's going on

[00:37:12] there?

[00:37:13] Um, you know, it's, it's

[00:37:14] fascinating and I'm going

[00:37:16] to tie this for a second

[00:37:17] into Google's announcement

[00:37:18] about its quantum chip.

[00:37:20] Uh, because it has Willow

[00:37:21] has a whole new chip and

[00:37:23] Google stock went way up.

[00:37:24] Everybody was happy about

[00:37:25] this new quantum chip.

[00:37:25] This quantum might be

[00:37:26] real.

[00:37:27] And what was interesting

[00:37:28] about this to me, um,

[00:37:31] is I'm trying to get my

[00:37:32] head around this, but to

[00:37:34] me, computing was always

[00:37:36] an exact science.

[00:37:37] There was a question and

[00:37:38] you had an answer.

[00:37:39] You had a calculation and

[00:37:40] you got the answer.

[00:37:41] And that was that.

[00:37:41] And I remember years ago

[00:37:42] one Intel chip had

[00:37:43] something wrong with it

[00:37:44] that certain kinds of

[00:37:45] calculations wouldn't be

[00:37:45] right.

[00:37:46] And that was shocking,

[00:37:47] right?

[00:37:48] Well, in neural networks

[00:37:50] and generative AI and AI

[00:37:52] around, it's more

[00:37:53] approximate.

[00:37:54] And in quantum, it's

[00:37:56] more approximate.

[00:37:57] It's about do it.

[00:37:58] Does it not, does this,

[00:37:59] is there enough to, to,

[00:38:00] to, um, push the

[00:38:03] balance of prediction to

[00:38:04] say that this is a

[00:38:06] useful response, right?

[00:38:08] Yeah.

[00:38:08] And, um, uh, and so

[00:38:11] I'm, I'm fascinated, uh,

[00:38:13] by, by that question.

[00:38:14] So we talk about

[00:38:15] hallucinations.

[00:38:16] One of the Google's

[00:38:17] thing was the, the

[00:38:18] more, uh, qubits or

[00:38:20] whatever the hell they

[00:38:20] call it, they put in

[00:38:21] there, the error rate

[00:38:22] went down in quantum.

[00:38:23] So here in generative AI,

[00:38:26] it's fascinating to me is

[00:38:27] that is what we call

[00:38:28] hallucinations, which is

[00:38:29] crap.

[00:38:29] And neither is it really

[00:38:30] errors.

[00:38:31] It's just things that

[00:38:32] don't match our

[00:38:34] expectation of the right

[00:38:35] prediction, right?

[00:38:37] And thus, and if it's

[00:38:38] going to keep doing that,

[00:38:39] it's not useful in many,

[00:38:40] many contexts.

[00:38:42] So, um, so, uh, uh, one

[00:38:45] is accused of lying more,

[00:38:47] but that means that

[00:38:48] they're not getting right

[00:38:49] what we think they, it

[00:38:51] should be getting in the

[00:38:52] context of its task.

[00:38:54] Which makes it less

[00:38:55] useful.

[00:38:55] So there was a lot of,

[00:38:57] of, of kind of

[00:38:58] accusation around that

[00:39:00] in, in, in this week

[00:39:01] was people have played

[00:39:02] more and more with a

[00:39:03] one.

[00:39:05] You know, it just kind of,

[00:39:06] kind of came to mind.

[00:39:08] I'm, I'm probably not

[00:39:08] the first person by any

[00:39:10] means to, to make, draw

[00:39:12] this connection, but as

[00:39:13] you were talking about

[00:39:14] kind of, you know, the

[00:39:15] hallucination aspect of,

[00:39:16] of AI and how we're kind

[00:39:18] of, you know, some people

[00:39:19] are, are kind of getting

[00:39:20] used to the idea that,

[00:39:22] that something like that

[00:39:23] cannot and will not be

[00:39:25] perfect and, and correct

[00:39:26] and accurate and factual

[00:39:28] and truthful 100% of the

[00:39:29] time.

[00:39:30] And then you've got

[00:39:31] quantum computing and,

[00:39:32] and, you know, Google's

[00:39:33] announcement there and

[00:39:34] that being kind of playing

[00:39:35] in the same fields like,

[00:39:37] yeah, but it's still

[00:39:37] doing amazing things.

[00:39:38] It's just, it's not

[00:39:39] always accurate.

[00:39:41] All of that combined with

[00:39:43] this moment in time that

[00:39:45] we live in, when fact as

[00:39:47] a thing, it has become so

[00:39:50] mushy and so kind of

[00:39:53] abstract at this point that

[00:39:56] it's, it's just interesting

[00:39:57] that these things are all

[00:39:58] coexisting at the same

[00:39:59] time.

[00:39:59] It's like, it's like the

[00:40:01] perfect technology for a

[00:40:02] moment in time when facts

[00:40:03] matter less to people than

[00:40:05] they ever have.

[00:40:06] Or it's the worst

[00:40:06] technology at this moment

[00:40:07] of time because, oh,

[00:40:08] it's good enough for open

[00:40:09] AI and quantum, it's good

[00:40:11] enough for me.

[00:40:12] It's truly the age of

[00:40:13] alternative facts.

[00:40:15] Yeah, totally.

[00:40:16] And it's, it's like the

[00:40:17] technology is, is up to

[00:40:19] the challenge.

[00:40:20] Right, right.

[00:40:21] So we have, we have Sora

[00:40:22] and Google also has VO,

[00:40:25] which is, it's fairly

[00:40:26] recently announced, uh,

[00:40:27] latest version of its

[00:40:28] video.

[00:40:29] Uh, some, um, creators

[00:40:31] leaked Sora, uh, that

[00:40:33] peeved open AI for a few

[00:40:35] days and they took it down,

[00:40:36] but now it's, it's going to

[00:40:37] be available in general.

[00:40:39] And, uh, the Guardian had

[00:40:40] a moral panic piece about

[00:40:42] we'll never know what

[00:40:42] reality is, but of course

[00:40:43] the same was true with

[00:40:44] photography and with sound

[00:40:46] on film.

[00:40:48] And, and I, that article

[00:40:49] keeps getting written.

[00:40:50] That's one of those things

[00:40:51] that, that, that authors

[00:40:53] keep thinking like it's

[00:40:55] that they're the first

[00:40:56] person to come up with the

[00:40:58] fact that like, oh, well,

[00:40:59] if this is doing this, then

[00:41:00] we can never believe

[00:41:01] anything ever again.

[00:41:02] It's like, it's kind of

[00:41:04] been that way for a long

[00:41:05] time.

[00:41:05] And I don't know, maybe I

[00:41:07] have my head in the sand,

[00:41:08] but nothing major has

[00:41:10] happened as a result of it

[00:41:11] yet.

[00:41:11] I'm not saying that it

[00:41:13] won't like, I guess the

[00:41:14] potential is there, but I

[00:41:16] don't know.

[00:41:16] I keep seeing that article

[00:41:17] and the more I see it and

[00:41:19] again and again and again

[00:41:20] and again.

[00:41:20] Yeah.

[00:41:21] Okay.

[00:41:23] Um, so, so, so SORA was one

[00:41:25] and this is now going to be

[00:41:28] available.

[00:41:28] Uh, I think in general, uh, I

[00:41:30] think in general on chat

[00:41:30] GPT and it's a creation tool

[00:41:32] both for writing and for

[00:41:35] coding.

[00:41:36] So it's just a more focused

[00:41:38] use of the chat, um, interface

[00:41:42] to do what a lot of people are

[00:41:44] doing with it.

[00:41:45] I think, um, I'm actually

[00:41:48] super curious to play around

[00:41:49] with the canvas because it

[00:41:51] really, it does seem like in

[00:41:54] my experience of using, you

[00:41:56] know, these, these systems,

[00:41:58] perplexity and chat GPT and

[00:42:00] everything, it's so locked

[00:42:01] into that singular kind of

[00:42:03] top down, always scrolling

[00:42:05] interface and can't, and what

[00:42:07] am I actually using these for

[00:42:08] while I'm kind of collaborating

[00:42:10] with them to pull out

[00:42:12] information that I can use in

[00:42:15] like, you know, say this

[00:42:16] thing that I'm writing or this,

[00:42:17] this structured kind of

[00:42:19] document.

[00:42:20] document and to have that kind

[00:42:21] of tools like that embedded

[00:42:23] into the place where I

[00:42:25] structure the document kind of

[00:42:27] seems like a powerful, um, tool

[00:42:29] set.

[00:42:30] So I'm going to insert

[00:42:31] something here, do something

[00:42:31] in there.

[00:42:32] I think, I think that's really

[00:42:33] useful and, and, and more

[00:42:34] interactive, uh, more

[00:42:36] collaborative.

[00:42:37] Um, so I think that, that

[00:42:39] really makes a lot of sense.

[00:42:42] Um, and then, um, so they're

[00:42:45] going to add open 01 on to

[00:42:47] Apple, which is really, I think

[00:42:49] Apple confessing they don't have

[00:42:50] a lot of AI.

[00:42:52] Yeah.

[00:42:53] Yeah.

[00:42:53] Well, and we knew that, that

[00:42:54] chat GPT was going to be, uh,

[00:42:57] part of Apple intelligence and

[00:42:59] that it was going to come a

[00:43:00] little bit later.

[00:43:00] So now we're kind of at that

[00:43:02] stage, uh, to where that's

[00:43:04] going to start happening.

[00:43:05] Yeah.

[00:43:06] Yeah.

[00:43:06] It is curious to see, okay,

[00:43:08] well, what, what is Apple's

[00:43:10] own internal play around AI

[00:43:13] outside of, you know, Siri kind

[00:43:15] of having some of the impacts

[00:43:17] and the capabilities that the

[00:43:19] AI affords, but really leaning

[00:43:21] on chat GPT.

[00:43:22] Like you said, shows that they

[00:43:24] don't have their own version of

[00:43:25] that.

[00:43:26] And should they, or maybe,

[00:43:27] maybe they shouldn't, maybe

[00:43:28] they just are a different

[00:43:29] company versus a lot of the

[00:43:31] others that are doing this

[00:43:32] sort of thing.

[00:43:33] Um, yeah.

[00:43:34] So, so, so Sundar Pichai has

[00:43:35] been having a good fun time,

[00:43:38] um, mocking Microsoft because

[00:43:40] well, they don't really have

[00:43:41] their own AI.

[00:43:42] They got to use open AI, but

[00:43:43] we have deep mind.

[00:43:45] We are Google.

[00:43:46] And it gives them the same

[00:43:48] opening with Apple.

[00:43:49] Like, oh, I don't really

[00:43:50] have it.

[00:43:51] They've got to use open AI.

[00:43:52] But of course, in both cases

[00:43:53] they're using open AI, not

[00:43:54] Google, uh, which is

[00:43:57] interesting.

[00:43:57] Right.

[00:43:58] Um, and Amazon, uh, you

[00:44:01] know, it, it's stock went up a

[00:44:02] week ago cause it's adding more

[00:44:04] models.

[00:44:04] Uh, the competition here is

[00:44:07] amazing, but, but you and I say

[00:44:08] this all the time in a way

[00:44:10] there's just tremendous

[00:44:10] competition, but in a way it's

[00:44:12] all a little bit the same.

[00:44:14] Yeah.

[00:44:14] They're, they're, they're, they're,

[00:44:15] they're just leapfrogging with

[00:44:16] little baby steps, uh, along the

[00:44:19] way, which is interesting.

[00:44:20] So if you go to, uh, what is it?

[00:44:22] Line 28 here, uh, the video, uh,

[00:44:26] you'll get the latest.

[00:44:27] So what's happening with, with,

[00:44:28] with 12 days of shipness is that

[00:44:31] they are.

[00:44:32] S H I P.

[00:44:33] Ship Paul.

[00:44:34] Shipness.

[00:44:36] Shipness.

[00:44:37] As I heard you say that, I was

[00:44:38] like, Oh, it could have been a

[00:44:40] different word.

[00:44:41] Uh, yeah, I would, would then, uh,

[00:44:43] yeah, go after me.

[00:44:43] So this is, this is, um, what

[00:44:45] you're going to see here is as, as

[00:44:48] the first video there, uh, where

[00:44:50] you were talking about, we need

[00:44:51] audio, we will need audio.

[00:44:52] You'll need audio if you can.

[00:44:53] Okay.

[00:44:54] Let me, let me get that going.

[00:44:56] Cause that is a while you're

[00:44:57] doing it.

[00:44:57] I'll talk about it.

[00:44:59] So, uh, the memory things that

[00:45:00] Jason talked about with, uh, I'm

[00:45:02] talking like Jason's in here cause

[00:45:03] he's busy doing something that you

[00:45:05] Jason talked about.

[00:45:06] Um, with, uh, Google, uh, is an

[00:45:10] important part.

[00:45:11] It's antecedent, you know, get me

[00:45:13] that book, get me that record.

[00:45:14] What about this?

[00:45:15] What about that?

[00:45:16] And knowing what that is.

[00:45:17] So you're going to see a demo here

[00:45:19] if you've got about a minute in,

[00:45:20] um, where, uh, it's advanced voice.

[00:45:23] So now the advanced voice, which you

[00:45:25] know is very, very, very chatty to

[00:45:28] a, um, weird, uh, degree will, uh,

[00:45:32] now combine with video so that you

[00:45:34] can talk about things that you're

[00:45:35] seeing with it.

[00:45:36] This is very much like what Jason

[00:45:38] saw in Google.

[00:45:40] So if you advanced to a minute,

[00:45:41] and you said, what you're going to

[00:45:42] see is there.

[00:45:43] Give this one more try.

[00:45:46] So he's going to introduce the

[00:45:48] people around the, have the people

[00:45:49] around the table introduce

[00:45:50] themselves.

[00:45:52] There we go.

[00:45:53] All right.

[00:45:53] So you can see as we, as we go

[00:45:55] into the, there's now a couple more

[00:45:56] buttons on the bottom left.

[00:45:57] In particular, there's this video

[00:45:59] button.

[00:45:59] Um, and so here we are and I can,

[00:46:04] uh, I'm going to introduce myself.

[00:46:05] Hey, Chad, I'm Kevin.

[00:46:06] I lead product at open AI.

[00:46:07] Hey, Chad.

[00:46:08] Okay.

[00:46:09] Hi, Kevin.

[00:46:09] It's great to meet you.

[00:46:11] What can I help you with today?

[00:46:12] Well, I'd actually, I'd love to

[00:46:13] introduce you to a few of my

[00:46:14] colleagues.

[00:46:14] Does that sound good?

[00:46:16] Sure.

[00:46:17] I'd love to meet them.

[00:46:18] No, I really don't want to.

[00:46:19] I just like you.

[00:46:24] Hey, Chad.

[00:46:29] Multimodal team.

[00:46:30] That sounds exciting.

[00:46:31] And Chad has to find something

[00:46:32] nice to say in each case.

[00:46:33] I'm Jackie.

[00:46:34] Oh, I know, right?

[00:46:35] It's like, okay.

[00:46:35] I'm the product team for ChatGPT

[00:46:36] Multimodal.

[00:46:37] Get through the niceties.

[00:46:38] Hi, Jackie.

[00:46:39] It's great to meet you too.

[00:46:41] Sounds like you're all working on

[00:46:42] some really cool projects.

[00:46:43] Really?

[00:46:44] We really are.

[00:46:45] Hey, ChatGPT.

[00:46:46] I'm Rowan.

[00:46:47] This guy didn't get the memo

[00:46:48] about calling it chat.

[00:46:50] Hi, Rowan.

[00:46:52] It's nice to meet you.

[00:46:53] He's embarrassed.

[00:46:53] Multimodal research sounds

[00:46:54] fascinating.

[00:46:55] Okay, so now comes the demo part.

[00:46:57] All right.

[00:46:57] And now I want to see if you

[00:46:58] remember the folks that you met.

[00:47:00] So can I give you a quick quiz?

[00:47:03] Sure.

[00:47:03] I'm ready for a quiz.

[00:47:04] All right.

[00:47:05] What was the name of my colleague

[00:47:07] that was wearing the reindeer antlers?

[00:47:10] That would be Michelle.

[00:47:12] All right.

[00:47:13] And how about the name of my colleague

[00:47:14] that was wearing the Santa hat?

[00:47:17] That would be Rowan.

[00:47:19] All right.

[00:47:20] Good work.

[00:47:21] You're two for two.

[00:47:21] You get an A.

[00:47:22] Thanks, Chat.

[00:47:23] Good job, Chat.

[00:47:25] Was there more?

[00:47:26] Was that it?

[00:47:27] That's enough.

[00:47:27] You can watch.

[00:47:28] You could have it imitate Santa

[00:47:30] if you want later on.

[00:47:31] That may be worth a laugh.

[00:47:33] Maybe we should do that in a minute.

[00:47:34] But what's interesting here is

[00:47:36] if it's a good demo,

[00:47:37] if we know it's a good demo,

[00:47:38] right,

[00:47:38] that A,

[00:47:39] it has memory.

[00:47:41] Yep.

[00:47:41] It remembers there was someone

[00:47:43] who wore antlers.

[00:47:43] B,

[00:47:44] it knows what antlers are

[00:47:45] and that they don't belong

[00:47:46] in a human being

[00:47:47] except in AI.

[00:47:49] C,

[00:47:49] it then remembered

[00:47:50] the name of that person.

[00:47:51] It had data about that person

[00:47:52] that it had stored.

[00:47:53] So this whole question

[00:47:54] of memory

[00:47:55] in your demos

[00:47:56] and here,

[00:47:57] strike me,

[00:47:58] it's the leapfrogging baby steps,

[00:48:00] right?

[00:48:01] They're both demonstrating

[00:48:02] video memory of things,

[00:48:05] which is interesting.

[00:48:06] So,

[00:48:08] now,

[00:48:08] yeah,

[00:48:08] and yet,

[00:48:09] as I watch this

[00:48:11] and as I think about

[00:48:12] my,

[00:48:12] kind of my,

[00:48:13] the demonstration

[00:48:14] that I saw in person,

[00:48:16] I could see this being

[00:48:17] the kind of thing

[00:48:18] that,

[00:48:18] you know,

[00:48:19] in a very short amount

[00:48:19] of time,

[00:48:20] six months or whatever,

[00:48:21] we're kind of like,

[00:48:22] okay,

[00:48:22] yeah,

[00:48:22] memory.

[00:48:23] enough of that.

[00:48:23] So,

[00:48:24] so what?

[00:48:25] Like,

[00:48:25] what is the next?

[00:48:26] Remember,

[00:48:27] like,

[00:48:28] days ago

[00:48:28] versus minutes ago,

[00:48:31] you know,

[00:48:31] maybe,

[00:48:31] maybe that ends up being,

[00:48:33] you know,

[00:48:33] the big deal

[00:48:34] and I suppose

[00:48:34] we'll only really experience

[00:48:35] that when we have these things

[00:48:36] for days and not minutes,

[00:48:38] but.

[00:48:38] So the other thing about this,

[00:48:39] which goes to Google Glass

[00:48:40] in the old days,

[00:48:41] Google Glass's

[00:48:42] one good thing

[00:48:43] was going to be

[00:48:43] to tell a,

[00:48:45] a boiler repair person

[00:48:47] how to repair the boiler,

[00:48:48] what are you seeing?

[00:48:49] Right.

[00:48:50] So in their demonstration,

[00:48:50] if you go,

[00:48:51] I've had to about

[00:48:52] four,

[00:48:53] four minutes.

[00:48:54] Um,

[00:48:54] Oh,

[00:48:55] same,

[00:48:55] same video,

[00:48:56] same video.

[00:48:57] Chat is instructing,

[00:48:59] um,

[00:49:00] the embarrassed guy

[00:49:01] how to make a cup of coffee.

[00:49:04] Okay.

[00:49:05] I'm going to fire up

[00:49:06] a new conversation

[00:49:07] with chat.

[00:49:07] Here,

[00:49:08] I'll show it

[00:49:08] and hit the advanced

[00:49:09] voice mode icon

[00:49:10] on the lower right.

[00:49:12] And then once it connects,

[00:49:13] there's a coffee

[00:49:14] right out there.

[00:49:16] Hey chat,

[00:49:16] how's it going?

[00:49:18] I'm doing great.

[00:49:19] Thanks for asking.

[00:49:20] I see you're wearing

[00:49:21] a Santa hat.

[00:49:22] I am.

[00:49:23] And do you see

[00:49:23] what I have in front of me?

[00:49:25] Yes,

[00:49:26] I see a coffee set up

[00:49:27] with a kettle

[00:49:28] and a dripper.

[00:49:29] Are you planning

[00:49:30] to make some coffee?

[00:49:31] I'd love to.

[00:49:31] Do you think you could

[00:49:32] walk me through the steps?

[00:49:35] Sure,

[00:49:35] I'd love to.

[00:49:36] First,

[00:49:37] place a filter

[00:49:37] in the dripper

[00:49:38] and rinse it

[00:49:39] with hot water

[00:49:39] to eliminate

[00:49:40] and pre-free taste.

[00:49:41] Uh-huh.

[00:49:43] Right?

[00:49:44] So now finally,

[00:49:46] if you advance

[00:49:46] to 7 minutes,

[00:49:47] 40 seconds,

[00:49:47] you'll get Santa.

[00:49:50] Okay,

[00:49:51] 7 minutes,

[00:49:51] 40 seconds.

[00:49:52] I just had to mute it

[00:49:53] so that it doesn't

[00:49:55] flip all around the place.

[00:49:56] That's going to be

[00:49:56] your festive entry point.

[00:49:58] You can also find

[00:49:59] a new icon

[00:50:00] on the chat GPT

[00:50:01] settings page.

[00:50:02] They are so festive

[00:50:04] at OpenAI.

[00:50:09] I'm a little scared,

[00:50:11] a little nervous.

[00:50:12] Hey Santa.

[00:50:13] Ho, ho, ho.

[00:50:15] Well,

[00:50:16] hello there.

[00:50:17] It's delightful

[00:50:18] to hear from you.

[00:50:19] Are you getting excited

[00:50:21] for the holidays?

[00:50:22] I'm so excited.

[00:50:24] Santa,

[00:50:24] I have a question for you.

[00:50:25] What's your favorite

[00:50:26] Christmas tradition?

[00:50:29] Ho, ho, ho, ho.

[00:50:30] What a wonderful question.

[00:50:33] My favorite...

[00:50:34] Okay, okay.

[00:50:38] Oh, you know,

[00:50:39] but if my kids

[00:50:41] were just a couple

[00:50:42] of years younger

[00:50:43] than they are right now,

[00:50:44] they'd probably

[00:50:44] really love that.

[00:50:46] Watch.

[00:50:47] I'm guaranteeing you

[00:50:48] within a week

[00:50:48] there'll be a story

[00:50:49] that OpenAI's

[00:50:51] AI Santa

[00:50:52] told kids

[00:50:53] to kill their parents,

[00:50:54] right?

[00:50:54] Can you believe?

[00:50:56] Right, right.

[00:50:57] We're going to be there.

[00:51:00] Interesting.

[00:51:00] So, okay,

[00:51:01] so there's more

[00:51:03] shipmas days left,

[00:51:04] so who knows

[00:51:05] what OpenAI

[00:51:06] has in store?

[00:51:07] Today is day

[00:51:07] six of 12.

[00:51:09] Six of 12.

[00:51:10] Man,

[00:51:10] who knew December

[00:51:11] was just going to be

[00:51:12] so jam-packed

[00:51:13] with AI announcements.

[00:51:15] They're like,

[00:51:15] we got to get it

[00:51:16] out the door

[00:51:16] before the new year.

[00:51:17] There must be

[00:51:17] some sort of

[00:51:18] tax thing

[00:51:19] that they got

[00:51:19] to get this stuff out.

[00:51:20] Or their bonuses.

[00:51:21] Yes,

[00:51:23] that's right.

[00:51:23] Their KPIs,

[00:51:24] as they say.

[00:51:27] We're going to take

[00:51:28] a super quick break.

[00:51:28] We've already talked

[00:51:29] about the quantum stuff,

[00:51:30] so we can probably

[00:51:31] pass that off.

[00:51:31] Or we can mention

[00:51:32] briefly, but yeah.

[00:51:33] Yeah,

[00:51:34] we'll come back.

[00:51:35] We'll talk a little bit

[00:51:35] more about the quantum

[00:51:36] and then a little bit

[00:51:37] about Reddit

[00:51:38] and then round things out.

[00:51:39] That's coming up

[00:51:39] after this break.

[00:51:43] All right,

[00:51:44] we did kind of mention

[00:51:45] the quantum computer thing.

[00:51:48] I screwed up.

[00:51:48] I didn't see it

[00:51:48] in the rundown below.

[00:51:49] I forgot.

[00:51:50] No, no, no.

[00:51:51] I don't have the memory

[00:51:52] of these programs.

[00:51:55] What I realized

[00:51:56] in kind of prepping

[00:51:57] for this story

[00:51:58] is every time

[00:51:59] a quantum computer

[00:52:00] story comes up,

[00:52:01] I have to re-remind myself

[00:52:02] what it actually means.

[00:52:03] I know.

[00:52:05] I think you did

[00:52:06] a pretty good job

[00:52:07] of explaining it.

[00:52:08] So the one thing

[00:52:09] just to mention,

[00:52:10] I think,

[00:52:10] if I can find it

[00:52:11] in here really quickly,

[00:52:11] is the statistic

[00:52:13] that has been out there

[00:52:14] a fair amount

[00:52:16] of how,

[00:52:18] I'm trying to avoid

[00:52:19] the word powerful,

[00:52:20] but how speedy

[00:52:21] I guess this thing is,

[00:52:22] that it can do

[00:52:23] a standard benchmark

[00:52:26] computation

[00:52:26] in under five minutes

[00:52:27] that would take

[00:52:28] one of today's

[00:52:29] fastest supercomputers

[00:52:31] 10 septillion,

[00:52:32] that is 10 to the 25th years,

[00:52:33] a number that vastly

[00:52:34] exceeds the age

[00:52:36] of the universe.

[00:52:38] Yeah.

[00:52:38] So this is what

[00:52:39] got people excited.

[00:52:40] And I think

[00:52:42] for this show,

[00:52:43] what's important

[00:52:44] is imagine

[00:52:45] what this can do

[00:52:47] with AI.

[00:52:50] Is it a complete

[00:52:51] rethink of neural

[00:52:53] network architecture?

[00:52:54] I don't know.

[00:52:55] But the need

[00:52:57] for tremendous

[00:52:59] computing power

[00:53:00] and speed

[00:53:00] in AI,

[00:53:02] if quantum

[00:53:03] really does arrive,

[00:53:06] one can imagine

[00:53:08] that it's more

[00:53:08] than baby steps then.

[00:53:10] Is it AGI?

[00:53:11] No, I don't think

[00:53:12] it's doing, folks.

[00:53:13] I was just going to say,

[00:53:13] you know what we'll have

[00:53:14] then, Jeff.

[00:53:15] We'll definitely

[00:53:15] have AGI by then.

[00:53:17] Oh.

[00:53:19] But I mean,

[00:53:20] who knows?

[00:53:21] I mean,

[00:53:21] the whole quantum

[00:53:22] aspect of this

[00:53:23] just takes everything

[00:53:24] to a completely

[00:53:25] different level

[00:53:26] that, you know,

[00:53:26] again,

[00:53:27] I have a hard

[00:53:27] time understanding.

[00:53:29] But I will say

[00:53:33] that New York Times

[00:53:34] Cade Metz

[00:53:34] did a good job

[00:53:35] of kind of

[00:53:37] breaking it down

[00:53:38] again.

[00:53:38] So I'll just read

[00:53:38] that for anyone

[00:53:39] else out there

[00:53:40] who's like,

[00:53:40] okay,

[00:53:41] but seriously,

[00:53:41] this is such

[00:53:42] an elusive term

[00:53:43] to me.

[00:53:44] Quantum bits,

[00:53:44] this is his words,

[00:53:46] or qubits

[00:53:47] behave very differently

[00:53:48] from normal bits.

[00:53:49] A single object

[00:53:50] can behave like

[00:53:51] two separate objects

[00:53:52] at the same time

[00:53:54] when it is either

[00:53:55] extremely small

[00:53:56] or extremely cold.

[00:53:58] By harnessing

[00:53:59] that behavior,

[00:54:00] scientists can build

[00:54:01] a qubit that holds

[00:54:02] a combination

[00:54:03] of ones and zeros.

[00:54:04] This means that

[00:54:05] two qubits can hold

[00:54:06] four values at once,

[00:54:07] and as the number

[00:54:08] of qubits grows,

[00:54:09] a quantum computer

[00:54:10] becomes exponentially

[00:54:11] more powerful.

[00:54:12] Does that tell you

[00:54:12] everything you need

[00:54:13] to know about

[00:54:13] quantum computers?

[00:54:14] Probably not.

[00:54:15] But that gives me

[00:54:15] an understanding

[00:54:17] the binary of zero

[00:54:19] and one

[00:54:19] and the importance

[00:54:20] of it in

[00:54:21] modern computing.

[00:54:22] Right.

[00:54:24] That exponential

[00:54:25] kind of possibility.

[00:54:26] And the other part

[00:54:27] of this

[00:54:28] that goes to

[00:54:29] this error question

[00:54:30] is that in my

[00:54:32] amateurish way,

[00:54:33] I would think,

[00:54:33] well, if quantum

[00:54:34] is approximate

[00:54:34] and has errors,

[00:54:35] then the more

[00:54:36] errors you have.

[00:54:37] No, it's the opposite.

[00:54:38] So the Google team

[00:54:40] published in Nature

[00:54:41] results showing

[00:54:42] that I'm quoting

[00:54:43] the more qubits

[00:54:44] we use in Willow,

[00:54:45] their chip,

[00:54:46] the more we reduce

[00:54:47] errors,

[00:54:48] and the more

[00:54:49] quantum the system

[00:54:50] becomes.

[00:54:50] We tested

[00:54:51] ever larger arrays

[00:54:52] of physical qubits

[00:54:53] scaling from a grid

[00:54:55] of 3x3 encoded qubits

[00:54:56] to a grid of 5x5

[00:54:57] to a grid of 7x7.

[00:54:59] And each time,

[00:55:00] using our latest advances

[00:55:01] in quantum error correction,

[00:55:03] we were able to cut

[00:55:04] the error rate

[00:55:05] in half.

[00:55:06] So the more

[00:55:07] quantum it is,

[00:55:08] the more right it is.

[00:55:11] You know what we need

[00:55:12] in here,

[00:55:13] more quantum.

[00:55:14] That's what we need.

[00:55:15] That's what we need.

[00:55:17] It's not working right.

[00:55:18] Did you give it

[00:55:18] more quantum?

[00:55:20] Quantum baby quantum.

[00:55:21] Think about that.

[00:55:22] That's our new,

[00:55:22] yeah.

[00:55:24] Quantum baby,

[00:55:25] quantum.

[00:55:26] Love it.

[00:55:27] All right.

[00:55:28] That's got to be

[00:55:29] at least in the running

[00:55:30] for a title

[00:55:32] for today.

[00:55:33] And then finally,

[00:55:35] I am

[00:55:36] a real fan

[00:55:38] of Reddit,

[00:55:38] have been for years.

[00:55:40] It's,

[00:55:40] it's,

[00:55:40] in many ways,

[00:55:41] I feel like Reddit

[00:55:42] is one of the unsung heroes

[00:55:44] in social media

[00:55:45] because

[00:55:45] I feel like

[00:55:46] it doesn't get talked about

[00:55:47] nearly as much

[00:55:48] as a lot of the other platforms.

[00:55:49] platforms.

[00:55:50] And yet,

[00:55:50] there is actually

[00:55:51] a lot of value

[00:55:52] on the platform.

[00:55:54] And when you're talking

[00:55:55] about value for Reddit,

[00:55:57] you know,

[00:55:57] there's many ways

[00:55:58] to think about that.

[00:55:59] But the data set

[00:56:00] that Reddit is

[00:56:02] is proving to be

[00:56:02] very valuable

[00:56:03] for AI companies.

[00:56:06] Wall Street Journal

[00:56:06] says AI

[00:56:07] is now the source

[00:56:08] or a source

[00:56:10] of key revenue growth

[00:56:11] after it began

[00:56:12] charging for that access.

[00:56:13] And we definitely

[00:56:14] talked about that.

[00:56:14] I think it's $86 million

[00:56:17] in the first three quarters

[00:56:19] of revenue

[00:56:20] for licensing.

[00:56:22] At the same time

[00:56:23] that Reddit

[00:56:24] is creating

[00:56:25] its own

[00:56:26] AI

[00:56:26] search engine

[00:56:28] competing with those

[00:56:29] it's licensing to,

[00:56:30] which is really interesting.

[00:56:33] Yeah.

[00:56:34] It's

[00:56:35] 19 years

[00:56:36] of community

[00:56:37] kind of,

[00:56:38] you know,

[00:56:39] fed

[00:56:39] information.

[00:56:42] Human conversation.

[00:56:43] Yeah,

[00:56:44] exactly.

[00:56:46] News organizations

[00:56:47] on one hand

[00:56:48] that have been

[00:56:49] wrestling with

[00:56:50] being okay

[00:56:51] giving,

[00:56:52] you know,

[00:56:52] or offering

[00:56:53] their own

[00:56:54] effort

[00:56:55] to the AI machine

[00:56:57] and everything

[00:56:58] that they've experienced.

[00:56:58] And then you've got Reddit

[00:56:59] really offering

[00:57:00] the work

[00:57:01] of its community

[00:57:02] to the machine,

[00:57:03] which is kind of,

[00:57:05] you know,

[00:57:05] kind of a big deal.

[00:57:06] There's so much

[00:57:07] valuable information

[00:57:08] in there

[00:57:08] and the sentiment

[00:57:10] of the people

[00:57:11] that are offering it.

[00:57:12] There's a rating system

[00:57:13] within Reddit

[00:57:14] that, you know,

[00:57:15] can tell you

[00:57:16] if people have

[00:57:17] voted this highly,

[00:57:18] then it must be

[00:57:19] either accurate

[00:57:20] or desired

[00:57:21] or approved of

[00:57:23] or whatever.

[00:57:24] And that can inform

[00:57:25] these AI models.

[00:57:28] Right.

[00:57:29] Right.

[00:57:29] So, yeah.

[00:57:30] So who would have thought

[00:57:31] that Reddit would be a darling,

[00:57:32] so to speak,

[00:57:33] of AI companies?

[00:57:34] But it's smart of them.

[00:57:35] I'm still waiting

[00:57:36] for a revolt

[00:57:37] from Redditors

[00:57:38] saying,

[00:57:39] where's our piece?

[00:57:40] Free isn't enough.

[00:57:41] Well, that is interesting,

[00:57:42] actually.

[00:57:42] Yeah, totally.

[00:57:44] Like, that Redditors

[00:57:44] aren't like,

[00:57:45] well, you can't.

[00:57:46] Like, I didn't put this

[00:57:47] onto Reddit

[00:57:47] so that you could just

[00:57:48] go and feed it

[00:57:49] into AI systems.

[00:57:50] You really haven't,

[00:57:51] I haven't heard any of that.

[00:57:52] You know,

[00:57:52] I'm old enough

[00:57:53] to remember that when,

[00:57:54] and I was one of these people

[00:57:55] when people contributed

[00:57:55] to Huffington Post

[00:57:56] and I was one of them

[00:57:57] because I got more attention,

[00:57:59] I got more audience,

[00:57:59] I got more,

[00:58:00] you know,

[00:58:00] spotlight.

[00:58:02] People said,

[00:58:02] how are you doing that?

[00:58:03] You're doing that for free.

[00:58:05] She's stealing your labor.

[00:58:07] And I didn't think so,

[00:58:08] but that argument

[00:58:09] was there

[00:58:12] with Reddit.

[00:58:14] People there

[00:58:14] not only create the content,

[00:58:16] they also create

[00:58:17] the norms and structures

[00:58:19] for moderation

[00:58:21] that lead to higher quality

[00:58:23] there than other social networks.

[00:58:25] So without the audience,

[00:58:26] they are truly nothing.

[00:58:28] Reddit doesn't add anything.

[00:58:29] The audience is everything.

[00:58:30] It enables them.

[00:58:31] Totally.

[00:58:32] Fine.

[00:58:33] Yeah.

[00:58:34] I don't want to start

[00:58:34] a revolt here.

[00:58:37] No need to start a revolt.

[00:58:39] There's plenty of that

[00:58:39] going on already

[00:58:40] in this world.

[00:58:42] Anyways,

[00:58:43] we have reached the end

[00:58:44] of this episode

[00:58:45] of AI Inside

[00:58:46] and it was a heck

[00:58:47] of a lot of fun.

[00:58:48] Thank you, Jeff.

[00:58:49] Thank you again

[00:58:49] for being flexible

[00:58:50] and moving a day.

[00:58:52] I really appreciate it.

[00:58:53] So does my cold

[00:58:55] now that I'm feeling

[00:58:55] a little better.

[00:58:57] Jeff,

[00:58:57] what do you want to plug?

[00:58:59] I'm pretty sure

[00:58:59] I know what you would like

[00:59:00] to plug.

[00:59:00] Just jeffjarvis.com.

[00:59:01] Three books out now.

[00:59:02] When I was in San Francisco,

[00:59:03] I went to City Lights Books

[00:59:04] and they had all three.

[00:59:05] The only place

[00:59:05] had all three

[00:59:06] of my current books out.

[00:59:07] So I autographed them all

[00:59:08] and I fell with happiness.

[00:59:10] That's awesome.

[00:59:11] If you're in San Francisco,

[00:59:12] you can go and get one of them

[00:59:13] and then they'll buy another one

[00:59:14] which is what happens

[00:59:15] which just makes me very happy.

[00:59:17] That's amazing.

[00:59:18] Yeah,

[00:59:19] when you go into a bookstore

[00:59:20] and you find your books,

[00:59:21] is that,

[00:59:22] I guess that's probably

[00:59:23] something that authors do

[00:59:24] is pull it off

[00:59:25] and sign you.

[00:59:25] They do inside

[00:59:26] but it's rare.

[00:59:28] Unfortunately,

[00:59:28] it's rare.

[00:59:29] I went with a big publisher

[00:59:29] this time

[00:59:30] and I think it's in

[00:59:31] two Barnes & Nobles

[00:59:32] in New Jersey.

[00:59:33] So that's book publishing

[00:59:35] these days.

[00:59:36] Yeah.

[00:59:37] Well,

[00:59:38] it can be found

[00:59:39] in San Francisco at least

[00:59:40] and occasionally

[00:59:41] you are touring

[00:59:43] to support

[00:59:44] at the worldwide

[00:59:45] Jeff Jarvis tour.

[00:59:47] Excellent.

[00:59:48] Well,

[00:59:48] it was great to see you

[00:59:49] and really great

[00:59:51] to get my book.

[00:59:51] I got my book signed too

[00:59:52] by the way.

[00:59:53] Thank you for signing.

[00:59:53] Thank you so much.

[00:59:55] Yeah.

[00:59:56] Everything you need

[00:59:56] to know about this show

[00:59:57] can be found

[00:59:58] at our website

[01:00:00] that is

[01:00:02] AIinside.show

[01:00:03] So if you go there,

[01:00:05] well,

[01:00:05] you'll find,

[01:00:06] actually,

[01:00:07] if you go there

[01:00:07] and go all the way

[01:00:08] down to the bottom,

[01:00:09] you'll find a photo

[01:00:10] from said book signing.

[01:00:11] There it is.

[01:00:12] Ah, yes.

[01:00:13] Showing we're both tall.

[01:00:14] That's right.

[01:00:15] We're both tall

[01:00:16] and we've both existed

[01:00:17] in the same place

[01:00:18] at the same time,

[01:00:19] not entirely virtually

[01:00:21] before.

[01:00:21] So there you are.

[01:00:23] Go to AIinside.show.

[01:00:24] You can leave us

[01:00:25] feedback there.

[01:00:26] We're considering

[01:00:27] kind of reading

[01:00:27] some feedback

[01:00:28] on the shows

[01:00:29] from time to time.

[01:00:30] So if you want to,

[01:00:31] you can go there.

[01:00:32] I think contact

[01:00:33] at AIinside.show

[01:00:35] works as well

[01:00:36] if you want to send us

[01:00:37] feedback or,

[01:00:38] you know,

[01:00:39] some information

[01:00:39] about something

[01:00:40] you messed around with

[01:00:42] featuring AI,

[01:00:43] whatever.

[01:00:43] Send us some feedback.

[01:00:44] We might play it.

[01:00:45] We might read it out

[01:00:46] on the show

[01:00:46] is what I'm saying.

[01:00:47] Do subscribe

[01:00:48] while you're there.

[01:00:49] Also,

[01:00:49] go to Patreon.com

[01:00:50] slash

[01:00:52] AIinside.show

[01:00:53] and you can support us

[01:00:54] directly.

[01:00:55] We have a bunch

[01:00:56] of people

[01:00:57] who support us

[01:00:58] on a regular basis.

[01:00:59] They also get special perks,

[01:01:01] you know,

[01:01:01] the show with no ads,

[01:01:04] Discord,

[01:01:05] Hangouts.

[01:01:05] You get an

[01:01:06] AIinside t-shirt

[01:01:07] if you become

[01:01:08] an executive producer

[01:01:09] and we have

[01:01:10] only a few

[01:01:11] executive producers

[01:01:13] but we appreciate them

[01:01:15] nonetheless.

[01:01:15] Dr. Do,

[01:01:16] Jeffrey Maricini,

[01:01:17] WPVM 103.7

[01:01:19] in Nashville,

[01:01:20] North Carolina,

[01:01:20] Paul Lang,

[01:01:21] and Ryan Newell.

[01:01:23] Thank you all

[01:01:24] so very much

[01:01:24] for continuing

[01:01:25] to stick by us,

[01:01:26] for continuing

[01:01:27] to support the show.

[01:01:28] Really couldn't do it

[01:01:29] without you

[01:01:30] and yeah,

[01:01:32] I think that's about it.

[01:01:32] We'll see you next time

[01:01:33] on another episode

[01:01:34] of AI Inside.

[01:01:35] Actually,

[01:01:35] next episode

[01:01:36] is our last live episode

[01:01:38] and then we've got

[01:01:38] two holiday episodes

[01:01:39] already in the can

[01:01:40] for Christmas

[01:01:42] and New Year's week.

[01:01:43] So those will be

[01:01:44] pre-recorded

[01:01:44] but we have

[01:01:46] a pretty awesome guest

[01:01:46] at this point anyways

[01:01:48] lined up for next week.

[01:01:49] I think you're

[01:01:50] going to like it.

[01:01:50] I'm not going to spoil it

[01:01:51] just in case things change

[01:01:53] but look for that.

[01:01:55] Anyways,

[01:01:56] thank you again

[01:01:56] and we'll see you next time

[01:01:57] on AI Inside.

[01:01:58] Bye everybody.

[01:01:59] Bye y'all.