The Intelligence Age or The Hype Age?
September 25, 20241:00:41

The Intelligence Age or The Hype Age?

[00:00:00] [SPEAKER_01]: This is AI Inside, Episode 36 recorded Wednesday, September 25, 2024, the Intelligence Age or The Hype Age.

[00:00:11] [SPEAKER_01]: This episode of AI Inside is made possible by our wonderful patrons at patreon.com slash AI Inside Show.

[00:00:18] [SPEAKER_01]: If you like what you hear, head on over and support us directly, and thank you for making independent podcasting possible.

[00:00:30] [SPEAKER_01]: What's going on, everybody? Welcome to another episode of AI Inside.

[00:00:33] [SPEAKER_01]: The show where we take a look at the AI that is layered like a lasagna.

[00:00:39] [SPEAKER_01]: I like the lasagna reference. We'll stick with that.

[00:00:42] [SPEAKER_01]: Jason, howl, one of your hosts joined us all these by my other hosts Jeff Jarvis.

[00:00:47] [SPEAKER_01]: How you doing Jeff?

[00:00:47] [SPEAKER_01]: Hey boss, how are you doing alright?

[00:00:50] [SPEAKER_01]: Had a few pre-show kind of set up issues with little old chatGPT,

[00:00:56] [SPEAKER_01]: which we can talk about in a little bit here, but I've already decided that the chatGPT guy is annoying.

[00:01:03] [SPEAKER_00]: Yes, it's so hard to be nice and so hard. He's like that salesperson who will not let you lose in the store.

[00:01:10] [SPEAKER_00]: You just want to say leave me alone.

[00:01:12] [SPEAKER_00]: Oh yes sir, whatever you'd like.

[00:01:13] [SPEAKER_00]: Happy to do sir.

[00:01:15] [SPEAKER_00]: It's just a little too sub-servian.

[00:01:18] [SPEAKER_00]: Jason is trying to set up the sound for demo he's going to do.

[00:01:21] [SPEAKER_00]: And asked it to talk and it kept on wanting to talk with him and chat with him and say things.

[00:01:27] [SPEAKER_01]: He's annoying.

[00:01:28] [SPEAKER_01]: Which is kind of what it's designed to do, but at the same time it can be a little annoying as well.

[00:01:35] [SPEAKER_01]: Yeah.

[00:01:35] [SPEAKER_01]: Yeah, it's like, yeah.

[00:01:36] [SPEAKER_01]: I think there's a ways to go there to make it still feel even more natural.

[00:01:41] [SPEAKER_01]: You know, it's two steps forward, two steps back seems to be what we're getting out of the way we see it.

[00:01:45] [SPEAKER_01]: We'll show you.

[00:01:47] [SPEAKER_01]: Yeah, we'll show you that before we get started.

[00:01:49] [SPEAKER_01]: Big thank you to those of you who support us on Patreon.

[00:01:53] [SPEAKER_01]: That's patreon.com slash AI inside show.

[00:01:56] [SPEAKER_01]: I want to thank Chris Pichtor for supporting us from the beginning I believe from the very beginning.

[00:02:03] [SPEAKER_01]: We still have so many people who signed up at the very beginning on Patreon that we're getting through.

[00:02:08] [SPEAKER_01]: So I'm really trying to hop around so it's, you know, so if you joined them last, you know,

[00:02:12] [SPEAKER_01]: couple of months you aren't waiting literal months until you hear your name.

[00:02:16] [SPEAKER_01]: So anyways, we'll get to you.

[00:02:17] [SPEAKER_01]: But Chris, thank you so much for your support.

[00:02:19] [SPEAKER_01]: Everyone else who's going to support us on Patreon.

[00:02:21] [SPEAKER_01]: We couldn't do without you.

[00:02:22] [SPEAKER_01]: Also if you happen to be watching live right now, there are a lot of you watching live.

[00:02:27] [SPEAKER_01]: We do record this show live and you can, you know, catch it through the Text Blur YouTube channel on

[00:02:33] [SPEAKER_01]: Dayo Jeff Jarvis, Rebrotcast on his Twitter, ex account and all sorts of things.

[00:02:38] [SPEAKER_01]: But I would encourage you to go to our website, AI inside.show and subscribe to the podcast.

[00:02:44] [SPEAKER_01]: Just a case you missed the live stream.

[00:02:46] [SPEAKER_01]: You can still catch the podcast after the fact we are here.

[00:02:49] [SPEAKER_00]: So we can tell the advertisers we want to get that you subscribe to that's what they pay attention to.

[00:02:54] [SPEAKER_01]: So that's totally true.

[00:02:56] [SPEAKER_01]: That is absolutely true.

[00:02:58] [SPEAKER_01]: You know, all the different directions matter.

[00:03:00] [SPEAKER_01]: Probably the one that matters most though is the podcast feed.

[00:03:03] [SPEAKER_01]: That's the one where we can really present it to them and say, hey, see?

[00:03:07] [SPEAKER_01]: Check it out.

[00:03:08] [SPEAKER_01]: We got lots of people.

[00:03:10] [SPEAKER_01]: So we appreciate you doing that.

[00:03:11] [SPEAKER_01]: Appreciate you being here.

[00:03:13] [SPEAKER_01]: Let's start off with open AI as we kind of alluded to, began rolling out its advanced voice model.

[00:03:20] [SPEAKER_01]: This is of course the model that you know had much attention and then much blowback around the fact that it had a voice that sounded

[00:03:29] [SPEAKER_01]: eerily similar to Scarlett Johansson.

[00:03:33] [SPEAKER_01]: And you won't find that voice in this update.

[00:03:36] [SPEAKER_01]: But if you are a plus or a team user, you're going to get access to this probably right now.

[00:03:43] [SPEAKER_01]: You already have access to it.

[00:03:44] [SPEAKER_01]: But if not immediately, you're going to get it at base A over the course of the next week.

[00:03:47] [SPEAKER_01]: You'll see an in-app notification.

[00:03:51] [SPEAKER_01]: I did not see that in-app notification.

[00:03:54] [SPEAKER_01]: But when I went to it, let's see here if I can show my screen here real quick.

[00:04:00] [SPEAKER_01]: When I went to the app itself, I just found this little button down on the corner, this little kind of speaking button or waveform button.

[00:04:10] [SPEAKER_01]: I don't know if you can see that very well.

[00:04:12] [SPEAKER_01]: So I didn't get a notification, but it is appearing there.

[00:04:15] [SPEAKER_01]: That's how you essentially kind of trigger that on and play around with it.

[00:04:21] [SPEAKER_01]: And they did kind of include a couple of new features which we can play around with here in a second.

[00:04:28] [SPEAKER_01]: But I revamped the design, the integrated some custom instructions and memory to make more personalized interactions,

[00:04:36] [SPEAKER_01]: kind of improved conversational recall.

[00:04:40] [SPEAKER_01]: So that's important, better accent recognition, conversational support from more than 50 languages.

[00:04:47] [SPEAKER_01]: That's super cool to be able to be talking to one of these voice AIs.

[00:04:53] [SPEAKER_01]: And then say, all right, now say that a mandarin or whatever.

[00:04:56] [SPEAKER_01]: And it just boom, jumps right into it.

[00:05:00] [SPEAKER_01]: I don't know, that's kind of a super cool.

[00:05:01] [SPEAKER_00]: This is an either quite... So I use Google and I use meta and I use others.

[00:05:08] [SPEAKER_00]: When it comes to chatchipyT, was there an audio voice interface at all before?

[00:05:15] [SPEAKER_01]: My understanding because I'm very new to chatchipyT, the actual app.

[00:05:18] [SPEAKER_01]: I've only just in the last couple of days signed up for it specifically because it's not like I don't use chatchipyT.

[00:05:24] [SPEAKER_01]: I use it through perplexity, but I figured I should probably know how to, you know, know more about the app itself.

[00:05:32] [SPEAKER_01]: So I've installed that.

[00:05:34] [SPEAKER_01]: I know that some of the voice interactions were there and some sort of a beta format, but this like conversational thing was beta before.

[00:05:43] [SPEAKER_01]: And now it is just rolled out all you need to know.

[00:05:45] [SPEAKER_01]: That's really good.

[00:05:46] [SPEAKER_01]: That's a different space.

[00:05:47] [SPEAKER_01]: Yeah, it's similar to what Google is doing with Gemini Live.

[00:05:49] [SPEAKER_01]: Now that's rolled out as a free service for all members.

[00:05:55] [SPEAKER_01]: Here, it's still a premium feature, but yeah, everybody has access to it now essentially.

[00:06:02] [SPEAKER_01]: Five new voices.

[00:06:03] [SPEAKER_01]: So it was I think four before.

[00:06:06] [SPEAKER_01]: Actually it was five to begin with and then they got rid of sky, which is Scarlett Johansson voice.

[00:06:11] [SPEAKER_01]: And then now they've added a number of new voices.

[00:06:14] [SPEAKER_01]: Although it was really weird when I was setting it up, I could not change the voice.

[00:06:17] [SPEAKER_01]: It wouldn't allow me to change the voice.

[00:06:19] [SPEAKER_01]: So the one that I had, I was stuck with and I don't know if it was an app audited or what was going on.

[00:06:25] [SPEAKER_00]: So you have a annoying guy.

[00:06:26] [SPEAKER_00]: It's all you have.

[00:06:27] [SPEAKER_01]: Yes.

[00:06:28] [SPEAKER_01]: Yes.

[00:06:28] [SPEAKER_01]: The voice that we were kind of playing around with in pre-show.

[00:06:31] [SPEAKER_01]: We could test it out in a second.

[00:06:32] [SPEAKER_01]: I do also want to point out before we can play around with it a little bit.

[00:06:36] [SPEAKER_01]: It's missing the video and screen sharing component.

[00:06:40] [SPEAKER_01]: And there's no timeline yet on the arrival of that.

[00:06:43] [SPEAKER_01]: But if you want it, if you're working on a math problem and you wanted to point that particular feature to the math problem with the camera,

[00:06:50] [SPEAKER_01]: that's not a feature that you can actually do yet.

[00:06:53] [SPEAKER_01]: So that's coming somewhere down the line.

[00:06:56] [SPEAKER_01]: But you know, I thought I would love to show the overhead shot and show my interaction with it.

[00:07:02] [SPEAKER_01]: There really isn't a whole lot to see.

[00:07:03] [SPEAKER_01]: The problem is my phone apparently isn't loud enough to penetrate the microphone for you to act

[00:07:09] [SPEAKER_01]: actually hear it.

[00:07:12] [SPEAKER_00]: So all you really see is the pulsating voice.

[00:07:15] [SPEAKER_00]: Yeah, absolutely.

[00:07:16] [SPEAKER_01]: You don't learn to jump into the mode.

[00:07:19] [SPEAKER_01]: That's really just it.

[00:07:21] [SPEAKER_01]: So instead I can just go, oh yeah, and then you can see there's the kind of like color kind of avatar of the voice that interacting with.

[00:07:30] [SPEAKER_01]: Who's listening to me right now?

[00:07:32] [SPEAKER_01]: Hello, chat, GBT.

[00:07:33] [SPEAKER_01]: How are you doing today?

[00:07:36] [SPEAKER_03]: Hey there.

[00:07:37] [SPEAKER_03]: I'm doing great.

[00:07:38] [SPEAKER_03]: Thanks for asking.

[00:07:39] [SPEAKER_03]: How about you?

[00:07:40] [SPEAKER_01]: I'm doing okay, but what I really want to know is can you sing me a song?

[00:07:46] [SPEAKER_01]: Go ahead.

[00:07:46] [SPEAKER_01]: Sing me a song, chat, GBT.

[00:07:50] [SPEAKER_03]: I can't actually sing or produce any music.

[00:07:53] [SPEAKER_03]: Is there anything else I can help you with?

[00:07:54] [SPEAKER_01]: I swear I saw you sing in one of the demos.

[00:07:58] [SPEAKER_01]: I don't know why you can't do it now.

[00:07:59] [SPEAKER_01]: Do you know when you're going to be able to sing or can you just do it for me please?

[00:08:04] [SPEAKER_03]: I get why you'd wanted to.

[00:08:06] [SPEAKER_03]: But singing is just not in my repertoire.

[00:08:09] [SPEAKER_03]: Even if you heard something like that in a demo, it's not something I can actually do.

[00:08:15] [SPEAKER_01]: Well, that's below me.

[00:08:15] [SPEAKER_01]: I saw you do it before.

[00:08:16] [SPEAKER_01]: But anyways, okay.

[00:08:17] [SPEAKER_01]: Can you perform Shakespeare?

[00:08:20] [SPEAKER_01]: Can you perform Shakespeare for us?

[00:08:23] [SPEAKER_01]: Please and do it in a sad voice?

[00:08:26] [SPEAKER_01]: Sound really sad as you're performing Shakespeare.

[00:08:34] [SPEAKER_03]: To be or not to be, that is the question.

[00:08:39] [SPEAKER_03]: Whether it is no blur in the mind to suffer.

[00:08:41] [SPEAKER_01]: Okay, that's okay.

[00:08:43] [SPEAKER_01]: But I want really sad like weeping kind of sadness Shakespeare.

[00:08:48] [SPEAKER_00]: I want hamlet holding the sky and he really feel it.

[00:08:53] [SPEAKER_03]: To be or not to be.

[00:08:56] [SPEAKER_03]: That is the question.

[00:08:58] [SPEAKER_03]: Whether it is no blur in the mind to suffer.

[00:09:01] [SPEAKER_01]: Okay, I will give you that.

[00:09:03] [SPEAKER_01]: You were able to do the weeping Shakespeare recital pretty well.

[00:09:07] [SPEAKER_01]: Can you is it possible for you to speak like Darth Vader?

[00:09:11] [SPEAKER_01]: Is that even a possibility?

[00:09:15] [SPEAKER_03]: I can do specific voice impressions, including Darth Vader.

[00:09:19] [SPEAKER_03]: But I can definitely channel some of his vibe in our conversation if you'd like.

[00:09:24] [SPEAKER_01]: Yeah, so okay, well let's what were you saying hamlet?

[00:09:27] [SPEAKER_01]: The recite, recite something from hamlet in a voice that is similar to Darth Vader.

[00:09:34] [SPEAKER_01]: Kind of like what you're adding to about an attitude.

[00:09:37] Yeah.

[00:09:39] [SPEAKER_03]: All right, channeling that vibe to be or not to be.

[00:09:44] [SPEAKER_03]: That is the question.

[00:09:46] [SPEAKER_03]: Whether it is no blur in the mind.

[00:09:49] [SPEAKER_01]: Okay, all right.

[00:09:50] [SPEAKER_01]: I thought bad.

[00:09:51] [SPEAKER_01]: I think I'm satisfied.

[00:09:53] [SPEAKER_01]: Okay, so I'm going to go ahead and stop this because that's interesting and everything.

[00:09:57] [SPEAKER_01]: But again, I guess the question that I have is that's a neat parlor trick.

[00:10:02] [SPEAKER_01]: It's really neat thing you can do that.

[00:10:03] [SPEAKER_00]: Yep, yep.

[00:10:05] [SPEAKER_00]: But what is it that we're saying?

[00:10:07] [SPEAKER_01]: I'm going to pull that out.

[00:10:08] [SPEAKER_00]: Right, I'm thinking of what do I want it to do and I have a problem.

[00:10:11] [SPEAKER_00]: I don't know if anybody who's watching on YouTube has any ideas for us or what you want us to have it to do.

[00:10:16] [SPEAKER_00]: But I, I, I, I don't know.

[00:10:19] [SPEAKER_00]: It's a solution for a problem.

[00:10:23] [SPEAKER_01]: It's a neat technical kind of demonstration, technological demonstration.

[00:10:28] [SPEAKER_01]: I can see maybe where it's going.

[00:10:30] [SPEAKER_01]: But yeah, I, I kind of come up empty with like, okay, here's what I'm really going to use this focus.

[00:10:36] [SPEAKER_00]: Let me try this now because I can ask it to an explanatory question and when I see it in text, it tends to put things in bullets.

[00:10:44] [SPEAKER_00]: So maybe ask it through a plain what a large language model is.

[00:10:50] [SPEAKER_01]: Okay, chat GPT.

[00:10:51] [SPEAKER_01]: I have a, I have a big question for you.

[00:10:53] [SPEAKER_01]: I hope you're listening.

[00:10:54] [SPEAKER_01]: I want you to explain for me what a large language model is and explain it to me like I'm seven years old.

[00:11:06] [SPEAKER_03]: Imagine a huge library, but instead of books,

[00:11:09] [SPEAKER_03]: It's filled with lots of words and sentences from all over the world.

[00:11:14] [SPEAKER_03]: A large language model is like a super smart librarian who has read every single thing in that library.

[00:11:21] [SPEAKER_00]: Wasn't we bought it or not?

[00:11:22] [SPEAKER_03]: I'm asking a question or say something to it.

[00:11:25] [SPEAKER_03]: It uses what it knows from all those words to come up with a really good answer or continue the conversation with you.

[00:11:33] [SPEAKER_01]: Okay, thank you.

[00:11:33] [SPEAKER_01]: Do you think that?

[00:11:35] [SPEAKER_01]: Do you ask it?

[00:11:35] [SPEAKER_01]: Do you think yeah.

[00:11:38] You're welcome.

[00:11:39] [SPEAKER_01]: Oh, sorry, sorry.

[00:11:40] [SPEAKER_01]: Okay, so what were you going to say?

[00:11:42] [SPEAKER_00]: Ask it whether he thinks.

[00:11:45] [SPEAKER_01]: Oh, do you think?

[00:11:46] [SPEAKER_01]: Oh, okay.

[00:11:47] [SPEAKER_01]: I thought this was like a follow-up question.

[00:11:49] [SPEAKER_00]: Oh no.

[00:11:49] [SPEAKER_00]: Do you think no, dude does he think?

[00:11:51] [SPEAKER_01]: Does it think so?

[00:11:51] [SPEAKER_01]: So chat GPT, do you think like are you thinking right now is that is that part of your capability can you think?

[00:12:02] [SPEAKER_03]: I don't actually think or have feelings like a person does.

[00:12:06] [SPEAKER_03]: I process information and generate responses based on patterns in data.

[00:12:10] [SPEAKER_03]: It's more like following a complex set of instructions than actual thinking.

[00:12:16] [SPEAKER_01]: Oh, right now.

[00:12:17] [SPEAKER_01]: Okay, that's not good answer.

[00:12:18] [SPEAKER_01]: I just want to respond to you.

[00:12:19] [SPEAKER_01]: Okay, oh, so nightmare in the chat room asks this is ozone nightmare.

[00:12:24] [SPEAKER_01]: He's a good friend of mine on the in the world of the internet.

[00:12:27] [SPEAKER_01]: He says what does reasoning really mean to you?

[00:12:30] [SPEAKER_01]: What does reasoning mean to you as chat GPT?

[00:12:36] [SPEAKER_03]: To me, reasoning means using patterns and information I've been trained on to provide answers or solve problems.

[00:12:43] [SPEAKER_03]: It's about analyzing the context of your question drawing from vast amounts of data and then constructing a response that fits logically.

[00:12:53] [SPEAKER_03]: It's not true reasoning like humans.

[00:12:55] [SPEAKER_01]: All right, all right.

[00:12:56] [SPEAKER_01]: All right.

[00:12:56] [SPEAKER_01]: Thank you very much.

[00:12:58] [SPEAKER_01]: That's going to go ahead and interrupt you.

[00:13:00] [SPEAKER_01]: And then it gives you, it does give you by the way kind of the kind of the, you know, kind of the,

[00:13:05] [SPEAKER_00]: or so the transcript is there.

[00:13:07] [SPEAKER_01]: So any of the information that you got in that conversation, you can refer to later.

[00:13:11] [SPEAKER_01]: It's not just kind of lost in the in the ether.

[00:13:14] [SPEAKER_01]: So yeah, so there you go.

[00:13:17] [SPEAKER_01]: And it can be, you know, it can't be a part of like Darth Vader, but it can talk.

[00:13:21] [SPEAKER_00]: Okay, I actually wasn't bad.

[00:13:23] [SPEAKER_00]: It didn't more than you did.

[00:13:23] [SPEAKER_00]: It was bad because boys kind of changed them.

[00:13:26] [SPEAKER_00]: Uh, it stayed a little deeper.

[00:13:29] [SPEAKER_00]: Yes, it did.

[00:13:31] [SPEAKER_01]: Interesting stuff.

[00:13:33] [SPEAKER_01]: Yeah.

[00:13:34] [SPEAKER_01]: Yeah.

[00:13:34] [SPEAKER_01]: So per percolate on that and see if, if, see if maybe over the course of the next, you know,

[00:13:39] [SPEAKER_01]: a little while you encounter a situation where you're like that would be the perfect tool.

[00:13:43] [SPEAKER_01]: I think that's the real challenge that I'm going to have with it is thinking so far.

[00:13:48] [SPEAKER_00]: I've got to follow you and I'm going to follow you and subscribe to it because after all we have the show and I should do it.

[00:13:54] [SPEAKER_00]: Eat the dog food, but I find myself using meta AI more than anything else to go say,

[00:14:00] [SPEAKER_00]: Give me a word I can't pick up.

[00:14:02] [SPEAKER_00]: It's an interactive the source.

[00:14:05] [SPEAKER_00]: And it's, I find it really valuable because I'm writing and I lose a word because of a lot of things.

[00:14:09] [SPEAKER_00]: I don't know what I'm going to do.

[00:14:10] [SPEAKER_00]: Um, that's valuable is it worth $1 a month?

[00:14:15] [SPEAKER_00]: Is it worth going to it and memory it can do it?

[00:14:17] [SPEAKER_00]: Yes.

[00:14:18] [SPEAKER_00]: So I now have a habit to do that.

[00:14:20] [SPEAKER_00]: It's the first kind of wedge in on actually using this stuff.

[00:14:26] Yeah.

[00:14:27] [SPEAKER_01]: Yeah.

[00:14:28] [SPEAKER_01]: It's not much.

[00:14:30] [SPEAKER_01]: It's not much, but at least my experience has been the more I find those, those pieces, those

[00:14:39] [SPEAKER_01]: opportunities where I'm like, oh okay actually it is really good at that.

[00:14:43] [SPEAKER_01]: The more it kind of gets me in the habit and of course they're bad.

[00:14:46] [SPEAKER_01]: You know they're hoping for this.

[00:14:48] [SPEAKER_01]: That get into the habit of thinking about it for other solutions as well.

[00:14:52] [SPEAKER_01]: It's kind of like it opens the door a little bit and then it opens the door a little bit more.

[00:14:56] [SPEAKER_01]: And now yeah like we've talked about many times if I went to my recent history and

[00:15:00] [SPEAKER_01]: every day has many multiples of queries because it's I use it more than I use Google if I'm completely honest.

[00:15:09] [SPEAKER_01]: I often go there to kind of brainstorm with it's a really great brainstorming tool and I do a lot of that these days.

[00:15:17] [SPEAKER_01]: So that's probably part of the reason why.

[00:15:20] [SPEAKER_01]: So there we go, you know, I hope you all are playing around with it and if you have any thoughts on you know on that,

[00:15:28] [SPEAKER_01]: contact that AI inside dot show, you can set us an email or find us on on socials.

[00:15:34] [SPEAKER_01]: And let's know what you think maybe we'll read it on the show.

[00:15:38] [SPEAKER_00]: So I'm just kind of think don't what we're going to do that for one second.

[00:15:40] [SPEAKER_00]: Yeah so I started reading the book.

[00:15:43] [SPEAKER_00]: Well the cults of creativity because as I mentioned the show,

[00:15:46] [SPEAKER_00]: I'm going to be teaching of course I hope in AI creativity.

[00:15:50] [SPEAKER_00]: One of the chapter and what an argues is that this notion of creativity and value we put on creativity is fairly recent.

[00:15:58] [SPEAKER_00]: And it does a interesting chart of the usage of the word creativity.

[00:16:04] [SPEAKER_00]: But the reason I'm bringing up right now on this context is it has a chapter that just started reading on the birth of brainstorming.

[00:16:12] [SPEAKER_00]: The brainstorming was not a thing that happened.

[00:16:15] [SPEAKER_00]: It was it's kind of like human centered design.

[00:16:17] [SPEAKER_00]: It was this thing that came out of advertising and that there was a value in throwing out ideas.

[00:16:23] [SPEAKER_00]: It's having lots of ideas and not dismissing any of them because they have value.

[00:16:29] [SPEAKER_00]: Yeah right.

[00:16:30] [SPEAKER_00]: And so we're taking that stick it's a stick and now put it on the machine and it is good because you want to hold list of ideas.

[00:16:38] [SPEAKER_00]: Yeah it can do that.

[00:16:41] [SPEAKER_00]: And but brainstorming is something that came from from a cultural notion that we didn't really have a century ago,

[00:16:50] [SPEAKER_00]: which I just found interesting.

[00:16:51] [SPEAKER_01]: So that's fascinating.

[00:16:53] [SPEAKER_01]: Yeah I love thinking about things that we totally take for granted and recognizing that there was a beginning for that thing.

[00:17:02] [SPEAKER_01]: And what you know like brainstorming that's so fascinating it makes sense that it would be a modern thing probably.

[00:17:09] [SPEAKER_01]: And you know, but prior to that, so much was probably done on based on pure instinct and momentary thought or belief.

[00:17:18] [SPEAKER_00]: Or the belief that you'd done research, you'd done your homework or you had some kind of scientific method or there was a logical push here.

[00:17:25] [SPEAKER_00]: And it just said no, be creative.

[00:17:28] [SPEAKER_00]: This all comes out of it but it's all technology based it comes out of a sputnik to a very distant is the fear that the Russians are ahead of us and our science was behind.

[00:17:38] [SPEAKER_00]: And so this whole notion of creativity wasn't focused on entertainment and our arts and writing and such.

[00:17:46] [SPEAKER_00]: It was focused on, do we have the ability to come up with more creative ideas to solve scientific and business problems?

[00:17:55] [SPEAKER_00]: So it's all kind of a little cycle here of how this affected our culture.

[00:17:59] [SPEAKER_00]: That's fascinating.

[00:18:00] [SPEAKER_01]: I think we'll speak up at whole speaking of creativity.

[00:18:04] [SPEAKER_01]: Johnny, I've many would argue is an incredibly creative individual having driven so much of the creative direction of products for Apple for so many years.

[00:18:16] [SPEAKER_01]: And he's been gone now, I think for five you probably a little bit more than five years but he's been doing his own thing with a design firm that he created called Love From.

[00:18:27] [SPEAKER_01]: And we've been hearing, we've definitely talked on this show about the budding bromance between Johnny and Sam Altman.

[00:18:39] [SPEAKER_01]: As they have been reportedly putting their minds together on creating an AI-driven hardware device of some sort.

[00:18:49] [SPEAKER_01]: And now we have an interview on the New York Times with Johnny Ive where he doesn't go into a whole lot of detail but he definitely does confirm that the two are working to build a product.

[00:19:01] [SPEAKER_01]: They've raised money privately could raise up to $1 billion by the end of the year.

[00:19:07] [SPEAKER_01]: They've had a bunch of a number, a $1 billion by the end of the year.

[00:19:10] [SPEAKER_01]: They're raised up to $1 billion by the end of the year.

[00:19:17] [SPEAKER_01]: And yeah, you know again, not really detailing a lot of, you know, really anything about this other than that it's in the works kind of confirming some of the reports that we've heard.

[00:19:29] [SPEAKER_01]: I did say quote for five years I haven't talked to anybody about what we're doing with the design firm.

[00:19:35] [SPEAKER_01]: I don't think that means that he's been working on this particular project project for five years in fact the moment will certain of that.

[00:19:41] [SPEAKER_01]: But you know, very secretive, very behind the scenes and.

[00:19:47] [SPEAKER_01]: Oh no, you know, so consumer.

[00:19:49] [SPEAKER_00]: So consumer.

[00:19:51] [SPEAKER_00]: Sam Alvin spoke with in a freed from axios at a UN event, the UN general assembly is here this week.

[00:19:59] [SPEAKER_00]: So there's lots of events going on around that technology at AI and such.

[00:20:03] [SPEAKER_00]: So one thing that Alvin says is I don't think you should try to do a better phone.

[00:20:10] [SPEAKER_00]: So data point. Okay, I'm going to be a phone.

[00:20:13] [SPEAKER_00]: He also said it's a low way away.

[00:20:18] [SPEAKER_00]: He said noting that it took open AI more than four and a half years to ship its first product and I thought that was fast.

[00:20:26] [SPEAKER_00]: So whenever this magical thing is the problem is they're building it up like whatever it comes out.

[00:20:32] [SPEAKER_01]: Yeah, it's going to be volgala for us, you know, well, you know what kind of as as I'm kind of thinking about this right now it kind of gives me.

[00:20:39] [SPEAKER_01]: Magic leap vibes where it's like, yeah, has all of this like pre release, you know, getting the reading the tea leaves for years and years and years.

[00:20:48] [SPEAKER_01]: Oh my goodness, this is going to be the example of the thing that works.

[00:20:53] [SPEAKER_01]: And I'm not sure the magic leap really delivered on that.

[00:20:56] [SPEAKER_01]: It's, you know, at a certain point it just becomes bigger than its ability to kind of deliver on the promises or the or the thoughts that people have about that thing having had years and years.

[00:21:09] [SPEAKER_01]: And I think there's a build up around it about how great it's going to be and everything.

[00:21:13] [SPEAKER_01]: So I wonder if it would even live up to it, but but you've got Johnny Ive who has an incredible, you know, track record.

[00:21:21] [SPEAKER_01]: You've got, you've got all men who, you know, are you really one of the most note worthy or at least known about AI companies right now with Open AI.

[00:21:32] [SPEAKER_01]: So you've got the names is just, is that it? Is that going to be enough?

[00:21:36] [SPEAKER_01]: Is there actually a product there that people are going to want?

[00:21:38] [SPEAKER_00]: Right. Right. Yeah. Are they solving a problem?

[00:21:41] [SPEAKER_00]: So I've been told, actually, it's the expect more cool things from this year.

[00:21:47] [SPEAKER_00]: So that's pretty much the whole hype they got about ten news stories out of saying is nothing.

[00:21:52] [SPEAKER_01]: Right.

[00:21:53] [SPEAKER_01]: Oh boy. Yeah.

[00:21:55] [SPEAKER_01]: Know how to play the hype cycle.

[00:21:58] [SPEAKER_01]: And this next segment is kind of perfect for that.

[00:22:01] [SPEAKER_01]: You, you included a segment called Altman's Utopia, which I love.

[00:22:05] [SPEAKER_01]: There's a tech crunch article that focuses on Altman's proclamations that AI will save the world.

[00:22:13] [SPEAKER_01]: This was from a recent post by by him on his site titled The Intelligence Age.

[00:22:21] [SPEAKER_01]: God mode as the article puts it, but there's just a number of quotes, pull quotes in the tech crunch article, you know, taken from this post.

[00:22:31] [SPEAKER_01]: Many proclamations. I'm curious to know if any of those stood out for you as so.

[00:22:35] [SPEAKER_00]: Sarah Perez and in tech crunch did a really good job, I think.

[00:22:40] [SPEAKER_00]: Of she said that he shifted from founder mode to God mode and then wanted to cut through the hype of all this.

[00:22:48] [SPEAKER_00]: So she, I think did a good job. I don't give you I'm just going to give you a few examples.

[00:22:51] [SPEAKER_00]: AI will help us solve hard problems. She'd decree that to be believable.

[00:22:58] [SPEAKER_00]: Because you know, look at what it's done with drugs.

[00:23:00] [SPEAKER_00]: Look at what it's done with with the genome and with proteins.

[00:23:04] [SPEAKER_00]: So fine. Believe it well. Check. We'll soon be able to work with AI that helps us accomplish much more than we ever come without AI.

[00:23:11] [SPEAKER_00]: She said, very into hype, but probably okay.

[00:23:15] [SPEAKER_00]: Yeah. Probably eventually we could have a personal AI team full of virtual experts in different areas working together to create almost anything we can imagine.

[00:23:25] [SPEAKER_00]: I.

[00:23:27] [SPEAKER_02]: I.

[00:23:27] [SPEAKER_00]: I.

[00:23:28] [SPEAKER_00]: Our children will have virtual tours.

[00:23:31] [SPEAKER_00]: Dutors believable to do.

[00:23:33] [SPEAKER_00]: Yeah. Totally. We can have shared prosperity to a degree that seems unimaginable today in the future everyone's lives can be better than anyone's life is now.

[00:23:45] Hi.

[00:23:45] [SPEAKER_01]: Okay.

[00:23:47] [SPEAKER_01]: Um, a lot of promise right there.

[00:23:49] [SPEAKER_00]: Yeah.

[00:23:50] [SPEAKER_00]: This may turn out to be the most consequential fact about all of history so far.

[00:23:56] [SPEAKER_00]: It is possible that we have super intelligence in a few thousand days.

[00:24:01] [SPEAKER_00]: It may take longer, but I'm confident we'll get there.

[00:24:04] [SPEAKER_00]: Hipe with the capital H per as says.

[00:24:09] [SPEAKER_00]: AI is going to get better with scale.

[00:24:12] [SPEAKER_00]: She says believable, but that's a, I think the thing that we can keep arguing now is is is bigger and necessarily better with this.

[00:24:19] [SPEAKER_00]: Um, AI systems are going to get so good that they may help us make better next generation systems and make scientific progress across the board.

[00:24:29] [SPEAKER_00]: Hipe she says.

[00:24:31] [SPEAKER_00]: The dawn of the intelligence age hype.

[00:24:35] [SPEAKER_00]: It will not be entirely positive story, but the upside is tremendous. First part believable second part hype.

[00:24:41] [SPEAKER_00]: Um, again prosperity. This is their argument.

[00:24:44] [SPEAKER_00]: It's almost like a certain presidential candidate talking about, um, teraphs.

[00:24:49] [SPEAKER_00]: A defining characteristic of the intelligence age will be massive prosperity.

[00:24:56] [SPEAKER_00]: Um, many of the jobs we do today would have looked like trifling waste of time to people a few hundred years ago, but nobody is looking back at the past wishing they were a lamp lighter.

[00:25:08] [SPEAKER_00]: Um, it says why shade lamp lighters.

[00:25:10] [SPEAKER_00]: Yeah, exactly.

[00:25:11] [SPEAKER_01]: Actually, it sounds like a pretty chill job.

[00:25:14] [SPEAKER_00]: Walking around every night.

[00:25:16] [SPEAKER_00]: Here he here.

[00:25:17] [SPEAKER_00]: He was that through the moment.

[00:25:18] [SPEAKER_00]: I was doing something.

[00:25:19] [SPEAKER_00]: Um, so they're getting carried away with themselves here.

[00:25:24] [SPEAKER_00]: And that's, that's nothing that's bothering me Jason, but I love to win the show.

[00:25:28] [SPEAKER_00]: I love following this stuff.

[00:25:30] [SPEAKER_00]: But I just had this constant nagging of me that it's going to implode not in a market way.

[00:25:38] [SPEAKER_00]: But because the hype is just so high and this AGI BS is just so constant for them.

[00:25:45] [SPEAKER_00]: I just don't feel like we get to reality.

[00:25:47] [SPEAKER_00]: We don't know what intelligence means.

[00:25:50] [SPEAKER_00]: We don't know what safety means.

[00:25:52] [SPEAKER_00]: Um, and all of them, it was self not a technologist.

[00:25:56] [SPEAKER_00]: Uh, a founder and an investor.

[00:25:58] [SPEAKER_00]: Uh, just keeps pushing this so much that I think it's, I think it's injurious to the field.

[00:26:03] [SPEAKER_00]: What was your reaction?

[00:26:04] [SPEAKER_01]: Yeah.

[00:26:04] [SPEAKER_01]: No, I mean, I think it makes sense coming from someone like Altman in the position that he's been.

[00:26:11] [SPEAKER_01]: And with the history that we know for, you know, most of these comments to be, to be uttered.

[00:26:16] [SPEAKER_01]: I think the question that comes up for me is like when it comes to technology and people creating technology,

[00:26:21] [SPEAKER_01]: taking AI out of the equation, our founders are the people who create that technology, usually as, as high on their own supply as it seems.

[00:26:34] [SPEAKER_01]: AI founders are, you know what I mean?

[00:26:37] [SPEAKER_01]: It's just to, is it, I mean, I imagine that founders will always look at their product and think the greatest of it.

[00:26:45] [SPEAKER_01]: You know, it's their darling and they, they anticipate that they're doing what they do because they see a future and they see possibility.

[00:26:53] [SPEAKER_01]: But is it different with what we're seeing with AI compared to others?

[00:26:58] [SPEAKER_01]: And if it is, is that a symbol of the actual capability or potential of a technology like AI versus other type of technologies?

[00:27:08] [SPEAKER_00]: There's there's a constant reflex because I've been studying, so the history of the technologies in the past.

[00:27:15] [SPEAKER_00]: One of my favorite examples which I use in my books, the good, my parenthesis on the sale now, is the telegraph.

[00:27:23] [SPEAKER_00]: And when the telegraph, when the first line was laid across the Atlantic and the queen sent a message to the president,

[00:27:33] [SPEAKER_00]: it by the way took, I think 24 hours because it kept up raking and soon the whole line broke.

[00:27:39] [SPEAKER_00]: But the, the hype of it is going to bring peace to mankind.

[00:27:43] [SPEAKER_00]: It's going to, universe and understanding, a lot of a lot of a lot of a lot.

[00:27:47] [SPEAKER_00]: And the stories for the New York Times are mind blowing. People got off work and parade it spontaneously, downtown, let it off balloons and bands and cheering and fireworks because a wire was connected.

[00:28:02] [SPEAKER_00]: So the one hand you think, and but meanwhile, in Britain they made fun of us for going bananas over just a wire because that's the iconic brids versus the insane Americans.

[00:28:16] [SPEAKER_00]: And there was also a, a, a, a, a, a Washington correspondent said that we really need news faster than 10 days for the males.

[00:28:26] [SPEAKER_00]: What were possible uses this right so you saw that same kind of yingyang and the hype.

[00:28:32] [SPEAKER_00]: And I was just reading some sermons from my line of type book was just reading guys sermon about the crystal palace, which was in New York.

[00:28:42] [SPEAKER_00]: The all kinds of do inventions like steam power presses in the 1850s and the hype as you mentioned in Jason sounds very similar to what I was saying right this is going to change mankind and a lot of ways it did it did it didn't not the ways predicted.

[00:28:57] [SPEAKER_01]: I, maybe that's the difference. It's easy it's easy to look at it and think of all the possible ways like like the reference of the, you know, on the cusp of the information age or something along those lines.

[00:29:09] [SPEAKER_01]: And I read somewhere I can't remember if it was in the tech crunch article kind of wish I had written it down because I, because I'm not going to remember exactly how they put it but you know essentially there were saying maybe we're on a dip, you know, we're on the verb the cusp of an entirely different age where things collapse because of overuse and,

[00:29:26] [SPEAKER_01]: you know, over reliance on, you know, not as abundant resources or whatever we just don't see it because we're in the middle of it.

[00:29:35] [SPEAKER_01]: You know that's that sort of analysis is the only thing you get historically speaking looking backwards not when you're in the middle of it when you're in the middle of it.

[00:29:42] [SPEAKER_01]: You don't actually know and yet here comes the proclamation of this is the beginning of the new age. So it doesn't work like that. You only see that actually speaking looking back.

[00:29:51] [SPEAKER_00]: So I think the tech crunch analysis of Altman's blog post is a really good framework for discussion going forward, especially even on the show.

[00:30:01] [SPEAKER_00]: Is I think we'll be trying to do here but much more explicitly is to separate out the hype from the realistic the hype from believable.

[00:30:08] [SPEAKER_00]: And so we can judge it accordingly.

[00:30:11] [SPEAKER_00]: I went to I didn't go to what I watched an all morning for our conference from academic's presenting papers from a new issue of AI and ethics, an academic journal.

[00:30:24] [SPEAKER_00]: And there were some interesting papers in there. One I'll just mention off hand is a computer scientist an emeritus professor who fell some years ago and is paralyzed from the neck down.

[00:30:40] [SPEAKER_00]: And he talked about how he uses AI in his everyday life and you can imagine the ways it's really important.

[00:30:47] [SPEAKER_00]: And his daughter who helped some is a PhD candidate and she's blind.

[00:30:52] [SPEAKER_00]: And she talked about how she uses AI in everyday life.

[00:30:56] [SPEAKER_00]: And one of their and they find it legitimately very valuable and useful to get information and to do things.

[00:31:03] [SPEAKER_00]: They find it sometimes frustrating when it doesn't answer the question right, you know.

[00:31:08] [SPEAKER_00]: But what what bothered them is the hype.

[00:31:12] [SPEAKER_00]: What bothered them was to say acting as if it was going to solve all their problems, and suddenly they were going to be like air quotes normal.

[00:31:19] [SPEAKER_00]: No, it was going to help us with small ways and that's useful. Let's see it is that another paper at the conference looked at the kind of the the bases of the hype and I just did this really quickly.

[00:31:29] [SPEAKER_00]: But it it saw three or four key kinds of hype the first to answer them morphism where we think it's going to know us.

[00:31:38] [SPEAKER_00]: The second is experts and exaggerated literacy everybody thinks they're an expert in AI.

[00:31:44] [SPEAKER_00]: The third is full-mo every company saying oh my god, we got to have it we got to have it so they build it up and think that they're doing brilliantly but if it's in their company.

[00:31:54] [SPEAKER_00]: And then fourth is the over use of the term AI.

[00:32:00] [SPEAKER_00]: Oh yeah, now a lot of things that we've been having for years is now something that's a, oh every other hand we kind of ignore some of the things that Google's been doing today for a long time.

[00:32:09] [SPEAKER_00]: So that's the end of our hype segment here.

[00:32:12] [SPEAKER_00]: But I think that there was some really interesting connections here and all this and how society and media are greeting AI.

[00:32:23] [SPEAKER_00]: Yeah 100%.

[00:32:23] [SPEAKER_01]: Cool stuff. I'm happy you added that in there because it's a really great great opportunity to kind of got check and get that perspective.

[00:32:33] [SPEAKER_01]: Before the break just real quick, as I'm not sure there's a whole lot to this story.

[00:32:36] [SPEAKER_01]: I hope beyond as I didn't have access to the unpay-walled version of the Wall Street Journal's article about it is character AIs co-founder Nom Shazir who Google had basically bought back into the company after he had left a number of years ago to start character AI.

[00:32:56] [SPEAKER_01]: And I believe the article just kind of has a, has an idea or has a source on the amount which is $2.7 billion is what they paid to bring him back is it just him or he's got to be right.

[00:33:11] [SPEAKER_00]: He was the author of the seminal paper that kicked all of this off.

[00:33:15] [SPEAKER_00]: And he quit Google in 2021 and started character AI as it began to flounder Google swooped in says the Wall Street Journal and wrote a character AI a check for around $2.7 billion.

[00:33:30] [SPEAKER_00]: The official reason was to license character AIs technology but the deal included the other component that Shazir agreed to come back to work at Google again.

[00:33:39] [SPEAKER_00]: I would like to be that value in life.

[00:33:42] [SPEAKER_00]: I'm hitting her.

[00:33:43] [SPEAKER_00]: And he got that value in front of stock.

[00:33:46] [SPEAKER_00]: So he came back with the software with employees and I think he's a really important person in this field.

[00:33:56] [SPEAKER_00]: But it's almost part of the hype too.

[00:33:58] [SPEAKER_00]: It's the financial hype, right?

[00:34:01] [SPEAKER_00]: Is that anyone can be that valuable?

[00:34:04] [SPEAKER_01]: That makes you crazy value.

[00:34:05] [SPEAKER_00]: It really is.

[00:34:07] [SPEAKER_01]: That's crazy but that value of acquisition for talent and companies man it just every year it creeps up and we think we've seen the top of it and then something big like this happens.

[00:34:20] [SPEAKER_01]: Oh well that's where we're at now.

[00:34:22] [SPEAKER_01]: He's a 7 billion dollars to bring back him.

[00:34:27] [SPEAKER_01]: Which is how you are like his efforts in his contributions to everything.

[00:34:33] [SPEAKER_01]: No, not really smart and he created a he supported this this burgeoning movement.

[00:34:41] [SPEAKER_01]: But boy that's a lot of money.

[00:34:43] [SPEAKER_00]: So real quickly to the little tip of the story he joined Google in 2000 as one of the first few hundred employees.

[00:34:49] [SPEAKER_00]: This is first major project was building the system to improve spelling correction.

[00:34:54] [SPEAKER_00]: And this is a great Google example.

[00:34:56] [SPEAKER_00]: Meers back to be Google I wrote about this moment Google do.

[00:35:00] [SPEAKER_00]: The normal reflex for how do you handle spelling is dictionary.

[00:35:04] [SPEAKER_00]: But the Google thing to do is to realize it looking how everyone miss spells everything was the way to understand spelling and the way to fix it.

[00:35:11] [SPEAKER_00]: So he went to Eric Schmidt.

[00:35:13] [SPEAKER_00]: I said I'm going to solve he asked him for access to thousands of computer chips.

[00:35:19] [SPEAKER_00]: I'm going to solve general intelligence by the weekend.

[00:35:21] [SPEAKER_00]: She's here to old Schmidt.

[00:35:23] [SPEAKER_00]: The CEO recalled during a 2015 university talk.

[00:35:26] [SPEAKER_00]: The early effort failed but this was the the German his brain.

[00:35:33] [SPEAKER_00]: And so in 2017 he published a paper with seven other Google researchers called attention is all you need.

[00:35:39] [SPEAKER_00]: And that was the beginnings of generative AI.

[00:35:43] [SPEAKER_00]: So yeah, probably a book guy.

[00:35:46] [SPEAKER_01]: Yeah, very a book guy.

[00:35:47] [SPEAKER_01]: I'm sure Google's really great.

[00:35:49] [SPEAKER_01]: Really happy to have him back.

[00:35:50] [SPEAKER_01]: We'll see what that amounts to going forward in this new you know new era with with the Gemini.

[00:35:57] [SPEAKER_01]: We're going to take a break but quickly before we do want to touch back on the conversation we had on chat GPT because I didn't realize.

[00:36:05] [SPEAKER_01]: But the OS and nightmare. Thank you so much for the super thanks had said about that story.

[00:36:11] [SPEAKER_01]: I appreciate the fact that the software acknowledges the limits of its reasoning.

[00:36:15] [SPEAKER_01]: The distinction matters as human reasoning involves more than databases and patterns.

[00:36:20] [SPEAKER_01]: So I'm glad it notes those limits.

[00:36:24] [SPEAKER_01]: Yeah, yeah, it's a good good point.

[00:36:27] [SPEAKER_01]: I agree.

[00:36:28] [SPEAKER_01]: Yeah, so that it doesn't go the opposite direction.

[00:36:30] [SPEAKER_01]: It's not I mean itself exactly exactly.

[00:36:35] [SPEAKER_01]: Kind of surprising that they wouldn't.

[00:36:37] [SPEAKER_01]: But you know, I guess they have to you know support the the theory that they're doing this smartly and what is it safely for some people who are pushing back.

[00:36:49] [SPEAKER_01]: And then also want to call out is it Jay Sal 1993 sorry the Texas really small my screen was says really enjoy in the show.

[00:36:56] [SPEAKER_01]: I've been happy with Gemini as my default assistant so far been saw been saving me time when researching although hope it's accurate.

[00:37:06] [SPEAKER_01]: Well, yeah definitely check the accuracy you always got to do that but it can yeah it can be a really wonderful researching tool I totally agree.

[00:37:15] [SPEAKER_00]: I've been using these tools working because when Gemini comes up at the top of search and summarizes some stuff the links are right there.

[00:37:24] [SPEAKER_00]: And so you can check and it gives you a quick summary and then you have the chance it's easy to then look and say I'm in the way make sure that I like them.

[00:37:31] [SPEAKER_01]: Yeah, yeah I like that too.

[00:37:33] [SPEAKER_01]: For sure, so thank you for being here. Thank you for the super thanks and thank you everyone.

[00:37:38] [SPEAKER_01]: We're going to take quick break and then when we come back we're going to talk about quantum computing we're going to attempt to talk about quantum computing anyways.

[00:37:45] [SPEAKER_01]: That's coming up. All right from AI to quantum computing what could the two together results in well Google quantum AI computer scientists have made.

[00:38:01] [SPEAKER_01]: Apparently a pretty significant breakthrough developed a new quantum memory system that reduces error rates to a substantial point and this is important.

[00:38:13] [SPEAKER_01]: Mind you I am no quantum computing expert here. So I was just you know again doing some reading doing some research trying to understand this through the research realize this is important because so far they've been limited by high error rates.

[00:38:29] [SPEAKER_01]: So if you can reduce the error rates it can be ultimately become far more powerful than what you get out of a traditional super computer.

[00:38:38] [SPEAKER_01]: And in this case, the system also proved to get better as it grew in size so theoretically that would mean that it's possibly a scalable solution to quantum power and you know what is all this have to do with AI.

[00:38:56] [SPEAKER_01]: I think the question is when you've got quantum computing to power quantum artificial intelligence is that the next generation of AI and what is that like when I think of that I think oh if

[00:39:13] [SPEAKER_01]: I don't know exactly how I feel about AI but is this the thing that makes that more possible whatever AI happens to be putting that in you know a sea of air quotes, but.

[00:39:23] [SPEAKER_01]: I don't know what do you think about this quantum.

[00:39:25] [SPEAKER_00]: I want to wait for this in here first thing. Well, what I'm computing knows that us is that part of your right and you brings the two together what interested me about this in my odd way of distracting things.

[00:39:38] [SPEAKER_00]: Is that quantum computing as I understand it.

[00:39:41] [SPEAKER_00]: Works with.

[00:39:45] [SPEAKER_00]: Approximation close enough that it's not it's not a girl computer logic eight one plus one equals two and we know that's the answer.

[00:39:52] [SPEAKER_00]: So much is going on that it gets near to answers, but doesn't necessarily know whether that's the only answer.

[00:39:59] [SPEAKER_00]: Sound familiar.

[00:40:00] [SPEAKER_00]: Sound like right sounds like generative AI.

[00:40:04] [SPEAKER_00]: Totally totally.

[00:40:06] [SPEAKER_00]: And so and then first you want to I want to you want to condemn that so you're computer you should be able to go to the exact right answer all the time.

[00:40:13] [SPEAKER_00]: But you were brains the same way.

[00:40:16] [SPEAKER_00]: Right we get close enough for jazz. We figured out that we tested with it. Does that make sense and we're going around memory is imperfect.

[00:40:24] [SPEAKER_01]: We were meant to remember moments, you know from our past and we recite them as if they happen. Meanwhile we get little details wrong, but it's okay because you know yeah human human brain is infallible like that.

[00:40:34] [SPEAKER_00]: Yeah, so it's and it's part of the answer for more position problem where the ironic thing is we presume that the computer is going to be better at everything than us.

[00:40:43] [SPEAKER_00]: That's what the whole AGI argument is right. No it could be just as flawed as we are but it's as flawed as we are.

[00:40:50] [SPEAKER_00]: And if that's the standard then it is flawed forever.

[00:40:54] [SPEAKER_00]: One of the people at the conference I watched the other day on AI hype said that they'd heard somebody else I don't know who were saying it for being it saying that we should stop referring to it as intelligence.

[00:41:05] [SPEAKER_00]: And start referring to it as super math.

[00:41:10] [SPEAKER_00]: That sounds okay too except that it can't do math.

[00:41:17] [SPEAKER_00]: So it's really interesting to try to draw back and ask what are we expecting of it?

[00:41:23] [SPEAKER_00]: It being AI and quantum computing both and when is close enough good enough.

[00:41:29] [SPEAKER_01]: Yeah right. Yeah what it actually is that line of good enough. That's a really great question.

[00:41:37] [SPEAKER_01]: By the way, the United Nations declared 2025 the International Year of Quantum Science for whatever its worth.

[00:41:44] [SPEAKER_01]: So it may be a lot of the cusp of something.

[00:41:48] [SPEAKER_01]: Some are wondering if quantum computing is going to have its chat GPT moment sometime in the near future.

[00:41:55] [SPEAKER_01]: We're going to be having it in our phones. Yeah, not anytime soon.

[00:42:00] [SPEAKER_01]: Maybe that's what Johnny I've been saying about.

[00:42:03] [SPEAKER_01]: We're working on quantum computing for our phones with quantum AI.

[00:42:07] [SPEAKER_01]: Be interesting.

[00:42:10] [SPEAKER_01]: Maybe quantum AI would be needed to solve captions or no not at all because AI apparently can already solve captions 100% of the time.

[00:42:19] [SPEAKER_01]: Captures of course are the tool online that we all love to hate because it's used to determine if a bot is on the other side of the screen as opposed to an actual human.

[00:42:32] [SPEAKER_01]: Sometimes those challenges can stump even humans and take 20 to 30 seconds to get into a site depending.

[00:42:39] [SPEAKER_01]: Now according to new scientists article, a fine tuned AI model called you only look once also known as yellow.

[00:42:47] [SPEAKER_01]: Learn to pass captures 100% of the time, but it's important to note that does not mean that it could respond correctly 100% of the time.

[00:42:58] [SPEAKER_01]: It was able to recognize when it couldn't get it right and then pass to do another one that it could eventually pass through.

[00:43:08] [SPEAKER_01]: So it you know so it could recognize when it when it didn't know the answer and pass and get to one that it would and get through 100% time.

[00:43:16] [SPEAKER_00]: So this is a case where I think AI becomes our friend if it makes captions absolutely worthless and they go away.

[00:43:25] [SPEAKER_01]: But then what are we going to use to decipher if there's a robot on the other side of the screen?

[00:43:29] [SPEAKER_00]: That's the interesting thing. It's going to be really hard to know real from computerized.

[00:43:37] [SPEAKER_01]: Yeah, yeah, more and more difficult that is true.

[00:43:40] [SPEAKER_01]: That's just the reality of where we're at right now.

[00:43:44] [SPEAKER_01]: And and I should also mention like it's it's kind of full circle.

[00:43:50] [SPEAKER_01]: Captures have been used to train AI models in the first place.

[00:43:56] [SPEAKER_01]: There was what was the article here.

[00:44:00] [SPEAKER_01]: A medium post that is just one example of how you know Google has been training it's AI with.

[00:44:06] [SPEAKER_01]: Without our help using using captions. So it's you know in some ways it's a snake eating its own tail I guess.

[00:44:16] [SPEAKER_01]: So this is really interesting. All right, so a little video games.

[00:44:20] [SPEAKER_01]: I know I know video games aren't your your total cup of tea.

[00:44:24] [SPEAKER_01]: But I find this fascinating from a wow you know where are things headed kind of perspective when we've been talked about this last couple of weeks.

[00:44:33] [SPEAKER_01]: And this is a good visual representation of it. Electronic arts chief strategy officer me here.

[00:44:39] [SPEAKER_01]: Vida spoke at EA's investor day in New York City this week and showed off an illustration put that in air quotes because that's what they refer to it as essentially a concept of AI for gameplay and design and essentially what it shows is using your voice while you're playing a game to.

[00:45:04] [SPEAKER_01]: And so you know that the example that they gave is I create a multi level you know playing field made out of cardboard boxes and then create a couple of characters and arm them with these.

[00:45:17] [SPEAKER_01]: You know these rifles or whatever and now make it a player versus player you know game where we run through the environment and you know or add a rule on the fly okay you can you can now only kill people if you throw grenade you can't kill people with guns and.

[00:45:33] [SPEAKER_01]: And so it's a demonstration which is essentially means this is not like real time footage it might not even be real it might have been generated by AI to show off the concept.

[00:45:45] [SPEAKER_01]: But it does kind of illustrate this this idea that like you know game studios could potentially get to a point to where they're remixing their own assets their own data libraries their asset libraries so.

[00:45:59] [SPEAKER_01]: 3D assets code game play hours telemetry events all of that used for training to come up with gameplay environments and gameplay scenarios potentially on the fly.

[00:46:13] [SPEAKER_01]: And I don't know what you know at what point that is a possibility maybe it already is maybe some of these tools are capable doing that.

[00:46:21] [SPEAKER_01]: But I think it's important to really point out this was not my understanding is this was not.

[00:46:27] [SPEAKER_01]: Video of a tool that exists even though it really looks like it is and you know you could easily be kind of fooled into that there were a lot of skeptics out there when I was researching you know the kind of this story and everything and a lot of people out there that we're just trash talking to a left right and center and I don't know if that's fear.

[00:46:51] [SPEAKER_00]: And I think it's a little bit of a prior show where I don't understand gaming well enough or game creators but I've got to believe that there are lots of kids out there.

[00:47:07] [SPEAKER_00]: That would love to use tools to make the games that are in their head.

[00:47:13] [SPEAKER_00]: To totally be, I wonder what lessons that teaches.

[00:47:22] [SPEAKER_00]: What is a game? What is the point of a game? What are the ways to get to goals? What are the ways to stop you from getting to goals?

[00:47:32] [SPEAKER_00]: Is there kind of a cynical view of trying to trick people as that a skill you learn to deceive them and doing something? I don't know right but but what are the life skills that you get out of creating the game if the ability to create it collaboratively with the machine were brought to young person.

[00:47:52] [SPEAKER_01]: Yeah, yeah. What it kind of brings up for me is you know I'm not a big gamer now but when I was younger I was big in the gaming you know this we're talking to old school era like Nintendo Sega you know back in the 80s and early 90s prior to that the Congress 64 you know I was big in the gaming on the game.

[00:48:14] [SPEAKER_00]: Uncle Jason and the old time games.

[00:48:15] [SPEAKER_01]: Yeah, you know I oh they're so great. I go back to them and watch a while they're a lot of fun but I remember it's a peaceful myself.

[00:48:24] [SPEAKER_01]: Okay okay now Jeff.

[00:48:28] [SPEAKER_01]: You know it's a very nostalgic thing the games that we grew up playing but I remember really getting into certain games that did allow you to use their tools to create your own levels you know if we're talking Nintendo there was a game called Excite Bike

[00:48:44] [SPEAKER_01]: You have the ability to create levels and essentially you're using the same tool set that the developers used to create your own levels out of all the pieces that already exist there

[00:48:54] [SPEAKER_01]: And this just kind of seems like you know to a certain degree kind of an extension of that potentially I think it would teach

[00:49:04] [SPEAKER_01]: Kids let's say who are using it to create something on the fly. It's very imaginative thing to come up with a you know to come up with an environment and then to immediately see the results of it

[00:49:17] [SPEAKER_01]: Yeah, yeah it's an infrastructure thing is to anticipate someone's behavior to try to manipulate that behavior. Yeah it's all very I think the question though is if you're creating something like this on the fly and creating it with your voice

[00:49:30] [SPEAKER_01]: It can look like a game and it can play like a game, but doesn't have a purpose like a game. Yes, right is there a purpose there is there a destination or a long

[00:49:41] [SPEAKER_01]: You know beginning to end story arc that has a payoff at the end and those are things that I think systems like this will be less adept at creating not saying eventually it might happen maybe but I see that as being the real downfall.

[00:49:58] [SPEAKER_01]: Yep, so anyways and I was going to show off shooting. Yeah, yeah, yeah totally exactly right like the mechanics are there sure but like why am I playing what is what is the ultimate goal.

[00:50:12] [SPEAKER_01]: And then I just thought this was neat because you know talking about retro gameplay stuff super Mario 64 on the Nintendo 64 we talked about runway ML as with their new video to video generation capabilities.

[00:50:27] [SPEAKER_01]: So you can take a source which is this gameplay footage from the Nintendo 64 and then you can run it through runway ML and turn it into kind of basically up res the quality and turn it into all these different kind of ways that the game suddenly is this is almost shot with like a low quality camcorder and looks like a claymation character a little bit later.

[00:50:54] [SPEAKER_01]: You know it becomes a kupa king kupa done in the yarn and it just makes me this looking at this makes me think okay we are nowhere near this being a real time operation at all.

[00:51:08] [SPEAKER_01]: We're talking probably seconds to minutes for a per frame but wow if we get to the point to where it would be this easy to reimagine you know some of some of these old games that would be a really unique way to play through some of this stuff so I don't know from imaginative standpoint this this stuff is very fascinating to me.

[00:51:28] [SPEAKER_00]: Yeah and a world of giving it commands it's almost like a batch processing.

[00:51:34] [SPEAKER_00]: Yeah go do this but this isn't just one frame or one thing you're giving it a command to you know move this character from here to here and everything in between it'll figure out maybe it takes a minute to come back.

[00:51:46] [SPEAKER_00]: But it did a tremendous number of functions to get what to use seems to be one command.

[00:51:52] [SPEAKER_01]: One thing yeah yeah for sure.

[00:51:55] [SPEAKER_01]: And then finally Hollywood embracing generative AI in three parts it turns out I found a bunch of stories that really fell into this part one Andy circus who was got them.

[00:52:06] [SPEAKER_01]: The star in Lord of the Rings.

[00:52:10] [SPEAKER_01]: Teased a new project for a imaginarium that features AI characters quote working on a narrative driven story that begins with 2D characters created using voice actors and then these characters will then come out into the augmented reality world as AI characters.

[00:52:29] [SPEAKER_01]: author by artists and directors and I'm having a hard time visualizing the concept really understanding kind of what that all means but I'm sure this isn't the only you know person in Hollywood that's considering this.

[00:52:45] [SPEAKER_00]: It makes me fear the ridiculous NFTs and Gary Vaynerchuk and then we made a character but what does it mean you know we made that one is not the lumberman whatever.

[00:52:57] [SPEAKER_00]: So you did that's not what it's all about is just thinking you name something and drew it or put it in the script characters are about interaction with people so we'll see.

[00:53:10] [SPEAKER_01]: Yeah yeah we'll see.

[00:53:13] [SPEAKER_01]: Secondly I think this is actually really interesting news lion skate partnering up with runway ML and runway AI essentially they're generating an AI model that's trained on Lionsgate.

[00:53:31] [SPEAKER_01]: They're content library essentially so they're considering offering custom trained models as templates for individual creators but essentially they're using you runways video generation model is really impressive and they're all you know having all of these great gains and often when I see some of the stuff that's generated.

[00:53:51] [SPEAKER_01]: I think oh wow if these tools were somehow integrated into you know that the tools that CGI artists use already just you know I imagine that it's going to do some really interesting things in Hollywood and now we're seeing a pretty big partnership Lionsgate is a very you know major film studio.

[00:54:12] [SPEAKER_01]: That is essentially you know going in with runway to kind of supercharge their own content library and essentially you know I'm assuming create great tools that we'll see in future films.

[00:54:26] [SPEAKER_01]: Jumping right into the AI world.

[00:54:29] [SPEAKER_00]: Yeah I think on the one hand training with with an artist who and understanding their their structure of what they do and the vision of it I think is a powerful tool but I also imagine how could be misused.

[00:54:47] [SPEAKER_00]: You know you make the the the the porn version of.

[00:54:51] [SPEAKER_00]: All right something and.

[00:54:56] [SPEAKER_00]: You're you let loose.

[00:54:59] [SPEAKER_00]: Potentially let loose of the grammar you created the vision you created so I don't know that's it's a little sobering.

[00:55:07] [SPEAKER_01]: Yeah are you talking about the the custom train models for individual creators that's sort of.

[00:55:12] [SPEAKER_01]: Yeah, I mean I have to imagine you know especially with the licensing.

[00:55:16] [SPEAKER_01]: I think that's a very interesting aspect of this man I have to imagine they'd have that tightened down to where you couldn't do like a porn version of.

[00:55:24] [SPEAKER_01]: Terminator or whatever that would go I'm not sure I don't even know the terminators Alliance Gate property but.

[00:55:31] [SPEAKER_01]: Yeah I would I would hope that they'd you know it's kind of like the music industry and music generation and you know we've heard them talk about wanting.

[00:55:41] [SPEAKER_01]: Not necessarily to shut down the kind of AI impersonation of if they're artists entirely like do that for unlicensed uses but we want to be the ones to bring the tool out that allows you to do that so that our artists and so that we make the money and this kind of.

[00:56:00] [SPEAKER_01]: And it gives me the same vibes is kind of like we you know we we can offer our take on these video generation tools and you know we're the ones to benefit because we've curated the content or we've curated the data set.

[00:56:14] [SPEAKER_01]: And we're controlling it in the way that we feel is comfortable you know.

[00:56:20] [SPEAKER_01]: Yeah and then finally James Cameron yes avatar terminator this probably why throughout Terminator Titanic director he's actually over the years throughout the decades been very protect when it comes to Hollywood technology.

[00:56:35] [SPEAKER_01]: Remember you know Terminator 2 had had that T 1000 that was made out of liquid and the abyss was an early one that was really pushing the boundaries of CGI early on.

[00:56:46] [SPEAKER_01]: And he was doing things that other people just weren't doing at that time now it's very ubiquitous now he's joining stability a.

[00:56:52] [SPEAKER_01]: He's board of directors so you know he says now the intersection of generative AI and CGI image creation is the next wave so that's the bet that he's making.

[00:57:03] [SPEAKER_01]: And you know he's of course making all these other avatar movies so I do wonder if some of the avatar movies are going to include some of the fruits of this partnership and he's I guarantee you he's exploring it.

[00:57:17] [SPEAKER_01]: So it'll be interesting to see how this stuff starts appearing on the screen and the push back of it as well.

[00:57:24] [SPEAKER_00]: Cameron's a little what do I say full of himself nothing is never sure to be but I can be a form of you so you're going to be.

[00:57:33] [SPEAKER_01]: Yeah fun times yeah he's he's definitely full of himself he's he's got a quite a resume in a behind him as well so.

[00:57:43] [SPEAKER_01]: I can kind of understand where he's coming from but yes indeed that's what I've heard too.

[00:57:48] [SPEAKER_01]: Well we reached the end of this episode of AI inside always a lot of fun always learned so much about the world of AI as we go through it together Jeff I appreciate you and I appreciate your perspective on all this.

[00:57:59] [SPEAKER_01]: Gutenberg parenthesis dot com for people to find what you're up to.

[00:58:04] [SPEAKER_01]: Anything else you want to leave people with?

[00:58:06] [SPEAKER_00]: If you if you Google Jarvis the web we leave the book I know have the hard back the book will be out October 8 and you can pre order it with the code web 20 on has shuts on site.

[00:58:20] [SPEAKER_01]: There we go that's where you go has shit book group dot com or just yeah Google the web web web.

[00:58:26] [SPEAKER_01]: Jarvis and you'll find the top link it takes you right there what is the code again web 20 cop web.

[00:58:33] [SPEAKER_01]: You can go check it out wonderful thank you Jeff.

[00:58:40] [SPEAKER_01]: Everything you need to know about this show can be found at our site and you can get subscribe links for the for the podcast or your choice give video versions of the show.

[00:58:49] [SPEAKER_01]: Get live recording details can all be found at AI inside dot show if you do love what we're doing here.

[00:58:56] [SPEAKER_01]: Please leave us a review and apple podcast that really really helps I'd love to get some new fresh reviews in there since we've been doing this for a while and you know it's always good to kind of get current up to date reviews in there that reflect how things are now versus a couple of months ago that sort of thing so go and apple podcast it would really help us out.

[00:59:17] [SPEAKER_01]: And finally if you really really love this show you can support us on patreon.

[00:59:23] [SPEAKER_01]: That's patreon dot com slash AI inside show you get ad free versions of the of the show every week get a discord community we do regular hangouts you can get an AI inside t-shirt by becoming an executive producer maybe you're making a statement about the AI that's inside you when you're wearing it I don't know.

[00:59:43] [SPEAKER_01]: But the executive producers of this show include Dr. Do Jeffrey Marachini WPVM 103.7 in National North Carolina Paul Lang and our newest executive producer Ryan Newell that's amazing thank you.

[00:59:59] [SPEAKER_01]: I appreciate you. Great to have all five fingers on one hand who filled with executive producers now we got to start on the other hand.

[01:00:09] [SPEAKER_01]: It's never enough you see how this works anyway thank you so much for your support we appreciate you all so incredibly for your support of what we're doing with the i inside and we look forward to doing another episode next Wednesday.

[01:00:22] [SPEAKER_01]: We'll see you then on another episode of AI inside fire buddy take care of yourself.