Jeff Jarvis and Jason Howell examine the shifting landscape of AI partnerships as Microsoft and OpenAI negotiate their future, while discussing Meta's latest language model and NVIDIA's entry into the "Open-ISH" AI arena.
🔔 Support the show on Patreon!
Note: Time codes subject to change depending on dynamic ad insertion by the distributor.
NEWS
0:07:46 - Meta Introduces Spirit LM open source model that combines text and speech inputs/outputs
0:12:47 - Nvidia just dropped a new AI model that crushes OpenAI’s GPT-4—no big launch, just big results
0:21:58 - Microsoft introduces ‘AI employees’ that can handle client queries
0:23:00 - Marc Benioff says Microsoft rebranding Copilot as AI 'agents' shows they're in 'panic mode'
0:29:44 - Motorola reveals ambitious upcoming Moto AI features for its phones
0:37:28 - The $14 Billion Question Dividing OpenAI and Microsoft
0:40:45 - Elon Musk, Tesla and WBD sued over alleged ‘Blade Runner 2049’ AI ripoff for Cybercab promotion
0:51:00 - Sotheby's to auction its first artwork made by a humanoid robot
[00:00:02] [SPEAKER_00]: Wir bei Vertex wissen, dass die Geschwindigkeit des globalen Handels zunimmt, was eine Steuerverwaltung komplexer macht.
[00:00:08] [SPEAKER_00]: Und Ihre unternehmenseigenen Systeme sind nicht dafür ausgelegt, diese Steuerkomplexität zu bewältigen.
[00:00:14] [SPEAKER_00]: Hier setzen wir mit unserer Plattform an, die Continuous Compliance ermöglicht.
[00:00:19] [SPEAKER_00]: So erhalten Sie mehr Transparenz, mehr Genauigkeit und mehr Vertrauen in Ihre Steuerdaten.
[00:00:25] [SPEAKER_00]: Sie wollen mehr über Continuous Compliance erfahren? Dann besuchen Sie vertexinc.com.de
[00:00:35] [SPEAKER_02]: This is AI Inside, Episode 40, recorded Wednesday, October 23rd, 2024.
[00:00:40] [SPEAKER_02]: The AI Agent Era Has Begun.
[00:00:45] [SPEAKER_02]: This episode of AI Inside is made possible by our wonderful patrons at patreon.com slash AI Inside Show.
[00:00:52] [SPEAKER_02]: If you like what you hear, head on over and support us directly.
[00:00:55] [SPEAKER_02]: And thank you for making independent podcasting possible.
[00:01:04] [SPEAKER_02]: What's going on, everybody?
[00:01:06] [SPEAKER_02]: Welcome to another episode of AI Inside,
[00:01:08] [SPEAKER_02]: the show where we take a look at the AI that's layered inside of so many things in the world of technology.
[00:01:14] [SPEAKER_02]: Today, with an extra dose of agents.
[00:01:17] [SPEAKER_02]: I'm one of your hosts, Jason Howell, joined as always by Jeff Jarvis.
[00:01:21] [SPEAKER_02]: We're not real, we're just agents.
[00:01:23] [SPEAKER_02]: Yes.
[00:01:23] [SPEAKER_02]: You can tell us what to do.
[00:01:24] [SPEAKER_02]: My co-agent.
[00:01:25] [SPEAKER_02]: Yes.
[00:01:26] [SPEAKER_02]: You are my co-agent on the show.
[00:01:29] [SPEAKER_02]: Good to see you, Jeff.
[00:01:30] [SPEAKER_02]: Good to see you, as ever.
[00:01:32] [SPEAKER_02]: Yeah, we've got a lot of news.
[00:01:34] [SPEAKER_02]: I love putting this show together because sometimes these themes kind of fall into place.
[00:01:39] [SPEAKER_02]: Yes.
[00:01:40] [SPEAKER_02]: There's, as we're alluding to, the kind of this theme of agentry in AI.
[00:01:45] [SPEAKER_02]: It's not like we haven't talked about that before,
[00:01:47] [SPEAKER_02]: but this week is a really good kind of representation of where that is headed.
[00:01:52] [SPEAKER_02]: And I'm looking forward to talking about that with you and sharing a little bit about the news there.
[00:01:58] [SPEAKER_02]: But before we get there, just want to call out a big thank you to those of you who support us on Patreon.
[00:02:04] [SPEAKER_02]: Patreon.com slash AI Inside Show.
[00:02:06] [SPEAKER_02]: You too can support us.
[00:02:08] [SPEAKER_02]: And we would so appreciate it because it helps us do the show each and every week.
[00:02:12] [SPEAKER_02]: Susan Barrett is, sorry, Susan Barrett Price is one of our supporters on Patreon.
[00:02:19] [SPEAKER_02]: And Susan, we appreciate you.
[00:02:20] [SPEAKER_02]: Thank you for being there for us.
[00:02:23] [SPEAKER_02]: And also, if you are catching this live, as many of you seem to be doing each and every week,
[00:02:29] [SPEAKER_02]: I encourage you to go to AIinside.show and subscribe to the podcast.
[00:02:34] [SPEAKER_02]: So if you miss it live, you won't miss it for good.
[00:02:37] [SPEAKER_02]: You'll be able to catch what we have to say and talk about in the weeks ahead.
[00:02:42] [SPEAKER_02]: And you won't even really have to do anything about it.
[00:02:43] [SPEAKER_02]: It'll just download to your device.
[00:02:45] [SPEAKER_02]: You know how the magic of podcasts works.
[00:02:47] [SPEAKER_02]: So AIinside.show.
[00:02:48] [SPEAKER_02]: So with that out of the way, let's talk some news this week.
[00:02:53] [SPEAKER_02]: And we'll go ahead and start with Meta.
[00:02:58] [SPEAKER_02]: Because I saw this article late last week, articles talking about Meta being under fire for polluting open source.
[00:03:08] [SPEAKER_02]: The open source initiative, OSI, criticized Meta for labeling its AI models as open source.
[00:03:15] [SPEAKER_02]: They're arguing that Meta isn't releasing something that is truly open, only releasing model weights.
[00:03:22] [SPEAKER_02]: But what does that mean?
[00:03:24] [SPEAKER_02]: That isn't open.
[00:03:26] [SPEAKER_02]: It's keeping the training data, the algorithms private.
[00:03:30] [SPEAKER_02]: It doesn't line up, according to OSI, with their open source criteria.
[00:03:36] [SPEAKER_02]: And that includes usage restrictions that they say violate traditional open source principles.
[00:03:42] [SPEAKER_02]: So I thought that was interesting just because we keep talking about open source and kind of congratulating or at least giving Meta a pat on the back for doing the open source method.
[00:03:53] [SPEAKER_02]: But not truly open source, I guess, but open-er.
[00:03:56] [SPEAKER_01]: We need – and this fits into away from AI where there's been quite the kerfuffle at Automatic and WordPress about open source, contributing to open source, who is truly part of it, who can use the brand, and so on and so forth.
[00:04:09] [SPEAKER_01]: I think we need a new nomenclature here.
[00:04:11] [SPEAKER_01]: Because what we praise Meta for is making things available for free, making them open, not necessarily open source.
[00:04:21] [SPEAKER_01]: Yeah, no, you're right.
[00:04:22] [SPEAKER_01]: And I think that's a contribution to the world.
[00:04:27] [SPEAKER_01]: You look at what's being built on Lama, including NVIDIA, just built something on top of Lama, we'll talk about later.
[00:04:33] [SPEAKER_01]: And the ability for universities and researchers and just plain old folk who know what they're doing to take a Lama and do things with it, I think is a positive contribution to the ecosystem in AI.
[00:04:44] [SPEAKER_01]: And so I think it's worthy to praise.
[00:04:46] [SPEAKER_01]: But the open source people are right.
[00:04:48] [SPEAKER_01]: It is not fully open source.
[00:04:50] [SPEAKER_01]: And the question I would have, Jason, is should it be?
[00:04:55] [SPEAKER_01]: If you put all of that data and everything out in that way, I don't know.
[00:05:01] [SPEAKER_01]: There are those who oppose open source in AI because they say that bad actors can come along and get around guardrails and get around other things.
[00:05:11] [SPEAKER_01]: And just all kinds of other things.
[00:05:12] [SPEAKER_01]: They can do whatever they want it to.
[00:05:13] [SPEAKER_01]: Right.
[00:05:14] [SPEAKER_01]: So do you really want AI to be fully open source?
[00:05:17] [SPEAKER_01]: I don't know what the answer to that is.
[00:05:19] [SPEAKER_01]: And I don't know enough to have an answer on my own.
[00:05:21] [SPEAKER_01]: But I think it's an interesting question.
[00:05:23] [SPEAKER_01]: There's probably a dial here of openness.
[00:05:26] [SPEAKER_01]: And where is the right – because on the one hand, if they opened up fully, I can just hear people saying, oh, my God, Russian trolls will do terrible things and make up bioweapons, right?
[00:05:38] [SPEAKER_01]: Yep.
[00:05:39] [SPEAKER_01]: Which they can do anyway.
[00:05:40] [SPEAKER_01]: No question.
[00:05:42] [SPEAKER_01]: So I think in the discussion of safety, as we had last week with Microsoft – pardon me – and that was a great discussion, by the way, with Sarah Bird.
[00:05:51] [SPEAKER_01]: She talked about what they opened up at Microsoft, too.
[00:05:53] [SPEAKER_01]: So I would love to see more discussion about the proper definitions of openness.
[00:06:01] [SPEAKER_02]: Yeah.
[00:06:02] [SPEAKER_02]: Yeah, nomenclature.
[00:06:03] [SPEAKER_02]: I mean, like you say, the industry as a whole really seeking consensus around what open source AI actually means.
[00:06:11] [SPEAKER_02]: Google and Microsoft stopped using the term open source for partially open models.
[00:06:17] [SPEAKER_02]: Mistral actually uses the term open weight, which is maybe a little bit more accurate.
[00:06:24] [SPEAKER_02]: It doesn't mean much to most people, but is this a discussion for most people?
[00:06:29] [SPEAKER_02]: Right.
[00:06:29] [SPEAKER_02]: And I think maybe that's at the core of this is at least there's some common understanding of what open source means, you know, the openness of code versus the closed off now, you know, you get what we give you sort of approach.
[00:06:43] [SPEAKER_02]: And this is different in that it is distributed freely and openly in the sense that anybody can use it and get in there and use it for whatever they want to.
[00:06:55] [SPEAKER_02]: But it's not quite the same.
[00:06:57] [SPEAKER_02]: It's not like here's everything that constitutes what this is made of and do what you will with it within the confines of the license.
[00:07:05] [SPEAKER_01]: Which I think is really important.
[00:07:06] [SPEAKER_01]: Pardon me.
[00:07:06] [SPEAKER_01]: That a lot of the value of open source is the transparency yields authenticity and accountability.
[00:07:17] [SPEAKER_01]: I see how this was made.
[00:07:19] [SPEAKER_01]: When you see things talk about security, if I can see how it's made in full, I mean, I wouldn't know what I'm looking at, but people who know what they're looking at, then that's part of the advantage of open source is that it leads to that kind of accountability.
[00:07:30] [SPEAKER_01]: In this case, that's true.
[00:07:32] [SPEAKER_01]: It's not.
[00:07:32] [SPEAKER_01]: You don't know how it's made.
[00:07:33] [SPEAKER_01]: You don't have all the stuff that goes into it.
[00:07:35] [SPEAKER_01]: Even if you did, could you really tell?
[00:07:37] [SPEAKER_01]: I don't know.
[00:07:38] [SPEAKER_01]: But yeah, maybe we should invent some new term here on the show.
[00:07:42] [SPEAKER_01]: I don't know.
[00:07:42] [SPEAKER_01]: We'll figure it out next week.
[00:07:44] [SPEAKER_02]: Yeah.
[00:07:45] [SPEAKER_02]: And what we should do is we should invent it right off the cuff and then just commit to it, whatever it happens to be.
[00:07:50] [SPEAKER_02]: Yeah.
[00:07:50] [SPEAKER_02]: And I have no idea what that is.
[00:07:52] [SPEAKER_02]: Now I feel under pressure to come up with something creative.
[00:07:56] [SPEAKER_02]: How about just opener?
[00:07:59] [SPEAKER_02]: Open-ish.
[00:08:00] [SPEAKER_02]: No, open-ish.
[00:08:00] [SPEAKER_02]: I like an open-ish.
[00:08:02] [SPEAKER_02]: That's it.
[00:08:02] [SPEAKER_02]: That's it.
[00:08:03] [SPEAKER_02]: That's an open-ish source.
[00:08:07] [SPEAKER_02]: And speaking of meta, I should also mention here because it kind of ties into this spirit LM, which Halloween is right around the corner.
[00:08:15] [SPEAKER_02]: So I wonder if they timed this specifically with that.
[00:08:20] [SPEAKER_02]: But open-source or open-ish in our new definition, multimodal language model text and speech integration.
[00:08:27] [SPEAKER_02]: This is meta's challenger to GPT-4.0.
[00:08:31] [SPEAKER_02]: There are two versions.
[00:08:33] [SPEAKER_02]: There's spirit LM base, which has phonetic tokens for basic speech processing and generation.
[00:08:40] [SPEAKER_02]: Then spirit LM expressive, which includes pitch and style tokens, which basically enables the capture of emotional tones, things like excitement, anger, surprise.
[00:08:52] [SPEAKER_02]: You know, we're going to have some real emotive agents in the future as a result of things like this.
[00:08:58] [SPEAKER_02]: It can capture essentially and reproduce very nuanced emotional states in voices, which we saw a little bit of that in the voice mode demonstration many months ago from open AI.
[00:09:11] [SPEAKER_02]: So, you know, this is an entirely new, but this is meta's kind of open-ish version of it.
[00:09:17] [SPEAKER_01]: Yeah.
[00:09:17] [SPEAKER_01]: And I think what that – we've talked about this on the show before in relation to audiobooks.
[00:09:20] [SPEAKER_01]: Audiobooks, that if you – I think we see something coming forward where we'll have a markup language for voice and emotion.
[00:09:31] [SPEAKER_01]: Right.
[00:09:31] [SPEAKER_01]: So that you can say, okay, read this book, but there's a joke.
[00:09:35] [SPEAKER_01]: That's a serious part.
[00:09:38] [SPEAKER_01]: Go slow here.
[00:09:40] [SPEAKER_01]: This is an explanation.
[00:09:41] [SPEAKER_01]: I could see a markup language where you have – you're giving the system cues about how it should give you back voice.
[00:09:49] [SPEAKER_02]: Yeah.
[00:09:50] [SPEAKER_02]: If you need – if that instruction is needed and can't be inferred by the system, which I think is –
[00:09:56] [SPEAKER_02]: Which often are largely what they're –
[00:09:57] [SPEAKER_01]: Yeah.
[00:09:58] [SPEAKER_01]: I still don't have an audiobook deal for The Web We Weave, my new book, which is pissing me off because friends say, oh, yeah, I'll listen to it.
[00:10:06] [SPEAKER_01]: I said, well, no, not unless I read it to you.
[00:10:08] [SPEAKER_01]: You're not going to.
[00:10:09] [SPEAKER_01]: I know.
[00:10:10] [SPEAKER_01]: And you could take away a lot of expense.
[00:10:12] [SPEAKER_01]: I would rather do it myself.
[00:10:13] [SPEAKER_01]: I did it for free for the Gutenberg parenthesis, but they still had a producer.
[00:10:18] [SPEAKER_01]: They still had studio time.
[00:10:19] [SPEAKER_01]: It's an investment.
[00:10:21] [SPEAKER_01]: So if you could take away that investment and use these tools and they get pretty darn good at it, then, yeah.
[00:10:29] [SPEAKER_01]: Our dear friend Ant Prod is saying that's why voiceover artists are not happy about AI.
[00:10:35] [SPEAKER_01]: And you're right, Ant.
[00:10:38] [SPEAKER_01]: The agents are coming for them.
[00:10:40] [SPEAKER_02]: Yeah, yeah.
[00:10:42] [SPEAKER_02]: I mean I realize this every day and we have definitely talked about this before.
[00:10:49] [SPEAKER_02]: But along that line, like transcribers, human transcribers used to be – I mean still are a thing.
[00:10:54] [SPEAKER_02]: It's like they still do exist.
[00:10:56] [SPEAKER_02]: But if you needed transcription on something and you couldn't do it yourself, you had to hire someone.
[00:11:04] [SPEAKER_02]: And now it's literally like become completely free in a million different directions and it's good enough for most people.
[00:11:14] [SPEAKER_01]: Did I bore you on this show with my rabbit hole about phonography?
[00:11:20] [SPEAKER_01]: Phonography.
[00:11:21] [SPEAKER_01]: I believe you did.
[00:11:22] [SPEAKER_02]: I did.
[00:11:22] [SPEAKER_02]: Okay.
[00:11:23] [SPEAKER_01]: Not too long ago.
[00:11:24] [SPEAKER_01]: It was that stenographers were – even before you could record things, stenographers were the only way you could record things.
[00:11:30] [SPEAKER_01]: And they were called phonographers and thus the phonograph came from that root, right?
[00:11:35] [SPEAKER_01]: So the funny thing was the stenographers were eager to be replaced by the machinery or be eager to be aided by the machinery because it was really wearing to use shorthand and get things wrong and it was slow and it was expensive.
[00:11:50] [SPEAKER_01]: So they were looking for these aids to help them.
[00:11:53] [SPEAKER_01]: But now transcribers are just gone.
[00:11:56] [SPEAKER_01]: Yep.
[00:11:56] [SPEAKER_02]: Yeah.
[00:11:57] [SPEAKER_01]: It's a horrible job transcribing.
[00:12:00] [SPEAKER_02]: Oh, I mean I've had to do manual transcription throughout my years in media at times.
[00:12:06] [SPEAKER_02]: And yeah, I absolutely hated it.
[00:12:08] [SPEAKER_02]: Absolutely hated it.
[00:12:10] [SPEAKER_02]: But, you know, it's just – it takes a long time.
[00:12:13] [SPEAKER_02]: The kind of – the focus and attention that you need to make sure that you get it right, the millions of like pause, go back, listen to that over again.
[00:12:21] [SPEAKER_02]: Oh, it was dreadful.
[00:12:23] [SPEAKER_02]: But, you know, for some people, that's the bread and butter.
[00:12:26] [SPEAKER_02]: And yeah, that's –
[00:12:27] [SPEAKER_02]: So one more little –
[00:12:28] [SPEAKER_02]: Technology does that.
[00:12:29] [SPEAKER_01]: It's just a part of technology.
[00:12:31] [SPEAKER_01]: I'll go back to another branch of my rabbit hole.
[00:12:33] [SPEAKER_01]: So when the phonograph was invented by – first by Edison and then by another group and they kind of merged, there was no amplifier yet.
[00:12:44] [SPEAKER_01]: Either way, right?
[00:12:45] [SPEAKER_01]: So it was just – it was the cone.
[00:12:47] [SPEAKER_01]: It was just that cone.
[00:12:48] [SPEAKER_01]: The passive cone.
[00:12:49] [SPEAKER_01]: The way it was used at first is if you were in a court case, the stenographer would now sit there with a cone over the microphone and say, witness howl says – and then would repeat you word for word into this.
[00:13:05] [SPEAKER_01]: And then they'd take the tube in the Edison way and a stenographer would then spend time separately, a cheaper person, usually a woman, to retype it.
[00:13:16] [SPEAKER_02]: Well, speaking of, we've been kind of alluding to NVIDIA in a couple of different ways.
[00:13:21] [SPEAKER_02]: So we might as well talk about their new open source or open-ish – you know, that's going to be in the title somehow today for sure.
[00:13:29] [SPEAKER_02]: Open-ish LLM called Lama 3.1 Nemotron 70B Instruct.
[00:13:34] [SPEAKER_02]: That just rolls off the tongue, doesn't it?
[00:13:36] [SPEAKER_02]: Blah.
[00:13:37] [SPEAKER_02]: It just rolls right off the tongue.
[00:13:41] [SPEAKER_02]: And we can actually play around with it if we would like to.
[00:13:45] [SPEAKER_02]: But the model is kind of turning some heads right now.
[00:13:50] [SPEAKER_02]: It's achieving top scores on alignment benchmarks.
[00:13:52] [SPEAKER_02]: It's beating GPT-4-0, CLOD 3.5 Sonnet in many metrics, not all but many.
[00:13:58] [SPEAKER_02]: It's also a much smaller model at 70 billion parameters.
[00:14:03] [SPEAKER_02]: But, you know, essentially competing with the biggies can be found on hugging face.
[00:14:10] [SPEAKER_02]: And, yeah, you put this in here.
[00:14:12] [SPEAKER_02]: What were your thoughts before I kind of crack it open and we take a look at it?
[00:14:16] [SPEAKER_01]: So I'm just fascinated by NVIDIA's role in the industry.
[00:14:21] [SPEAKER_01]: Because in a B2B world, you would not think that NVIDIA would be competing with its customers and making software.
[00:14:31] [SPEAKER_01]: And using the open-source model from a customer to make its own software.
[00:14:37] [SPEAKER_01]: I don't know how that's being seen.
[00:14:38] [SPEAKER_01]: Is it just, well, we're contributing to progress and everybody, that's good?
[00:14:43] [SPEAKER_01]: Or everybody in the pool.
[00:14:45] [SPEAKER_01]: We can't complain about NVIDIA because we want their damn chips, so shush.
[00:14:49] [SPEAKER_01]: But we wish they wouldn't do this.
[00:14:51] [SPEAKER_01]: I have no idea what people think about this.
[00:14:53] [SPEAKER_01]: But I found it interesting that they're now releasing models to the public.
[00:14:56] [SPEAKER_01]: They released it in a very geeky way.
[00:14:59] [SPEAKER_02]: I mean, it's super geeky.
[00:15:01] [SPEAKER_02]: Yeah.
[00:15:01] [SPEAKER_02]: Yeah, take a look at the kind of the interface here.
[00:15:05] [SPEAKER_02]: I'm trying to expand it so it looks better on the video version if you're watching this on YouTube.
[00:15:09] [SPEAKER_01]: They call it Llama, but in the right there it says from OpenAI, client equals OpenAI.
[00:15:15] [SPEAKER_01]: I'm not sure what that means.
[00:15:16] [SPEAKER_01]: I have no idea.
[00:15:18] [SPEAKER_01]: But you can –
[00:15:19] [SPEAKER_01]: From OpenAI, import OpenAI.
[00:15:20] [SPEAKER_01]: It's like any other model.
[00:15:21] [SPEAKER_01]: I used it earlier to – I said, give me a blog post justifying the position that AI companies should be able to read and learn from anything they can download on the internet.
[00:15:43] [SPEAKER_02]: And I have entered that in.
[00:15:45] [SPEAKER_02]: We'll go ahead and hit set.
[00:15:47] [SPEAKER_02]: I realized you were acting a stenographer.
[00:15:49] [SPEAKER_02]: Yes, exactly.
[00:15:51] [SPEAKER_02]: I can type pretty fast when I need to.
[00:15:54] [SPEAKER_02]: Generating, taking a little bit of time.
[00:15:55] [SPEAKER_02]: I do like the interface.
[00:15:57] [SPEAKER_02]: It gives you kind of like the Python code on the right-hand side of the screen so you can kind of see, I guess, what the code looks like.
[00:16:04] [SPEAKER_01]: You can see view parameters below.
[00:16:07] [SPEAKER_01]: Below that, it's a pulldown.
[00:16:09] [SPEAKER_02]: Oh.
[00:16:10] [SPEAKER_02]: Oh, yeah.
[00:16:11] [SPEAKER_02]: I have no idea what that means with this.
[00:16:12] [SPEAKER_01]: Yeah, tokens and temperature and –
[00:16:15] [SPEAKER_02]: Yeah.
[00:16:16] [SPEAKER_02]: I know.
[00:16:16] [SPEAKER_02]: These controls, I just want to come in here and start moving things around, but I have no idea what they're actually going to do.
[00:16:23] [SPEAKER_02]: Maybe they would make it quicker because it's still working on my –
[00:16:26] [SPEAKER_01]: Yeah, I think maybe people have discovered it.
[00:16:29] [SPEAKER_01]: On my prompt.
[00:16:30] [SPEAKER_01]: So anyway, so I did that, and then it came back with a decent thing.
[00:16:33] [SPEAKER_01]: And then I said, now argue the opposite.
[00:16:36] [SPEAKER_01]: And it did a decent thing.
[00:16:37] [SPEAKER_01]: Oh, okay.
[00:16:37] [SPEAKER_01]: It was really interesting.
[00:16:38] [SPEAKER_01]: Now, I'm not sure that I should do the same thing on the other models and see what it does.
[00:16:47] [SPEAKER_01]: Yeah.
[00:16:47] [SPEAKER_02]: Always good to compare them.
[00:16:49] [SPEAKER_01]: But the reason to put this in the rundown, I think, was mainly because it's NVIDIA's role with their customers slash competitors.
[00:16:59] [SPEAKER_01]: Will they sell this?
[00:17:00] [SPEAKER_01]: Yeah.
[00:17:00] [SPEAKER_01]: Is it just to do things?
[00:17:02] [SPEAKER_01]: I don't know.
[00:17:03] [SPEAKER_02]: Yeah.
[00:17:04] [SPEAKER_02]: I mean, that's always an interesting question, right?
[00:17:06] [SPEAKER_02]: Like the thing that immediately comes to mind for me is Google has its Android operating system, and it's really largely created from day one to get as many partners and different companies building on top of the platform.
[00:17:22] [SPEAKER_02]: And then at a certain point, Google is like, okay, well, we're going to do hardware too.
[00:17:25] [SPEAKER_02]: And that's been the ongoing conversation.
[00:17:27] [SPEAKER_02]: You're right.
[00:17:28] [SPEAKER_02]: Everybody's in everybody's business.
[00:17:29] [SPEAKER_02]: Yeah.
[00:17:29] [SPEAKER_02]: Yeah.
[00:17:30] [SPEAKER_02]: That's a good point.
[00:17:30] [SPEAKER_02]: Yeah, yeah.
[00:17:31] [SPEAKER_02]: Like I think the fear in that situation was, oh, my goodness, is Google going to upset its partners?
[00:17:38] [SPEAKER_02]: And in the end, I mean, there have been little squibbles here and there, but it hasn't really impacted things dramatically.
[00:17:43] [SPEAKER_02]: It's just another, you know, but mind you, Google isn't a major, major mover of, you know, of things in the smartphone world from its hardware division.
[00:17:54] [SPEAKER_02]: It's not selling numbers nearly like it is some of its competitors and partners.
[00:17:58] [SPEAKER_02]: But anyways.
[00:17:59] [SPEAKER_02]: So what's happening now is it comes up a lot.
[00:18:02] [SPEAKER_02]: But very slowly.
[00:18:04] [SPEAKER_02]: Wow.
[00:18:04] [SPEAKER_02]: Yeah.
[00:18:05] [SPEAKER_02]: Pretty slowly.
[00:18:06] [SPEAKER_02]: This is like 300 bod slow.
[00:18:08] [SPEAKER_02]: Yeah.
[00:18:08] [SPEAKER_02]: 300 bod slow, which takes me back.
[00:18:12] [SPEAKER_02]: But it wrote a headline.
[00:18:14] [SPEAKER_02]: You know, why.
[00:18:15] [SPEAKER_02]: If it'll let me look at it.
[00:18:17] [SPEAKER_01]: Something about how why AI companies should be unfettered.
[00:18:21] [SPEAKER_01]: Yeah.
[00:18:21] [SPEAKER_01]: Then it has argument.
[00:18:22] [SPEAKER_01]: It keeps readjusting.
[00:18:23] [SPEAKER_01]: It lists arguments.
[00:18:24] [SPEAKER_01]: It's almost the same exact thing for me.
[00:18:26] [SPEAKER_01]: Argument one, accelerated innovation and competitiveness.
[00:18:28] [SPEAKER_01]: Yeah, yeah, yeah, yeah, yeah.
[00:18:30] [SPEAKER_01]: Global leadership.
[00:18:31] [SPEAKER_02]: Was it this slow at generating?
[00:18:32] [SPEAKER_02]: No.
[00:18:32] [SPEAKER_02]: It was very fast.
[00:18:33] [SPEAKER_02]: For you?
[00:18:34] [SPEAKER_02]: No.
[00:18:34] [SPEAKER_02]: Yeah.
[00:18:34] [SPEAKER_02]: There must be, I'm guessing, more awareness, more people hitting it up and seeing what they can get out of it and playing around.
[00:18:42] [SPEAKER_01]: So I'm curious on the right, Jason.
[00:18:43] [SPEAKER_01]: It has three tabs, Python, Node, and Shell.
[00:18:48] [SPEAKER_02]: Yeah.
[00:18:48] [SPEAKER_01]: So that's just different.
[00:18:49] [SPEAKER_02]: Python, which we were looking at, Node.
[00:18:52] [SPEAKER_02]: Then you've got Shell.
[00:18:53] [SPEAKER_02]: Yeah.
[00:18:54] [SPEAKER_02]: I mean, this, and you can copy the code.
[00:18:56] [SPEAKER_02]: You can get the API key.
[00:18:58] [SPEAKER_02]: It says, this is all stuff that I feel like is a little bit out of my depth.
[00:19:03] [SPEAKER_02]: You know, when we start getting into the code aspect of things, I'm sure someone else.
[00:19:06] [SPEAKER_01]: Between this and the parameters, it gives you something to play with.
[00:19:09] [SPEAKER_01]: Which will be interesting.
[00:19:10] [SPEAKER_01]: Yeah.
[00:19:10] [SPEAKER_01]: Folks out there.
[00:19:11] [SPEAKER_02]: Yeah, it's an interesting interface.
[00:19:13] [SPEAKER_02]: I mean, I feel like I'm used to a lot of LLM interfaces being straight text unless you query it for code.
[00:19:20] [SPEAKER_02]: And this kind of shows you, I don't know, are we essentially looking at how, like, what is happening underneath the surface?
[00:19:31] [SPEAKER_02]: You know, this is the convenient and easy to understand user interface.
[00:19:37] [SPEAKER_02]: This is exactly what's happening.
[00:19:39] [SPEAKER_02]: Well, not really.
[00:19:40] [SPEAKER_02]: That query is sent.
[00:19:41] [SPEAKER_02]: Is that what we're looking at?
[00:19:42] [SPEAKER_01]: No, because that's code you could just put in somewhere and it would do it.
[00:19:45] [SPEAKER_01]: I don't think we see the, unless you scroll.
[00:19:47] [SPEAKER_01]: Yeah, there's no, it's not showing us the construct or anything.
[00:19:51] [SPEAKER_01]: No.
[00:19:52] [SPEAKER_02]: No.
[00:19:52] [SPEAKER_01]: But it is interesting.
[00:19:53] [SPEAKER_01]: But it would be fun to play with parameters and go up in heat, whatever that means.
[00:19:56] [SPEAKER_01]: I don't even know.
[00:19:57] [SPEAKER_01]: You know, play with it.
[00:19:58] [SPEAKER_02]: Totally.
[00:19:59] [SPEAKER_02]: I would love to know more about that stuff.
[00:20:01] [SPEAKER_02]: You know, and with confidence to be able to like, oh, well, here's why when you open the view parameters and you go to the top P.
[00:20:10] [SPEAKER_02]: I mean, they do have these little call-outs to give you an idea.
[00:20:14] [SPEAKER_02]: But even reading through this, I feel like I need a better, I need a degree in something.
[00:20:18] [SPEAKER_02]: Yeah, read the beginning of that.
[00:20:21] [SPEAKER_02]: The top P sampling mass used for text generation.
[00:20:24] [SPEAKER_02]: The top P value determines the probability mass that is sampled at sampling time.
[00:20:27] [SPEAKER_02]: For example, if top P equals 0.2, only the most likely tokens summing to 0.2 cumulative probability will be sampled.
[00:20:35] [SPEAKER_02]: It is not recommended to modify both temperature and top P in the same cell.
[00:20:39] [SPEAKER_02]: Okay, now go to temperature and see what that means.
[00:20:40] [SPEAKER_02]: Or the same call.
[00:20:42] [SPEAKER_02]: Now go to what?
[00:20:43] [SPEAKER_02]: Temperature.
[00:20:43] [SPEAKER_01]: Yeah, I'm curious what that is.
[00:20:44] [SPEAKER_02]: The sampling temperature to use for text generation.
[00:20:47] [SPEAKER_02]: The higher the temperature value is, the less deterministic the output text will be.
[00:20:52] [SPEAKER_02]: It is not recommended to modify both temperature and top P in the same call.
[00:20:57] [SPEAKER_02]: Of course, that's exactly what I want to do now, but for another day.
[00:21:00] [SPEAKER_02]: Yes, exactly.
[00:21:02] [SPEAKER_02]: I'm so curious now.
[00:21:03] [SPEAKER_02]: You tell me not to do it, and I want to.
[00:21:05] [SPEAKER_01]: So there's a frequency penalty, which I guess is not using the same stuff too much.
[00:21:08] [SPEAKER_01]: There's a, what else was down there?
[00:21:12] [SPEAKER_01]: There's a presence penalty.
[00:21:14] [SPEAKER_01]: Presence penalty, which I don't know.
[00:21:15] [SPEAKER_02]: Then you've got your seed amount and stop.
[00:21:20] [SPEAKER_02]: A string or a list of strings where the API will stop generating further tokens.
[00:21:25] [SPEAKER_02]: Yeah, it's a whole lot of controls in here that I know for certain I would need more of an education on
[00:21:31] [SPEAKER_02]: in order to be able to switch those dials with certainty as far as what I'm going to get on the other side.
[00:21:37] [SPEAKER_02]: And I have a lot of respect for people who understand that to that level.
[00:21:42] [SPEAKER_02]: I could get that understanding on systems that have kind of put it into more of a context for something that I understand.
[00:21:49] [SPEAKER_02]: Like UDO, for example, which I've talked about in episodes past, which is music generation.
[00:21:56] [SPEAKER_02]: There's very detailed controls in UDO.
[00:21:59] [SPEAKER_02]: But in a short amount of time, just based on my knowledge of music and then playing around with it, I could kind of figure it out.
[00:22:07] [SPEAKER_02]: This might take me a little bit longer.
[00:22:09] [SPEAKER_01]: Yeah, you'd have to have experiments to see what the impact is of each of those variants.
[00:22:13] [SPEAKER_01]: Totally.
[00:22:13] [SPEAKER_02]: Yeah, very interesting.
[00:22:16] [SPEAKER_02]: And folks can get in there and play around with it and see what they come up with.
[00:22:21] [SPEAKER_02]: Okay, so we've been kind of dancing around agents a little bit.
[00:22:24] [SPEAKER_02]: And I think this is really interesting.
[00:22:27] [SPEAKER_02]: It seems like there's a lot of agent activity happening at this moment.
[00:22:32] [SPEAKER_02]: Microsoft announced agents through two of its channels this week.
[00:22:38] [SPEAKER_02]: Let's see if I can get that on screen here.
[00:22:40] [SPEAKER_02]: There's a co-pilot studio, which is companies creating custom agents.
[00:22:44] [SPEAKER_02]: There will be a preview next month for that.
[00:22:47] [SPEAKER_02]: Also, Dynamic 365, which is 10 pre-built agents for very specific business functions.
[00:22:55] [SPEAKER_02]: Some of these agents being included in Microsoft 365, co-pilot.
[00:23:00] [SPEAKER_02]: Others priced separately.
[00:23:04] [SPEAKER_02]: And you have Mark Benioff from Salesforce basically getting in there and talking smack, saying, hey, Microsoft rebranding co-pilot is the company in panic mode, he says.
[00:23:19] [SPEAKER_02]: He's calling its new agents more like Clippy 2.0.
[00:23:25] [SPEAKER_02]: Dropping the stingers right there.
[00:23:27] [SPEAKER_01]: Which is a little ironic because Salesforce, I think two weeks ago, announced their own agents.
[00:23:33] [SPEAKER_01]: That's right.
[00:23:34] [SPEAKER_01]: They did.
[00:23:35] [SPEAKER_01]: All God's children are going to have agents now.
[00:23:38] [SPEAKER_01]: Yeah.
[00:23:39] [SPEAKER_02]: He has something to gain from the trash talk.
[00:23:41] [SPEAKER_02]: Yeah, it's agent force.
[00:23:42] [SPEAKER_02]: We're doing it right.
[00:23:44] [SPEAKER_02]: Microsoft's doing it wrong.
[00:23:45] [SPEAKER_01]: I went to, as I've talked before, to a World Economic Forum event in San Francisco for the AI governance group I'm part of.
[00:23:53] [SPEAKER_01]: And a Salesforce executive was there on stage saying that you're not going to release agents unless you trust the software underneath to do what you're doing.
[00:24:05] [SPEAKER_01]: So Salesforce, I guess, trusted it enough to release this stuff.
[00:24:09] [SPEAKER_01]: Microsoft trusted it enough.
[00:24:13] [SPEAKER_01]: But who knows whether you're going to hand over too much of these agents.
[00:24:16] [SPEAKER_01]: But I think it's definitely the hot word of the moment.
[00:24:19] [SPEAKER_01]: I also attended virtually a meta update on basically they're doing this.
[00:24:30] [SPEAKER_01]: It's not unlike what led to their oversight board where they held meetings around the world to understand the dynamics of content moderation.
[00:24:42] [SPEAKER_01]: And then they created the oversight board out of that.
[00:24:44] [SPEAKER_01]: Well, now they're doing similar meetings around the world with real people about AI and what the AI standards ought to be and so on and so forth.
[00:24:52] [SPEAKER_01]: So they held this event for an hour just talking through very PowerPointy of how they're talking people through it.
[00:24:59] [SPEAKER_01]: They're using the deliberative democracy structure out of Stanford University.
[00:25:04] [SPEAKER_01]: But what I found interesting was at the end when they looked at some of the questions they were asking people, meta was already assuming agents.
[00:25:13] [SPEAKER_01]: They weren't so much asking about AI.
[00:25:16] [SPEAKER_01]: They were asking about how you're going to feel about agents.
[00:25:20] [SPEAKER_01]: So everybody in this world, I think, is getting a little bit over their skis because we don't necessarily trust.
[00:25:28] [SPEAKER_01]: We've got something to focus on.
[00:25:29] [SPEAKER_01]: Yeah.
[00:25:29] [SPEAKER_01]: And so now they're looking at agents, agents everywhere.
[00:25:32] [SPEAKER_01]: Well, we'll see.
[00:25:33] [SPEAKER_01]: But boy, God, this week it was just amazing how everybody just stormed in with agents.
[00:25:39] [SPEAKER_01]: Everybody.
[00:25:40] [SPEAKER_02]: Yeah.
[00:25:40] [SPEAKER_02]: Well, and just real quick as far as Benioff is concerned, when he announced the agent force agents one month ago, he called the company's move into AI agents a hard pivot, which kind of illustrates the need to be there, essentially.
[00:25:59] [SPEAKER_02]: They are all feeling the need to be there.
[00:26:01] [SPEAKER_02]: The previous need to be there was we've got to have an AI play of some sort.
[00:26:05] [SPEAKER_02]: Now it's, oh, well, we've got to put these agents to work, these custom agents that interact with our customers or with whomever.
[00:26:15] [SPEAKER_02]: Anthropics Cloud 3.5 Sonnet, another example of this, they introduced computer use, in quotes, capability.
[00:26:24] [SPEAKER_02]: Essentially, it can view the screen.
[00:26:26] [SPEAKER_02]: It can move cursors, click buttons, type, complete tasks independently.
[00:26:32] [SPEAKER_02]: And, you know, really kind of feels to a certain degree – I mean, that is very much agentry.
[00:26:37] [SPEAKER_02]: It also feels like this large action model sort of approach, you know, using a system or a machine the way a human uses it without kind of pre-coded stuff is kind of what that seems to allude to for me.
[00:26:51] [SPEAKER_02]: Or rather, pre-coded relationships between these two things, in essence.
[00:26:57] [SPEAKER_02]: If it's moving the cursor, then it's doing what it can to take an understanding of what it sees on the screen and know how to act on what it finds there.
[00:27:09] [SPEAKER_02]: Which is a little freaky, right?
[00:27:12] [SPEAKER_02]: Yeah, it's interesting.
[00:27:14] [SPEAKER_02]: That would be – yeah, that would be unsettling to see my cursor moving around on me and making actions on my screen.
[00:27:21] [SPEAKER_01]: I remember when our son, Jake, was I think only three years old.
[00:27:26] [SPEAKER_01]: We had – and he's more than 30 now.
[00:27:30] [SPEAKER_01]: So this is – you can guess what the technology was at the time.
[00:27:33] [SPEAKER_01]: But it was CDs and Arthur games and stuff like that.
[00:27:37] [SPEAKER_01]: And my parents were up in the computer room because you just had a computer room then.
[00:27:43] [SPEAKER_01]: And they were trying to run the computer and do the game with him.
[00:27:47] [SPEAKER_01]: And he finally just got fed up with them and took the mouse over and did it to their amazement, right?
[00:27:52] [SPEAKER_01]: Like, oh, geez, he could really do this all himself, right?
[00:27:55] [SPEAKER_01]: But it would be like we're being treated like grandparents.
[00:27:59] [SPEAKER_01]: And the machine is taking it over and saying, no, no, no, you idiot.
[00:28:01] [SPEAKER_01]: This is what you should be doing now.
[00:28:02] [SPEAKER_02]: Don't worry.
[00:28:02] [SPEAKER_02]: We got this.
[00:28:03] [SPEAKER_02]: Yeah.
[00:28:04] [SPEAKER_02]: It also –
[00:28:05] [SPEAKER_02]: You just sit back and take – and relax.
[00:28:08] [SPEAKER_02]: We got this.
[00:28:08] [SPEAKER_01]: And what if it does something wrong?
[00:28:11] [SPEAKER_01]: Do you get blamed for it?
[00:28:12] [SPEAKER_01]: Well, it was your machine.
[00:28:13] [SPEAKER_01]: Yeah.
[00:28:14] [SPEAKER_01]: You did this.
[00:28:15] [SPEAKER_01]: Well, no, actually, I didn't do it.
[00:28:16] [SPEAKER_01]: My agent didn't.
[00:28:17] [SPEAKER_01]: Is my agent did it the new – the dog ate the homework?
[00:28:21] [SPEAKER_01]: Yes.
[00:28:23] [SPEAKER_02]: Yeah.
[00:28:24] [SPEAKER_02]: It's your agent after all.
[00:28:27] [SPEAKER_02]: So, you know, you take responsibility.
[00:28:28] [SPEAKER_02]: It's Benioff's agent.
[00:28:29] [SPEAKER_02]: It ain't mine.
[00:28:32] [SPEAKER_02]: Well, if we're adopting it for use, I guess that would be Salesforce passing the responsibility off onto you then.
[00:28:41] [SPEAKER_02]: Exactly.
[00:28:42] [SPEAKER_02]: It's good.
[00:28:44] [SPEAKER_02]: Yeah.
[00:28:44] [SPEAKER_02]: And by the way, it's not getting any of this right.
[00:28:46] [SPEAKER_02]: As far as this is concerned, it completes less than 50% airline booking tasks.
[00:28:52] [SPEAKER_02]: So not a great stat there.
[00:28:55] [SPEAKER_02]: Often makes errors.
[00:28:56] [SPEAKER_02]: It's kind of slow.
[00:28:58] [SPEAKER_02]: And I'll just put up the little picture that I found to illustrate how this works.
[00:29:03] [SPEAKER_02]: It's essentially taking screenshots of the screen.
[00:29:06] [SPEAKER_02]: It's calculating pixel distances for the cursor movement.
[00:29:11] [SPEAKER_02]: And then it's converting user prompts into specific computer commands based on that.
[00:29:17] [SPEAKER_02]: And so that's kind of what you're seeing here.
[00:29:19] [SPEAKER_02]: So I think we're early days for this type of agentry.
[00:29:23] [SPEAKER_02]: You know, there's a lot more understanding that needs to go into these things for them to be effective.
[00:29:27] [SPEAKER_01]: As we've discussed, I think, last week or the week before with robotics, the first most important use I can see is with people with disabilities who can't use screens well.
[00:29:39] [SPEAKER_01]: And you want to tell it, make my airline reservation.
[00:29:42] [SPEAKER_01]: But the problem is this is going to be designed for us to be able to look at it and see it.
[00:29:45] [SPEAKER_01]: And I'm not sure that's going to be – and it's not going to be a good business model, which is a shame.
[00:29:51] [SPEAKER_02]: And, yeah, and people who are less able might have more trust in the system and not be as – might not scrutinize it as closely as a result.
[00:30:03] [SPEAKER_02]: And, you know, it buys the wrong ticket for you or who knows what that looks like.
[00:30:07] [SPEAKER_03]: No, stop!
[00:30:08] [SPEAKER_02]: Right, yeah.
[00:30:10] [SPEAKER_02]: And not quite the same scale, but last night on one of my other podcasts, Android Faithful, we discussed Motorola's Moto AI coming to smartphones.
[00:30:20] [SPEAKER_02]: And it's basically a large action model on phones similar to the Rabbit R1 that's back there on the tabletop somewhere.
[00:30:31] [SPEAKER_02]: And it can do things kind of along the lines of what we're talking about here, tasks like ordering coffee, requesting rides,
[00:30:38] [SPEAKER_02]: setting alarms, kind of doing this all on the device in a similar kind of like action-oriented way.
[00:30:46] [SPEAKER_02]: Although I'm guessing with this one it might be more pre-programmed things would be my guess versus we're virtually touching the screen and moving the virtual finger up to here to –
[00:30:59] [SPEAKER_02]: I don't think that's what's going on here.
[00:31:01] [SPEAKER_02]: But it just kind of illustrates like this is the – this really is the current direction of where everyone in tech, their minds are at when it comes to how can we really put this AI to work and make that money, make that bread.
[00:31:18] [SPEAKER_01]: Yeah, you think about it.
[00:31:19] [SPEAKER_01]: I guess you're – it's interesting.
[00:31:21] [SPEAKER_01]: Is your buy-in greater?
[00:31:22] [SPEAKER_01]: I mean is it – or does it reduce the buy-in?
[00:31:26] [SPEAKER_01]: That is to say I have to be in the Apple ecosystem and use Apple's agents for everything?
[00:31:33] [SPEAKER_01]: Or does an agent let you use anybody's anything?
[00:31:36] [SPEAKER_01]: I don't know.
[00:31:37] [SPEAKER_02]: Well, yeah.
[00:31:38] [SPEAKER_02]: Apple Intelligence, by the way, coming out next week.
[00:31:41] [SPEAKER_02]: And so we'll kind of see over time.
[00:31:45] [SPEAKER_02]: Yeah, I mean I'll have it on a phone but – well, actually no, I won't.
[00:31:49] [SPEAKER_02]: The phone that I have is not getting Apple Intelligence.
[00:31:51] [SPEAKER_02]: The iPhone that I have is not getting it.
[00:31:55] [SPEAKER_02]: Speaking of mobile, Axios Inafried is making the case that on-device AI could make apps obsolete.
[00:32:03] [SPEAKER_02]: And this week was a big – has been a big week for Qualcomm.
[00:32:07] [SPEAKER_02]: Big product announcement and event in Maui of all places.
[00:32:11] [SPEAKER_02]: We'd love to go to Maui, by the way, Qualcomm.
[00:32:13] [SPEAKER_02]: So with this latest premium processor, the Snapdragon 8 Elite with the Orion CPU inside.
[00:32:20] [SPEAKER_02]: Sam Altman even made a video appearance at the event, a tie-in to statements that he made weeks earlier about the progress toward on-device AI as a complement to what's done in the cloud.
[00:32:31] [SPEAKER_02]: And Ina is arguing that this is bringing the full power of a PC processor into the phone.
[00:32:37] [SPEAKER_02]: It's going to boost on-device AI dramatically.
[00:32:40] [SPEAKER_02]: And, of course, many Snapdragon top-tier processors end up in many flagships in the Android world.
[00:32:47] [SPEAKER_02]: So we're going to see a lot of phones with this capability coupled with the efforts that we're seeing from Google with Gemini and Gemini Nano.
[00:32:57] [SPEAKER_02]: Yeah.
[00:32:58] [SPEAKER_02]: I mean, Agentry probably going to get a big boost.
[00:33:01] [SPEAKER_02]: And I think at the end of the day, that's really kind of the point here is as compute on-device in the smartphone world increases and gets more and more powerful, do – and along with that, we begin to build habits around the agents that maybe potentially do the things that we actually want them to do.
[00:33:25] [SPEAKER_02]: Does that – I don't know.
[00:33:28] [SPEAKER_02]: Does that paint a view of the future where apps are far less necessary slash useful, like a true post-app world where, oh, well, we don't need apps anymore.
[00:33:40] [SPEAKER_02]: We've got the agents.
[00:33:41] [SPEAKER_02]: The agents do the things the apps used to do.
[00:33:43] [SPEAKER_01]: Right.
[00:33:44] [SPEAKER_01]: So I'm going to Germany next month, and I had to go to the United app.
[00:33:49] [SPEAKER_01]: I had to go to the Marriott app.
[00:33:52] [SPEAKER_01]: I had to go to the Newark parking app.
[00:33:55] [SPEAKER_01]: It's not hard to imagine that I just say to an agent that already knows that I'm a United flyer and where I fly from and that I have Marriott points and blah, blah, blah, blah.
[00:34:10] [SPEAKER_01]: I don't have to go to all those apps.
[00:34:12] [SPEAKER_01]: It could do it on its own.
[00:34:13] [SPEAKER_01]: That would be delightful.
[00:34:14] [SPEAKER_01]: And not only the apps but also the web because all I do is say it, and there's this whole visual idea of the interaction and pages and sites becomes just as irrelevant as apps.
[00:34:32] [SPEAKER_02]: Yeah.
[00:34:34] [SPEAKER_02]: Yeah.
[00:34:35] [SPEAKER_02]: Oh, boy.
[00:34:35] [SPEAKER_02]: That's why they're dying for agents.
[00:34:37] [SPEAKER_02]: Right?
[00:34:37] [SPEAKER_01]: So who controls that?
[00:34:41] [SPEAKER_01]: So rather than he or she who controls, the it that controls those transactions means that if I use Google's agent, it could earn commissions from United and from Marriott.
[00:34:55] [SPEAKER_01]: All right.
[00:34:57] [SPEAKER_02]: Thank you, Google, for pointing your users our way when they ask for a plane ticket.
[00:35:02] [SPEAKER_02]: Thank you.
[00:35:02] [SPEAKER_02]: We appreciate it.
[00:35:03] [SPEAKER_02]: Here's your payoff.
[00:35:04] [SPEAKER_02]: Yeah.
[00:35:05] [SPEAKER_02]: Yeah.
[00:35:05] [SPEAKER_02]: You're absolutely right.
[00:35:07] [SPEAKER_02]: That's interesting.
[00:35:08] [SPEAKER_02]: That really tips the financial scales of so much of what we've come to expect and rely upon, I suppose, on the internet as far as how the internet economy works.
[00:35:20] [SPEAKER_02]: That really shifts things.
[00:35:21] [SPEAKER_01]: Yeah.
[00:35:21] [SPEAKER_01]: People jump to that being voice controlled.
[00:35:24] [SPEAKER_01]: Make me a reservation too.
[00:35:26] [SPEAKER_01]: Maybe.
[00:35:27] [SPEAKER_01]: Maybe.
[00:35:28] [SPEAKER_01]: But it doesn't have to be.
[00:35:29] [SPEAKER_01]: But still, agentry bypasses all of those interfaces.
[00:35:38] [SPEAKER_01]: It becomes a single.
[00:35:38] [SPEAKER_01]: And no longer.
[00:35:39] [SPEAKER_01]: Yeah.
[00:35:40] [SPEAKER_01]: Sorry.
[00:35:40] [SPEAKER_01]: No, it just becomes a meta interface.
[00:35:42] [SPEAKER_01]: Go ahead.
[00:35:42] [SPEAKER_02]: Yeah.
[00:35:43] [SPEAKER_02]: I was just going to say, and no longer are our eyeballs seeing these things.
[00:35:48] [SPEAKER_02]: And so there's so much value in where is our attention.
[00:35:52] [SPEAKER_02]: We're paying less attention in that scenario.
[00:35:56] [SPEAKER_02]: We're merely interacting with an agent that's not an actual person paying attention to anything.
[00:36:02] [SPEAKER_02]: So where does that lead?
[00:36:04] [SPEAKER_02]: That's really interesting.
[00:36:06] [SPEAKER_02]: It is.
[00:36:06] [SPEAKER_02]: What does this look like if five years down the line, this is all built out and developed and actually goes in this direction?
[00:36:12] [SPEAKER_01]: I'd be curious to see.
[00:36:13] [SPEAKER_01]: And you think about it.
[00:36:14] [SPEAKER_01]: You know, back, I'm old enough to remember that when I wanted to go on a big trip, I would have to go to the travel agent and sit down and say, this is where I'm going.
[00:36:24] [SPEAKER_01]: And they had the access to the machine.
[00:36:25] [SPEAKER_01]: They could tell when the flights were.
[00:36:27] [SPEAKER_01]: They could tell what the hotel rates were.
[00:36:28] [SPEAKER_01]: They could do all that.
[00:36:29] [SPEAKER_01]: Yep.
[00:36:29] [SPEAKER_01]: So it's an appropriate word, I think.
[00:36:33] [SPEAKER_01]: With these functions being done.
[00:36:35] [SPEAKER_01]: Yeah, so true.
[00:36:35] [SPEAKER_01]: Yeah.
[00:36:37] [SPEAKER_02]: Yep.
[00:36:37] [SPEAKER_02]: I remember, you know, back when the internet was kind of taken off and it became more and more normal to book your travel yourself instead of going to, you know, a travel agent.
[00:36:51] [SPEAKER_02]: And a lot of the same kind of conversations happening.
[00:36:55] [SPEAKER_02]: Like, oh, the internet is putting travel agents out of business, you know?
[00:36:59] [SPEAKER_02]: And it's just another example of like, this is just how technology works.
[00:37:03] [SPEAKER_02]: It comes along, it makes things possible.
[00:37:05] [SPEAKER_02]: And ultimately, it democratizes more.
[00:37:09] [SPEAKER_02]: It seems to shift the power into the user versus the specialized person, which actually we're going to talk about in the second act of this show as well.
[00:37:18] [SPEAKER_01]: As long as our agents give us sufficient control and transparency.
[00:37:23] [SPEAKER_01]: I want to know whether the perplexity agent has a special deal that biases them toward Hilton.
[00:37:33] [SPEAKER_01]: You really do want to know that.
[00:37:35] [SPEAKER_01]: And what are you getting out of this?
[00:37:37] [SPEAKER_01]: And do I get a piece of your deal?
[00:37:39] [SPEAKER_01]: Yeah.
[00:37:39] [SPEAKER_01]: It's going to be interesting.
[00:37:41] [SPEAKER_02]: That is fascinating.
[00:37:43] [SPEAKER_02]: Well, we've got more coming up.
[00:37:44] [SPEAKER_02]: We've got some really interesting stories, including Microsoft and OpenAI's kind of relationship.
[00:37:51] [SPEAKER_02]: How's it doing?
[00:37:52] [SPEAKER_02]: We're going to check in on the two, on the couple here in a second.
[00:37:59] [SPEAKER_02]: OpenAI, as we have discussed in recent weeks, is in the process of transitioning from nonprofit to for-profit.
[00:38:08] [SPEAKER_02]: Although, what do they call it?
[00:38:09] [SPEAKER_02]: A for-profit?
[00:38:10] [SPEAKER_02]: They're thinking about it being a benefit corporation.
[00:38:12] [SPEAKER_01]: Yeah.
[00:38:13] [SPEAKER_02]: Benefit corporation.
[00:38:14] [SPEAKER_02]: Now negotiating with Microsoft's $13.75 billion investment on how that will convert to equity.
[00:38:26] [SPEAKER_02]: And, I mean, I can only imagine that is just a puzzle.
[00:38:29] [SPEAKER_02]: That is just not a mess to manage.
[00:38:30] [SPEAKER_01]: A lot of lawyers and bankers are making a fortune right now going over this, trying to figure out the structures.
[00:38:37] [SPEAKER_01]: At the end of the day, it's just a negotiation.
[00:38:40] [SPEAKER_01]: Sure.
[00:38:41] [SPEAKER_01]: But Microsoft gave a lot of the value it gave was not in cash, but in chip access.
[00:38:51] [SPEAKER_01]: Yeah.
[00:38:51] [SPEAKER_01]: And recently, OpenAI complained a little bit publicly that they weren't getting enough of what they wanted.
[00:38:58] [SPEAKER_01]: Well, that's negotiation.
[00:39:00] [SPEAKER_01]: And you have other investors having just come into a big investment.
[00:39:04] [SPEAKER_01]: What was their preference?
[00:39:06] [SPEAKER_01]: How was that structured?
[00:39:07] [SPEAKER_01]: What does this cap table look like?
[00:39:08] [SPEAKER_01]: Look, I don't know of any time this has ever been done like this, where you have a huge entity that is officially a not-for-profit tied to a for-profit, now converting entirely to for-profit, with prior investors under one regime, new investors under this regime, and then another regime coming.
[00:39:27] [SPEAKER_01]: Oh, my Lord, it's complicated.
[00:39:30] [SPEAKER_02]: Yeah, super complicated.
[00:39:31] [SPEAKER_02]: It's complicated.
[00:39:32] [SPEAKER_02]: The best bromance in tech, falling apart under the pressure, the constraints.
[00:39:39] [SPEAKER_02]: I don't think it's falling apart.
[00:39:40] [SPEAKER_01]: I just think they're both negotiating.
[00:39:42] [SPEAKER_02]: Yeah.
[00:39:43] [SPEAKER_02]: Yeah.
[00:39:43] [SPEAKER_02]: At the end of the day, they both have a bottom line that they're trying to hit.
[00:39:48] [SPEAKER_02]: OpenAI is pursuing alternative funding sources because it has a projected loss of $5 billion this year.
[00:39:56] [SPEAKER_02]: You know, it's yet another one of those companies that, for as much attention and clout as OpenAI has gained and been given and everything, it's still losing money.
[00:40:09] [SPEAKER_01]: Yeah, and Microsoft was getting dinged pretty heavily for – on the one hand, they seemed to win big.
[00:40:16] [SPEAKER_01]: They got the primary relationship with OpenAI.
[00:40:17] [SPEAKER_01]: They did all the flashy things.
[00:40:20] [SPEAKER_01]: Google screwed up a demonstration.
[00:40:22] [SPEAKER_01]: Boy, wasn't Microsoft smart?
[00:40:24] [SPEAKER_01]: And the backlash came pretty quickly where, oh, I don't know.
[00:40:28] [SPEAKER_01]: Microsoft is too tied to OpenAI.
[00:40:30] [SPEAKER_01]: They're too at risk then.
[00:40:32] [SPEAKER_01]: Then they hired Mustafa Silliman.
[00:40:34] [SPEAKER_01]: Then they did – right?
[00:40:35] [SPEAKER_01]: Then they did other things like that.
[00:40:37] [SPEAKER_01]: So it's a fast – is it really a fast-moving field?
[00:40:42] [SPEAKER_01]: Yes, it is.
[00:40:43] [SPEAKER_01]: But it's not as fast, I think, as they're trying to all make believe that it is.
[00:40:49] [SPEAKER_01]: And they've got to try to all stay ahead of the financial vibe here.
[00:40:54] [SPEAKER_01]: And I think that's what's doing a lot of this.
[00:40:57] [SPEAKER_01]: Where the real money is going to be in the future, we still don't really know.
[00:41:00] [SPEAKER_01]: Yeah.
[00:41:01] [SPEAKER_01]: What's worth investing in, it's hard to say.
[00:41:05] [SPEAKER_01]: But, yeah, fascinating to watch.
[00:41:07] [SPEAKER_01]: This will be really interesting to watch.
[00:41:08] [SPEAKER_01]: And I wonder whether there will end up being some suits with kind of discovery and neat stuff to learn.
[00:41:13] [SPEAKER_01]: We'll find out.
[00:41:14] [SPEAKER_02]: Yep.
[00:41:15] [SPEAKER_02]: Indeed, we will.
[00:41:16] [SPEAKER_02]: We will indeed find that out.
[00:41:19] [SPEAKER_02]: Speaking of suits, a few weeks after Tesla's WeRobot event, the producer of Blade Runner 2049 has filed a lawsuit against Tesla.
[00:41:32] [SPEAKER_02]: And this is kind of interesting because I guess we had had this conversation and then I also talked with Sam Abul Samad on the Texboater channel shortly after the event to get his take.
[00:41:46] [SPEAKER_02]: He knows a lot about Tesla and everything, so that was an interesting discussion.
[00:41:50] [SPEAKER_02]: But it came up in both conversations where, wow, these vehicles, they look so much like a Blade Runner aesthetic.
[00:41:59] [SPEAKER_02]: And it turns out that wasn't at all a mistake or just a fluke or anything, an accident.
[00:42:07] [SPEAKER_02]: No, coincidental.
[00:42:09] [SPEAKER_02]: Tesla had requested hours before the event to show.
[00:42:13] [SPEAKER_02]: A little late.
[00:42:15] [SPEAKER_02]: Yeah, I don't know.
[00:42:16] [SPEAKER_02]: Why does this happen?
[00:42:17] [SPEAKER_02]: Like, I feel like this happened with OpenAI and the Scarlett Johansson thing, right?
[00:42:20] [SPEAKER_01]: Yes, yes, exactly.
[00:42:22] [SPEAKER_02]: Like, how do these things get left two hours before the event?
[00:42:26] [SPEAKER_02]: Or is this just how they're being reported?
[00:42:27] [SPEAKER_02]: I don't understand how that happens.
[00:42:29] [SPEAKER_01]: They don't have enough lawyers with enough power in the company yet.
[00:42:31] [SPEAKER_01]: But after things like this, they will get more lawyers with more power.
[00:42:35] [SPEAKER_02]: Yeah, yeah, undoubtedly.
[00:42:38] [SPEAKER_02]: So Tesla had requested to show an AI-generated image depicting a Blade Runner-esque future with, I'm guessing, with their vehicle.
[00:42:50] [SPEAKER_02]: And it resembled scenes from Blade Runner 2049 close enough that they felt like they needed to reach out and get the permission.
[00:42:59] [SPEAKER_02]: That request was denied.
[00:43:01] [SPEAKER_02]: So this story sounds very familiar to me.
[00:43:04] [SPEAKER_01]: And they went ahead and did it anyway because, hell, it's Musk.
[00:43:08] [SPEAKER_02]: Totally.
[00:43:08] [SPEAKER_02]: And not only did it anyway, my understanding is that they actually showed the image for quite a long time.
[00:43:15] [SPEAKER_02]: It was an 11-second display, as the article puts it.
[00:43:19] [SPEAKER_02]: It constitutes a marketing and advertising eternity.
[00:43:24] [SPEAKER_02]: Leaving that up there for 11 seconds.
[00:43:26] [SPEAKER_02]: Anyways, Musk still included it in the presentation.
[00:43:30] [SPEAKER_02]: Also alluding, quote, I love Blade Runner, but I don't know if we want that future.
[00:43:35] [SPEAKER_02]: And the lawsuit alleges copyright infringement, brand misappropriation, said the use of AI-generated imagery was a bad faith and intentionally malicious gambit.
[00:43:47] [SPEAKER_01]: Yeah, this is not fair use in any possible definition because it's a marketing use.
[00:43:52] [SPEAKER_01]: And you can't just use somebody's likeness or creativity for the sake of marketing your product without their permission.
[00:44:01] [SPEAKER_01]: But this is very, very musky.
[00:44:05] [SPEAKER_01]: Very.
[00:44:06] [SPEAKER_02]: It's getting musky in here.
[00:44:10] [SPEAKER_02]: Yeah.
[00:44:11] [SPEAKER_02]: Now we know why.
[00:44:12] [SPEAKER_02]: So there you go.
[00:44:14] [SPEAKER_02]: Probably in the end not going to amount to very much in the grand scheme of things.
[00:44:19] [SPEAKER_02]: But I do think it's interesting just because the overall vibe of what he's going with.
[00:44:24] [SPEAKER_02]: You know, all the way down to, I might add, like – and it's not like Blade Runner owns the cyberpunk aesthetic.
[00:44:30] [SPEAKER_02]: But, you know, had the music that really kind of sounded very similar and familiar.
[00:44:36] [SPEAKER_01]: What's amazing to me is that they could have – what's Musk's AI called?
[00:44:42] [SPEAKER_01]: Grok?
[00:44:43] [SPEAKER_01]: Yeah.
[00:44:43] [SPEAKER_01]: Are you talking about –
[00:44:44] [SPEAKER_01]: So if Grok were so good, with an hour to go, they could have invented a whole new world.
[00:44:50] [SPEAKER_02]: Yeah, totally.
[00:44:51] [SPEAKER_01]: Right?
[00:44:51] [SPEAKER_01]: They could have invented something entirely different.
[00:44:54] [SPEAKER_01]: You know, if you had to have artists do it, that's harder.
[00:44:57] [SPEAKER_01]: But, hey, we have Grok.
[00:44:59] [SPEAKER_01]: Grok can make up anything.
[00:45:00] [SPEAKER_01]: Let's make up Mars, Saturn, and anything.
[00:45:04] [SPEAKER_01]: Right?
[00:45:05] [SPEAKER_01]: But no.
[00:45:05] [SPEAKER_01]: Lazy SOBs just went and violated the copyright.
[00:45:11] [SPEAKER_02]: Yeah.
[00:45:12] [SPEAKER_02]: Yeah.
[00:45:12] [SPEAKER_02]: There you go.
[00:45:14] [SPEAKER_02]: So this is the aesthetic from Blade Runner 2049 on the left.
[00:45:20] [SPEAKER_02]: This is the image from the live RoboTaxi event.
[00:45:25] [SPEAKER_02]: Just change the color.
[00:45:27] [SPEAKER_02]: Yeah.
[00:45:28] [SPEAKER_02]: I mean, yeah.
[00:45:29] [SPEAKER_02]: Totally.
[00:45:31] [SPEAKER_02]: Yeah.
[00:45:32] [SPEAKER_02]: It's very, very similar.
[00:45:34] [SPEAKER_02]: Yeah.
[00:45:34] [SPEAKER_02]: You know, and hard to deny that it had anything to do with it when you've reached out for permission.
[00:45:40] [SPEAKER_02]: That's the key.
[00:45:41] [SPEAKER_02]: And you talk about Blade Runner.
[00:45:42] [SPEAKER_02]: Just like Scarlett Johansson.
[00:45:43] [SPEAKER_01]: You knew what you were doing.
[00:45:44] [SPEAKER_02]: Totally.
[00:45:45] [SPEAKER_02]: Totally.
[00:45:45] [SPEAKER_02]: You knew what you were doing.
[00:45:46] [SPEAKER_02]: Yeah.
[00:45:46] [SPEAKER_02]: Yeah.
[00:45:46] [SPEAKER_02]: In the end, is it going to matter a whole lot?
[00:45:48] [SPEAKER_02]: Probably not.
[00:45:49] [SPEAKER_02]: But it is unsurprising.
[00:45:52] [SPEAKER_01]: But it's stuff like this that really hurts the new AI industry's reputation.
[00:45:56] [SPEAKER_01]: Yeah, that's true.
[00:45:57] [SPEAKER_01]: With creators.
[00:45:59] [SPEAKER_01]: See?
[00:45:59] [SPEAKER_01]: Yeah.
[00:45:59] [SPEAKER_01]: They're trying to rip us off at every possible turn.
[00:46:01] [SPEAKER_01]: And so Musk does no favors.
[00:46:02] [SPEAKER_02]: See, they obviously don't care.
[00:46:04] [SPEAKER_01]: Right.
[00:46:04] [SPEAKER_02]: They don't care about ownership.
[00:46:06] [SPEAKER_02]: They don't care about copyright.
[00:46:07] [SPEAKER_02]: They just do it anyways because they're entitled and blah, blah, blah.
[00:46:10] [SPEAKER_01]: And I'm arguing for reading material as fair use when it comes to training models.
[00:46:19] [SPEAKER_01]: And that's a very unpopular position in my world.
[00:46:22] [SPEAKER_01]: But I hold it.
[00:46:23] [SPEAKER_01]: But this doesn't help, that argument.
[00:46:25] [SPEAKER_02]: No.
[00:46:26] [SPEAKER_02]: No, it really doesn't.
[00:46:29] [SPEAKER_02]: Runway, which is a video generation company that we've talked about in recent weeks,
[00:46:36] [SPEAKER_02]: they had the video, the matching, well, now I can't remember exactly what they called it,
[00:46:45] [SPEAKER_02]: but the text-to-video where you can feed it your own video, and it will interpret that video
[00:46:51] [SPEAKER_02]: in any number of ways depending on how you define it.
[00:46:54] [SPEAKER_02]: So show me this video, but in a crochet aesthetic or animated or whatever.
[00:47:01] [SPEAKER_02]: And so they did some really cool things.
[00:47:03] [SPEAKER_01]: I remember that now.
[00:47:04] [SPEAKER_01]: Yeah.
[00:47:04] [SPEAKER_01]: Okay.
[00:47:04] [SPEAKER_01]: Yes.
[00:47:05] [SPEAKER_01]: Okay.
[00:47:05] [SPEAKER_02]: They now have a new feature called Act One.
[00:47:11] [SPEAKER_02]: And actually, for video listeners, I'm going to just peel the onion back.
[00:47:16] [SPEAKER_02]: I have to change the technology and share this differently.
[00:47:20] [SPEAKER_02]: Otherwise, you're not going to be able to hear the audio when I play it.
[00:47:23] [SPEAKER_02]: Okay.
[00:47:23] [SPEAKER_02]: So if you're watching video, you'll see what we're talking about.
[00:47:27] [SPEAKER_02]: If you're just an audio podcast, you'll hear it.
[00:47:29] [SPEAKER_02]: But it's really, really kind of interesting what they're able to do here.
[00:47:33] [SPEAKER_01]: I know everywhere on the floor of the coffee shop.
[00:47:35] [SPEAKER_01]: I mean, it's one latte.
[00:47:36] [SPEAKER_01]: It's not that big of a deal.
[00:47:38] [SPEAKER_02]: Everything's fine.
[00:47:38] [SPEAKER_02]: Everything's totally great.
[00:47:39] [SPEAKER_02]: This is going to be the best day ever.
[00:47:41] [SPEAKER_00]: Smashed my phone the other night.
[00:47:43] [SPEAKER_01]: And you know how much I love that phone.
[00:47:45] [SPEAKER_01]: Okay.
[00:47:45] [SPEAKER_02]: All right.
[00:47:46] [SPEAKER_02]: Just breathe.
[00:47:47] [SPEAKER_01]: Okay.
[00:47:48] [SPEAKER_02]: Maybe you don't breathe that hard.
[00:47:50] [SPEAKER_02]: So the idea here is that this is essentially what I was talking about earlier as far as technology democratizing things that once before were incredibly gatekept, difficult, expensive.
[00:48:07] [SPEAKER_02]: And what we're seeing here, if you're watching the video version, this really is all about anyone taking a recording of themselves or an actor or whatever.
[00:48:17] [SPEAKER_02]: Even with like a smartphone.
[00:48:19] [SPEAKER_02]: It doesn't matter what camera you're using.
[00:48:20] [SPEAKER_02]: And recording an emotive, you know, complete with facial expressions, performance, let's say.
[00:48:28] [SPEAKER_02]: And then using this system, act one, to essentially do what, you know, required motion capture engineers to do before.
[00:48:38] [SPEAKER_02]: And I mean, we've seen little bits and pieces of this, but when you watch the videos that they've given, the expressiveness of the animations that are tied to this, the AI-generated animations that are happening here when you feed it that video, it's really impressive.
[00:48:52] [SPEAKER_02]: Like I could see and imagine that we're probably going to see out of advancements like this a lot of really interesting independent, and I think that's kind of what gets me kind of interested in this,
[00:49:06] [SPEAKER_02]: independent approaches to animating stories.
[00:49:10] [SPEAKER_02]: Because now I can just sit in front of a camera and I can be the character, and then I can run through this system and match it up with the character that I'm giving the system.
[00:49:22] [SPEAKER_02]: And it turns into this very expressive kind of, you know, maybe Pixar-like or whatever the aesthetic is that you're going for.
[00:49:29] [SPEAKER_02]: And that's a real strong like storytelling platform that no longer requires a motion capture studio.
[00:49:35] [SPEAKER_02]: No longer requires all this expensive technology.
[00:49:38] [SPEAKER_02]: It's really impressive.
[00:49:41] [SPEAKER_01]: Yeah, and I can't draw, let alone animate.
[00:49:44] [SPEAKER_01]: Yeah.
[00:49:44] [SPEAKER_01]: And so to be able to tell the story I want to tell or explain something the way I want to, to use this is tremendous.
[00:49:51] [SPEAKER_01]: Yeah.
[00:49:52] [SPEAKER_01]: And we've imagined chat models as a way for people to tell their own stories in text, but this is really powerful.
[00:49:59] [SPEAKER_02]: Yeah.
[00:50:00] [SPEAKER_02]: Yeah, super cool.
[00:50:01] [SPEAKER_02]: I'm really curious to kind of see what some of the storytelling aspects of this are.
[00:50:06] [SPEAKER_02]: And I'm also prepared to hear of, you know, when I talk about kind of, you know, the people who have done this and will continue to do this from a professional perspective, you know, have built their careers around motion capture and doing these things.
[00:50:22] [SPEAKER_02]: Like, I don't know.
[00:50:23] [SPEAKER_02]: I guess the question, the curiosity that I have is, do both exist and for how long side by side?
[00:50:32] [SPEAKER_02]: You know what I mean?
[00:50:32] [SPEAKER_02]: Or at what point do tools like this become part of the tool set for the people who are professionals?
[00:50:39] [SPEAKER_02]: And what does that enable them to do, taking that tool setting and going forward with it?
[00:50:43] [SPEAKER_01]: So I'm flashing on something I don't want to flash on.
[00:50:47] [SPEAKER_01]: Animated NFTs.
[00:50:51] [SPEAKER_01]: Gary V would use this, well, we made a new character and we have three seconds and it's going to be really valuable.
[00:50:57] [SPEAKER_01]: And it's just.
[00:50:58] [SPEAKER_02]: NFTs, is this still a thing?
[00:51:00] [SPEAKER_02]: Like, I never hear about them anymore.
[00:51:02] [SPEAKER_02]: Is it?
[00:51:03] [SPEAKER_02]: No.
[00:51:04] [SPEAKER_02]: No.
[00:51:04] [SPEAKER_02]: It probably is.
[00:51:05] [SPEAKER_01]: And had to leave, but we should have asked Ant.
[00:51:08] [SPEAKER_02]: Yeah.
[00:51:08] [SPEAKER_02]: Yeah.
[00:51:09] [SPEAKER_02]: So someone's yelling at us right now, almost certainly.
[00:51:11] [SPEAKER_02]: Yes.
[00:51:11] [SPEAKER_02]: Saying, yes, they are.
[00:51:13] [SPEAKER_02]: And I'll sell them to you.
[00:51:15] [SPEAKER_02]: Yes.
[00:51:16] [SPEAKER_02]: Please.
[00:51:17] [SPEAKER_02]: Could you buy mine?
[00:51:17] [SPEAKER_02]: Yeah.
[00:51:18] [SPEAKER_02]: Buy these that I already bought for too much money.
[00:51:21] [SPEAKER_02]: I need to unload them over pennies on the dollar.
[00:51:24] [SPEAKER_02]: Sorry, NFT fans.
[00:51:26] [SPEAKER_02]: We think you're great.
[00:51:28] [SPEAKER_02]: But yeah, the NFT thing.
[00:51:30] [SPEAKER_02]: I don't know.
[00:51:31] [SPEAKER_02]: And finally, Sotheby's Auction House having its first ever auction of artwork created by a humanoid robot.
[00:51:41] [SPEAKER_02]: The artist that you see here is named, is it Aida or Aida?
[00:51:47] [SPEAKER_02]: I guess Aida.
[00:51:47] [SPEAKER_02]: It's spelled A-I-D-A.
[00:51:49] [SPEAKER_02]: So probably Aida.
[00:51:50] [SPEAKER_02]: The artwork that's being auctioned is called AI God.
[00:51:55] [SPEAKER_02]: It's a portrait of Alan Turing, of all people.
[00:51:58] [SPEAKER_02]: And it's a mixed media painting.
[00:52:01] [SPEAKER_02]: Do we have the painting?
[00:52:03] [SPEAKER_02]: The actual painting?
[00:52:03] [SPEAKER_02]: There it is.
[00:52:04] [SPEAKER_02]: Yeah, it almost, I mean, if it's a mixed media painting, it still, it looks very digital.
[00:52:10] [SPEAKER_02]: Yes.
[00:52:11] [SPEAKER_02]: Like it's a digital image and not an actual physical image.
[00:52:15] [SPEAKER_02]: But I'm guessing it is.
[00:52:16] [SPEAKER_02]: It must be a physical image.
[00:52:19] [SPEAKER_01]: And it's supposedly going to go between $120,000 and $180,000 because the robot made it.
[00:52:25] [SPEAKER_01]: Yeah.
[00:52:25] [SPEAKER_01]: I'm not so sure.
[00:52:28] [SPEAKER_02]: I mean, that's a lot.
[00:52:30] [SPEAKER_01]: But I guess if it, yeah.
[00:52:32] [SPEAKER_01]: Well, what you just showed has more, I'm not going to say value, but creativity and effort than this.
[00:52:45] [SPEAKER_01]: Is this like watching the dog paint with a brush and you're amazed that a dog doesn't?
[00:52:50] [SPEAKER_01]: I don't know.
[00:52:51] [SPEAKER_02]: Yeah.
[00:52:52] [SPEAKER_02]: Yeah.
[00:52:52] [SPEAKER_02]: I don't know.
[00:52:53] [SPEAKER_02]: I mean, it's a neat painting.
[00:52:55] [SPEAKER_02]: The value of art is very subjective, I suppose.
[00:53:00] [SPEAKER_02]: I certainly wouldn't see that kind of value in it.
[00:53:03] [SPEAKER_02]: But maybe it's more valuable because of the kind of flag it plants in the ground, you know, as being the first of this thing that who knows 20 years down the line, we'll look back and we'll be like, oh, man, we're so used to seeing humanoid robots creating art.
[00:53:20] [SPEAKER_02]: I wish I could go back in time and buy the very first auctioned artwork created by a humanoid.
[00:53:27] [SPEAKER_02]: I can't find it now.
[00:53:28] [SPEAKER_02]: That's kind of the vibe I get.
[00:53:31] [SPEAKER_01]: Oh, God.
[00:53:32] [SPEAKER_01]: I can't find it right now.
[00:53:33] [SPEAKER_01]: But I remember going to some conference, probably in Germany, because that's where they do these crazy things, and a robot drew a portrait of me.
[00:53:44] [SPEAKER_02]: Oh, okay.
[00:53:46] [SPEAKER_02]: Interesting.
[00:53:46] [SPEAKER_02]: Like, how did that take place?
[00:53:49] [SPEAKER_02]: I think just had-
[00:53:50] [SPEAKER_02]: There was a robot there just drawing portraits of everybody?
[00:53:53] [SPEAKER_01]: Of what you do.
[00:53:55] [SPEAKER_01]: It was just a pen.
[00:53:56] [SPEAKER_01]: It was a very, very spare thing.
[00:53:59] [SPEAKER_01]: I wouldn't make it my new-
[00:54:02] [SPEAKER_01]: Yeah.
[00:54:05] [SPEAKER_01]: But it's not the first robot to draw.
[00:54:08] [SPEAKER_02]: No.
[00:54:09] [SPEAKER_02]: No, it's not.
[00:54:10] [SPEAKER_02]: I actually remember going to Google I.O. a handful of years ago, and they had a section where you could go, and there was, you know, they were kind of talking it up and saying, this is an AI drawing robot, and it was, you know, an arm on a kind of like a big platform that, you know, it drew everybody its own, their own unique.
[00:54:31] [SPEAKER_02]: Kind of artwork, and I don't know.
[00:54:33] [SPEAKER_02]: I mean, the art wasn't that impressive, to be honest.
[00:54:36] [SPEAKER_02]: But I suppose if you understood what was happening to make that art, it was probably impressive.
[00:54:41] [SPEAKER_02]: For me, I was like, eh, okay.
[00:54:43] [SPEAKER_02]: I hung it up in my office at Twit for a couple of years just because I needed things on the wall.
[00:54:49] [SPEAKER_01]: So let me-
[00:54:49] [SPEAKER_02]: Not because it was amazing.
[00:54:51] [SPEAKER_01]: I'm going to see if you get this quick enough.
[00:54:53] [SPEAKER_01]: I'm going to email it to you.
[00:54:56] [SPEAKER_01]: Oh, okay.
[00:54:56] [SPEAKER_01]: I just emailed you.
[00:54:58] [SPEAKER_01]: Share with you.
[00:54:58] [SPEAKER_01]: Okay, I'll look for it.
[00:54:59] [SPEAKER_01]: Share with Jason.
[00:55:00] [SPEAKER_01]: Send.
[00:55:00] [SPEAKER_01]: Yes, outside the organization.
[00:55:01] [SPEAKER_01]: Yes, just do what I say, agent.
[00:55:06] [SPEAKER_02]: We need the agents to be more effective.
[00:55:10] [SPEAKER_02]: Okay.
[00:55:11] [SPEAKER_02]: So while I work on this, is this going to be the artwork?
[00:55:14] [SPEAKER_02]: This is the artwork.
[00:55:16] [SPEAKER_01]: The robot made, yes.
[00:55:18] [SPEAKER_01]: It's not very impressive.
[00:55:19] [SPEAKER_01]: The robot made.
[00:55:20] [SPEAKER_02]: Well, I am looking for it, and hopefully I can find it.
[00:55:25] [SPEAKER_01]: This is not going to work with that.
[00:55:26] [SPEAKER_01]: I apologize, folks.
[00:55:29] [SPEAKER_02]: I'm having a hard time finding it.
[00:55:31] [SPEAKER_02]: If I can find it in post-show, I'll put it up there.
[00:55:35] [SPEAKER_02]: But yeah, I'm not seeing it yet.
[00:55:37] [SPEAKER_02]: It could come in any number of directions.
[00:55:41] [SPEAKER_02]: No, I'm not-
[00:55:42] [SPEAKER_02]: Okay, I just texted it to you.
[00:55:43] [SPEAKER_02]: I'm sorry.
[00:55:44] [SPEAKER_02]: Oh, wait a minute.
[00:55:45] [SPEAKER_02]: Oh, I'm seeing it.
[00:55:46] [SPEAKER_02]: Okay, hold on.
[00:55:47] [SPEAKER_02]: Hold on.
[00:55:48] [SPEAKER_02]: Allow me to move this to there.
[00:55:51] [SPEAKER_02]: Actually, I'm happy we did get this because I think that looks pretty cool.
[00:55:57] [SPEAKER_02]: Yeah, it's not bad.
[00:55:58] [SPEAKER_02]: That's-
[00:55:59] Yeah.
[00:56:00] [SPEAKER_02]: It's totally like a line art drawn version.
[00:56:03] [SPEAKER_02]: Like I can totally see you in this art, especially at a smaller, like at a distance.
[00:56:09] [SPEAKER_02]: You know what I mean?
[00:56:09] [SPEAKER_02]: If I'm further away from it, I start to see it a little bit clearer.
[00:56:13] [SPEAKER_02]: But I think that's neat.
[00:56:14] [SPEAKER_02]: And that was done on paper?
[00:56:16] [SPEAKER_02]: Yeah.
[00:56:17] [SPEAKER_02]: Yeah.
[00:56:17] [SPEAKER_02]: Yeah.
[00:56:18] [SPEAKER_02]: Oh, that's super cool.
[00:56:19] [SPEAKER_02]: Cool.
[00:56:20] [SPEAKER_02]: And so that robot was just sitting around like with this purpose for everyone that wanted
[00:56:25] [SPEAKER_02]: a-
[00:56:26] [SPEAKER_02]: Yeah, it was a gimmick.
[00:56:26] [SPEAKER_02]: A portrait of themselves.
[00:56:28] [SPEAKER_02]: Well, that's neat.
[00:56:28] [SPEAKER_02]: Yeah.
[00:56:29] [SPEAKER_02]: So is this worth $120,000?
[00:56:32] [SPEAKER_02]: No.
[00:56:33] [SPEAKER_02]: Well, more than it is.
[00:56:34] [SPEAKER_01]: It's not worth it.
[00:56:35] [SPEAKER_01]: Not worth it.
[00:56:35] [SPEAKER_01]: It's worth more.
[00:56:37] [SPEAKER_02]: That's true.
[00:56:38] [SPEAKER_02]: Yes.
[00:56:39] [SPEAKER_02]: $180,000.
[00:56:40] [SPEAKER_02]: Yeah.
[00:56:40] [SPEAKER_02]: You're right.
[00:56:40] [SPEAKER_02]: You're right.
[00:56:41] [SPEAKER_02]: Thank you for sharing that.
[00:56:43] [SPEAKER_02]: That's really cool.
[00:56:44] [SPEAKER_02]: Sorry, audio listeners.
[00:56:45] [SPEAKER_02]: Yeah.
[00:56:46] [SPEAKER_02]: Because you're missing out.
[00:56:47] [SPEAKER_02]: Yeah, I'm missing out.
[00:56:47] [SPEAKER_02]: It's all right.
[00:56:50] [SPEAKER_02]: Proceeds, by the way, of that auction go to continued development of Ida.
[00:56:56] [SPEAKER_01]: Yeah, Ida is the same robot that you might remember about a year ago testified before
[00:57:00] [SPEAKER_01]: in the House of Lords or Commons.
[00:57:01] [SPEAKER_01]: I can't remember which one in the UK.
[00:57:03] [SPEAKER_01]: So there was video of Ida speaking to the legislators as part of testimony.
[00:57:09] [SPEAKER_01]: So maybe it was the first robot to testify.
[00:57:11] [SPEAKER_01]: I don't know.
[00:57:13] [SPEAKER_02]: Oh, interesting.
[00:57:14] [SPEAKER_02]: Not sure I remember that news story.
[00:57:17] [SPEAKER_02]: At least before we started our show here.
[00:57:20] [SPEAKER_02]: Yeah.
[00:57:21] [SPEAKER_02]: Sure it was on Twitter at some point.
[00:57:23] [SPEAKER_02]: Well, with that, we have reached the end of this episode of AI Inside.
[00:57:27] [SPEAKER_02]: Jeff Jarvis, always a pleasure.
[00:57:30] [SPEAKER_02]: JeffJarvis.com.
[00:57:32] [SPEAKER_02]: The easiest way to find Jeff's new book, The Web We Weave.
[00:57:37] [SPEAKER_02]: Why we must reclaim the internet from moguls, misanthropes, and moral panic.
[00:57:42] [SPEAKER_02]: I love that you got moral panic on the cover.
[00:57:44] [SPEAKER_02]: Yeah, I did.
[00:57:45] [SPEAKER_02]: Yes.
[00:57:46] [SPEAKER_02]: You had to get that in there.
[00:57:48] [SPEAKER_02]: Definitely grateful to you, Jeff Jarvis.com.
[00:57:50] [SPEAKER_02]: I'm grateful to anybody who buys it.
[00:57:50] [SPEAKER_02]: Thank you very much.
[00:57:52] [SPEAKER_02]: Right on.
[00:57:52] [SPEAKER_02]: Well, you do great work, and I appreciate you, Jeff.
[00:57:55] [SPEAKER_02]: Thank you so much for bringing it to the show each and every week.
[00:57:59] [SPEAKER_02]: Appreciate you.
[00:58:00] [SPEAKER_02]: Thank you.
[00:58:00] [SPEAKER_02]: Thank you.
[00:58:01] [SPEAKER_02]: Um, also appreciate everybody for watching and listening and supporting and subscribing.
[00:58:05] [SPEAKER_02]: So if you go to AIinside.show, you can find all the ways to subscribe to this podcast, uh, so that you don't have to, you know, remember to catch it live or whatever.
[00:58:15] [SPEAKER_02]: It's, it's, it's all here.
[00:58:16] [SPEAKER_02]: Everything you need to know, including, you know, if you go into like last week's excellent episode where we chatted with Sarah Bird from Microsoft, you get access to the video there as well.
[00:58:27] [SPEAKER_02]: So there are a number of different ways.
[00:58:28] [SPEAKER_02]: And I've actually started timing, uh, adding time codes for different points in the, in the podcast.
[00:58:35] [SPEAKER_02]: So if we're talking about news stories, I'm going to take the time now to kind of give you an indication as far as when each news story happens.
[00:58:42] [SPEAKER_02]: So hopefully you'll like some of that extra effort there.
[00:58:46] [SPEAKER_02]: Um, and you know, maybe it improves your, your viewing or listening of the show.
[00:58:51] [SPEAKER_02]: And then of course you can go to patreon.com slash AI inside show, go there and you can support the show at any number of different ways.
[00:59:02] [SPEAKER_02]: And you get access to ad free shows.
[00:59:04] [SPEAKER_02]: You get access to discord.
[00:59:06] [SPEAKER_02]: I did a live zoom hangout with a bunch of, uh, patrons yesterday.
[00:59:11] [SPEAKER_02]: That was a lot of fun.
[00:59:13] [SPEAKER_02]: You can also get an AI inside t-shirt.
[00:59:15] [SPEAKER_02]: You do have to become an executive producer of the show in order to do that.
[00:59:19] [SPEAKER_02]: But Hey, we have, you'd be in good company.
[00:59:22] [SPEAKER_02]: Dr.
[00:59:22] [SPEAKER_02]: Do Jeffrey Maricini, WPVM one Oh 3.7 in Asheville, North Carolina, Paul Lang and Ryan Newell are all executive producers.
[00:59:31] [SPEAKER_02]: You can be one too.
[00:59:33] [SPEAKER_02]: Thank you so much for your support each and every week.
[00:59:36] [SPEAKER_02]: We literally could not do this show without you.
[00:59:39] [SPEAKER_02]: Um, thank you once again.
[00:59:41] [SPEAKER_02]: And I know we've got some great news probably on the horizon for next week's episode.
[00:59:45] [SPEAKER_02]: So until next time we will see you then take care of yourselves.
[00:59:48] [SPEAKER_02]: We'll see you on another episode of AI inside.
[00:59:51] [SPEAKER_02]: Bye everybody.