Jason Howell and Jeff Jarvis discuss Google's Project Jarvis browser control AI, GitHub's expansion of Copilot to multiple AI models, Meta's development of its own AI search engine, and Apple's rollout of Apple Intelligence features for select devices.
π Support the show on Patreon!
Note: Time codes subject to change depending on dynamic ad insertion by the distributor.
NEWS
0:03:33 - Project Jarvis: Google Preps AI That Takes Over Computers
0:15:55 - More than a quarter of new code at Google is generated by AI
0:20:42 - GitHub Copilot will support models from Anthropic, Google, and OpenAI
0:29:38 - Meta builds AI search engine to cut Google, Bing reliance
0:30:50 - Meta AI gains access to Reuters news content in multi-year deal
0:33:50 - Meta is in fact the most well-placed company to take advantage of generative AI
0:38:25 - Meta takes on Googleβs NotebookLM with new NotebookLlama
0:42:59 - Apple Intelligence is available today
0:45:38 - Humane bets others need its AIOS
0:49:55 - UNIVERSAL MUSIC GROUP ENTERS INTO A STRATEGIC COLLABORATION WITH ETHICAL AI MUSIC COMPANY, KLAY
0:53:25 - LIGHTNING ROUND
Hospitals use a transcription tool powered by an error-prone OpenAI model
Linus Torvalds slams AI as β90% marketing and 10% realityβ
Stability AI Is No Longer Controlled by Former CEO Emad Mostaque
Arcade, a new AI product creation platform, designed this necklace
[00:00:00] This is AI Inside, Episode 41, recorded Wednesday, October 30th, 2024. Google's Project Jarvis Takes Over the Browser.
[00:00:10] This episode of AI Inside is made possible by our wonderful patrons at patreon.com slash AI Inside Show.
[00:00:17] If you like what you hear, head on over and support us directly and thank you for making independent podcasting possible.
[00:00:30] Hello, everybody, and welcome to yet another episode of AI Inside.
[00:00:35] I'm one of your hosts, Jason Howell, here to talk a little bit about how AI is layered throughout so many things right now,
[00:00:44] from agents to coding to all sorts of things that we have on the docket today.
[00:00:48] Who is we? It's me, Jason. Also, Jeff Jarvis, my co-host. How you doing, Jeff?
[00:00:53] Hey, boss. How are you doing?
[00:00:54] Good. Man, this week, you put a wall of links down in the rundown.
[00:00:59] The news just kept coming.
[00:01:01] So what happens is I just put in links, links, links, links, links.
[00:01:05] I put the same ones in on Twig as well, Twig.
[00:01:09] They irritate the report there because it's overloaded.
[00:01:12] And then I just copy them over to AI Inside.
[00:01:15] And Jason does an excellent job, I point this out to him often, of picking out the good stuff to go put up into the rundown.
[00:01:21] But man, there was a lot this week.
[00:01:22] I mean, the problem is so much of what you put in is good stuff.
[00:01:26] So I always feel like I'm leaving out something that is worth discussing.
[00:01:30] But we only really have an hour.
[00:01:32] At this point, I think we've kind of realized or I've realized as I've put the show together,
[00:01:36] we have enough time for like eight to ten stories in this show before we completely tap out with discussion and everything.
[00:01:43] And there's so much happening in AI.
[00:01:45] We can't possibly talk about it all.
[00:01:47] But this is probably just a really great opportunity to tell you Jeff does another show where he talks a lot about AI called This Week in Google.
[00:01:54] If you go to twit.tv, you can find that wonderful show.
[00:01:59] And some of these stories that we don't talk about, maybe you talk about there too.
[00:02:03] You never know what's going to happen there.
[00:02:04] You know what we could do at the end of the show?
[00:02:05] We could do an absolute lightning round of going into what's left and saying there's this and this and this and this.
[00:02:10] We'll try that.
[00:02:10] This happened.
[00:02:11] This happened.
[00:02:12] This happened.
[00:02:13] Yeah.
[00:02:13] This is headline reading, though.
[00:02:14] Let's let's play that one by ear.
[00:02:16] All right.
[00:02:17] Maybe we could do a little bit of that one.
[00:02:19] No, you know what?
[00:02:20] It's a democracy.
[00:02:21] If I remember or if you remember to remind me, let's do that.
[00:02:26] Before we get started, big thank you to those of you who support us each and every week on Patreon.
[00:02:32] Patreon.com slash AI Inside Show.
[00:02:35] Our newest patron, Joe Stironis.
[00:02:38] Did I pronounce that correctly?
[00:02:40] Anyways, Joe, good to have you on board.
[00:02:43] Some of the patron numbers kind of increasing.
[00:02:46] So that's nice to see.
[00:02:48] Love to see the support.
[00:02:49] Thank you so very much for that.
[00:02:51] And, you know, if you want to help us out, go to Patreon.com slash AI Inside Show and you can do that.
[00:02:58] We really appreciate it.
[00:02:59] Also, if you are watching as we record live, well, you should subscribe to the podcast if you haven't already.
[00:03:06] That's found at AI Inside.show.
[00:03:10] That's the web page.
[00:03:11] You can find all the subscribe links for all the podcatchers, RSS.
[00:03:15] As well as, you know, you can listen to and watch all of our episodes from the website as well.
[00:03:21] But, like, who does that?
[00:03:22] You subscribe to a podcast, right?
[00:03:24] So go subscribe.
[00:03:24] That's the best way to do it.
[00:03:27] And with that, why don't we dive right into our top story?
[00:03:32] And I will admit I had to put this in the top slot.
[00:03:35] Yes, because it feels like an important conversation to have, but also because it's called Project Jarvis.
[00:03:41] And anytime I see your name Jarvis used in a project, I got to put it at the top slot.
[00:03:51] So I feel like Jarvis is a name that isn't just this project, right?
[00:03:57] No, it's not.
[00:03:57] No, it's not because when Mark Zuckerberg some years ago wrote his personal agent.
[00:04:04] But it was named Jarvis.
[00:04:05] Yes, that's what it is.
[00:04:05] It was named Jarvis.
[00:04:06] And then let's not forget a certain movie costume is named Jarvis.
[00:04:11] Right.
[00:04:12] I think the issue is when I was growing up, the joke was that Jarvis was a name for a butler.
[00:04:18] Right.
[00:04:19] Jarvis!
[00:04:20] It's, yes.
[00:04:20] Bring the car around.
[00:04:22] And so I think it feels like something that's in servitude to you.
[00:04:26] Which I am to you, Jason, so you know how that feels.
[00:04:29] I am your servant, Jarvis.
[00:04:33] When I think of Jarvis, not because of you, but in general, I do, it does kind of sound like a butlery type name.
[00:04:40] Yeah.
[00:04:40] The movie that you're talking about, or movies rather, I think it's just the Marvel cinematic universe has Jarvis tied into it.
[00:04:48] I'm not a huge Marvel fan, so, I mean, they're fine.
[00:04:52] It's just I never got nerdy about it, so.
[00:04:54] You probably just lost 10 people from the audience saying that.
[00:04:57] I know.
[00:04:57] Sorry.
[00:04:58] Come back.
[00:04:59] Come back.
[00:05:00] What are we even talking about here?
[00:05:02] Google is reportedly developing a web browser controlling AI agent.
[00:05:07] So here we are again.
[00:05:08] That's definitely.
[00:05:09] Which is just like last week's story where Anthropic said we're going to take over your laptop, your computer, and Google says we'll take over your browser.
[00:05:18] We're just, you don't have to do anything anymore, people.
[00:05:21] The computers are just going to do everything for you.
[00:05:24] In the computer of the future, we just sit down in our desk and we just zone out and stare at the screen while it does all the things we want it to do.
[00:05:32] Yeah.
[00:05:34] With our slack jaw.
[00:05:38] This is called Jarvis or Project Jarvis according to a report on the information, which, I mean, I'll show you, but I'm not logged in, so you get this very subscribed unlock sort of thing.
[00:05:51] Anyways, it responds to a person's commands by capturing frequent screenshots of what's on their computer screen and interpreting the shots before taking actions like clicking on a button or typing into a text field.
[00:06:05] This is from the information article.
[00:06:07] So common task assistance would be things like research, shopping, booking flights, the sort of stuff you'd open up a browser to do.
[00:06:19] And this would be, yes, of course, Chrome because it's a Google product.
[00:06:23] I can't imagine they'd have wider ambitions than that in the near term anyways.
[00:06:27] But I do wonder, because I don't think the report mentions it, but if it's Chrome, does that kind of assume that at some point Chrome OS could fall under the guise of this thing that isn't even confirmed to exist other than this report?
[00:06:41] Who the heck knows?
[00:06:42] It's all speculation.
[00:06:43] I mean, what strikes me, Jason, is isn't this similar to the new screenshot features that are available for Android?
[00:06:51] That you can take a screenshot of something on Android, right?
[00:06:54] And then on the basis of that screenshot, have it do organize them and have it do things.
[00:06:59] This sounds to me like it's very similar because it's built around screenshots of what you're seeing and thus doing.
[00:07:08] Yeah, it's understanding in real time what it's seeing on the screen.
[00:07:14] That is definitely similar between what the Screenshots app on Pixel, for example, what you're talking about does.
[00:07:22] Screenshots app on Pixel really is about pulling out pieces of information and then organizing them or making them searchable, accessible, that sort of stuff.
[00:07:31] Not as much about action.
[00:07:33] Like I don't know.
[00:07:33] This is more agentive.
[00:07:35] Yeah, exactly.
[00:07:37] So yeah, it's a component of it for sure, that recognition.
[00:07:40] And I imagine all the research and kind of products that they use to exercise that muscle just makes them all better as they build these things out and iterate on them over time.
[00:07:55] Yeah, last week we talked about how the app could go away because you're just going to say, make my trip for me.
[00:08:02] And it calls up Marriott and it calls up United Airlines and so on and so forth.
[00:08:08] But then I kind of thought last week, well, the page goes away.
[00:08:11] The web goes away too because you're just having to do actions.
[00:08:13] However, this is dependent upon there being a screen.
[00:08:17] Yeah.
[00:08:18] One presumes a web because it's a browser and you're doing things there.
[00:08:23] Yeah, I can't get my head around this fully of how it's going to operate.
[00:08:26] I guess it just becomes predictive and presumptive.
[00:08:30] If you're doing this, surely this is what you want to do next.
[00:08:34] But I don't know how well that works.
[00:08:35] Yeah, and if it's running in the browser, and we're only guessing because there's no concrete information here.
[00:08:41] We have no idea when we'll see this potentially release in 2025.
[00:08:45] It has the potential, according to the article, being previewed as soon as December 6th when the company is tipped to launch Gemini 2.
[00:08:56] It's already been almost a year since the first Gemini was released, which is kind of crazy.
[00:09:01] But what I'm wondering is, yeah, along the lines of what you're talking about, if it's in the browser specifically and it's agentic and it's interacting with it the way a user does,
[00:09:12] does that mean that I then have to give the Chrome browser access and control to my mouse control on a system level in order for it to be able to do that?
[00:09:23] Because the browser already knows what page you're on.
[00:09:27] It already has analyzed the content of that page, right?
[00:09:33] Because that's how it does search.
[00:09:35] That's how it scrapes the web.
[00:09:37] But how does it do the action?
[00:09:39] That's what I don't know.
[00:09:39] That's the next step here.
[00:09:40] Yeah.
[00:09:42] Or is it analyzing the code of the web page?
[00:09:45] Does it present you with options?
[00:09:46] Do you direct it to do something?
[00:09:47] This is all we don't know.
[00:09:48] So the short of long of this, folks, is we don't know what's going on here.
[00:09:51] Yeah.
[00:09:52] But it's interesting to think this is where the collective thinking of the AI folks in Silicon Valley is going.
[00:10:01] Mm-hmm.
[00:10:03] It's like the first step was AI can help you do all these things, and now AI is so good that you don't even need to do them at all anymore.
[00:10:11] Just tell it what you want done, and we'll go ahead and take care of it.
[00:10:14] Man, I still have a lot of hesitation around letting a system just take on full action for me.
[00:10:22] I just don't trust it enough.
[00:10:25] So years ago, not Marissa Meyer, former head of browser at Google.
[00:10:37] Marissa Meyer.
[00:10:38] Yeah, Marissa Meyer.
[00:10:38] Was it Marissa?
[00:10:39] Yeah, it's Marissa.
[00:10:40] Yeah.
[00:10:40] So years ago, I was at some event.
[00:10:42] Yes, it was.
[00:10:43] Where I was speaking, and I talked about the wonderβthis is long, long ago when she was still at Googleβabout how hyperlocal was going to be the next thing in news.
[00:10:52] And she stopped, and she said, no, I think you're wrong.
[00:10:56] I think it's going to be hyperpersonal.
[00:11:00] And ever since then, I think that's oneβand she said straight out, you know, we know what you're doing, so we can predict what you want to do before you necessarily have decided you want to do it.
[00:11:11] That's been Google's goal for a very long time.
[00:11:18] Mm-hmm.
[00:11:45] Therefore, we'll present it to you at the right time or whatever.
[00:11:48] I feel like I'm always battling it.
[00:11:50] I'm like, no, I don't want that right now.
[00:11:52] Yeah.
[00:11:52] Like, that's a glimpse of me, but wrong time, wrong place.
[00:11:58] How dare you?
[00:11:59] Yeah.
[00:11:59] Yeah, yeah.
[00:12:00] Well, and I've joked about this before.
[00:12:02] Sure.
[00:12:03] When I'm typing along in a Google product and it guesses the next word I'm going to say, and I think, oh, crap, that was what I was going to say.
[00:12:11] But God damn it, I'm not going to say it now because that makes sense.
[00:12:14] I'm just putting out a cliche, right?
[00:12:16] Yeah.
[00:12:16] Right.
[00:12:17] Horses of aβoh, shoot.
[00:12:19] Right?
[00:12:19] And horses of a feather.
[00:12:22] I'll make it my own.
[00:12:24] And so, yeah, I'm not sure how we're going to react to this idea that it knows what we want before we know it.
[00:12:31] Now, there are cases where I'm typing along phrase and it is the word I want, and it's nice to hit tab and that's the rest of it.
[00:12:39] But it's not that useful.
[00:12:41] I guess that is one example that I have grown to use a little bit more.
[00:12:46] Not wholesale, not start to finish, but when it completes the sentence and like, oh, yeah, well, that is where I'm going.
[00:12:52] I've gotten more and more used to just accepting that, you know, that was the word I was going to say anyways, so I'm not going to feel bad about it.
[00:12:59] Yeah, when I go into Drive, it knows that every Tuesday and Wednesday I pull up the run for this show.
[00:13:10] So it puts it at the top of the list.
[00:13:12] That's useful.
[00:13:14] During the campaign, almost every morning when news comes out, I'm curious how the stock market reacts.
[00:13:20] So I look up stock pre-market and I get to the same page.
[00:13:24] And Google says, you look this up often.
[00:13:26] Thank you.
[00:13:28] Oh, okay.
[00:13:30] Right?
[00:13:30] So there are things, but that's pretty obvious.
[00:13:33] That's just habit.
[00:13:34] It could choose to say, good morning.
[00:13:36] You always look at this, Jeff.
[00:13:37] You want it now?
[00:13:38] Yeah.
[00:13:39] That'd be okay.
[00:13:40] Is that really valuable?
[00:13:42] I don't know.
[00:13:44] Yeah.
[00:13:45] It's interesting that you mentioned the whole Drive thing as far as offering up suggestions of what you might need at any given time.
[00:13:52] I never β like I'm looking at it right now and like suggested folders, pets.
[00:13:59] I'm like, why?
[00:14:02] I can't even tell you the last time I opened that folder.
[00:14:05] Like what are you thinking about me right now that I want to have immediate access to pets?
[00:14:10] Some of the stuff that's there is appropriate.
[00:14:14] But I think my problem with it is that I can never β I'm never guaranteed that when I look at that spot, I'm going to get the thing that I actually need in the moment.
[00:14:23] So therefore, I never look in the spot.
[00:14:25] True.
[00:14:25] Because if I look there and it's not there, then that's one extra step to get me to where I'm actually going.
[00:14:31] And so my habit becomes I just go right to the thing because I always know what the thing is.
[00:14:35] And that's my complaint around systems.
[00:14:38] And maybe AI will fix this as they develop it further.
[00:14:41] But that's my complaint with these systems is it always β it just feels like it wants to be smarter than it actually is as far as how it relates to saving me time or saving me attention.
[00:14:53] Right.
[00:14:54] And Joe points out, Ozone Nightmare, that people will figure out how to game the results similar to SEO.
[00:15:01] And I think that's right.
[00:15:03] We'll be trying to train the system to give us what we actually want.
[00:15:09] Yeah.
[00:15:10] But that's a pain.
[00:15:11] I mean, I remember back to the early days of Yahoo News and you could have your own Yahoo News.
[00:15:15] And you had to go through 15 minutes of answering questions and then you never went to it.
[00:15:19] We all set up Yahoo News once.
[00:15:22] And when you're setting up β
[00:15:23] You're like, oh, this is great.
[00:15:25] This is going to be so useful, blah, blah, blah, blah.
[00:15:27] You know, like you're actually legitimately excited.
[00:15:29] I can't tell you how many times I've done that, had that experience where I'm like, finally, okay, yes, I'll do the work.
[00:15:36] And then it's just not part of your habit.
[00:15:38] It's not part of your routine.
[00:15:39] It isn't effective, whatever the case may be.
[00:15:41] Ozone Nightmare, thank you for the super thanks.
[00:15:43] Thank you.
[00:15:45] You're incredibly supportive and we appreciate you.
[00:15:48] Yep.
[00:15:48] Thank you for piping up.
[00:15:50] Yeah, interesting.
[00:15:53] Interesting conversation there.
[00:15:54] But we're not done with Google because speaking of Google, they had their quarterly β
[00:15:58] See, there's more Google on this show than there is on this week in Google.
[00:16:01] Let's just make that clear.
[00:16:04] You guys are going to talk about earnings, I'm sure, right?
[00:16:07] That almost always β maybe, maybe not.
[00:16:09] Which is an AI story too, by the way, because AI is what brought Google's earnings up.
[00:16:14] Well, so that's a good question.
[00:16:16] Like services revenue up 13% year over year.
[00:16:19] Google Cloud, which includes AI infrastructure for companies, up 35% year over year.
[00:16:25] But they're not breaking out like AI numbers in that, right?
[00:16:29] No.
[00:16:30] No, they're not breaking out the revenue.
[00:16:32] But things are rising and AI is undoubtedly integrated into those aspects of the business that are doing well.
[00:16:38] So it's hard to know how much that makes the imprint, you know?
[00:16:43] Yeah.
[00:16:44] Anyway.
[00:16:45] During the call, Sunar Pichai, CEO, mentioned, quote,
[00:16:49] More than a quarter of all new code at Google is generated by AI, then reviewed and accepted by engineers.
[00:16:58] More than a quarter of all new code at Google generated by AI.
[00:17:02] Like that's surprising to me.
[00:17:06] Like I realize, you know, Google loves to dog food their stuff.
[00:17:09] And this is obviously a part of that.
[00:17:11] But like what does that mean?
[00:17:14] Like wholesale code generation, adding comments to code that was already written.
[00:17:19] Like what does that mean?
[00:17:20] More than 25% is a huge number.
[00:17:24] Yeah.
[00:17:24] And I would love to know what the metric is there.
[00:17:26] Is it lines of code?
[00:17:28] Is it?
[00:17:28] Right.
[00:17:29] I don't know that.
[00:17:30] But still, it's impressive.
[00:17:32] It's impressive.
[00:17:34] It is.
[00:17:35] And hey, kids, before you go off and get that degree in computer science,
[00:17:39] maybe you might want to consider another angle on being close to technology.
[00:17:44] Yeah.
[00:17:44] Which is what I'm working on.
[00:17:46] Is that your next book?
[00:17:48] No, it's the next.
[00:17:49] So I have two new affiliations.
[00:17:51] I'm working now.
[00:17:52] So I let this loose this week.
[00:17:55] I now, since I'm emeritus, Latin for old, at CUNY,
[00:17:59] I am now working with Stony Brook University and Montclair State University.
[00:18:02] Right on.
[00:18:03] And in both places trying to work on programs that are about technology and society
[00:18:07] and bringing in other disciplines.
[00:18:08] So yes, that is what I'm working on next.
[00:18:10] Right on.
[00:18:11] That's amazing.
[00:18:13] Congratulations.
[00:18:13] So you can be around AI, you can be part of AI, you can be part of leading AI,
[00:18:16] but you don't have to be an engineer.
[00:18:19] Yeah.
[00:18:20] Well, yeah.
[00:18:21] I mean, the story after this one definitely ties into what we're talking about.
[00:18:25] But it's really interesting to me how the kind of the thing around being a coder,
[00:18:33] being a developer has been so much of the same kind of definition for so long.
[00:18:40] And in the last couple of years, that definition is shifting.
[00:18:44] I mean, and I'm not saying that AI is taking over for development, but it's introducing this new tool set.
[00:18:52] And I'm sure, undoubtedly, there are developers that have a problem with handing over the control and want to use it 100%.
[00:19:01] But there's the new, like you're saying, there's the new up and comers that the industry and the job, as it were,
[00:19:09] will be shaped by the fact that the new people coming into the industry are going to use these tools.
[00:19:16] These tools will be either a supplement or a replacement for parts of the job that are done already.
[00:19:24] And yeah.
[00:19:25] And I mean, Google is putting proof to that.
[00:19:27] I mean, if we trust them at their word when they say more than 25% of all of their code, it's just, that's just crazy to me.
[00:19:35] Like that just blew me away.
[00:19:36] I was like, wow.
[00:19:37] Okay.
[00:19:38] It really is.
[00:19:38] So I did last weekend or last week, I did the Boston Book Festival and on Saturday, and I was on a panel with Andrew Smith, who wrote a new book called, trying to find it here, The Devil Within, I think it is, in which he spent a year learning how to code.
[00:19:58] And, you know, he got really into it as a, I haven't read the book yet, but learning it as a culture, as a language, obviously, as a worldview, as a way to think.
[00:20:11] And so I wonder how much of that stays in this world, even if you are coding, but you're doing it through these ways.
[00:20:17] Now, he said he uses Copilot and it's useful and it helps him out and that's fine.
[00:20:22] So he's still a coder and still thinks that way.
[00:20:25] But I think it might change the culture of coding in ways that I can't fully understand because I'm not one.
[00:20:31] Totally.
[00:20:32] Yeah, I completely agree.
[00:20:33] I couldn't understand that either.
[00:20:37] Yeah, yeah.
[00:20:38] It's just really, really interesting.
[00:20:39] And I mean, we got to talk about the next story because it totally ties into this.
[00:20:43] GitHub opening its Copilot to be multi-model, not modal, multi-model.
[00:20:50] So not just using OpenAI, ChatGPT, but also its competitors, Google, Anthropic.
[00:21:00] Developers will be able to chat with them in Copilot Assistant eventually.
[00:21:04] But OpenAI's models will reside in the default setting, at least for now.
[00:21:09] So, you know, essentially Microsoft not putting all of its eggs in the OpenAI basket.
[00:21:17] So there's that.
[00:21:18] There's kind of multiple parts to this story.
[00:21:20] But I mean, I guess that speaks more to Microsoft, you know, pulling back a little bit from the relationship with OpenAI.
[00:21:29] Keeping it, but also kind of differentiating a little bit.
[00:21:32] Yeah, which relates to what we talked about last week with the negotiations going on now about the cap table for the for-profit public OpenAI.
[00:21:40] So it really struck me as a shot across OpenAI's bow.
[00:21:45] We don't need just you.
[00:21:47] We can do without you.
[00:21:48] We can do other things.
[00:21:49] It's also obviously good for Copilot.
[00:21:51] It gives people who would prefer Anthropic or prefer Google another path.
[00:21:55] That's good for the product.
[00:21:56] That's smart.
[00:21:57] But, yeah, I think the timing of this fits into the OpenAI-Microsoft negotiations.
[00:22:05] Yeah, 100%.
[00:22:06] OpenAI, you're not that valuable to us.
[00:22:08] You're neat.
[00:22:09] Don't go away.
[00:22:11] You can hang around here.
[00:22:13] We're not saying no.
[00:22:13] We're not saying no.
[00:22:14] We're just saying yes and.
[00:22:18] Let's date more people.
[00:22:20] You know, I don't think we want to.
[00:22:20] Let's open up this relationship.
[00:22:22] Yeah, yeah.
[00:22:24] We're still friends.
[00:22:25] An open relationship.
[00:22:26] Yeah, there you go.
[00:22:27] That's what it is.
[00:22:27] It's an open-ish relationship, as you would say.
[00:22:30] Right.
[00:22:31] And I wonder how OpenAI feels about that, you know?
[00:22:34] Yeah, I do.
[00:22:35] They could be really bummed.
[00:22:36] Sorry, OpenAI.
[00:22:39] GitHub also shared a new AI tool called Spark, which allows for building micro web apps using natural language,
[00:22:48] which, you know, is kind of nothing new in the last couple of years.
[00:22:51] You know, there's a lot of that happening.
[00:22:53] This is meant to enable anyone, even non-coders, to create apps.
[00:22:58] But also, it gives experienced developers the ability to manipulate the code as needed.
[00:23:05] It can all be previewed within the experience compared against other versions, along with the changes that are made there.
[00:23:14] Sounds like a pretty powerful approach to coding.
[00:23:18] You know, and again, this ties in with what we were talking about just moments ago, which is just that really, and I bet you, I'm almost certain that developers hate to hear the phrase,
[00:23:28] this means that anyone can be a developer.
[00:23:31] Because they're like, no, that's not true.
[00:23:33] But more so than without, I mean, I couldn't develop, you know, beyond Commodore 64 Basic, you know, in my lifetime.
[00:23:41] And using something like this, I could potentially come up with a block of code.
[00:23:45] Did I create it?
[00:23:46] No.
[00:23:46] No.
[00:23:46] But I could, that's a skill that I could actually do something with that I couldn't before without a lot of studying.
[00:23:55] But you know what?
[00:23:55] It goes to what we talked about last week with all the agent stuff.
[00:23:59] What's an app?
[00:24:00] What is an app?
[00:24:01] Why is this creating an app?
[00:24:03] If I can tell it to do something and to replicate that in the future, I don't need to know what the code is.
[00:24:09] I don't need to know what's behind it.
[00:24:11] I don't need to run it, right?
[00:24:12] I don't need to play host to it, potentially.
[00:24:15] I can say, you know that thing I had to do, getting me a plane reservation?
[00:24:19] I want you to do it again.
[00:24:21] Oh, yeah.
[00:24:21] And whatever it did behind the scenes to do that, if it's memorized, it's there.
[00:24:27] And it may be an app in an old nomenclature, but now it's just a command.
[00:24:31] Yeah.
[00:24:32] I mean, I do that, you know, here comes your bingo card perplexity.
[00:24:36] Like, I do that every day with perplexity.
[00:24:38] I've got my commands saved.
[00:24:40] I can't remember what they call it now.
[00:24:42] I think it used to be called Spaces or something, and they call it something different.
[00:24:45] But anyway.
[00:24:46] So do what, for example?
[00:24:48] Well, so for example, like when we're done with this podcast, I'm going to have a transcript of our show.
[00:24:54] And I've got a specific prompt that, you know, breaks out keywords so that I can feed those into YouTube.
[00:24:59] It gives me like five different like versions of like short one to two sentence descriptions of the episode.
[00:25:08] And I've over time, you know, kind of like crafted out a prompt to have it be more specific and not vague in general and not too like, and the amazing advancements AI loves to do, you know.
[00:25:20] And I mean, really, when you think about it, you know, I have, it's not an app, but it's, I'm using it like an app.
[00:25:27] I'm going in there and I'm saying, here's my text, hit the button and run the program.
[00:25:33] Right.
[00:25:33] It creates these things that I wanted.
[00:25:36] And that's, yeah.
[00:25:37] That's what we're talking about.
[00:25:38] So it's post app.
[00:25:41] Yeah.
[00:25:41] Yeah.
[00:25:42] And so then to that end, maybe the Johnny Ive, Sam Altman hardware device thing makes more sense because it's like, why do you need a phone that runs apps?
[00:25:54] Maybe you just need an AI agent device, but I don't know why that it couldn't just be the phone, you know.
[00:25:59] But, you know, anyways.
[00:26:02] GitHub Copilot, by the way, has a $2 billion revenue run rate.
[00:26:06] So they're doing pretty well.
[00:26:07] More than 1 million paid subscribers.
[00:26:10] And I think just in general, this and the previous story just kind of had my mind going as far as like, when we're looking at generative AI, what are the examples of kind of like the avenues that are winning right now that are showing some real progress?
[00:26:28] And it kind of seems like this is one of them.
[00:26:30] You know, this has the potential to make some serious money.
[00:26:34] Yeah.
[00:26:36] I don't know.
[00:26:36] There's a lot of naysaying around generative AI and like, oh, it's just bluster and, you know, it's just overblown and whatever.
[00:26:43] And, I mean, examples like this just make me go, yeah, but that's pretty cool.
[00:26:47] Like, I don't think that's going anywhere.
[00:26:49] You know, I don't think that's going away.
[00:26:51] Yeah, I was surprised.
[00:26:52] One of the things that's not in the rundown, just to mention real quickly, is that the CFO of OpenAI said that paid chat GPT subscriptions make up most of their revenue.
[00:27:03] 75% of the overblown business comes from consumer subscriptions.
[00:27:06] That surprised me immensely.
[00:27:08] Oh, wow.
[00:27:09] So there seems to be a business in this now.
[00:27:12] Now, is it just a curiosity business?
[00:27:14] Is it everybody in the world has a podcast and so you've got to subscribe to OpenAI and talk about it?
[00:27:18] Right.
[00:27:19] Yeah.
[00:27:20] What are the uses?
[00:27:21] Is that sustainable?
[00:27:22] Does that keep going or not?
[00:27:24] I don't know.
[00:27:25] You know, because everybody's wondering where is the actual business in large language models and generative AI?
[00:27:31] Is this indicative?
[00:27:33] Maybe.
[00:27:33] Yeah.
[00:27:33] Yeah.
[00:27:34] Yeah, it's kind of surprising.
[00:27:37] Yeah, it's interesting.
[00:27:40] Most of their business made up from just paid subscribers.
[00:27:44] Yeah, like you.
[00:27:45] What is the conversion?
[00:27:46] Yeah, right.
[00:27:47] Yeah, absolutely.
[00:27:49] Very interesting.
[00:27:51] See, I snuck a story in there.
[00:27:53] No, it's great.
[00:27:54] At what time does that potentially break down or maybe it doesn't?
[00:28:00] I don't know.
[00:28:00] Right.
[00:28:01] Or at what point do paid subscribers realize they're not getting the benefit that they used to?
[00:28:07] Or I don't know.
[00:28:08] Like, you know, I've mentioned it plenty of times.
[00:28:11] Like, I'm coming up on a year with perplexity at this point.
[00:28:14] And I could not imagine doing what I do this past year without the assistance of it.
[00:28:20] Wow.
[00:28:20] Like, it has improved my workflow tremendously.
[00:28:25] And so to put that genie back in the bottle, like, you know, if I had to, I would.
[00:28:32] Like, if they went away, if all of these services just went away and I had no choice, like, obviously I'd figure it out.
[00:28:37] But given the choice and given what my experience has been so far, I wouldn't.
[00:28:42] Like, I will pay the premium.
[00:28:44] Can you transfer all those functions to OpenAI, to ChatGPT?
[00:28:49] Well, that's a good question.
[00:28:50] I mean, probably.
[00:28:51] I could probably take the prompts that I'm using into perplexity, which, by the way, you know, is a multi-model service also.
[00:28:59] So I have ChatGPT through it.
[00:29:02] But if I wanted to just go to ChatGPT and, you know, pay over there and do everything there, sure.
[00:29:07] Or better.ai or Gemini or any of them.
[00:29:10] Yeah.
[00:29:10] I mean, I could take those prompts and start using them over there and then start fine-tuning them as need be, you know, if they're not giving me exactly what I want.
[00:29:17] But, I mean, that is also part of the exploration is, like, the more I use it, the more comfortable I get in understanding how to phrase my prompts and my questions in order to get the things that I'm really looking for.
[00:29:29] What this model is good at versus what that model is good at and all that stuff.
[00:29:33] So, yeah.
[00:29:35] Very interesting.
[00:29:38] Meta is developing its own AI-powered search engine.
[00:29:42] This also according to the information, an effort to reduce its dependency on Google and Bing for real-time information in meta AI.
[00:29:55] Been building it since earlier this year.
[00:29:58] Meta AI currently sees around 185 million active weekly users across its many apps, you know, WhatsApp and Facebook and Instagram and everything.
[00:30:07] Meta AI is firmly implanted in those experiences, just begging you to interact with it, even though I don't know that I have more than a handful of times.
[00:30:16] But there it is.
[00:30:18] It also goes to the Google antitrust case is that if it's not a big deal for meta to start a search engine and for open AI to probably be a search engine we've discussed before, then search is not a monopoly anymore from a consumer level.
[00:30:35] So, meta joining in there is probably, one could say it's bad news for Google.
[00:30:39] But again, people use TikTok for search.
[00:30:43] They use Amazon for search.
[00:30:44] They use lots of ways to search.
[00:30:45] Oh, for sure.
[00:30:46] So, meta to have a search, yeah, not a big deal.
[00:30:48] Yeah, yeah.
[00:30:50] Meta also announced last week a deal with Reuters for current events and for news questions to be, you know, integrated to get more of that, like, real-time kind of quality.
[00:30:59] So, this pissed me off.
[00:31:01] Oh, yeah.
[00:31:02] Yeah, it pissed me off.
[00:31:03] Let me tell you.
[00:31:03] Let me tell you, it pissed me off.
[00:31:04] Because the problem is, for those of, like, man, I'm a rare beast.
[00:31:09] I'm a news junkie.
[00:31:10] When I put news links into Meta's products now, you know they get downgraded.
[00:31:16] They don't want anything to do with news.
[00:31:18] They're getting ready to get rid of news, I think, if legislation, you know, forces them to.
[00:31:24] And so, we can't talk about news in Facebook anymore.
[00:31:29] So, it looks like Meta is going to be able to service news, but we can't serve each other news.
[00:31:33] And that's going to piss me off.
[00:31:36] Meta says, okay, well, we can have this and our AI can serve you news, but you can't do this yourself, folks.
[00:31:42] And in Canada, of course, you absolutely can't do it because news is entirely off the platform.
[00:31:47] So, this actually peeved me in a way that Meta can do what we can't do.
[00:31:53] Interesting.
[00:31:54] Yeah, it's a good perspective.
[00:31:57] So, if I go on there and I share news, like, I can, but it's just not going to be so many people.
[00:32:03] Nobody's going to see it.
[00:32:04] It's going to be ramped down to small hair.
[00:32:05] Right.
[00:32:06] And this is what people try to do is they say, you know, link in the comments as if that's going to get around Meta.
[00:32:09] Like, Meta can't figure out, oh, that's what people are doing.
[00:32:12] Right.
[00:32:13] So, if I put up a screenshot β well, what happened in Canada was people were starting putting up screenshots and then the Canadian government went after Meta and said, see, you still have news.
[00:32:22] You should be paying for that even though Meta's not doing it.
[00:32:25] So, I suspect they've certainly learned how to downgrade screenshots of text as well.
[00:32:30] So, I think you could probably put β you could put in an image of something and talk about that as an image.
[00:32:36] And you can sneak the news in that way, certainly.
[00:32:38] You can talk about it.
[00:32:39] But can you provide to your readers the link to the same thing you're talking about, which is the basis of this?
[00:32:44] Not really anymore.
[00:32:46] So, if that link is in there, then that's going to β yeah.
[00:32:49] It appears.
[00:32:50] We don't β you know, they never say it.
[00:32:51] We don't know it.
[00:32:52] Right.
[00:32:52] But it's just common knowledge now that if you put a news link in β
[00:32:56] Well, I was just talking with someone the other day kind of somewhat related to this as far as like video strategy and everything.
[00:33:01] And he was saying, you know, if you're sharing your YouTube links into Facebook because you want more people on Facebook to see it, they're probably not going to.
[00:33:11] What you got to do is you got to upload your video to Facebook because Facebook doesn't want to show you other things.
[00:33:16] They want to show you their things.
[00:33:17] Yes.
[00:33:22] But she refuses to have an account, so she couldn't watch it.
[00:33:25] I see TikToks all the time on Twitter.
[00:33:27] Yeah.
[00:33:28] You got to download it first and then upload it as if it's your own video.
[00:33:34] Oh.
[00:33:34] Oh, on Twitter.
[00:33:34] So, if you see a TikTok on Twitter or Facebook, it's because somebody did that.
[00:33:39] Now, the branding is still clear.
[00:33:41] It's TikTok, so TikTok's fine.
[00:33:42] They get the view.
[00:33:43] Yeah, right.
[00:33:44] Yeah.
[00:33:44] They bake that right in there.
[00:33:47] Yeah.
[00:33:48] Interesting.
[00:33:49] Well, so how do you feel about Ben Thompson, the wickedly smart Ben Thompson at Stratechery, who wrote about how he sees meta as best positioned in the generative AI race right now?
[00:34:02] He says, short-term AI-powered advertising, longer-term content integration, evolving from a social network to a content network, even longer-term.
[00:34:11] You've got your XR projects like Project Orion.
[00:34:15] Do you share his rosy outlook on meta's potential upside when it comes to generative AI?
[00:34:21] I have argued on this show that meta is the spoiler in AI, because if you look at the people, you know, 75% of the business of open AI is individual subscriptions.
[00:34:31] Well, I can go to meta.ai right now and do it for free.
[00:34:34] Yeah.
[00:34:35] And so, I think that has a huge effect, plus, obviously, the llama being out there as a free alternative.
[00:34:42] So, I think meta is the spoiler.
[00:34:44] That I absolutely think.
[00:34:45] What Ben argues, and I've got to admit, I love Ben.
[00:34:48] He's brilliant.
[00:34:50] But I have taken now whenever I'm going to read one of them, I put it in Notebook LM, and it summarized it for me.
[00:34:55] This article from Stratechery by Ben Thompson.
[00:34:58] It's comprehensive.
[00:34:59] Yeah.
[00:34:59] God.
[00:34:59] He makes Mike Masnick look like he just tweets all day, you know?
[00:35:03] And Mike writes really detailed posts, nothing compared to Ben.
[00:35:09] So, he argues that meta is uniquely positioned to capitalize the potential of generative AI, blah, blah, blah, blah.
[00:35:14] Right?
[00:35:14] So, you summarized it better than Notebook LM did.
[00:35:21] You still have a job, Jason.
[00:35:24] I don't know.
[00:35:26] No.
[00:35:26] The problem is that I've defended Facebook in the past.
[00:35:30] I don't like it very much now for the reason I just said.
[00:35:33] I can't really share what I want to share.
[00:35:35] It irritates me.
[00:35:37] I don't really use Instagram.
[00:35:38] I'm weird that way.
[00:35:40] I'm an old fart.
[00:35:40] I don't either.
[00:35:41] Not very much.
[00:35:42] Yeah, I hear you.
[00:35:43] No.
[00:35:44] I'm just not as image-based.
[00:35:45] I'm more text-based.
[00:35:48] And, you know, so using it to create content, well, no.
[00:35:53] I don't think we need the machine to do that.
[00:35:56] Now, using it to advertising, yes, that makes sense, right?
[00:36:00] But they're already using AI for advertising.
[00:36:02] They've been using AI for advertising.
[00:36:03] So does Google.
[00:36:04] They use it to try to figure out the best-
[00:36:07] Patterns.
[00:36:08] It's patterns.
[00:36:08] It's prediction, right?
[00:36:09] So, that's not generative AI.
[00:36:12] That's predictive AI.
[00:36:16] So, the one thing that does strike me is I think that there's something probably around
[00:36:20] curation, that if there's a lot of content, it's not so much going to make it, but find
[00:36:24] it and recommend it based on relevance to you.
[00:36:29] I think that's fine.
[00:36:31] Is there a big pot of gold for meta there?
[00:36:34] I don't know.
[00:36:37] So, I think I agree with Ben, but probably for different reasons.
[00:36:42] I think Yann Lacon is really smart.
[00:36:44] I think Llama is an important leverage point for meta in the whole AI world.
[00:36:51] And I think they're going to surprise us in ways.
[00:36:54] So, yeah, I just don't know that it's going to come from the ways, the places that Ben thinks.
[00:36:58] What do you think?
[00:36:59] Yeah.
[00:37:01] Well, yeah, I don't know exactly how I think about meta being the one and only to benefit.
[00:37:07] I mean, I can understand where Ben is coming from, but I think it's just, it's so early.
[00:37:13] It's so hard to tell that, like, I feel like these business models are just shifting every
[00:37:19] week.
[00:37:20] You know, it's such a race right now.
[00:37:22] I don't have an answer on whether I truly believe in what he's saying.
[00:37:27] I did find it kind of surprising because he's been pretty critical in the past, you know,
[00:37:32] and I guess that just kind of shows his ability to kind of ride with the wave of the times and
[00:37:40] be able to say, you know, I've said those things about meta in the past and right now
[00:37:44] things are looking pretty good.
[00:37:45] So, I don't know how I feel, but I do know that you and I have talked a lot about meta's
[00:37:50] open kind of approach, whether you want to call it completely open source or open-ish or
[00:37:55] whatever we were calling it.
[00:37:57] And I do think that that is going to win them a lot of favor and a lot of success.
[00:38:02] And we'll have an example after the break.
[00:38:05] That's right.
[00:38:06] Love the setup.
[00:38:07] Excellent setup.
[00:38:08] We'll be back in a moment with an example, as Jeff says.
[00:38:14] See, I didn't even have to do that anymore.
[00:38:16] This is getting so easy for me now, Jeff.
[00:38:19] You can just do the transitions for me.
[00:38:21] Excellent.
[00:38:22] Let's see here.
[00:38:23] You planned it this way, boss.
[00:38:26] It just makes sense.
[00:38:28] So, meta now has a new product that you can β I'm vamping because I'm trying to get
[00:38:35] it up on my screen so I can play you an example.
[00:38:38] But meta has its own take on Notebook LM called Notebook Llama.
[00:38:44] It's a copy.
[00:38:46] It's a generic version of Notebook LM using Llama.
[00:38:53] Right.
[00:38:55] Exactly.
[00:38:55] Yes.
[00:38:56] It's free.
[00:38:57] It's customizable to uploading your own PDFs.
[00:39:02] I think right now you can upload PDFs.
[00:39:05] So, it can't do some of the more advanced things that you can do with Notebook LM like adding
[00:39:11] web links, adding audio, YouTube links, that sort of stuff.
[00:39:15] So, for now, it's early days.
[00:39:18] And my understanding is it was thrown together pretty quickly, at least according to the Tom's
[00:39:24] Guide article.
[00:39:24] They said they launched it pretty quickly.
[00:39:28] And I think that when you listen to the example, it's pretty obvious that it's not quite playing
[00:39:34] in the same league as Notebook LM and what we heard in the podcast generation of Notebook
[00:39:40] LM.
[00:39:40] But it's interesting.
[00:39:41] This week's episode of Lestai, Insights, where we explore the latest developments in
[00:39:46] the field of artificial intelligence.
[00:39:47] The voice isn't as good.
[00:39:48] That's for sure.
[00:39:49] Today, we're going to dive into the fascinating world of knowledge.
[00:39:52] Well, check out the secondary voice.
[00:39:54] So, let's get started.
[00:39:55] Joining me on this journey is my co-host who's through to the topic and I'll be guiding
[00:39:59] them through the ins and outs of new to the topic.
[00:40:02] I'll be guiding them through the ins and outs of knowledge.
[00:40:05] Just let's start it.
[00:40:07] Sounds exciting.
[00:40:08] I've heard of knowledge distillation, but I'm not entirely sure what it's all about.
[00:40:14] Can you give me a brief overview?
[00:40:17] Okay.
[00:40:18] Of course, knowledge distillation is a technique that enables the time source of knowledge.
[00:40:23] It's fun to make fun of systems that don't work.
[00:40:25] I can see their...
[00:40:26] This isn't like sitting next to a...
[00:40:29] You're going to a bar and you're next to a really bad date.
[00:40:33] Yes.
[00:40:33] It's kind of uncomfortable.
[00:40:35] Yeah.
[00:40:35] You want to jump off the top.
[00:40:37] Give it up.
[00:40:37] It's not working ever.
[00:40:38] It's just...
[00:40:38] This isn't working.
[00:40:40] This isn't working.
[00:40:41] None of that is to say that this is a horrible, you know, the first entry into this space.
[00:40:47] It is an interesting first step, right?
[00:40:50] It's free.
[00:40:51] It's open source.
[00:40:51] Similar to what Meta does with their AI models.
[00:40:55] And so that's kind of the big benefit.
[00:40:58] It's also early.
[00:40:58] It can only do the PDF input right now.
[00:41:00] I'm sure over time it's going to, you know, broaden out and allow you to take some of those other sources and hopefully, hopefully improve some of the voice quality because that's a big downside.
[00:41:11] Yeah, I think it's more than being about having Notebook Lama, which is the name they're using to be smart-assed.
[00:41:17] Right.
[00:41:17] I think it is a way for Meta to say, see, not just a big deal.
[00:41:22] You're all wowed by Notebook LM.
[00:41:24] Well, with our open source model, look, we can slap it together.
[00:41:28] Is it perfect?
[00:41:29] No.
[00:41:30] Yeah.
[00:41:30] But you could do this too because open source, open-ish, not as an open source, but our open-ish trademark Jason Howell model.
[00:41:40] Google lets you slap this stuff together and look how powerful you are and look how powerful Google isn't.
[00:41:46] Yeah.
[00:41:46] Right?
[00:41:47] That's a good take.
[00:41:48] I think that's kind of what they're doing here is it's β I used the phrase already in the show, but it's a shot across the bow.
[00:41:54] This time not of open AI but of Google.
[00:41:57] Well, but I guess open AI too because it says whatever they do, you can do too because you can use the open-ish Lama.
[00:42:05] There you go.
[00:42:06] So it's on GitHub.
[00:42:08] I presume you can β those of you who know what you're doing can modify it and make it your own.
[00:42:14] And that makes it a powerful demonstration of the power of open-ish.
[00:42:19] Yeah.
[00:42:20] Yes.
[00:42:21] Very interesting stuff.
[00:42:22] Just look for Notebook Lama and, yeah, we'll see how that develops.
[00:42:27] I hadn't thought about the kind of how it β the shot across the bow theory.
[00:42:35] I think that's β
[00:42:36] It's probably pretty spot on.
[00:42:38] It's like β
[00:42:38] Yeah, yeah, yeah, yeah.
[00:42:40] I picked β
[00:42:40] Anything you can do, we can do worse.
[00:42:42] I put a booger out of my nose and it was that easy.
[00:42:44] This is what it came up with.
[00:42:46] That's how easy it is.
[00:42:46] I thought I was a little more gracious than that, Jason.
[00:42:48] Jason.
[00:42:49] Jeez.
[00:42:49] Jeez.
[00:42:50] Oh, sorry.
[00:42:52] Sorry to put it in such crude terms, but there you go.
[00:42:57] Apple Intelligence, for those of you who have iPhones or Macs running either M1 or A17 Pro chips, so definitely not for everyone right now.
[00:43:08] But if you've been waiting for Apple to finally unleash its Apple Intelligence into the world, wait no longer because if you fall into those categories, you can start getting some experience with Apple Intelligence.
[00:43:23] I guess this was just a couple of days ago it started rolling out.
[00:43:27] And it sounds like it's very trim right out of the gate.
[00:43:30] It's not all the features that Apple was touting.
[00:43:34] You get some writing tools.
[00:43:36] You get some improved Siri, some photo cleaning tools, pretty standard stuff.
[00:43:42] December is when some of the more advanced features come like ChatGPT integration, Image Playground, taking a sketch and turning it into an image.
[00:43:50] I think that might be part of Image Playground.
[00:43:52] Genmoji, Visual Search, that sort of stuff.
[00:43:55] So I saw a β I won't say what organization, but a large organization's AI policy recently.
[00:44:01] And they said, you know, without permission, you may not use any device that has AI inside.
[00:44:08] Well, that means you can't use your phone.
[00:44:11] Yeah.
[00:44:11] You know, it's ridiculous.
[00:44:13] Where was this again?
[00:44:13] You can't take it on this.
[00:44:14] You can't use any device that has AI installed inside.
[00:44:19] And where did you see this?
[00:44:21] A certain institution.
[00:44:22] Oh, oh.
[00:44:23] Institution.
[00:44:24] Unnamed institution.
[00:44:25] Got it.
[00:44:25] Okay.
[00:44:26] Large bureaucratic institution.
[00:44:28] Got it.
[00:44:29] And it's amusing because it's just like, well, AI is a bad thing.
[00:44:32] Well, you use spell check.
[00:44:36] Right.
[00:44:37] You use transcription.
[00:44:38] Where do you want the definition?
[00:44:40] Yeah.
[00:44:40] Where do you draw the line?
[00:44:40] Did you use ways to get to the office today that you're required to come to because your
[00:44:43] bosses are all jerks and make you come back to it?
[00:44:46] Yeah.
[00:44:46] Right?
[00:44:47] It's all AI is all over.
[00:44:49] It's layered and everything, as we say here.
[00:44:52] And yeah, so I was amused by that.
[00:44:54] So now your phone, both your Android and your Apple phone are going to have AI built in.
[00:44:59] They're going to have AI offs.
[00:45:01] Yeah.
[00:45:02] AI battles.
[00:45:03] Yep.
[00:45:04] Dance battles.
[00:45:06] Definitely feels, you know, a little like a soft launch of the stuff that most more people
[00:45:12] are probably looking forward to.
[00:45:13] But nonetheless, it's been a long time coming and Apple's finally releasing it.
[00:45:20] And by the way, if you're in the EU, you're not going to get any of this until April 2025
[00:45:25] is what I understand.
[00:45:28] So you're going to wait even longer.
[00:45:30] We will get it here in the US before you get any of that stuff.
[00:45:33] Sorry, folks.
[00:45:34] Just the way it goes.
[00:45:36] You put in a link to an article by O'Malek about Humane.
[00:45:41] And Humane, of course, the creators of the infamous AI pin that will forever have the
[00:45:51] word flop attached to it.
[00:45:53] I mean, for better or for worse, it's just the way it goes after such a disastrous launch
[00:45:58] earlier this year in April.
[00:45:59] And according to Ohm's article, they're shifting the company strategy away from solely hardware
[00:46:07] to focus on licensing its AIOS, Cosmos, Cosmos, whatever you want to call it.
[00:46:16] Yeah.
[00:46:16] Good luck with that.
[00:46:20] So is this a symbol of like you can't recover?
[00:46:24] Yeah.
[00:46:25] Yeah.
[00:46:25] Yeah.
[00:46:25] Well, it's like in the early days of entrepreneurial journalism when I had a lot of students working
[00:46:32] on companies and I dealt with a lot of startups, inevitably they would start wanting to be B2C
[00:46:37] because that's where all the money is.
[00:46:38] And then they realized that's really hard and really expensive to get users and to get an
[00:46:42] audience and to get the revenue.
[00:46:44] So they would inevitably switch.
[00:46:45] Well, no, no.
[00:46:46] We're a B2B company.
[00:46:48] I was part of that with one company.
[00:46:49] I was a founder of a company called Daylife.
[00:46:53] Same exact.
[00:46:53] We were going to be the new Google News.
[00:46:55] No, we're going to help you be the new Google News.
[00:46:58] Right?
[00:46:59] And so this is similar to me is we're going to be this cool hardware thing.
[00:47:03] Yeah, we got some software.
[00:47:06] Well, no.
[00:47:07] The hardware was the whole point.
[00:47:08] The hardware doesn't work.
[00:47:09] So they're trying to rescue it.
[00:47:10] God bless them.
[00:47:11] But it ain't going to work.
[00:47:13] Yeah.
[00:47:13] The humane founders, Bethany Bongiorno and Imran Chowdhury said that this licensing move
[00:47:21] was always a part of their plan.
[00:47:22] Yeah, yeah, yeah.
[00:47:23] Yeah, don't call it a pivot.
[00:47:26] No.
[00:47:28] And I don't think you're going to call it a comeback either.
[00:47:31] We don't mean to be cruel about it.
[00:47:33] It's just that you hyped it up as this was going to be this amazing thing.
[00:47:37] It's hard to recover from something.
[00:47:38] That was a spectacular failure out of the gate.
[00:47:43] It wasn't just a normal like, oh, yeah, it's kind of a bad product.
[00:47:46] There was so much ado about that failure, probably because Marques Brownlee's worst tech product I've ever used thing.
[00:47:58] That certainly didn't help.
[00:47:59] But if it was true, then what can you do?
[00:48:02] Yeah, and you and I are both very grateful we didn't order it.
[00:48:06] Yeah.
[00:48:06] Yeah, although from a historical perspective, maybe.
[00:48:10] But no.
[00:48:11] But what's funny is now when I see β not that it ever looked like an elegant device to begin with,
[00:48:16] but when I see it in product photos now, I feel like maybe my view of it has shifted a little bit further into the β
[00:48:23] oh, God, that thing is ugly.
[00:48:25] But I don't know.
[00:48:27] Every time, every single time, Jason, I see a Cybertruck go by, I laugh.
[00:48:33] Because it's laughable.
[00:48:34] I think it's the same.
[00:48:35] Yeah.
[00:48:36] Yeah, I know that.
[00:48:37] So I feel bad for them and always being actually very nice to them and listening to them.
[00:48:42] But yeah, no.
[00:48:43] Yeah, so they're saying that they've demonstrated the system in other places like car dashboards.
[00:48:49] You know, it's an agent-driven OS, not application-driven, built specifically for AI-first interactions.
[00:48:58] Just kind of feels like selling for parts sort of thing.
[00:49:03] Yeah.
[00:49:04] Yeah.
[00:49:05] So anyways, we'll see.
[00:49:06] Oh, and by the way, there is other news in the humane world.
[00:49:11] Days ago, the company cut the cost of the AI pin from $6.99 to $4.99.
[00:49:19] So they cut a couple hundred dollars off if you still want to get one.
[00:49:22] I don't know.
[00:49:22] So I think-
[00:49:23] $200 less.
[00:49:24] Would you buy it for $99?
[00:49:26] I don't know.
[00:49:27] Honestly, when I first read this, when I first read that part of the article, or the different
[00:49:32] article actually, I think.
[00:49:33] But when I first read it, I misread it and thought that they had cut the price to $200.
[00:49:38] And even there, I was like, oh, no.
[00:49:42] And then I was like, oh, I misread that.
[00:49:44] They cut off $200 from the price.
[00:49:46] It's $500.
[00:49:48] They go, yeah, hell no.
[00:49:49] No.
[00:49:50] That's not happening.
[00:49:52] No.
[00:49:53] So there you go.
[00:49:55] And finally, the AI music world and the music industry, of course, fighting against the
[00:50:03] rise of models like Suno and Udeo, which we've talked about.
[00:50:07] Now, while also at the same time looking for a way to be in the generative AI space themselves,
[00:50:13] I think ultimately, as we've talked about before, they're looking for ways that they
[00:50:19] can capitalize off of them so that they can control their IP, control their copyright,
[00:50:23] and still be in on the joke or in on the trend and not have other people making the money
[00:50:33] that they could possibly make off of it.
[00:50:37] And just essentially, ultimately, trying to capitalize on the technology.
[00:50:41] But in the case of this, doing so with respect to IP and copyright.
[00:50:45] And to that end, Universal Music Group is partnering with Clay Vision, which is an AI music startup
[00:50:50] that I had not heard of before this article.
[00:50:53] But they are entering a, quote, partnership, sorry, pioneering commercial ethical foundational model
[00:51:02] for AI-generated music that works in collaboration with the music industry and its creators.
[00:51:08] So they're really β Clay specifically is a company that bills itself as an ethical AI music company,
[00:51:15] developing something called the Clay MM or a large music model.
[00:51:21] That was a mouthful.
[00:51:23] So anyways, I don't know what's going to come out of this partnership.
[00:51:26] They're probably going to β
[00:51:28] There's similar stuff happening in the text world, which we talked about.
[00:51:31] Prorata.ai, which from a friend of mine, Bill Gross, and I'm part of the press release,
[00:51:35] native human AI, and Tolbit are all trying to do variations of this is the okay place for you AI people to go get your content
[00:51:43] because we'll reward the original content creators, and this is good.
[00:51:48] Right.
[00:51:48] So now to do the same thing in music, I think the same thing will happen with images and Getty and such.
[00:51:53] There'll be efforts to do this.
[00:51:55] Though I was at a virtual event today with the CNTI, the Center for News, Technology, and Innovation,
[00:52:02] kind of a think-tanky thing among the Board of Advisors, and I guess so it's Chatham House Rules,
[00:52:09] so I can't say who said it, but somebody from an AI company emphasized again that training is fair use and transformative,
[00:52:17] and we don't have any reason to license content for training.
[00:52:21] We license it only for display, and if we want to try to give you the full display of something,
[00:52:26] then we should license it, but training is not.
[00:52:29] So that's the line that's going to take the courts to come in here, whether it's text or music or image.
[00:52:37] Oh, it'd be so interesting to see where that comes down.
[00:52:39] Yeah.
[00:52:40] Yeah, to see where that comes down.
[00:52:42] Because I think I agree, and I think that a lot of the people out there that are concerned about AI are blind to that
[00:52:58] or don't quite see it that way.
[00:53:01] Oh, no.
[00:53:02] Whether it is or it isn't.
[00:53:04] And I think in my head, I guess where I'm headed is even if the courts decide that, that isn't going to suddenly satisfy and be like, oh, no.
[00:53:13] Oh, no.
[00:53:14] This fight's going to go on.
[00:53:15] Yeah.
[00:53:16] Yeah.
[00:53:16] Yeah.
[00:53:18] Insistence.
[00:53:18] All right.
[00:53:19] Can I do a lightning round?
[00:53:21] Yeah, sure.
[00:53:22] Let's do lightning.
[00:53:22] We'll do a real quick here.
[00:53:23] All right.
[00:53:24] I asked for four stories.
[00:53:26] One, a hospital was using an open AI model to do transcription.
[00:53:31] Not LLM stuff, just transcription.
[00:53:34] And the chat GPT model, when it came to a silence, it just made up stuff.
[00:53:39] Oh, boy.
[00:53:40] And it added, and this is a hospital.
[00:53:42] You don't want it making up stuff.
[00:53:44] No.
[00:53:45] No.
[00:53:45] And I mean, I've had, and I know this is lightning round, but just real quick on this topic.
[00:53:50] Like, I've had plenty of times where I run our show through to transcribe, and I always check it, right?
[00:53:56] Check for spellings, whatever.
[00:53:57] And there are, like, literal blocks that come out of nowhere that are-
[00:54:01] No, really?
[00:54:02] Nonsense.
[00:54:02] Oh, really?
[00:54:03] It doesn't happen all the time, but it does happen sometimes.
[00:54:05] And if you don't look through it, you could be putting out total, like, garbage, total sloppy stuff.
[00:54:11] I mean, completely, like, random generated, like, paragraph that has nothing to do with anything.
[00:54:17] It just, weird things happen like that sometimes.
[00:54:19] So, yeah.
[00:54:21] So the hospital shouldn't be using it this way.
[00:54:22] Oh, wow.
[00:54:23] I didn't know that.
[00:54:23] What do you use to transcribe?
[00:54:26] What do I use to transcribe?
[00:54:28] It's a, hold on.
[00:54:29] I have to find the bookmark for it.
[00:54:31] It is restream.io has a site.
[00:54:36] It's a free transcription.
[00:54:37] So if you upload an MP3, it'll do a free transcription, kick you a file in, like, a minute.
[00:54:42] And it's usually pretty good.
[00:54:44] But I've also used Revoltev, and that has done it.
[00:54:47] I've used this.
[00:54:48] That's done it a couple of times.
[00:54:49] Wow.
[00:54:50] Sometimes it just happens.
[00:54:51] And I just do it again, and it goes away.
[00:54:54] But, yeah.
[00:54:56] That's amazing.
[00:54:57] Yeah.
[00:54:57] Heads up.
[00:54:57] All right.
[00:54:57] So that's one.
[00:54:58] Lightning round.
[00:54:58] Keep going here.
[00:55:01] Jan LeCun, we talked about earlier, blasts Musk as the biggest threat to democracy.
[00:55:05] I just wanted to say that.
[00:55:06] That's all.
[00:55:06] We don't need to say anything more.
[00:55:07] I found that enjoyable to say.
[00:55:09] This one is interesting.
[00:55:10] Linus Torvalds slams AI as 90% marketing and 10% reality.
[00:55:18] Which, when Linus says this, you know, the basis of so much with Linux, I think that's somebody to listen to.
[00:55:26] And I think he's right.
[00:55:26] I think that we've talked about this in the show last week.
[00:55:28] When you're just hyping to raise money, that's the marketing.
[00:55:34] And you're overpromising, and it's not going to work.
[00:55:37] Yeah.
[00:55:37] I can continue by saying, I think AI is really interesting, and I think it is going to change the world.
[00:55:42] And at the same time, I hate the hype cycle so much that I really don't want to go there.
[00:55:46] Right.
[00:55:47] Which I only point out to say, like, it's not like he's anti-AI.
[00:55:52] No, no, no.
[00:55:53] But he's right.
[00:55:53] He's pulling down the hype level.
[00:55:55] Yeah, exactly.
[00:55:57] Stability, which we don't talk about very often.
[00:55:59] Their former CEO, Imad Mostak, has given up his 75% of equity because the company is in such bad shape, I guess, is pretty much worthless, and he's gone anyway.
[00:56:08] And finally, I thought this was interesting, Arcade is a new AI product creation platform.
[00:56:14] And the shtick here is that you can go in and design jewelry with the AI, and then I think that there's kind of a competition, and they'll make it and sell it.
[00:56:27] So it becomes interesting about how you get from that point of imagining something, and you're not doing a β because before with products, right,
[00:56:36] the fact that if you want to start a new product five years ago, the world was entirely different because you could use Chinese factories to make it.
[00:56:46] You could use Amazon to stock it and fulfill it.
[00:56:51] You could use Facebook to market it.
[00:56:57] Well, now you go one step back in the process, and you can put design in there, and you can use AI to design.
[00:57:04] You don't even have to be good at designing something and creating something.
[00:57:07] You can have it do it for you, and then you end up with a product that goes along that chain.
[00:57:13] That's cool.
[00:57:14] That's it.
[00:57:15] That's neat.
[00:57:16] I mean, taking something β working with something on the screen and not having the skills, the CAD skills or whatever,
[00:57:25] that creates 3D imagery or whatever, to work with a tool like AI and then at the end of the day have a real tangible like, oh, and here it is.
[00:57:33] Here's the thing that I worked with the system to create.
[00:57:37] That's super empowering.
[00:57:38] Of course, I'm an idiot, and I put the wrong link in, so you couldn't show it.
[00:57:40] It's called Arcadia.
[00:57:41] I found it.
[00:57:42] You did?
[00:57:43] Okay.
[00:57:43] Yeah.
[00:57:44] Yeah.
[00:57:45] Yeah, I was showing it off, and then I had to sneeze, so I had to cut away.
[00:57:48] Oh, is that it?
[00:57:51] But yeah, that's really neat.
[00:57:53] If you go to arcade.ai, I'm pointing to my screen right now like you can see it right there.
[00:57:58] Yes, that's right.
[00:58:00] Hey, agent, do this.
[00:58:02] So it seems that you can put things in there, and then I guess they get voted on, or if they think it's worthwhile to make it, they can make it.
[00:58:12] Okay.
[00:58:13] What does this do to not eBay?
[00:58:16] What's the other one?
[00:58:16] What's the crafty one?
[00:58:20] Oh, suit.
[00:58:21] Why am I suddenly blanking?
[00:58:22] I am too.
[00:58:22] Yeah, it's not eBay.
[00:58:24] It's a...
[00:58:25] Etsy.
[00:58:25] Etsy.
[00:58:26] Etsy.
[00:58:26] Thank you.
[00:58:27] Yeah.
[00:58:28] So anyway, I thought it was a big deal.
[00:58:30] And then they have other materials coming up.
[00:58:32] If you go to the bottom, they'll next be doing things in platinum and acrylic and cufflinks.
[00:58:39] And cufflinks.
[00:58:40] Finally, cufflinks.
[00:58:41] Just what you need, Jason.
[00:58:42] I know.
[00:58:43] That's what I was waiting for.
[00:58:44] I keep seeing your sleeves of your T-shirt there flopping.
[00:58:48] Yeah, you need the cufflinks.
[00:58:50] Oh, these floppy T-shirt sleeves.
[00:58:52] They just need a shiny one-of-a-kind cufflinks.
[00:58:55] I think I just invented something.
[00:58:56] I want to make the French cuff T-shirt.
[00:59:01] Love it.
[00:59:02] Yeah.
[00:59:02] Yes.
[00:59:03] Make that.
[00:59:03] Go to arcade.
[00:59:05] Exactly.
[00:59:05] They probably don't do textiles yet, but maybe someday they will, and you can make that.
[00:59:11] I love lightning round.
[00:59:13] Let's do that again sometime.
[00:59:14] If we have enough.
[00:59:15] If we have enough.
[00:59:16] Yeah, if we have enough and we have the time, which we totally did today.
[00:59:19] It all worked out.
[00:59:19] Yeah.
[00:59:20] Love it.
[00:59:21] Enjoy doing this show with you, Jeff.
[00:59:23] Jeffjarvis.com is where you should go.
[00:59:27] Not where you can go, but where you will go to check out everything that Jeff Jarvis is up to,
[00:59:33] including his most recent book, recently released, The Web We Weave,
[00:59:36] Why We Must Reclaim the Internet from Moguls, Misanthropes, and Moral Panic.
[00:59:42] And the Gutenberg Parentheses and magazine.
[00:59:44] Of course.
[00:59:45] And you?
[00:59:46] We've got a lot on offer there.
[00:59:49] Well, for me, just go to AIinside.show.
[00:59:53] That is our web page, our home on the web that has all of our episodes,
[00:59:59] has big pictures of us, AI cleaned up and everything.
[01:00:04] You know, he had to go the AI route.
[01:00:06] So I had to run it through AI and give it a little bit of that glisten.
[01:00:11] Yeah.
[01:00:11] Everything you need to know about this show can be found there.
[01:00:13] How to subscribe.
[01:00:15] All the different ways that you can subscribe.
[01:00:16] Suddenly I'm getting like, I must be getting allergic or something.
[01:00:19] I just sneezed and everything.
[01:00:21] Sorry about that.
[01:00:21] But AIinside.show.
[01:00:23] Go there.
[01:00:24] Subscribe to the podcast.
[01:00:25] And then, of course, you can go to our Patreon.
[01:00:29] Patreon.com slash AIinsideshow.
[01:00:32] There you can support us on any number of levels.
[01:00:36] And we do appreciate that you do that.
[01:00:39] If you're at the top level, which is called the executive producer level,
[01:00:42] where you get all of the other stuff, ad-free shows, Discord community, hangouts.
[01:00:49] You also get a t-shirt, by the way.
[01:00:51] So an AIinside shirt for just you.
[01:00:55] And you get called out at the end of the show, whether you like it or not.
[01:00:58] Actually, if you didn't want to, you can just let me know and I wouldn't do it.
[01:01:01] But I've heard no such thing from Dr. Du, Jeffrey Maricini, WPVM 103.7 in Asheville, North Carolina, Paul Lang, and Ryan Newell.
[01:01:10] Y'all are awesome.
[01:01:12] Thank you for your support each and every week.
[01:01:14] And thank you for watching each and every week and listening.
[01:01:18] We love having you here.
[01:01:19] Yes, Jeff.
[01:01:20] And next week, I will not be here because I'll be in Mainz, Germany, giving a talk at CONCON, the content convention.
[01:01:25] But you're going to be in good hands because?
[01:01:28] Because Mr. Mike Elgin will be joining me.
[01:01:30] So it'll be me and Mike talking about the news of the week, having a little bit of an election hangover.
[01:01:37] So we'll see how that goes.
[01:01:39] I may ask for asylum depending upon how things happen.
[01:01:42] Yeah.
[01:01:43] Yeah.
[01:01:43] Yeah.
[01:01:44] It could be kind of an interesting day.
[01:01:46] But anyways, I'm looking forward to hanging out with Mike wherever in the world he's going to be at that point.
[01:01:51] I'm not entirely sure, but he'll be joining virtually and it'll be a lot of fun.
[01:01:55] We'll miss you, though, Jeff.
[01:01:57] So next time we have Mike on, I'll have to make sure that Mike's on when you're on as well.
[01:02:02] Thank you, everybody, for watching and listening.
[01:02:04] Y'all are awesome.
[01:02:04] We'll see you next time on AI Inside.
[01:02:06] Bye.



