The Line Between Not Human and Human
October 02, 20241:03:08

The Line Between Not Human and Human

[00:00:00] [SPEAKER_02]: This is AI Inside, episode 37 recorded Wednesday October 2nd, 2024. The Line Between Not Human and Human.

[00:00:11] [SPEAKER_02]: This episode of AI Inside is made possible by our wonderful patrons at patreon.com slash AI Inside show.

[00:00:18] [SPEAKER_02]: If you like what you hear head on over and support us directly and thank you for making independent podcasting possible.

[00:00:29] [SPEAKER_02]: What's going on everybody? Welcome to another episode of AI Inside, the show where we take a look at the AI that is

[00:00:35] [SPEAKER_02]: layered throughout the world of technology. Everywhere you look, it's got the letters A and I in it.

[00:00:42] [SPEAKER_02]: I'm one of your hosts. Jason Howell joined us always by my co-host and friend Jeff Jarvis. What's going on, Jeff?

[00:00:49] [SPEAKER_00]: Hey buddy, how are you?

[00:00:50] [SPEAKER_00]: I was just talking to Jason. I live here in temperate New Jersey, but it's been getting very hot where Jason is.

[00:00:57] [SPEAKER_00]: And he was on TikTok last night, just obviously just sweating, just melting.

[00:01:03] [SPEAKER_02]: I had all the blind, everything shut in the house because we don't have air conditioning here.

[00:01:09] [SPEAKER_02]: And for, you know, I mean, it's warm in a lot of places, but in the Bay Area for it to get to like 100 and some odd degrees.

[00:01:17] [SPEAKER_02]: I think yesterday in Petaluma, it was something like 105, which for some areas they're like, oh yeah, that's just every Tuesday in the summer.

[00:01:24] [SPEAKER_02]: But here that doesn't happen. And so when it does, we all just melt. Like I don't have air conditioning.

[00:01:28] [SPEAKER_00]: And so I was fearing that he was going to say, I don't know, lights on the podcast?

[00:01:34] [SPEAKER_02]: No, I think we're okay. Because my plan, and we were talking about this before the show, my plan is to get out of dodge today

[00:01:41] [SPEAKER_02]: and through tomorrow I'm going to take the kids to go camping for the night.

[00:01:46] [SPEAKER_02]: So we're going to get out of, although that doesn't really save me from the heat.

[00:01:50] [SPEAKER_02]: I'm just going to a different area where there is heat, but still it gets me out of the non-air conditioned house.

[00:01:56] [SPEAKER_02]: Take ice.

[00:01:58] [SPEAKER_02]: Yeah, yeah, definitely take some ice and nice cold drinks. It's going to be a lot of fun. I'm looking forward to it.

[00:02:04] [SPEAKER_02]: Before we get started with all the AI news that we've selected for today's show,

[00:02:09] [SPEAKER_02]: just want to remind you that we do have a way that you can support us directly.

[00:02:12] [SPEAKER_02]: If you like what Jeff and I are doing with AI Inside, you can go to patreon.com.com.

[00:02:16] [SPEAKER_02]: slash AI inside show there.

[00:02:18] [SPEAKER_02]: You can support us directly financially.

[00:02:20] [SPEAKER_02]: Number of reasons why you might want to do that, but I'll leave it up to you to go check it out.

[00:02:24] [SPEAKER_02]: You also get your name read at the top of the show.

[00:02:26] [SPEAKER_02]: Ken Hayes, thank you for your support, Ken.

[00:02:29] [SPEAKER_02]: Good to have you as part of the AI Inside crew.

[00:02:32] [SPEAKER_02]: I feel like every show has a name that they give their community.

[00:02:37] [SPEAKER_02]: I don't know that we've gotten there yet. Are you all the lasagnas?

[00:02:41] [SPEAKER_02]: No, that's weird.

[00:02:43] [SPEAKER_02]: Oh no, no.

[00:02:44] [SPEAKER_02]: Yeah, I don't know. We'll figure it out.

[00:02:47] [SPEAKER_02]: Our lasagna community.

[00:02:49] [SPEAKER_02]: I don't know what it is.

[00:02:50] [SPEAKER_02]: Maybe maybe y'all can let us know what you think.

[00:02:54] [SPEAKER_02]: Contact at AI inside dot show.

[00:02:56] [SPEAKER_02]: Send us your emails and your thoughts there.

[00:02:58] [SPEAKER_02]: Sure.

[00:02:58] [SPEAKER_02]: And if you're watching live while we record,

[00:03:01] [SPEAKER_02]: because we do get a lot of people watching live actually

[00:03:03] [SPEAKER_02]: while we do this recording,

[00:03:05] [SPEAKER_02]: you can subscribe to the podcast so you don't miss a future episode

[00:03:09] [SPEAKER_02]: when you aren't able to watch live.

[00:03:11] [SPEAKER_02]: That's AI inside dot show.

[00:03:13] [SPEAKER_02]: All the details that you need to subscribe are there.

[00:03:16] [SPEAKER_02]: Yeah, go check it out.

[00:03:18] [SPEAKER_02]: Subscribe.

[00:03:19] [SPEAKER_02]: You won't miss an episode that way.

[00:03:20] [SPEAKER_02]: All right, let's get into it.

[00:03:22] [SPEAKER_02]: A lot of news this week

[00:03:24] [SPEAKER_02]: and we'll start off with OpenAI, big surprise.

[00:03:28] [SPEAKER_02]: A lot of OpenAI in the news this week,

[00:03:31] [SPEAKER_02]: much related to some big name shakeups at the top,

[00:03:36] [SPEAKER_02]: which feels like a recurring theme.

[00:03:38] [SPEAKER_02]: Constant.

[00:03:39] [SPEAKER_02]: It is constant.

[00:03:41] [SPEAKER_00]: And I don't know yet whether people leaving

[00:03:43] [SPEAKER_00]: is a commentary on going for profit,

[00:03:47] [SPEAKER_00]: on Sam Altman, on opportunities outside,

[00:03:50] [SPEAKER_00]: on office politics.

[00:03:52] [SPEAKER_00]: I just really don't have a sense of what's happening in there.

[00:03:56] [SPEAKER_02]: It is really interesting, right?

[00:03:58] [SPEAKER_02]: And it's not like it's just a few little moments,

[00:04:03] [SPEAKER_02]: few little people.

[00:04:05] [SPEAKER_02]: We've got Andre Karpathy in February,

[00:04:08] [SPEAKER_02]: Ilya Sutskiver in May, Greg Brockman in August.

[00:04:12] [SPEAKER_02]: And now we've got Mira Murata, who's the former,

[00:04:16] [SPEAKER_02]: at this point, CTO of OpenAI.

[00:04:19] [SPEAKER_02]: She announced her decision to leave.

[00:04:21] [SPEAKER_02]: She's been with the company six and a half years.

[00:04:23] [SPEAKER_02]: These are long-term players with the company.

[00:04:26] [SPEAKER_02]: We have the ouster and then the insta

[00:04:29] [SPEAKER_02]: of Altman last year, you know,

[00:04:31] [SPEAKER_02]: and the shakeup that that's caused.

[00:04:33] [SPEAKER_02]: I mean, no matter how you look at it,

[00:04:35] [SPEAKER_02]: like I don't know the cause of all this,

[00:04:37] [SPEAKER_02]: maybe it's opportunity,

[00:04:39] [SPEAKER_02]: but it's really hard to look at it and not wonder

[00:04:42] [SPEAKER_02]: if it's something deeper,

[00:04:43] [SPEAKER_02]: because these are big names that are choosing

[00:04:45] [SPEAKER_02]: to step away from the brand.

[00:04:47] [SPEAKER_00]: Everyone who was on the OpenAI cover of wire

[00:04:50] [SPEAKER_00]: except Altman is now gone.

[00:04:52] [SPEAKER_00]: Yeah, I saw that.

[00:04:54] [SPEAKER_00]: So the information,

[00:04:56] [SPEAKER_00]: which does very good inside reporting on technology,

[00:04:59] [SPEAKER_00]: their headline is behind OpenAI staff churn,

[00:05:02] [SPEAKER_00]: Turf Wars Burnout Compensation Demands,

[00:05:05] [SPEAKER_00]: which sounds like a little bit from column A,

[00:05:07] [SPEAKER_00]: a little bit from column B.

[00:05:09] [SPEAKER_00]: But I do think it's different.

[00:05:10] [SPEAKER_00]: I think that some of the Scoutscaper and those

[00:05:14] [SPEAKER_00]: were in the doomer AI safety,

[00:05:17] [SPEAKER_00]: we're not doing enough world.

[00:05:19] [SPEAKER_00]: Some were probably in the,

[00:05:21] [SPEAKER_00]: we don't think this should be a for-profit world.

[00:05:24] [SPEAKER_00]: Some don't like Altman.

[00:05:27] [SPEAKER_00]: And for Marati, I saw a story somewhere

[00:05:30] [SPEAKER_00]: I didn't put in the rundown,

[00:05:31] [SPEAKER_00]: but they're beating the path to her door

[00:05:33] [SPEAKER_00]: venture capitalist are.

[00:05:35] [SPEAKER_00]: You want me to help you start a company?

[00:05:37] [SPEAKER_00]: Let's start a company.

[00:05:39] [SPEAKER_00]: Because this, you know, CTO of OpenAI leaving is a big deal.

[00:05:43] [SPEAKER_00]: She of course was the temporary for about two hours CEO.

[00:05:46] [SPEAKER_00]: Obviously highly respected, obviously brilliant.

[00:05:51] [SPEAKER_00]: And so is this, but it's,

[00:05:53] [SPEAKER_00]: but you had a PayPal mafia after it was moving successfully.

[00:06:00] [SPEAKER_00]: Here the mafia just kind of comes in and goes

[00:06:02] [SPEAKER_00]: over his day by day and leaves and starts new stuff.

[00:06:04] [SPEAKER_00]: So we'll see what happens.

[00:06:05] [SPEAKER_02]: Yeah, yeah, interesting.

[00:06:07] [SPEAKER_02]: It's also timed along with OpenAI seeking to restructure

[00:06:11] [SPEAKER_02]: its core business into a for-profit company

[00:06:16] [SPEAKER_02]: with Sam Altman getting a minority equity position

[00:06:20] [SPEAKER_02]: but an equity position.

[00:06:21] [SPEAKER_00]: Well now he has none.

[00:06:23] [SPEAKER_00]: Right, exactly.

[00:06:24] [SPEAKER_00]: And he brags about that,

[00:06:26] [SPEAKER_00]: that he'd be, you know,

[00:06:26] [SPEAKER_00]: his salary is too coaxed at an iced tea

[00:06:29] [SPEAKER_00]: and he doesn't have equity in the company

[00:06:31] [SPEAKER_00]: and he's doing this for the goodness of humankind

[00:06:34] [SPEAKER_00]: and other deals that come his way.

[00:06:36] [SPEAKER_00]: So the reports are that he'd be up for 7% equity,

[00:06:40] [SPEAKER_00]: which, you know, sounds fair actually.

[00:06:46] [SPEAKER_00]: What I'm curious about of a huge company,

[00:06:49] [SPEAKER_00]: what I'm curious about is how they vest that.

[00:06:52] [SPEAKER_00]: Do they hold on to him?

[00:06:54] [SPEAKER_00]: Does it become vested immediately?

[00:06:56] [SPEAKER_00]: How liquid is that when?

[00:06:57] [SPEAKER_00]: But of course if he sold any equity in the company

[00:07:00] [SPEAKER_00]: at any point, it would make huge,

[00:07:02] [SPEAKER_00]: until he's 30 years older

[00:07:04] [SPEAKER_00]: and doing it for his kids' college or whatever.

[00:07:08] [SPEAKER_00]: That's a joke.

[00:07:12] [SPEAKER_00]: He could buy Stanford,

[00:07:13] [SPEAKER_00]: he doesn't need to borrow a penituation,

[00:07:16] [SPEAKER_00]: but anyway, if he sold any stock in any case

[00:07:20] [SPEAKER_00]: it would be big news and be bad for the company.

[00:07:22] [SPEAKER_00]: So you're kind of stuck with it

[00:07:23] [SPEAKER_02]: if you're at that level anyway.

[00:07:25] [SPEAKER_02]: Yeah, yeah, that's true.

[00:07:28] [SPEAKER_02]: Yeah, is the just a random thought

[00:07:32] [SPEAKER_02]: is the kind of, what is the,

[00:07:36] [SPEAKER_02]: how can I even speaking-lish?

[00:07:38] [SPEAKER_02]: Suddenly I'm having a really hard time speaking.

[00:07:40] [SPEAKER_02]: I'm looking at Mark Zuckerberg as the model

[00:07:43] [SPEAKER_02]: for Sam Altman to a certain degree

[00:07:46] [SPEAKER_02]: of where Sam Altman is at right now.

[00:07:48] [SPEAKER_02]: He's in the go work, work, work, work.

[00:07:50] [SPEAKER_02]: And I think Zuckerberg's probably still there,

[00:07:52] [SPEAKER_02]: but you know some of the Google founders,

[00:07:53] [SPEAKER_02]: they kind of got to a point

[00:07:54] [SPEAKER_02]: to where they took, they were able to kind of step away

[00:07:57] [SPEAKER_02]: and take their foot off the gas.

[00:07:58] [SPEAKER_02]: Of course now even they are kind of coming back in

[00:08:01] [SPEAKER_02]: because of the AI moment that we're in.

[00:08:03] [SPEAKER_02]: Sergey, especially.

[00:08:05] [SPEAKER_02]: Sergey primarily, but yeah,

[00:08:07] [SPEAKER_02]: Altman is definitely nowhere near

[00:08:09] [SPEAKER_02]: they take the foot off the gas.

[00:08:10] [SPEAKER_02]: He is only, I guess the reason that I say that

[00:08:12] [SPEAKER_02]: is because in a very short amount of time

[00:08:14] [SPEAKER_02]: Sam Altman has become kind of the technology

[00:08:19] [SPEAKER_02]: superstar name that has become basically

[00:08:23] [SPEAKER_02]: a household name much in the same way

[00:08:25] [SPEAKER_02]: that Mark Zuckerberg did when Facebook was founded,

[00:08:28] [SPEAKER_02]: that Sergey and Larry were when Google founded

[00:08:31] [SPEAKER_02]: and were nowhere near the end of that star rising.

[00:08:34] [SPEAKER_02]: And the temptation I have to imagine

[00:08:38] [SPEAKER_02]: the temptation being in Sam Altman's position

[00:08:41] [SPEAKER_02]: is great around kind of just recognizing

[00:08:46] [SPEAKER_02]: where they are with open AI

[00:08:48] [SPEAKER_02]: and the potential of where that's going.

[00:08:50] [SPEAKER_02]: And I mean, yeah, it's kind of hard to fathom exactly

[00:08:54] [SPEAKER_02]: what that head looks like for him.

[00:08:56] [SPEAKER_00]: As you're talking, as you're raising that comparison

[00:08:59] [SPEAKER_00]: that makes me think of something else too,

[00:09:00] [SPEAKER_00]: which is the stock structure of these companies.

[00:09:04] [SPEAKER_00]: Larry and Sergey have considerable control

[00:09:08] [SPEAKER_00]: of a class of shares, voting shares

[00:09:11] [SPEAKER_00]: and thus have still considerable control in the company.

[00:09:16] [SPEAKER_00]: Zuckerberg famously in the very early days said,

[00:09:18] [SPEAKER_00]: oh, stock, I don't know.

[00:09:20] [SPEAKER_00]: Well, you want some stock?

[00:09:21] [SPEAKER_00]: You paint the wall, here's some stock, right?

[00:09:23] [SPEAKER_00]: And only this was somewhat portrayed

[00:09:26] [SPEAKER_00]: in the movie that often lied about him,

[00:09:28] [SPEAKER_00]: the social network had to come up

[00:09:30] [SPEAKER_00]: with a two-tier structure there too

[00:09:32] [SPEAKER_00]: where he would maintain considerable control

[00:09:34] [SPEAKER_00]: and voting shares.

[00:09:37] [SPEAKER_00]: For open AI, there isn't that opportunity.

[00:09:40] [SPEAKER_00]: Oh, we started small and I had the founders equity

[00:09:42] [SPEAKER_00]: and I'll structure it as we go,

[00:09:44] [SPEAKER_00]: Alman had no equity.

[00:09:46] [SPEAKER_00]: So there's not that opportunity

[00:09:47] [SPEAKER_00]: to give him that kind of vice control

[00:09:53] [SPEAKER_00]: on governance of the company, right?

[00:09:56] [SPEAKER_00]: And then we obviously saw the prior governance crisis

[00:09:58] [SPEAKER_00]: when they fired them and got them back.

[00:10:00] [SPEAKER_00]: And it'll be really interesting to see

[00:10:03] [SPEAKER_00]: how they manipulate this

[00:10:05] [SPEAKER_00]: and whether he manages to get some vehicles for control

[00:10:09] [SPEAKER_00]: that even go beyond the 7% actual equity

[00:10:13] [SPEAKER_00]: to greater voting shares or what that is.

[00:10:17] [SPEAKER_00]: I don't know, it's also gonna be very, very hard.

[00:10:19] [SPEAKER_00]: There are stories about this this week.

[00:10:20] [SPEAKER_00]: It's not easy to cross the bloodstream

[00:10:24] [SPEAKER_00]: from for profit to not-for-profit either way.

[00:10:26] [SPEAKER_00]: It'd be extremely careful

[00:10:28] [SPEAKER_00]: because even if you're going this way,

[00:10:31] [SPEAKER_00]: the things that you did in the past

[00:10:32] [SPEAKER_00]: have to be seen as truly not-for-profit

[00:10:34] [SPEAKER_00]: for tax purposes.

[00:10:36] [SPEAKER_00]: And now to go to for profit,

[00:10:39] [SPEAKER_00]: newspapers are going the other way

[00:10:41] [SPEAKER_00]: where they're saying, oh, we give up,

[00:10:42] [SPEAKER_00]: we can't make any money,

[00:10:43] [SPEAKER_00]: let's just go not-for-profit.

[00:10:44] [SPEAKER_00]: And that's not easy, but they do it

[00:10:46] [SPEAKER_00]: and it brings kind of complications.

[00:10:49] [SPEAKER_00]: Same this way is you can't manipulate the tax laws,

[00:10:54] [SPEAKER_00]: the benefit not-for-profits for the sake of a for profit.

[00:10:57] [SPEAKER_02]: Well, they have a limited amount of time to do that too, right?

[00:11:00] [SPEAKER_02]: They have to complete the transition in two years.

[00:11:03] [SPEAKER_02]: Yeah, that's interesting.

[00:11:04] [SPEAKER_02]: I won't begin to pretend like I understand

[00:11:07] [SPEAKER_02]: how that process works,

[00:11:09] [SPEAKER_02]: but as you're talking through it

[00:11:10] [SPEAKER_02]: and based on what I read,

[00:11:11] [SPEAKER_02]: yeah, it does sound like a real fine

[00:11:14] [SPEAKER_02]: kind of tightrope kind of situation.

[00:11:17] [SPEAKER_02]: And especially when you're talking about a company,

[00:11:19] [SPEAKER_02]: this isn't just a little company deciding,

[00:11:21] [SPEAKER_02]: oh, we're going to go not-for-profit for profit.

[00:11:23] [SPEAKER_02]: This is arguably one of the most valuable companies

[00:11:25] [SPEAKER_02]: of our day right now that is deciding

[00:11:29] [SPEAKER_02]: to do this incredibly difficult transition.

[00:11:32] [SPEAKER_00]: And they have, you know, Microsoft

[00:11:33] [SPEAKER_00]: is a very complicated player here

[00:11:35] [SPEAKER_00]: because they gave a lot of it in kind

[00:11:39] [SPEAKER_00]: in terms of compute and such,

[00:11:40] [SPEAKER_00]: but they gave a lot of value to open AI

[00:11:42] [SPEAKER_00]: and value in terms of using open AI

[00:11:45] [SPEAKER_00]: and promoting it and the brand and so on.

[00:11:47] [SPEAKER_00]: So where they stand in negotiations

[00:11:48] [SPEAKER_00]: for the cap table and structure going forward,

[00:11:54] [SPEAKER_00]: what should we call it?

[00:11:55] [SPEAKER_00]: The big Japanese VC soft bank is now supposedly victorious

[00:11:59] [SPEAKER_00]: and they've been trying to get in.

[00:12:01] [SPEAKER_00]: They're getting $500 million in.

[00:12:04] [SPEAKER_00]: And so they're going to try to worry

[00:12:05] [SPEAKER_00]: about where they stand versus prior money.

[00:12:07] [SPEAKER_00]: And the prior money was not really kind of put

[00:12:09] [SPEAKER_00]: into a for-profit company.

[00:12:10] [SPEAKER_00]: It's going to be a really complex negotiation.

[00:12:14] [SPEAKER_00]: Lot of lawyers or liars

[00:12:16] [SPEAKER_00]: are going to make a lot of money on this.

[00:12:19] [SPEAKER_00]: It's going to be fascinating to watch.

[00:12:21] [SPEAKER_00]: And I really want to say,

[00:12:22] [SPEAKER_00]: and I'm not terribly specific,

[00:12:23] [SPEAKER_00]: I worked on a few things.

[00:12:25] [SPEAKER_00]: I was on boards with little tiny companies,

[00:12:27] [SPEAKER_00]: but the cap table here

[00:12:28] [SPEAKER_00]: and the equity structure is going to be fascinating to watch

[00:12:31] [SPEAKER_00]: because that's going to be about power.

[00:12:34] [SPEAKER_00]: Mm-hmm, 100%.

[00:12:36] [SPEAKER_02]: Also important to note that Apple is no longer involved

[00:12:39] [SPEAKER_02]: with the funding round anymore.

[00:12:41] [SPEAKER_02]: So they're out.

[00:12:42] [SPEAKER_02]: Now that doesn't necessarily impact open AI's presence.

[00:12:45] [SPEAKER_02]: That was one thing that I thought of as like,

[00:12:46] [SPEAKER_02]: oh, is Apple just about facing?

[00:12:48] [SPEAKER_02]: Like they announced this

[00:12:49] [SPEAKER_02]: at their Apple intelligence stream of news

[00:12:52] [SPEAKER_02]: with the last iPhone announcement.

[00:12:54] [SPEAKER_02]: And that's not the case.

[00:12:55] [SPEAKER_02]: My understanding is this doesn't impact

[00:12:57] [SPEAKER_02]: the relationship that Apple has with open AI's presence

[00:13:01] [SPEAKER_02]: in future iPhones and that sort of stuff.

[00:13:04] [SPEAKER_00]: Right, and if you want money in a company,

[00:13:07] [SPEAKER_00]: like you usually want it,

[00:13:08] [SPEAKER_00]: because you want two things.

[00:13:09] [SPEAKER_00]: You want to go bargain,

[00:13:10] [SPEAKER_00]: you want future revenue,

[00:13:11] [SPEAKER_00]: but I think this stock was getting very expensive

[00:13:15] [SPEAKER_00]: considering the valuation could be going up and up.

[00:13:17] [SPEAKER_00]: And two, you want power over,

[00:13:20] [SPEAKER_00]: you want a strategic interest.

[00:13:21] [SPEAKER_00]: Apple has plenty of power.

[00:13:23] [SPEAKER_00]: Apple can exert that power.

[00:13:25] [SPEAKER_00]: So I don't think that they just saw any benefit anymore

[00:13:28] [SPEAKER_00]: in investing.

[00:13:29] [SPEAKER_00]: And then they got plenty of cash,

[00:13:31] [SPEAKER_00]: they can put it in lots of places.

[00:13:32] [SPEAKER_00]: And I guess they just said,

[00:13:33] [SPEAKER_00]: this is not the place to put it now.

[00:13:34] [SPEAKER_00]: I don't think it's a vote of confidence

[00:13:36] [SPEAKER_00]: or no confidence or anything.

[00:13:38] [SPEAKER_00]: It's just, oh, never mind.

[00:13:40] [SPEAKER_02]: Yeah.

[00:13:42] [SPEAKER_02]: Well, in other open AI news,

[00:13:45] [SPEAKER_02]: just kind of related to this

[00:13:47] [SPEAKER_02]: and actually just kind of a string of voice interaction news

[00:13:53] [SPEAKER_02]: if they were all lumped together.

[00:13:55] [SPEAKER_02]: I think that's a commonality

[00:13:57] [SPEAKER_02]: between the next three stories that we have ahead of us.

[00:14:00] [SPEAKER_02]: So first of all, open AI's Dev Day 2024,

[00:14:02] [SPEAKER_02]: this took place yesterday with a bunch of API news,

[00:14:06] [SPEAKER_02]: aimed to developers of course

[00:14:08] [SPEAKER_02]: to lower the barrier of entry for Devs with those products.

[00:14:13] [SPEAKER_02]: Real-time API so that developers can build products

[00:14:18] [SPEAKER_02]: using the technology behind advanced voice mode.

[00:14:21] [SPEAKER_02]: So this is gonna allow developers to tap into

[00:14:24] [SPEAKER_02]: that human-like conversational AI approach

[00:14:29] [SPEAKER_02]: with their own products.

[00:14:32] [SPEAKER_02]: Model distillation.

[00:14:33] [SPEAKER_02]: So this is using the output of larger models

[00:14:36] [SPEAKER_02]: to fine-tune smaller models.

[00:14:39] [SPEAKER_02]: That's something that they introduced.

[00:14:41] [SPEAKER_02]: Prompt caching, which is reusing input tokens

[00:14:46] [SPEAKER_02]: for prompts, let's say that are frequent or common

[00:14:51] [SPEAKER_02]: that reduces cost by up to 50%.

[00:14:54] [SPEAKER_02]: That makes it 80% faster

[00:14:56] [SPEAKER_02]: when you're reusing some of the prompt cache.

[00:14:59] [SPEAKER_02]: I think that one, I don't know.

[00:15:00] [SPEAKER_02]: I feel like I wanna understand that one a little bit more.

[00:15:02] [SPEAKER_02]: Like if it's prompt caching,

[00:15:04] [SPEAKER_02]: does that mean that it's always giving a similar

[00:15:07] [SPEAKER_02]: or same answer to this?

[00:15:10] [SPEAKER_02]: I mean, that's what it leads me to believe.

[00:15:12] [SPEAKER_02]: Yeah, I mean, it's kind of basic database stuff, isn't it?

[00:15:15] [SPEAKER_00]: It's an efficiency structure for that.

[00:15:19] [SPEAKER_02]: And then vision fine-tuning,

[00:15:21] [SPEAKER_02]: that's training models on both images and text.

[00:15:24] [SPEAKER_02]: But yeah, giving developers a little bit more access

[00:15:27] [SPEAKER_02]: to what OpenAI is doing

[00:15:29] [SPEAKER_02]: didn't actually end up seeing a huge amount of news about this.

[00:15:33] [SPEAKER_02]: No, I didn't see much at all.

[00:15:33] [SPEAKER_02]: But I thought was interesting considering it's OpenAI

[00:15:36] [SPEAKER_02]: and I feel like everything they do gets a lot of attention.

[00:15:39] [SPEAKER_00]: But I think it's...

[00:15:42] [SPEAKER_00]: If you're a dev around OpenAI, that's kind of high level.

[00:15:47] [SPEAKER_00]: It's not like, oh, I have an up store

[00:15:49] [SPEAKER_00]: and I'm gonna put up a cute thing to make music, right?

[00:15:51] [SPEAKER_00]: This is like much heavier, much more difficult.

[00:15:54] [SPEAKER_00]: One kind of side, you mentioned voices there

[00:15:57] [SPEAKER_00]: and they're going big with more voices

[00:15:59] [SPEAKER_00]: and you played with the voices last week.

[00:16:01] [SPEAKER_00]: Just a little sideline here, Benedict Evans

[00:16:02] [SPEAKER_00]: whose email I greatly value,

[00:16:05] [SPEAKER_00]: former and recent Horowitz analyst.

[00:16:09] [SPEAKER_00]: I didn't, I had thought of this at all.

[00:16:11] [SPEAKER_00]: But AI, and I saw a story about this too,

[00:16:13] [SPEAKER_00]: that AI wants to have all of its voices

[00:16:15] [SPEAKER_00]: really have human emotion and to project that.

[00:16:20] [SPEAKER_00]: We heard that last week somewhat.

[00:16:22] [SPEAKER_00]: By the way, I saw a hilarious TikTok

[00:16:27] [SPEAKER_00]: rendition of someone asking OpenAI,

[00:16:32] [SPEAKER_00]: whether it knew Ebonics,

[00:16:34] [SPEAKER_00]: which is the word for American Black Speech.

[00:16:37] [SPEAKER_00]: And it said yes when you speak in it.

[00:16:41] [SPEAKER_00]: Oh no, I can't do voices, I can't do accents, it said.

[00:16:45] [SPEAKER_00]: But can you speak in it?

[00:16:46] [SPEAKER_00]: Can you speak in it?

[00:16:47] [SPEAKER_00]: And it said yes.

[00:16:48] [SPEAKER_00]: And then proceeded in that kind of cold white voice

[00:16:53] [SPEAKER_00]: to speak in Ebonics.

[00:16:57] [SPEAKER_00]: And it was rather like the great scene from airplane

[00:17:01] [SPEAKER_00]: where the flight attendant suddenly speaks in Ebonics, right?

[00:17:05] [SPEAKER_00]: And- Excellent scene, yes.

[00:17:06] [SPEAKER_00]: There's a few who said it was fake,

[00:17:08] [SPEAKER_00]: but I'm not so sure it was fake.

[00:17:09] [SPEAKER_00]: I don't know what the truth is.

[00:17:10] [SPEAKER_00]: In any case, it was hilarious, but that's not my point.

[00:17:13] [SPEAKER_00]: The point is that Ben Evans said that amusingly,

[00:17:18] [SPEAKER_00]: I'll quote from him,

[00:17:19] [SPEAKER_00]: its ability to sound enthusiastic or sad

[00:17:21] [SPEAKER_00]: would make it technically illegal to use and work in the EU

[00:17:25] [SPEAKER_00]: since the AI Act, which is effective next year,

[00:17:28] [SPEAKER_00]: bans any AI product that can infer emotions.

[00:17:33] [SPEAKER_00]: Oh, oh, that's interesting.

[00:17:35] [SPEAKER_00]: Because the EU doesn't want it to act human.

[00:17:37] [SPEAKER_00]: Well, the whole thing they're trying to do is to act human.

[00:17:40] [SPEAKER_00]: Yeah, the march. We hit the first little knee here

[00:17:43] [SPEAKER_00]: around them, which is gonna be-

[00:17:45] [SPEAKER_00]: Interesting. I can't see the EU

[00:17:46] [SPEAKER_00]: necessarily taking them to jail.

[00:17:48] [SPEAKER_00]: You weren't happy, how dare you be happy.

[00:17:50] [SPEAKER_00]: Well yeah.

[00:17:51] [SPEAKER_02]: I mean, also I think what comes up for me around that

[00:17:54] [SPEAKER_02]: is how do you define human enough?

[00:17:59] [SPEAKER_02]: What is that line between not human and human?

[00:18:03] [SPEAKER_02]: The line between not human and human.

[00:18:05] [SPEAKER_02]: What the heck is that?

[00:18:07] [SPEAKER_02]: Maybe that's a good title.

[00:18:09] [SPEAKER_02]: Yeah, I mean, I don't know,

[00:18:11] [SPEAKER_02]: maybe the AI Act goes into further detail,

[00:18:14] [SPEAKER_02]: but I'm guessing it probably doesn't.

[00:18:16] [SPEAKER_02]: Maybe it's the type of thing.

[00:18:18] [SPEAKER_02]: You know it when you hear it,

[00:18:19] [SPEAKER_02]: or you know it when you see it.

[00:18:21] [SPEAKER_02]: Yeah, sort of.

[00:18:22] [SPEAKER_00]: I thought it was amusing.

[00:18:24] [SPEAKER_00]: It's also part of the problem is

[00:18:25] [SPEAKER_00]: they don't, the EU Act doesn't want it to infer our emotions.

[00:18:30] [SPEAKER_00]: Right. And exploit them,

[00:18:32] [SPEAKER_00]: which is probably more to the point.

[00:18:33] [SPEAKER_00]: But still, it's an interesting conflict ahead.

[00:18:37] [SPEAKER_02]: Yeah, yeah indeed.

[00:18:39] [SPEAKER_02]: Well, OpenAI wasn't the only company holding a kind

[00:18:44] [SPEAKER_02]: of a developer conference that had a lot to do with AI.

[00:18:47] [SPEAKER_02]: Meta also had its big conference Connect 2024.

[00:18:53] [SPEAKER_02]: At this point, this is kind of old news.

[00:18:54] [SPEAKER_02]: It was last week, but it came shortly after our show.

[00:18:58] [SPEAKER_02]: And so we got to talk about it

[00:18:59] [SPEAKER_02]: because there was some really interesting stuff

[00:19:01] [SPEAKER_02]: at this event, big headlines around the Orion AR glasses prototype,

[00:19:09] [SPEAKER_02]: which where is it?

[00:19:11] [SPEAKER_02]: I know I have it here somewhere.

[00:19:12] [SPEAKER_02]: I could show it for video viewers.

[00:19:14] [SPEAKER_02]: I just didn't pull up the right article.

[00:19:16] [SPEAKER_02]: Essentially, 10 years in development,

[00:19:19] [SPEAKER_02]: we've heard Meta talk about this, you know,

[00:19:20] [SPEAKER_02]: at different events over the years.

[00:19:23] [SPEAKER_02]: Now kind of getting to the point

[00:19:25] [SPEAKER_02]: to where it looks more like normal glasses,

[00:19:28] [SPEAKER_02]: not necessarily exactly.

[00:19:29] [SPEAKER_02]: A little chunky, a little nerdy.

[00:19:31] [SPEAKER_02]: A little chunky.

[00:19:32] [SPEAKER_02]: You know, there's a lot of technology happening

[00:19:34] [SPEAKER_02]: inside those frames, you can definitely tell.

[00:19:37] [SPEAKER_02]: But getting there, I mean, far cry

[00:19:40] [SPEAKER_02]: from what we've seen in the past.

[00:19:43] [SPEAKER_02]: Light at 98 grams has voice and hand tracking controls,

[00:19:49] [SPEAKER_02]: which is kind of part of what you see here,

[00:19:50] [SPEAKER_02]: which is actually enabled my understanding

[00:19:53] [SPEAKER_02]: by this neural wristband.

[00:19:57] [SPEAKER_02]: Where does it show the copy of the wristband?

[00:20:00] [SPEAKER_02]: I think one of these pictures does.

[00:20:02] [SPEAKER_02]: Yeah, so this fits around your wrist.

[00:20:05] [SPEAKER_02]: They might as well put a watch on it or something.

[00:20:07] [SPEAKER_02]: Yeah, totally, right.

[00:20:09] [SPEAKER_02]: Well, you know, but I guess my understanding

[00:20:11] [SPEAKER_02]: is this is actually the wristband is going to roll out

[00:20:14] [SPEAKER_02]: before we see the glasses that potentially Meta

[00:20:17] [SPEAKER_02]: has another product along these lines

[00:20:20] [SPEAKER_02]: that is going to utilize this next year potentially.

[00:20:24] [SPEAKER_02]: So we'll see, I think that's rumor is still at this point,

[00:20:27] [SPEAKER_02]: but that this is gonna have some applications

[00:20:29] [SPEAKER_02]: outside of just this glasses that they were showing off.

[00:20:32] [SPEAKER_02]: But yeah, and what does that have to do with AI?

[00:20:35] [SPEAKER_02]: Well, it's integrating contextual AI.

[00:20:38] [SPEAKER_00]: AI through and through.

[00:20:38] [SPEAKER_02]: Yeah, yeah, Meta AI on board.

[00:20:41] [SPEAKER_02]: It has some micro LED projectors on each lens.

[00:20:46] [SPEAKER_02]: Really to a certain degree it's an evolution,

[00:20:48] [SPEAKER_02]: at least seemingly on its surface

[00:20:51] [SPEAKER_02]: in evolution of the Ray-Ban Meta smart glasses,

[00:20:54] [SPEAKER_02]: which can just do way more.

[00:20:56] [SPEAKER_02]: I mean that small glasses form factor

[00:20:58] [SPEAKER_02]: allowing you to see that type of content

[00:21:02] [SPEAKER_02]: has a very wide field of view.

[00:21:05] [SPEAKER_02]: I'm super interested in this.

[00:21:07] [SPEAKER_02]: I would love to check these out and see what it looks like.

[00:21:11] [SPEAKER_00]: So Axios got a hands on with it or a heads on, I should say.

[00:21:15] [SPEAKER_02]: I was actually, honestly, I was surprised by how many

[00:21:18] [SPEAKER_02]: how many outlets got heads on whatever you want to call it

[00:21:23] [SPEAKER_02]: with these.

[00:21:24] [SPEAKER_02]: Yeah, so it's not which is to say that,

[00:21:27] [SPEAKER_02]: usually when we see something like this,

[00:21:30] [SPEAKER_02]: it's like yeah, and you know,

[00:21:31] [SPEAKER_02]: it's one or two get the feature

[00:21:33] [SPEAKER_02]: and it's somewhere down the line,

[00:21:35] [SPEAKER_02]: but this time around it's like Meta's getting closer

[00:21:38] [SPEAKER_02]: because they let a lot of people check it out.

[00:21:41] [SPEAKER_00]: So Inerfried wrote about it in the typical short Axios way,

[00:21:45] [SPEAKER_00]: I put it in the, I added to the rundown

[00:21:47] [SPEAKER_00]: and says that they're bulky

[00:21:49] [SPEAKER_00]: but feel surprisingly light and comfortable.

[00:21:52] [SPEAKER_00]: One thing that interests me is obviously

[00:21:53] [SPEAKER_00]: they've put off a lot of the computing power

[00:21:55] [SPEAKER_00]: onto the separate box that you put in your pocket

[00:21:58] [SPEAKER_00]: and on twig with Stacy going away back when

[00:22:01] [SPEAKER_00]: Leo and I were talking a lot about this idea

[00:22:03] [SPEAKER_00]: of a blob computer, just this thing you carry with you

[00:22:07] [SPEAKER_00]: that then interacts with whatever you're near.

[00:22:09] [SPEAKER_00]: And this is kind of the beginning of that.

[00:22:11] [SPEAKER_00]: So you have this box that you have in your pocket

[00:22:13] [SPEAKER_00]: and it could talk to these glasses

[00:22:14] [SPEAKER_00]: but it could also talk to your laptop,

[00:22:16] [SPEAKER_00]: it could also talk to anything else.

[00:22:17] [SPEAKER_00]: It's your computing presence.

[00:22:20] [SPEAKER_00]: I think that's an important bit here.

[00:22:23] [SPEAKER_00]: She says you select objects to interact

[00:22:25] [SPEAKER_00]: simply by looking at them.

[00:22:28] [SPEAKER_00]: The eye tracking I'm quoting here is so fast

[00:22:30] [SPEAKER_00]: that Meta employees told me they have to measure

[00:22:32] [SPEAKER_00]: where your eyes have been looking

[00:22:34] [SPEAKER_00]: because the user may have already moved on to another task.

[00:22:39] [SPEAKER_00]: To click on an object, you pinch your fingers together,

[00:22:42] [SPEAKER_00]: setting up the device was simple.

[00:22:46] [SPEAKER_00]: They showed an example of how a user could use it

[00:22:48] [SPEAKER_00]: looking at a table of contents

[00:22:50] [SPEAKER_00]: and Orion provided a recipe.

[00:22:51] [SPEAKER_00]: So it's the kind of stuff we've seen in demos.

[00:22:55] [SPEAKER_00]: And I think it's important,

[00:22:56] [SPEAKER_00]: I think it's really important and smart that Zuckerberg

[00:23:00] [SPEAKER_00]: said this is a ways off coming.

[00:23:04] [SPEAKER_00]: Right, don't sell in stock on this today.

[00:23:07] [SPEAKER_00]: I'm not trying to figure this out today

[00:23:08] [SPEAKER_00]: but this is how far we are in our labs.

[00:23:11] [SPEAKER_00]: And I think that's smart

[00:23:12] [SPEAKER_00]: because people excited about it

[00:23:13] [SPEAKER_00]: but don't overhype it quite yet.

[00:23:17] [SPEAKER_02]: Yeah, in the way that maybe they overhyped

[00:23:21] [SPEAKER_02]: and oversold on VR, right?

[00:23:23] [SPEAKER_02]: Yeah.

[00:23:25] [SPEAKER_02]: With the quest and really went deep

[00:23:27] [SPEAKER_02]: into the potential of the quest

[00:23:31] [SPEAKER_02]: to change the work environment or any of those things.

[00:23:35] [SPEAKER_02]: And this is where a product like this

[00:23:39] [SPEAKER_02]: is addressing, I think to a certain degree,

[00:23:42] [SPEAKER_02]: some of the concerns that people had

[00:23:44] [SPEAKER_02]: or some of the pushback that people myself included

[00:23:47] [SPEAKER_02]: have had in this idea that the future, the metaverse,

[00:23:51] [SPEAKER_02]: all this stuff that like the future,

[00:23:52] [SPEAKER_02]: the offices of the future will have people sitting

[00:23:55] [SPEAKER_02]: in VR headsets in their own lands,

[00:23:58] [SPEAKER_02]: meeting virtually.

[00:23:59] [SPEAKER_02]: I have a hard time seeing that,

[00:24:02] [SPEAKER_02]: maybe in my lifetime, maybe not, I suppose it could happen.

[00:24:06] [SPEAKER_02]: I have a hard time believing that personally

[00:24:09] [SPEAKER_02]: but something like this seems to bridge the gap to that.

[00:24:13] [SPEAKER_02]: We're starting to see that miniaturization,

[00:24:15] [SPEAKER_02]: the glasses don't look incredibly awkward

[00:24:18] [SPEAKER_02]: and cumbersome.

[00:24:21] [SPEAKER_02]: Everything's going in the right direction

[00:24:23] [SPEAKER_02]: for this kind of smaller form factor approach.

[00:24:26] [SPEAKER_02]: And the technology appears to be good enough

[00:24:29] [SPEAKER_02]: to pull it off to start with, you know, these mini,

[00:24:32] [SPEAKER_02]: sorry, micro LED projectors in each lens

[00:24:35] [SPEAKER_02]: in the smaller form factor doing a lot

[00:24:38] [SPEAKER_02]: of what Zuckerberg has been talking about now

[00:24:40] [SPEAKER_02]: for years in the VR space

[00:24:42] [SPEAKER_02]: but this being augmented reality.

[00:24:45] [SPEAKER_02]: So yeah, I'm really curious and really kind of interested

[00:24:49] [SPEAKER_02]: slash excited about the potential

[00:24:51] [SPEAKER_02]: of devices like this in the future.

[00:24:54] [SPEAKER_00]: So I saw some figure today

[00:24:55] [SPEAKER_00]: about how few people are using the rabbit on a given day.

[00:25:01] [SPEAKER_02]: I don't know what you're talking about yet.

[00:25:02] [SPEAKER_02]: It's all the way.

[00:25:04] [SPEAKER_02]: Now I got to get it.

[00:25:05] [SPEAKER_02]: What was the last time you turned it on?

[00:25:07] [SPEAKER_02]: Hold on, hold on.

[00:25:09] [SPEAKER_02]: I'll go ahead and pull it off the shelf.

[00:25:13] [SPEAKER_02]: I mean, when's the last time I turned it on?

[00:25:16] [SPEAKER_02]: It's been it's been a couple of months, but yeah.

[00:25:21] [SPEAKER_00]: I'm not, I'm not, I'm not decrying your prediction.

[00:25:25] [SPEAKER_00]: I think there's a quest here to make the device

[00:25:27] [SPEAKER_00]: that actually does make sense.

[00:25:29] [SPEAKER_02]: Yeah, yeah.

[00:25:31] [SPEAKER_02]: I mean, yeah, exactly.

[00:25:32] [SPEAKER_02]: You know, it's funny that you mentioned the rabbit

[00:25:34] [SPEAKER_02]: because they do have some announcement

[00:25:37] [SPEAKER_02]: or some news this week that I really found

[00:25:40] [SPEAKER_02]: did not get much attention at all

[00:25:42] [SPEAKER_02]: which is essentially they're rolling out wider

[00:25:45] [SPEAKER_02]: their lamb large action model playground

[00:25:50] [SPEAKER_02]: which is essentially finally demonstrating

[00:25:52] [SPEAKER_02]: and doing some of the promises that they made initially

[00:25:56] [SPEAKER_02]: that this device was going to,

[00:25:58] [SPEAKER_02]: which is essentially what does that mean?

[00:26:00] [SPEAKER_02]: It's like if I end the demonstration that I saw

[00:26:03] [SPEAKER_02]: I wish I had linked to it is, well actually

[00:26:05] [SPEAKER_02]: it wouldn't have helped to link to it

[00:26:07] [SPEAKER_02]: because it took a long time.

[00:26:09] [SPEAKER_02]: That was one thing that I noticed

[00:26:10] [SPEAKER_02]: is that it took a long time for it to act on this

[00:26:12] [SPEAKER_02]: but you know, it was asked someone pointed the camera

[00:26:16] [SPEAKER_02]: at like a thing of tide cleaner

[00:26:19] [SPEAKER_02]: and basically said I wanna order one of these from Amazon

[00:26:23] [SPEAKER_02]: and I mean, the whole process took a couple of minutes

[00:26:25] [SPEAKER_02]: but essentially what's going on behind the scenes

[00:26:28] [SPEAKER_02]: and what they're saying is a big deal about this

[00:26:30] [SPEAKER_02]: is it's not a pre-programmed thing

[00:26:33] [SPEAKER_02]: where rabbit has made a deal with Amazon

[00:26:37] [SPEAKER_02]: so that when you say that it launches into this thing

[00:26:39] [SPEAKER_02]: essentially it's doing what a human would do.

[00:26:43] [SPEAKER_02]: It's opening up an instance of a browser,

[00:26:44] [SPEAKER_02]: it's going, it's searching for the product,

[00:26:46] [SPEAKER_02]: pulling it up.

[00:26:47] [SPEAKER_02]: It's agentive exactly.

[00:26:49] [SPEAKER_02]: And so yes it added at least according to the video proof

[00:26:53] [SPEAKER_02]: that I saw, proof in air quotes,

[00:26:56] [SPEAKER_02]: it added the tide to the cart in Amazon

[00:27:01] [SPEAKER_02]: and so yeah, so they are finally pushing out

[00:27:04] [SPEAKER_02]: some of the capabilities for people to start playing around

[00:27:07] [SPEAKER_02]: with the large action model thing on the rabbit.

[00:27:10] [SPEAKER_02]: I'm curious to know if that's actually

[00:27:12] [SPEAKER_02]: gonna change people's minds

[00:27:13] [SPEAKER_02]: or if this is even the right form factor to begin with

[00:27:16] [SPEAKER_02]: which I think is to your point, right?

[00:27:18] [SPEAKER_02]: Like maybe this just isn't it.

[00:27:20] [SPEAKER_02]: We already have something like this,

[00:27:22] [SPEAKER_02]: time and time again we keep pointing out

[00:27:24] [SPEAKER_02]: we already have something in this form factor.

[00:27:26] [SPEAKER_02]: It's a phone and it does these things

[00:27:28] [SPEAKER_02]: so why replace a phone with the less phone

[00:27:32] [SPEAKER_02]: that only does this thing?

[00:27:34] [SPEAKER_02]: Maybe it is the glasses perspective,

[00:27:36] [SPEAKER_02]: maybe that is the change.

[00:27:40] [SPEAKER_00]: If suddenly tomorrow tons of people came in

[00:27:43] [SPEAKER_00]: and then gave money to this show and made you rich,

[00:27:46] [SPEAKER_00]: richer than your wildest dreams, may that happen?

[00:27:50] [SPEAKER_00]: Would you be tempted to buy the Ray-Ban smart glasses?

[00:27:55] [SPEAKER_02]: Yeah, oh for sure.

[00:27:56] [SPEAKER_02]: You want to buy it? Yes.

[00:27:58] [SPEAKER_02]: I actually really do want a pair of the Ray-Ban smart

[00:28:00] [SPEAKER_02]: glasses, I just haven't been able to justify it yet.

[00:28:02] [SPEAKER_02]: Yeah totally, I would love to have those.

[00:28:04] [SPEAKER_02]: I would love to check them out because I mean

[00:28:06] [SPEAKER_02]: I think from the last couple of years

[00:28:09] [SPEAKER_02]: of actual AI driven hardware in one way, shape or form

[00:28:13] [SPEAKER_02]: maybe it's not as wide of a potential use

[00:28:16] [SPEAKER_02]: as something like the Ravidar one

[00:28:18] [SPEAKER_02]: but it seems to be one of the few

[00:28:21] [SPEAKER_02]: AI driven hardware devices that people have

[00:28:24] [SPEAKER_02]: a good overall experience with.

[00:28:27] [SPEAKER_02]: Like I hear more positive than negative about it

[00:28:29] [SPEAKER_02]: from a user perspective and that has me curious.

[00:28:32] [SPEAKER_00]: Not to pry, do you wear contacts

[00:28:35] [SPEAKER_00]: or are you just super human?

[00:28:37] [SPEAKER_02]: I wear contacts for sure.

[00:28:38] [SPEAKER_00]: You wear contacts blind without them.

[00:28:40] [SPEAKER_00]: Would you switch off of contacts on to glass?

[00:28:44] [SPEAKER_00]: If you like the Ray-Ban so much,

[00:28:46] [SPEAKER_00]: yeah.

[00:28:47] [SPEAKER_00]: Would you switch your lenses to that

[00:28:49] [SPEAKER_00]: or would you actually wear non lenses

[00:28:51] [SPEAKER_00]: in front of your contacts?

[00:28:53] [SPEAKER_00]: I mean how would you manage that?

[00:28:54] [SPEAKER_00]: I wear glasses, I'm always gonna wear glasses,

[00:28:56] [SPEAKER_00]: I'm never gonna wear contacts, I'm squeamish.

[00:28:59] [SPEAKER_00]: So that's easy for me for you,

[00:29:00] [SPEAKER_00]: I'm curious about, but if everybody starts

[00:29:02] [SPEAKER_00]: to love glasses but you either don't need them

[00:29:05] [SPEAKER_00]: or you have contacts, what does that do to the market?

[00:29:07] [SPEAKER_00]: I'm just curious, what you're thinking would be.

[00:29:09] [SPEAKER_02]: That's a really interesting point

[00:29:11] [SPEAKER_02]: that I hadn't considered.

[00:29:12] [SPEAKER_02]: I think in my head I'm just thinking,

[00:29:14] [SPEAKER_02]: oh I'd still wear contacts and then when I want

[00:29:16] [SPEAKER_02]: the ease of use or the convenience of having the glasses

[00:29:20] [SPEAKER_02]: I just throw them on top of my already,

[00:29:24] [SPEAKER_02]: my already poor eyes with the contact lenses in them.

[00:29:27] [SPEAKER_02]: But I guess yeah, I guess it could be.

[00:29:29] [SPEAKER_00]: Yeah, because if you couldn't just use

[00:29:30] [SPEAKER_00]: the Ray-Bans and sunglasses because then when you're

[00:29:32] [SPEAKER_00]: inside you wanna use the functionality.

[00:29:35] [SPEAKER_00]: You're not gonna put on sunglasses unless you're

[00:29:37] [SPEAKER_00]: a certain famous tech writer.

[00:29:41] [SPEAKER_02]: Yeah, so I guess to that end that yeah,

[00:29:43] [SPEAKER_02]: I would probably still want to wear contacts.

[00:29:47] [SPEAKER_02]: And have those glasses because those glasses, yeah,

[00:29:50] [SPEAKER_02]: but that does kind of minimize the potential

[00:29:53] [SPEAKER_02]: kind of benefit of something like that.

[00:29:57] [SPEAKER_02]: If you're always wearing them and they're always on you

[00:29:59] [SPEAKER_02]: then you have more potential use cases for them

[00:30:02] [SPEAKER_02]: that just kind of happen organically versus being like,

[00:30:05] [SPEAKER_02]: all right now is the glasses time

[00:30:07] [SPEAKER_02]: for this particular thing.

[00:30:09] [SPEAKER_00]: Right, just to take this because it's in this quest

[00:30:13] [SPEAKER_00]: for the right device.

[00:30:15] [SPEAKER_02]: Yeah, yeah, the right device, the right form factor,

[00:30:18] [SPEAKER_02]: the right solution.

[00:30:20] [SPEAKER_02]: I mean, yeah, it could just be though

[00:30:22] [SPEAKER_02]: that once you use the right device, whatever that is

[00:30:25] [SPEAKER_02]: then it changes your perspective on all this too.

[00:30:29] [SPEAKER_02]: Like I could use one of those things

[00:30:31] [SPEAKER_02]: and have an amazing experience that like, okay, I'm sold.

[00:30:35] [SPEAKER_02]: Like it's worth it for me to not wear contacts anymore

[00:30:37] [SPEAKER_02]: and to wear these things.

[00:30:39] [SPEAKER_02]: All right, sorry for that little rabbit hole,

[00:30:40] [SPEAKER_00]: but it's interesting.

[00:30:41] [SPEAKER_00]: No, I love it.

[00:30:42] [SPEAKER_00]: Where this all fits in together,

[00:30:43] [SPEAKER_00]: trying to bring AI into your life.

[00:30:46] [SPEAKER_02]: Yeah, totally.

[00:30:47] [SPEAKER_00]: And it's not easy, it's not easy.

[00:30:49] [SPEAKER_02]: Yeah, yeah, and still very undefined.

[00:30:52] [SPEAKER_02]: That's why there's so much money

[00:30:54] [SPEAKER_02]: being thrown around right now

[00:30:56] [SPEAKER_02]: because I think everybody that has the money

[00:30:58] [SPEAKER_02]: recognizes that there's a lot of opportunity here.

[00:31:03] [SPEAKER_02]: I think the people who have the money

[00:31:05] [SPEAKER_02]: are always looking for the opportunity

[00:31:08] [SPEAKER_02]: that is yet untapped.

[00:31:10] [SPEAKER_02]: And here there is so much energy

[00:31:12] [SPEAKER_02]: around artificial intelligence,

[00:31:14] [SPEAKER_02]: the kind of the underneath the layers technology

[00:31:17] [SPEAKER_02]: of what's going on there.

[00:31:18] [SPEAKER_02]: Now we just have to figure out how we make a hardware

[00:31:22] [SPEAKER_02]: device that is like the perfect kind of delivery mechanism

[00:31:25] [SPEAKER_02]: for it or whatever.

[00:31:27] [SPEAKER_02]: And maybe that will never happen.

[00:31:28] [SPEAKER_02]: Maybe we already have it.

[00:31:29] [SPEAKER_00]: We'll see.

[00:31:30] [SPEAKER_02]: Yeah, we'll see.

[00:31:32] [SPEAKER_02]: Microsoft also announced an upgrade

[00:31:35] [SPEAKER_02]: to its co-pilot experience for Windows PC,

[00:31:39] [SPEAKER_02]: PCs co-pilot voice.

[00:31:41] [SPEAKER_02]: So here we are again, conversational communication,

[00:31:43] [SPEAKER_02]: kind of like open AI voice mode,

[00:31:46] [SPEAKER_02]: co-pilot vision which is vision

[00:31:49] [SPEAKER_02]: into Microsoft Edge browser.

[00:31:52] [SPEAKER_02]: So context aware assistance inside of the browser.

[00:31:55] [SPEAKER_02]: There's think deeper,

[00:31:57] [SPEAKER_02]: which is powered by OpenAI's 01 model.

[00:32:00] [SPEAKER_02]: So enhanced reasoning that we've talked about.

[00:32:03] [SPEAKER_02]: And then the return of recall,

[00:32:05] [SPEAKER_02]: which is upgraded,

[00:32:06] [SPEAKER_02]: which this time with upgraded security,

[00:32:08] [SPEAKER_02]: upgraded privacy now opt in.

[00:32:11] [SPEAKER_02]: And I think of all the things

[00:32:12] [SPEAKER_02]: that we've talked about almost the last half hour,

[00:32:15] [SPEAKER_02]: there's this kind of point of voice interaction

[00:32:19] [SPEAKER_02]: as being a very constant.

[00:32:21] [SPEAKER_02]: Things are constantly pointing towards

[00:32:24] [SPEAKER_02]: a potential shift in how we interact with computers,

[00:32:28] [SPEAKER_02]: not that this is new information,

[00:32:29] [SPEAKER_02]: but it's just kind of a thing

[00:32:31] [SPEAKER_02]: that I noticed of all these stories.

[00:32:33] [SPEAKER_02]: They're all tapping into this.

[00:32:35] [SPEAKER_02]: And it just makes me think like,

[00:32:37] [SPEAKER_02]: I remember a decade ago or a decade and a half,

[00:32:40] [SPEAKER_02]: however long it's been,

[00:32:41] [SPEAKER_02]: when Siri came out,

[00:32:42] [SPEAKER_02]: when Google now slash assistant came out,

[00:32:44] [SPEAKER_02]: this was the trend then too.

[00:32:47] [SPEAKER_02]: It's just, it didn't quite make it.

[00:32:50] [SPEAKER_02]: And are we seeing enough now,

[00:32:53] [SPEAKER_02]: or are people more comfortable now

[00:32:55] [SPEAKER_02]: talking to their computers?

[00:32:56] [SPEAKER_02]: Is it just like, has it changed?

[00:32:57] [SPEAKER_02]: I don't think the market has said,

[00:32:59] [SPEAKER_00]: we've talked about this before,

[00:33:00] [SPEAKER_00]: but I think all the devices,

[00:33:01] [SPEAKER_00]: Madam A and G sitting by your side on your desk,

[00:33:05] [SPEAKER_00]: I finally put mine away.

[00:33:07] [SPEAKER_00]: I don't have any here anymore.

[00:33:08] [SPEAKER_00]: It was a power sucking clock.

[00:33:10] [SPEAKER_00]: It's gone.

[00:33:13] [SPEAKER_02]: We still got it in all of our rooms.

[00:33:14] [SPEAKER_02]: We still use it.

[00:33:17] [SPEAKER_02]: We use it for music.

[00:33:18] [SPEAKER_02]: We usually use it for weather.

[00:33:23] [SPEAKER_02]: And occasionally ask it,

[00:33:25] [SPEAKER_02]: like for conversions,

[00:33:26] [SPEAKER_02]: if we're cooking or whatever.

[00:33:27] [SPEAKER_02]: Still those,

[00:33:29] [SPEAKER_02]: but I mean, is it worth it?

[00:33:31] [SPEAKER_02]: I don't know that it necessarily is.

[00:33:33] [SPEAKER_00]: Yeah.

[00:33:35] [SPEAKER_00]: The thing about the Microsoft

[00:33:37] [SPEAKER_00]: announced what the struck me was,

[00:33:39] [SPEAKER_00]: I objected to think deeper.

[00:33:42] [SPEAKER_00]: Number one, it doesn't think.

[00:33:46] [SPEAKER_00]: It does not think.

[00:33:47] [SPEAKER_00]: Number two, it then does not think deeper.

[00:33:51] [SPEAKER_00]: And I think Microsoft has been in this position

[00:33:53] [SPEAKER_00]: where they're constantly overselling what AI can do.

[00:33:58] [SPEAKER_00]: And I think it gets them in trouble, right?

[00:33:59] [SPEAKER_00]: They put it up with search

[00:34:03] [SPEAKER_00]: and it was giving people bad responses

[00:34:06] [SPEAKER_00]: and that kind of got Microsoft in trouble too.

[00:34:08] [SPEAKER_00]: And I just don't think they're learning that lesson.

[00:34:11] [SPEAKER_00]: I don't know.

[00:34:13] [SPEAKER_02]: Yeah.

[00:34:13] [SPEAKER_02]: It makes me wonder if the EU at some point is gonna be like,

[00:34:18] [SPEAKER_02]: you know what?

[00:34:18] [SPEAKER_02]: We don't want you talking to AI that sounds human.

[00:34:21] [SPEAKER_02]: We also don't want you referring to AI as if it is human.

[00:34:25] [SPEAKER_02]: Right.

[00:34:26] [SPEAKER_02]: I don't know.

[00:34:27] [SPEAKER_00]: That might be a bridge too far, but...

[00:34:29] [SPEAKER_00]: Before you go away from the story,

[00:34:31] [SPEAKER_00]: I'm putting in one more because it's me

[00:34:32] [SPEAKER_00]: and I have to talk about Chromebooks.

[00:34:35] [SPEAKER_00]: So I just put a link in.

[00:34:36] [SPEAKER_00]: So Google had a Chromebook showcase.

[00:34:39] [SPEAKER_00]: It was a week ago and I got all pissed off

[00:34:41] [SPEAKER_00]: and I couldn't see any news about it.

[00:34:42] [SPEAKER_00]: What happened to it?

[00:34:42] [SPEAKER_00]: Plus they don't invite me, which really pisses me off.

[00:34:45] [SPEAKER_00]: I've been loyal to these things for a decade.

[00:34:47] [SPEAKER_00]: Well, it was barred for a week.

[00:34:49] [SPEAKER_00]: So now there's a new Samsung thin,

[00:34:53] [SPEAKER_00]: really nicely thin beautiful Chromebook.

[00:34:56] [SPEAKER_00]: I'm gonna find out whether it has a damn fan

[00:34:57] [SPEAKER_00]: because I have a Samsung now and the fan drives me crazy

[00:35:00] [SPEAKER_00]: but that's an version.

[00:35:01] [SPEAKER_00]: But they're right there in that picture there.

[00:35:03] [SPEAKER_00]: You scrolled up to the new plus button.

[00:35:06] [SPEAKER_00]: And so this is being added to the plus Chromebooks

[00:35:10] [SPEAKER_00]: which is an entry into the AI features.

[00:35:13] [SPEAKER_00]: So you have something on your screen,

[00:35:14] [SPEAKER_00]: you hit that plus button and you can get translation.

[00:35:17] [SPEAKER_00]: You can get recent links, put it in a link.

[00:35:22] [SPEAKER_00]: You can get summaries of the content

[00:35:26] [SPEAKER_00]: and all kinds of stuff.

[00:35:27] [SPEAKER_00]: So to our point earlier about whether glasses are a good UI

[00:35:31] [SPEAKER_00]: to get you to AI or not, when it comes to your laptops

[00:35:34] [SPEAKER_00]: they're trying to figure out how to get

[00:35:36] [SPEAKER_00]: between co-pilot and now Google

[00:35:38] [SPEAKER_00]: how to get you to the AI and say,

[00:35:39] [SPEAKER_00]: hey, I'm here.

[00:35:40] [SPEAKER_00]: I can do stuff for you and we'll see.

[00:35:42] [SPEAKER_02]: Right now we've given you a single button.

[00:35:44] [SPEAKER_02]: Of course it's integrated into the caps lock so.

[00:35:47] [SPEAKER_00]: Which, yeah.

[00:35:48] [SPEAKER_00]: I'm a Chromebook user and supposedly that button

[00:35:51] [SPEAKER_00]: is the Google button, the search button

[00:35:55] [SPEAKER_00]: but you can change the button.

[00:35:56] [SPEAKER_00]: So I turn that immediately into caps lock.

[00:35:58] [SPEAKER_00]: So I don't know how this is gonna work now.

[00:36:00] [SPEAKER_00]: Is it, but we'll see.

[00:36:02] [SPEAKER_02]: Probably hold down function or something when you press it

[00:36:05] [SPEAKER_02]: or I'm not quite sure.

[00:36:06] [SPEAKER_02]: But nonetheless, a single button to take you to your AI.

[00:36:11] [SPEAKER_00]: Insert, like you're inserting AI into your life.

[00:36:14] [SPEAKER_02]: Yes, just a reminder it's there.

[00:36:17] [SPEAKER_02]: Why not use it?

[00:36:19] [SPEAKER_02]: I mean, yeah, that's interesting.

[00:36:20] [SPEAKER_02]: I'll be curious to see if that is a feature

[00:36:24] [SPEAKER_02]: that people opt to use or forget about

[00:36:27] [SPEAKER_02]: or maybe that really impacts

[00:36:31] [SPEAKER_02]: and increases your potential usage of AI

[00:36:35] [SPEAKER_02]: once you know that it's easy to summon it

[00:36:37] [SPEAKER_02]: because it appears in all different places,

[00:36:39] [SPEAKER_02]: all different kinds of ways outside of that button.

[00:36:43] [SPEAKER_02]: So yeah.

[00:36:43] [SPEAKER_00]: If they don't get people to use it

[00:36:44] [SPEAKER_00]: then a lot of investments going into something

[00:36:46] [SPEAKER_00]: that's not that worthwhile on a consumer basis.

[00:36:49] [SPEAKER_02]: Yeah, we talked about this a little bit last night

[00:36:51] [SPEAKER_02]: on Android Faithful and that was one of my questions is

[00:36:54] [SPEAKER_02]: is this like the kind of button

[00:36:57] [SPEAKER_02]: that like two or three years down the line

[00:36:58] [SPEAKER_02]: when everybody's moved on to whatever the next thing is

[00:37:01] [SPEAKER_02]: that this is just one of those legacy like,

[00:37:03] [SPEAKER_02]: oh, I remember that era.

[00:37:06] [SPEAKER_00]: It's a vestige like your tonsils and appendix.

[00:37:08] [SPEAKER_00]: Yeah, why do I have these things?

[00:37:10] Right.

[00:37:12] [SPEAKER_02]: Are you gonna get one of these laptops?

[00:37:13] [SPEAKER_02]: So are you thinking about it?

[00:37:14] [SPEAKER_00]: I'm thinking about it.

[00:37:15] [SPEAKER_00]: I have one now.

[00:37:16] [SPEAKER_00]: My Google one, which I loved

[00:37:18] [SPEAKER_00]: finally clocked out too many ways on me.

[00:37:21] [SPEAKER_00]: So I went and got this Samsung one,

[00:37:23] [SPEAKER_00]: which is okay, it's the big red one

[00:37:24] [SPEAKER_00]: that was really pretty.

[00:37:25] [SPEAKER_00]: And I knew it was a problem,

[00:37:26] [SPEAKER_00]: but the fan is driving me insane.

[00:37:29] [SPEAKER_00]: It is constantly on constantly allowed.

[00:37:31] [SPEAKER_00]: I can close everything and the fan still comes on.

[00:37:33] [SPEAKER_00]: Because the prior version of this machine overheated a lot.

[00:37:36] [SPEAKER_00]: So they went overboard.

[00:37:37] [SPEAKER_00]: So I wonder whether this machine will have the fan

[00:37:40] [SPEAKER_00]: and what it'll operate.

[00:37:41] [SPEAKER_00]: And the other thing is I think Chromebook Unboxed

[00:37:45] [SPEAKER_00]: said that it didn't have a touchscreen.

[00:37:48] [SPEAKER_02]: Oh, interesting.

[00:37:49] [SPEAKER_00]: And it's also 16 by nine

[00:37:51] [SPEAKER_00]: because it has a numeric keyboard

[00:37:53] [SPEAKER_00]: so they can get more keys on

[00:37:54] [SPEAKER_00]: and do more fancy things with the keys.

[00:37:55] [SPEAKER_00]: So I love the form factor of it

[00:37:57] [SPEAKER_00]: in terms of it's thin and light and solid

[00:38:00] [SPEAKER_00]: and a good screen.

[00:38:02] [SPEAKER_00]: And I might, I don't know.

[00:38:04] [SPEAKER_00]: But I still haven't gotten a new Android

[00:38:06] [SPEAKER_00]: and I still haven't gotten a new Chromebook.

[00:38:08] [SPEAKER_00]: We'll see.

[00:38:09] [SPEAKER_02]: What phone do you have right now?

[00:38:10] [SPEAKER_00]: Six.

[00:38:12] [SPEAKER_02]: The six?

[00:38:12] [SPEAKER_02]: I know grandpa.

[00:38:13] [SPEAKER_02]: You do my friends.

[00:38:15] [SPEAKER_02]: I am.

[00:38:17] [SPEAKER_02]: At least, I mean nowadays you buy a new Pixel phone

[00:38:21] [SPEAKER_02]: you get seven years out of it potentially.

[00:38:23] [SPEAKER_02]: So maybe that's an old paradigm.

[00:38:26] [SPEAKER_02]: Oh, it's three years old, you've got to replace that.

[00:38:28] [SPEAKER_02]: Maybe you don't actually see it.

[00:38:29] [SPEAKER_00]: Plus my wife keeps phones for 40 years.

[00:38:31] [SPEAKER_00]: I mean, so she's like, why do you need a new phone?

[00:38:34] [SPEAKER_00]: Am I one?

[00:38:34] [SPEAKER_00]: Perfectly fine.

[00:38:35] [SPEAKER_00]: She likes the small ones

[00:38:36] [SPEAKER_00]: and she has the last small iPhone

[00:38:40] [SPEAKER_00]: that will ever be made probably.

[00:38:42] [SPEAKER_00]: So she's never let loose of that thing.

[00:38:44] [SPEAKER_00]: So that means I'm not justified to get a car.

[00:38:47] [SPEAKER_00]: Say with the phone, say with the car.

[00:38:49] [SPEAKER_00]: She's got tons more miles on it.

[00:38:51] [SPEAKER_00]: Oh, it's working fine.

[00:38:53] [SPEAKER_00]: I don't need to you don't need to.

[00:38:55] [SPEAKER_00]: I don't need to, yeah.

[00:38:58] [SPEAKER_02]: Look we do different things.

[00:38:59] [SPEAKER_02]: I work in the world of technology, practice.

[00:39:02] [SPEAKER_00]: Nice excuse, nice try bud.

[00:39:04] [SPEAKER_02]: Yeah, try that one see how that works.

[00:39:06] [SPEAKER_02]: Try it off precise.

[00:39:08] [SPEAKER_02]: All right, we're gonna take a quick break.

[00:39:09] [SPEAKER_02]: When we come back we're gonna talk a little bit

[00:39:11] [SPEAKER_02]: about AI legislation in the state of California

[00:39:15] [SPEAKER_02]: where I live that's coming up in a second.

[00:39:19] [SPEAKER_02]: All right here in California Governor Gavin Newsom

[00:39:22] [SPEAKER_02]: vetoed the bill SB 1047

[00:39:27] [SPEAKER_02]: called it well intentioned but ultimately flawed.

[00:39:31] [SPEAKER_02]: This is the AI safety bill that we have definitely talked

[00:39:33] [SPEAKER_02]: about in previous episodes that would have seen strict

[00:39:36] [SPEAKER_02]: regulations in place for AI firms

[00:39:39] [SPEAKER_02]: and models within the state.

[00:39:42] [SPEAKER_02]: Of course you had companies like OpenAI

[00:39:44] [SPEAKER_02]: and Google lobbying hard against it saying,

[00:39:48] [SPEAKER_02]: you know this is gonna stifle technology,

[00:39:50] [SPEAKER_02]: innovation that sort of stuff.

[00:39:52] [SPEAKER_02]: And then you had on the flip side,

[00:39:53] [SPEAKER_02]: you had AI safety companies like Anthropic,

[00:39:56] [SPEAKER_02]: you had Elon Musk.

[00:39:58] [SPEAKER_02]: The Doomsters.

[00:39:59] [SPEAKER_02]: That the Doomsters much of Hollywood supporting the bill

[00:40:03] [SPEAKER_02]: afraid you know kind of from their perspective

[00:40:06] [SPEAKER_02]: of being afraid of AI kind of taking their jobs

[00:40:09] [SPEAKER_02]: or taking their likelihood and livelihood.

[00:40:12] [SPEAKER_02]: You put in an article from the amazing Mike Masnick

[00:40:16] [SPEAKER_02]: at TechDirt who says the bad bill going away

[00:40:20] [SPEAKER_02]: simply sets the scene for something worse

[00:40:21] [SPEAKER_02]: down the line though he's happy that it went away.

[00:40:24] [SPEAKER_00]: Right, I agree I think it was a good.

[00:40:26] [SPEAKER_00]: So Newsom was playing clever as he does politically.

[00:40:29] [SPEAKER_00]: He signed I think three other AI bills.

[00:40:32] [SPEAKER_00]: One about disinformation, one about summarizing,

[00:40:35] [SPEAKER_00]: I think it was about disinformation, I can be wrong.

[00:40:37] [SPEAKER_00]: One was about summarizing what you use to train the model

[00:40:40] [SPEAKER_00]: which I think is a good transparency is a good thing.

[00:40:42] [SPEAKER_00]: But that opened the door for him to say,

[00:40:45] [SPEAKER_00]: ah but this one's a bridge too far.

[00:40:47] [SPEAKER_00]: The information to my surprise in their headline said

[00:40:50] [SPEAKER_00]: it was an odd veto.

[00:40:52] [SPEAKER_00]: I don't think it's an odd veto at all.

[00:40:53] [SPEAKER_00]: I think it's a smart veto.

[00:40:54] [SPEAKER_00]: I agree with what Newsom did in this case.

[00:40:58] [SPEAKER_00]: Masnick said one argument is that part of Newsom's argument

[00:41:01] [SPEAKER_00]: was well only deals with the big models

[00:41:03] [SPEAKER_00]: that should deal with all of them.

[00:41:04] [SPEAKER_00]: Well that's kind of worse because the problem is

[00:41:07] [SPEAKER_00]: if you've discussed the show before,

[00:41:08] [SPEAKER_00]: do you put the responsibility up at the model level

[00:41:11] [SPEAKER_00]: at the intermediate area or application level

[00:41:14] [SPEAKER_00]: or at the user level?

[00:41:15] [SPEAKER_00]: And to only do the model level,

[00:41:19] [SPEAKER_00]: A, I think is impossible and B,

[00:41:22] [SPEAKER_00]: fools people into false comfort that oh, we've made AI safe now

[00:41:28] [SPEAKER_00]: when in fact people of the next two layers

[00:41:30] [SPEAKER_00]: could do awful things with it

[00:41:32] [SPEAKER_00]: and the people of the top layer

[00:41:33] [SPEAKER_00]: can't necessarily stop them.

[00:41:36] [SPEAKER_00]: So that's my issue with the legislation.

[00:41:39] [SPEAKER_00]: And so I think it was good.

[00:41:41] [SPEAKER_00]: That's being seen as well, tech wins another one,

[00:41:44] [SPEAKER_00]: God, I'm touched,

[00:41:45] [SPEAKER_00]: text charge of Newsom and winning stuff.

[00:41:49] [SPEAKER_00]: And I think that's part they did lobby against it.

[00:41:52] [SPEAKER_00]: They did think it was bad legislation,

[00:41:54] [SPEAKER_00]: but I think they were right.

[00:41:57] [SPEAKER_00]: So you have Masnick said when Newsom went on

[00:42:00] [SPEAKER_00]: about the size of the bills,

[00:42:01] [SPEAKER_00]: Masnick said in his inimitable fashion,

[00:42:04] [SPEAKER_00]: I dot, dot, dot don't think that was the main problem

[00:42:07] [SPEAKER_00]: of the bill dude calling his governor dude.

[00:42:14] [SPEAKER_00]: Elsewhere he said his argument made more sense,

[00:42:16] [SPEAKER_00]: noting that any regulatory regime right now

[00:42:18] [SPEAKER_00]: must be adaptable.

[00:42:19] [SPEAKER_00]: The technology is still quite new.

[00:42:21] [SPEAKER_00]: I think that's absolutely true.

[00:42:23] [SPEAKER_00]: So I think we're gonna see, Mike's right,

[00:42:26] [SPEAKER_00]: we're gonna see other bills come along,

[00:42:28] [SPEAKER_00]: but I think this was a really important moment to say,

[00:42:32] [SPEAKER_00]: no.

[00:42:33] [SPEAKER_02]: Yeah, I thought one thing that Mike pointed out

[00:42:36] [SPEAKER_02]: that I think was a really good paragraph from this

[00:42:41] [SPEAKER_02]: is he says, my key takeaway from watching this debate

[00:42:44] [SPEAKER_02]: and other AI bills play out over the last few months

[00:42:47] [SPEAKER_02]: is that a lot of people feel that one,

[00:42:49] [SPEAKER_02]: social media is bad.

[00:42:51] [SPEAKER_02]: Two, they missed the chance to regulate it

[00:42:54] [SPEAKER_02]: when they should have.

[00:42:56] [SPEAKER_02]: Three, they don't wanna do that with AI

[00:42:59] [SPEAKER_02]: and therefore four, they need to over correct

[00:43:02] [SPEAKER_02]: and aggressively regulate AI.

[00:43:05] [SPEAKER_02]: And I think yeah, he's absolutely spot on with that

[00:43:08] [SPEAKER_02]: as far as the approach to AI compared to

[00:43:11] [SPEAKER_02]: what has happened in the last decade, decade and a half

[00:43:14] [SPEAKER_02]: with social media.

[00:43:15] [SPEAKER_02]: I think that drives a lot of the fear

[00:43:18] [SPEAKER_02]: and the uncertainty and the doubt.

[00:43:21] [SPEAKER_02]: Sure, we'll go ahead and throw the doubt in there as well.

[00:43:24] [SPEAKER_02]: Yeah, yeah.

[00:43:24] [SPEAKER_02]: But yeah.

[00:43:26] [SPEAKER_02]: Mike's usually right.

[00:43:27] [SPEAKER_02]: Interesting.

[00:43:28] [SPEAKER_02]: Yeah, he's great.

[00:43:30] [SPEAKER_02]: He's great.

[00:43:32] [SPEAKER_02]: Google DeepMind unveiled AlphaChip.

[00:43:35] [SPEAKER_02]: This is AI that designs computer chips

[00:43:39] [SPEAKER_02]: and it can design chip layouts in hours, says Google,

[00:43:44] [SPEAKER_02]: not months.

[00:43:45] [SPEAKER_02]: So this is just one of those examples of an AI

[00:43:48] [SPEAKER_02]: being applied to something that, I mean,

[00:43:50] [SPEAKER_02]: when you think of the amount of time saved

[00:43:53] [SPEAKER_02]: when you're cutting something

[00:43:55] [SPEAKER_02]: that would normally take months to hours.

[00:43:57] [SPEAKER_02]: Like when I hear things like that,

[00:43:58] [SPEAKER_02]: like it's kind of hard for me to comprehend

[00:44:01] [SPEAKER_02]: that at the same time,

[00:44:04] [SPEAKER_02]: like I know that the ways that I'm using AI

[00:44:07] [SPEAKER_02]: in my own job right now

[00:44:09] [SPEAKER_02]: and in my own kind of professional world right now,

[00:44:12] [SPEAKER_02]: I am doing certain things with AI

[00:44:16] [SPEAKER_02]: that take me, that cut something

[00:44:18] [SPEAKER_02]: that may have taken me two hours before

[00:44:21] [SPEAKER_02]: down to a matter of like 10 minutes.

[00:44:23] [SPEAKER_02]: Like what?

[00:44:24] [SPEAKER_02]: Trans summaries or transcripts or what?

[00:44:26] [SPEAKER_02]: Like research.

[00:44:27] [SPEAKER_02]: Research is, you know,

[00:44:29] [SPEAKER_02]: finding disparate places that have shared information

[00:44:33] [SPEAKER_02]: so that I can confirm pieces of information

[00:44:36] [SPEAKER_02]: so that I can compile a list

[00:44:37] [SPEAKER_02]: of something that I'm trying to do.

[00:44:40] [SPEAKER_02]: You know, transcribing, transcription is one of those things

[00:44:43] [SPEAKER_02]: that is, I mean, it was not very long ago

[00:44:47] [SPEAKER_02]: that transcription was a time consuming,

[00:44:50] [SPEAKER_02]: very expensive process if you wanted to do it

[00:44:52] [SPEAKER_02]: with everything.

[00:44:53] [SPEAKER_02]: And now like it is free and almost immediate.

[00:44:57] [SPEAKER_02]: It is like just crazy.

[00:44:59] [SPEAKER_02]: And so when I think of like the time savings

[00:45:01] [SPEAKER_02]: from that small example,

[00:45:04] [SPEAKER_02]: and I believe it because I've experienced it,

[00:45:08] [SPEAKER_02]: like what would I be doing if these tools didn't exist

[00:45:10] [SPEAKER_02]: and I'm doing what I'm doing right now?

[00:45:12] [SPEAKER_02]: I'd be spending a lot more time on these other things

[00:45:14] [SPEAKER_02]: which means I wouldn't be spending that time

[00:45:16] [SPEAKER_02]: doing some of the other things

[00:45:17] [SPEAKER_02]: that I'm able to do as a result.

[00:45:19] [SPEAKER_02]: And then I look at a story like this

[00:45:20] [SPEAKER_02]: and I'm just like, man, that is crazy

[00:45:23] [SPEAKER_02]: to cut something down that would take months down to hours.

[00:45:26] [SPEAKER_00]: And chips are mind bogglingly complex

[00:45:33] [SPEAKER_00]: as complex as that word.

[00:45:36] [SPEAKER_00]: And you wonder how they QC it,

[00:45:41] [SPEAKER_00]: how do they know that the AI didn't make a mistake

[00:45:43] [SPEAKER_00]: in that process when it can be done

[00:45:46] [SPEAKER_00]: to many, many things, but we'll find that interesting.

[00:45:49] [SPEAKER_00]: If I may, go ahead.

[00:45:52] [SPEAKER_02]: I was just gonna say in both cases,

[00:45:53] [SPEAKER_02]: they did say that human developers do most of the work,

[00:45:57] [SPEAKER_02]: alpha chip is used for a limited set of the component.

[00:46:01] [SPEAKER_02]: But yeah.

[00:46:02] [SPEAKER_00]: Which I gotta imagine what it was like

[00:46:03] [SPEAKER_00]: when you designed cars before CAD CAM came along.

[00:46:08] [SPEAKER_00]: Right, the precision you could get

[00:46:09] [SPEAKER_00]: or buildings to architects.

[00:46:12] [SPEAKER_00]: Yeah.

[00:46:13] [SPEAKER_00]: You know what a difference.

[00:46:14] [SPEAKER_00]: So if I may go down a good comparison,

[00:46:16] [SPEAKER_00]: one of my rabbit holes for just a moment.

[00:46:18] [SPEAKER_00]: So I'm right now writing and still researching a book

[00:46:21] [SPEAKER_00]: I'm writing on the history of the line of type,

[00:46:23] [SPEAKER_00]: which is very wonky.

[00:46:24] [SPEAKER_00]: It's a technology that changed from setting type one letter

[00:46:27] [SPEAKER_00]: at a time to a line at a time

[00:46:28] [SPEAKER_00]: and it opened the door to mass media.

[00:46:29] [SPEAKER_00]: I won't bore you on that particular line more.

[00:46:32] [SPEAKER_00]: The inspiration for the machine that won in the end

[00:46:35] [SPEAKER_00]: is a guy named James O. Cleffane.

[00:46:38] [SPEAKER_00]: And Cleffane was a stenographer

[00:46:40] [SPEAKER_00]: working for the secretary of state under Lincoln.

[00:46:44] [SPEAKER_00]: And if you think about it,

[00:46:46] [SPEAKER_00]: the only way that you could record human speech

[00:46:49] [SPEAKER_00]: was by writing it down.

[00:46:51] [SPEAKER_00]: And so stenographers were in a terribly important field

[00:46:54] [SPEAKER_00]: and they invented shorthand to try to make it faster.

[00:46:58] [SPEAKER_00]: And Cleffane was crucial in testing the first typewriters

[00:47:01] [SPEAKER_00]: to try to make that work.

[00:47:03] [SPEAKER_00]: And then I found out, I didn't realize this at first

[00:47:06] [SPEAKER_00]: that he was all, so he was an investor as well

[00:47:09] [SPEAKER_00]: in the prototype, but he was also an investor

[00:47:10] [SPEAKER_00]: in the first competitor to Edison's photograph.

[00:47:14] [SPEAKER_00]: And here's my little rabbit hole payoff.

[00:47:16] [SPEAKER_00]: Probably not worth this trip,

[00:47:18] [SPEAKER_00]: but you're on it already.

[00:47:20] [SPEAKER_00]: We're there, we're in.

[00:47:22] [SPEAKER_00]: Stenographers were also known as phonographers

[00:47:25] [SPEAKER_00]: because they recorded things.

[00:47:28] [SPEAKER_00]: So the phonograph was a root of phonographer, the person.

[00:47:34] [SPEAKER_00]: Because if you think about it,

[00:47:35] [SPEAKER_00]: the only way you could record anything

[00:47:36] [SPEAKER_00]: was by somebody writing it down.

[00:47:37] [SPEAKER_00]: And that person was the thing that recorded it.

[00:47:40] [SPEAKER_00]: And that was the verb that was used.

[00:47:41] [SPEAKER_00]: That was how you recorded an event.

[00:47:43] [SPEAKER_00]: Is the person recorded it with pen on paper.

[00:47:47] [SPEAKER_00]: And then maybe you've got to print.

[00:47:48] [SPEAKER_00]: And so when you invented something to record mechanically,

[00:47:54] [SPEAKER_00]: it was called a phonograph as a result.

[00:47:56] [SPEAKER_00]: Phonograph.

[00:47:57] [SPEAKER_00]: Oh, that's awesome.

[00:47:58] [SPEAKER_00]: Isn't that fun?

[00:47:59] Ha ha ha ha ha.

[00:48:00] [SPEAKER_00]: I thought it was fun.

[00:48:02] [SPEAKER_00]: But now you go to your transcripts,

[00:48:05] [SPEAKER_00]: you see kind of a straight line here

[00:48:07] [SPEAKER_00]: of this effort to record reality that happens.

[00:48:10] [SPEAKER_02]: Yeah, yeah.

[00:48:12] [SPEAKER_02]: And that whole kind of breadcrumb trail

[00:48:16] [SPEAKER_02]: leading to what you were just talking about.

[00:48:18] [SPEAKER_02]: How long did that take you to research versus,

[00:48:22] [SPEAKER_02]: like I wonder if an AI,

[00:48:24] [SPEAKER_02]: if you knew the right question,

[00:48:26] [SPEAKER_02]: if it was like here you go 10 seconds later,

[00:48:28] [SPEAKER_00]: we found that answer for you.

[00:48:30] [SPEAKER_00]: It's a good question.

[00:48:31] [SPEAKER_00]: I don't know.

[00:48:31] [SPEAKER_00]: I'll now go and try to act as if I don't know that

[00:48:34] [SPEAKER_00]: and see if it tells me that.

[00:48:36] Ha ha ha ha.

[00:48:37] [SPEAKER_00]: Maybe you don't want to know.

[00:48:39] [SPEAKER_00]: Well, but the thing is,

[00:48:39] [SPEAKER_00]: it's such a serendipitous moment.

[00:48:41] [SPEAKER_00]: As I read through,

[00:48:43] [SPEAKER_00]: I see that Clefane also invested in a photograph.

[00:48:46] [SPEAKER_00]: And I'm okay, well why?

[00:48:47] [SPEAKER_00]: If you're Sam Altman,

[00:48:48] [SPEAKER_00]: well because deal flow came to you,

[00:48:50] [SPEAKER_00]: but that's not the way it worked then, right?

[00:48:52] [SPEAKER_00]: But then I had to stop and think back

[00:48:53] [SPEAKER_00]: and say, well, what's the breadcrumbs

[00:48:55] [SPEAKER_00]: that tied his interests together?

[00:48:58] [SPEAKER_00]: The typewriter, the type setter and the photograph

[00:49:01] [SPEAKER_00]: were all about trying to record Supreme Court hearings

[00:49:05] [SPEAKER_00]: and congressional hearings and meetings

[00:49:07] [SPEAKER_00]: and end up with them in print for the public, right?

[00:49:13] [SPEAKER_00]: So to see that string, I hope still required me.

[00:49:18] [SPEAKER_02]: I don't know.

[00:49:20] [SPEAKER_02]: Yes indeed.

[00:49:21] [SPEAKER_02]: I'd like to say indeed.

[00:49:22] [SPEAKER_02]: You're still needed, Jeff.

[00:49:23] [SPEAKER_02]: I hope.

[00:49:25] [SPEAKER_02]: You are.

[00:49:26] [SPEAKER_02]: AI has not replaced you yet.

[00:49:28] [SPEAKER_02]: I mean, unless we're talking notebook at LM

[00:49:31] [SPEAKER_02]: and then AI has replaced both of us.

[00:49:34] [SPEAKER_00]: Boy are people having fun with that.

[00:49:36] [SPEAKER_02]: That continues to, oh.

[00:49:37] [SPEAKER_02]: Oh my goodness.

[00:49:38] [SPEAKER_02]: I keep seeing examples that I'm just like,

[00:49:40] [SPEAKER_02]: oh my goodness, what is going on?

[00:49:41] [SPEAKER_02]: I saw, oh boy, talk about, okay,

[00:49:45] [SPEAKER_02]: we're on all over tangents today and that's okay.

[00:49:48] [SPEAKER_02]: I saw one example where someone had created

[00:49:51] [SPEAKER_02]: a notebook LM notebook that was nothing but the word poop

[00:49:55] [SPEAKER_02]: and fart repeated over and over and over

[00:49:58] [SPEAKER_02]: and over and over again.

[00:50:01] [SPEAKER_02]: And I don't know the output from it,

[00:50:03] [SPEAKER_02]: like you expect that to be something

[00:50:04] [SPEAKER_02]: that's going to break the system or whatever

[00:50:06] [SPEAKER_02]: and it just ends up, like that notebook LM system

[00:50:10] [SPEAKER_02]: just continues to impress me.

[00:50:12] [SPEAKER_02]: The more I hear about it, the more I'm just like,

[00:50:13] [SPEAKER_02]: oh my goodness, like what, where is this leading?

[00:50:16] [SPEAKER_02]: Maybe podcasters will be replaced.

[00:50:18] [SPEAKER_02]: I don't know, could happen.

[00:50:20] [SPEAKER_00]: Since we're on poop, okay, we'll get off of this

[00:50:22] [SPEAKER_00]: rabbit hole or rabbit litter box in a second.

[00:50:26] [SPEAKER_00]: I don't have a handy right now,

[00:50:27] [SPEAKER_00]: but there was an, on N plus one,

[00:50:29] [SPEAKER_00]: there was an amazing essay, very long essay

[00:50:31] [SPEAKER_00]: that went on about trying to play with AI art.

[00:50:37] [SPEAKER_00]: And it started with a prompt for a cat pooping.

[00:50:42] [SPEAKER_00]: And then what they do is they get the picture,

[00:50:44] [SPEAKER_00]: whatever it is, and then they have a related program

[00:50:48] [SPEAKER_00]: describe it in text.

[00:50:50] [SPEAKER_00]: And then they put that text back into the image generation

[00:50:55] [SPEAKER_00]: and then do that for a few generations

[00:50:57] [SPEAKER_00]: to see where it goes and what does it create?

[00:51:00] [SPEAKER_00]: And so there you have artists now playing with this

[00:51:04] [SPEAKER_00]: and try to go back and forth between text and image.

[00:51:08] [SPEAKER_00]: It's weird, but interesting.

[00:51:11] [SPEAKER_02]: Yeah, it is.

[00:51:11] [SPEAKER_02]: That's super interesting.

[00:51:13] [SPEAKER_02]: It's like the AI equivalent of the game Telephone.

[00:51:17] [SPEAKER_02]: Yeah, that's so interesting.

[00:51:19] [SPEAKER_02]: Yeah, I wonder what you come up with

[00:51:20] [SPEAKER_02]: when you do that 10 times.

[00:51:24] [SPEAKER_02]: Artists are apparently explored.

[00:51:26] [SPEAKER_02]: We're raw artists.

[00:51:26] [SPEAKER_02]: We definitely are.

[00:51:27] [SPEAKER_02]: That's not difficult to do.

[00:51:29] [SPEAKER_02]: Yeah, exactly.

[00:51:31] [SPEAKER_02]: It's the morning, totally shifting gears from poop.

[00:51:35] [SPEAKER_02]: Maybe not actually.

[00:51:37] [SPEAKER_02]: It's the morning after a big vice presidential debate

[00:51:41] [SPEAKER_02]: here in the US.

[00:51:44] [SPEAKER_02]: So maybe we'll talk just very briefly

[00:51:46] [SPEAKER_02]: about the election for a second.

[00:51:48] [SPEAKER_02]: Pulsters are utilizing AI to improve

[00:51:52] [SPEAKER_02]: or to attempt to improve their polling methods

[00:51:55] [SPEAKER_02]: and accuracy.

[00:51:57] [SPEAKER_02]: And of course they're doing this

[00:51:59] [SPEAKER_02]: because they wanna get a better prediction

[00:52:00] [SPEAKER_02]: on outcomes, potential outcomes.

[00:52:03] [SPEAKER_02]: They're using a wider dataset of analysis

[00:52:05] [SPEAKER_02]: or rather using the systems to get wider data analysis

[00:52:09] [SPEAKER_02]: attempting to catch trends that human analysts

[00:52:12] [SPEAKER_02]: may have missed.

[00:52:15] [SPEAKER_02]: Yeah, you put this article in here.

[00:52:16] [SPEAKER_00]: I put it in here

[00:52:17] [SPEAKER_00]: because it's a little bit of a rant for me.

[00:52:19] [SPEAKER_00]: So I hate polling.

[00:52:22] [SPEAKER_00]: I think that I quote the late professor

[00:52:24] [SPEAKER_00]: at Columbia University, James Carey,

[00:52:26] [SPEAKER_00]: who said, explained that polling preempts

[00:52:28] [SPEAKER_00]: the public discourse it is intended to measure.

[00:52:30] [SPEAKER_00]: It reveals the pollsters' views,

[00:52:32] [SPEAKER_00]: the pollsters' structure of thought

[00:52:34] [SPEAKER_00]: and it takes all nuance away from people

[00:52:36] [SPEAKER_00]: who puts them in binary buckets

[00:52:38] [SPEAKER_00]: that the pollster determines.

[00:52:40] [SPEAKER_00]: And it further kind of says, okay, well, that's done now.

[00:52:43] [SPEAKER_00]: That's all my opinion is

[00:52:44] [SPEAKER_00]: and it doesn't matter anymore.

[00:52:45] [SPEAKER_00]: Why am I gonna vote?

[00:52:46] [SPEAKER_00]: Why am I gonna get involved in discussions and so on?

[00:52:48] [SPEAKER_00]: So I hate polling like focus groups.

[00:52:50] [SPEAKER_00]: I think that they are false

[00:52:55] [SPEAKER_00]: distillations of human discourse.

[00:52:57] [SPEAKER_00]: Yeah.

[00:52:58] [SPEAKER_00]: And so polling though has also gotten worse,

[00:53:01] [SPEAKER_00]: technically because people don't have landline phones

[00:53:03] [SPEAKER_00]: like old Uncle Jeff still has.

[00:53:06] [SPEAKER_00]: And I have it but you never answer it.

[00:53:08] [SPEAKER_00]: Never, never, never.

[00:53:09] [SPEAKER_00]: Because it is always-

[00:53:10] [SPEAKER_00]: Do you still get calls on it?

[00:53:11] [SPEAKER_00]: Oh yeah, there's always spam, tons of spam.

[00:53:13] [SPEAKER_00]: We get some on our mobile phones,

[00:53:14] [SPEAKER_00]: but that's all it is on the landline phone, right?

[00:53:17] [SPEAKER_00]: So they can't use that anymore.

[00:53:19] [SPEAKER_00]: They, and so what they,

[00:53:20] [SPEAKER_00]: and they can't get as many Republicans.

[00:53:22] [SPEAKER_00]: So they always adjust.

[00:53:23] [SPEAKER_00]: They say we're gonna go R plus four

[00:53:24] [SPEAKER_00]: because we learned from our prior polls

[00:53:26] [SPEAKER_00]: that we under measured the Republicans.

[00:53:28] [SPEAKER_00]: So we're gonna pump this up to more than we have

[00:53:30] [SPEAKER_00]: and they make these adjustments in it.

[00:53:32] [SPEAKER_00]: So it's already faked out

[00:53:34] [SPEAKER_00]: in the sense that they are making some presumption

[00:53:36] [SPEAKER_00]: about the electorate.

[00:53:38] [SPEAKER_00]: What appalled me about this story

[00:53:40] [SPEAKER_00]: is that they use the AI in some cases to ask questions

[00:53:45] [SPEAKER_00]: which saves them time of the human being

[00:53:47] [SPEAKER_00]: on the phone asking things, whatever.

[00:53:49] [SPEAKER_00]: But they also use it sometimes to answer questions.

[00:53:53] [SPEAKER_00]: Just like they come up with fake content,

[00:53:55] [SPEAKER_00]: they come up with fake people,

[00:53:56] [SPEAKER_00]: which is really no different from saying R plus four.

[00:53:59] [SPEAKER_00]: Yeah. In essence, they're saying, well, let's adjust this.

[00:54:02] [SPEAKER_00]: Which is to say that it's the pollster

[00:54:06] [SPEAKER_00]: who's constructing the world, not the electorate.

[00:54:10] [SPEAKER_00]: So the involvement of AI in,

[00:54:12] [SPEAKER_00]: I think it's something to track going forward.

[00:54:14] [SPEAKER_00]: How much AI gets involved in and ruins

[00:54:17] [SPEAKER_00]: or makes even worse polling

[00:54:18] [SPEAKER_00]: is gonna be something really interesting to track.

[00:54:21] [SPEAKER_00]: So that's why I put this in.

[00:54:22] [SPEAKER_02]: Yeah, well, especially at a time when,

[00:54:25] [SPEAKER_02]: there's already so much doubt about polling

[00:54:28] [SPEAKER_02]: and concerns about transparency,

[00:54:31] [SPEAKER_02]: concerns about bias, public trust, all these things

[00:54:36] [SPEAKER_02]: certainly doesn't seem to help that equation,

[00:54:39] [SPEAKER_02]: help that issue at all.

[00:54:42] [SPEAKER_02]: That's interesting.

[00:54:43] [SPEAKER_02]: Yeah, I pretty much try and do my best to ignore

[00:54:49] [SPEAKER_02]: that stuff as best as I can.

[00:54:52] [SPEAKER_02]: And finally, hugging face,

[00:54:55] [SPEAKER_02]: the open source AI model hosting platform,

[00:54:58] [SPEAKER_02]: the most popular.

[00:54:59] [SPEAKER_02]: I mean, it's the one that you hear of probably most often

[00:55:02] [SPEAKER_02]: when it comes to people sharing open source AI models.

[00:55:08] [SPEAKER_02]: Hit 1 million models last week.

[00:55:11] [SPEAKER_02]: Often, the services referred to as the GitHub of AI.

[00:55:17] [SPEAKER_02]: So you're gonna find your llamas there.

[00:55:19] [SPEAKER_02]: You're gonna find your stable diffusions,

[00:55:21] [SPEAKER_02]: your mistrules, all the kind of the big name

[00:55:24] [SPEAKER_02]: kind of open source models are stored there.

[00:55:28] [SPEAKER_02]: CEO Clemente DeLang says the reason it's so successful

[00:55:32] [SPEAKER_02]: is because it proves that quote,

[00:55:33] [SPEAKER_02]: smaller specialized, customized, optimized models

[00:55:37] [SPEAKER_02]: for your use case, your domain, your language,

[00:55:40] [SPEAKER_02]: your hardware and generally your constraints are better.

[00:55:45] [SPEAKER_02]: And we've talked about this

[00:55:46] [SPEAKER_02]: compared to the one model to rule them all fallacy.

[00:55:49] [SPEAKER_02]: Kind of proving that maybe the biggest gains

[00:55:53] [SPEAKER_02]: aren't necessarily from this like ginormous AI model

[00:55:57] [SPEAKER_02]: that knows everything,

[00:55:59] [SPEAKER_02]: which it can't possibly know everything

[00:56:01] [SPEAKER_02]: or know anything really,

[00:56:03] [SPEAKER_02]: versus these smaller models that are very hyper focused

[00:56:08] [SPEAKER_02]: and driven to a specific use case,

[00:56:11] [SPEAKER_02]: but studied well around that or trained well around that.

[00:56:15] [SPEAKER_00]: Are most of those, this is my inner speaking,

[00:56:17] [SPEAKER_00]: are some of those, are some good number of those

[00:56:20] [SPEAKER_00]: still adaptations of the models that exist

[00:56:23] [SPEAKER_00]: of Mistral or of Lama? Yes.

[00:56:26] [SPEAKER_02]: Right, so there's- Yeah, and I think that's a big part

[00:56:29] [SPEAKER_02]: of its strength actually is

[00:56:31] [SPEAKER_02]: because it's a community effort to fine tune,

[00:56:35] [SPEAKER_02]: often to fine tune those models that are shared there.

[00:56:39] [SPEAKER_02]: So for example, you're gonna find

[00:56:41] [SPEAKER_02]: a bunch of variations of Lama

[00:56:44] [SPEAKER_02]: that are fine tuned to specific applications.

[00:56:47] [SPEAKER_02]: So that's, obviously that's a big reason

[00:56:49] [SPEAKER_02]: why the numbers are great.

[00:56:50] [SPEAKER_02]: It's you're not gonna find nearly as many Lama level models

[00:56:56] [SPEAKER_02]: as much as you're going to find Lama tailored to this

[00:56:59] [SPEAKER_02]: or tailored to that, that sort of thing.

[00:57:01] [SPEAKER_00]: Yeah. And so pre-trained.

[00:57:03] [SPEAKER_00]: I got it.

[00:57:04] [SPEAKER_00]: I put this in late among the stories

[00:57:07] [SPEAKER_00]: and I'll just mention it here.

[00:57:08] [SPEAKER_00]: So MIT has a spinoff called Liquid

[00:57:11] [SPEAKER_00]: and they put up their model last week

[00:57:15] [SPEAKER_00]: and to say it's already state of the art,

[00:57:17] [SPEAKER_00]: it's already better than Lama or whatever.

[00:57:18] [SPEAKER_00]: The important thing about it is it's not built on transformer.

[00:57:22] [SPEAKER_00]: So what's interesting to me is,

[00:57:23] [SPEAKER_00]: at Hugging Face a lot of those models

[00:57:25] [SPEAKER_00]: are built on the basis of transformer

[00:57:27] [SPEAKER_00]: and come from the roots of Google

[00:57:29] [SPEAKER_00]: as the founding fathers and mothers of that string

[00:57:33] [SPEAKER_00]: or the DNA of it stretches back

[00:57:35] [SPEAKER_00]: to Google and transformer.

[00:57:37] [SPEAKER_00]: So what's interesting here,

[00:57:38] [SPEAKER_00]: and I'm sure there's other things as well,

[00:57:39] [SPEAKER_00]: but that this one is out as a post-transformer,

[00:57:44] [SPEAKER_00]: non-transformer model.

[00:57:46] [SPEAKER_00]: And I'll never be technical enough

[00:57:48] [SPEAKER_00]: to be able to really understand

[00:57:49] [SPEAKER_00]: the pluses and minuses of any of these.

[00:57:52] [SPEAKER_02]: Yeah, that's I think my question.

[00:57:54] [SPEAKER_00]: But it's interesting to see the competition

[00:57:56] [SPEAKER_00]: is now existing even at that fundamental level

[00:57:58] [SPEAKER_00]: going back to slice off against transformer

[00:58:02] [SPEAKER_00]: which I think is important in terms of the technology

[00:58:04] [SPEAKER_00]: just to say what if there's a better way than transformer?

[00:58:07] [SPEAKER_00]: Everybody's using transformer

[00:58:08] [SPEAKER_00]: because it was so amazing.

[00:58:09] [SPEAKER_02]: It's not like transformers to be all in.

[00:58:11] [SPEAKER_02]: No, no, not at all.

[00:58:11] [SPEAKER_02]: For sure, for sure.

[00:58:13] [SPEAKER_02]: It just happens to be the thing

[00:58:14] [SPEAKER_02]: that everybody is really going deep on right now

[00:58:18] [SPEAKER_02]: and getting a lot of traction

[00:58:19] [SPEAKER_02]: but maybe there are other ways to do this.

[00:58:22] [SPEAKER_02]: That's interesting.

[00:58:23] [SPEAKER_02]: Yeah, okay, because I did kind of read through that

[00:58:27] [SPEAKER_02]: initially.

[00:58:27] [SPEAKER_02]: Yeah, I couldn't discuss it,

[00:58:28] [SPEAKER_00]: maybe because I don't understand it.

[00:58:30] [SPEAKER_02]: Quite understand it enough to be like,

[00:58:33] [SPEAKER_02]: all right, yeah, huh.

[00:58:35] [SPEAKER_02]: Well, I'll be curious to see kind of where that leads to.

[00:58:37] [SPEAKER_00]: And if I could plug one more thing that I just,

[00:58:39] [SPEAKER_00]: I haven't read yet,

[00:58:40] [SPEAKER_00]: but I just the Stanford Cyber Policy Center

[00:58:43] [SPEAKER_00]: which is really a bunch of really smart people.

[00:58:46] [SPEAKER_00]: I'm gonna be presenting my new book,

[00:58:48] [SPEAKER_00]: The Web We Weave coming out next week

[00:58:49] [SPEAKER_00]: at the Stanford Center.

[00:58:51] [SPEAKER_02]: Next week!

[00:58:52] [SPEAKER_02]: That's great, congrats.

[00:58:54] [SPEAKER_00]: So I'll be going to Stanford in early December

[00:58:56] [SPEAKER_00]: and then to the Commonwealth Club,

[00:58:58] [SPEAKER_00]: I'll push those tickets on you later.

[00:59:00] [SPEAKER_00]: But they have a bunch of really smart people

[00:59:01] [SPEAKER_00]: and they're associated with Hoover Institution

[00:59:03] [SPEAKER_00]: which is very conservative

[00:59:04] [SPEAKER_00]: and the Project Liberty Institute and others.

[00:59:07] [SPEAKER_00]: And they just came out with this project

[00:59:08] [SPEAKER_00]: called the Digitalist Papers.

[00:59:10] [SPEAKER_00]: And their question is,

[00:59:11] [SPEAKER_00]: what would the Federalist Papers say

[00:59:13] [SPEAKER_00]: if they were written in the 21st century?

[00:59:15] [SPEAKER_00]: So this is a lot of kind of philosophical stuff about AI

[00:59:18] [SPEAKER_00]: and we can't go into it now because I haven't read it

[00:59:21] [SPEAKER_00]: and we have no time.

[00:59:23] [SPEAKER_00]: But I think for those who are interested

[00:59:25] [SPEAKER_00]: in digging deeper in some ways,

[00:59:27] [SPEAKER_00]: there are some interesting authors here

[00:59:28] [SPEAKER_00]: from various political stripes,

[00:59:30] [SPEAKER_00]: a very conservative libertarian economist

[00:59:32] [SPEAKER_00]: next to a very liberal believer

[00:59:35] [SPEAKER_00]: in enabling democracy past money and so on,

[00:59:39] [SPEAKER_00]: trying to imagine the impact of all this.

[00:59:41] [SPEAKER_00]: So I just thought I'd give them a little plug if I may.

[00:59:44] [SPEAKER_02]: Yeah, indeed.

[00:59:45] [SPEAKER_02]: Thank you for that.

[00:59:46] [SPEAKER_02]: Is there a place people-

[00:59:47] [SPEAKER_00]: Oh sorry, yes, digitalpapers.com.

[00:59:49] [SPEAKER_00]: Oh sorry, digitalistpapers.com.

[00:59:54] [SPEAKER_02]: Papers.com, Digitalist.

[00:59:57] [SPEAKER_02]: Okay, I think I have it up here.

[01:00:00] [SPEAKER_02]: There we go.

[01:00:01] [SPEAKER_02]: 12 essays, 19 authors,

[01:00:02] [SPEAKER_02]: one overarching goal,

[01:00:04] [SPEAKER_02]: present an array of possible futures

[01:00:06] [SPEAKER_02]: that the AI revolution might produce.

[01:00:08] [SPEAKER_00]: And Nate Persley is from Stanford.

[01:00:11] [SPEAKER_00]: He's one of the organizers,

[01:00:12] [SPEAKER_00]: Eric Brynjolfsson who's very smart about this stuff,

[01:00:15] [SPEAKER_00]: Sandy Petland and then a bunch of authors you know,

[01:00:19] [SPEAKER_00]: like Reid Hoffman, Lawrence Lessig,

[01:00:23] [SPEAKER_00]: Jennifer Palca, Eric Schmidt,

[01:00:25] [SPEAKER_00]: but some you may not know

[01:00:26] [SPEAKER_00]: and Eugene Volak who's a libertarian.

[01:00:29] [SPEAKER_00]: So I'm gonna dig into this as time allows.

[01:00:33] [SPEAKER_00]: And this really interests me as you can tell on the show

[01:00:35] [SPEAKER_00]: because I'm not good at the technology

[01:00:36] [SPEAKER_00]: but I'm interested in the implications.

[01:00:38] [SPEAKER_00]: So I thought I'd just throw this out

[01:00:39] [SPEAKER_00]: in case other folks have anything

[01:00:41] [SPEAKER_00]: you find interesting in there, let me know.

[01:00:43] [SPEAKER_02]: Excellent, cool.

[01:00:45] [SPEAKER_02]: And I'm so excited that your book

[01:00:47] [SPEAKER_02]: is gonna finally be released next week.

[01:00:50] [SPEAKER_00]: October 8th, the web we weave.

[01:00:52] [SPEAKER_00]: I'm trying to get my son to give me the new webpage

[01:00:54] [SPEAKER_00]: so I can link to there.

[01:00:55] [SPEAKER_00]: But if you want a pre-order

[01:00:56] [SPEAKER_00]: before the pre-order is done,

[01:00:57] [SPEAKER_00]: web 20 is the code.

[01:00:59] [SPEAKER_00]: If you search for it and find the basic book site

[01:01:01] [SPEAKER_00]: for the web we weave by me, thank you very much.

[01:01:05] [SPEAKER_00]: All right, yeah.

[01:01:06] [SPEAKER_02]: Thank you.

[01:01:06] [SPEAKER_02]: And of course the Gutenberg or sorry,

[01:01:09] [SPEAKER_02]: not the Gutenberg.

[01:01:10] [SPEAKER_00]: Yeah, the Gutenberg.com.

[01:01:12] [SPEAKER_00]: Yes, like, oh I said that's the book.

[01:01:14] [SPEAKER_00]: The URL is Gutenberg.com, yes.

[01:01:15] [SPEAKER_02]: Gutenberg parenthesis to go to the website

[01:01:18] [SPEAKER_02]: and then the Gutenberg parenthesis.

[01:01:20] [SPEAKER_02]: Three, count them three books out now.

[01:01:23] [SPEAKER_02]: I know, you're so prolific.

[01:01:25] [SPEAKER_02]: I'm always so impressed by that.

[01:01:26] [SPEAKER_02]: Can't get rid of me.

[01:01:29] [SPEAKER_02]: Don't want to.

[01:01:30] [SPEAKER_02]: Thank you, Jeff.

[01:01:31] [SPEAKER_02]: Thank you, boss.

[01:01:32] [SPEAKER_02]: Always a pleasure learning

[01:01:33] [SPEAKER_02]: about artificial intelligence with you

[01:01:35] [SPEAKER_02]: and with those of you watching and listening,

[01:01:38] [SPEAKER_02]: you too as well.

[01:01:39] [SPEAKER_02]: Everything you need to know about this show

[01:01:40] [SPEAKER_02]: can be found at our site.

[01:01:42] [SPEAKER_02]: Go to aiinside.show.

[01:01:45] [SPEAKER_02]: You can find video versions of the show.

[01:01:46] [SPEAKER_02]: You can find subscribe links

[01:01:48] [SPEAKER_02]: so you can subscribe to the podcast

[01:01:50] [SPEAKER_02]: or whatever podcatcher you like.

[01:01:53] [SPEAKER_02]: It's all listed there.

[01:01:54] [SPEAKER_02]: You can also link out to our Patreon

[01:01:56] [SPEAKER_02]: at patreon.com slash aiinsideshow

[01:02:00] [SPEAKER_02]: where you can share your support of this show

[01:02:04] [SPEAKER_02]: and everything we're doing.

[01:02:05] [SPEAKER_02]: You get ad-free shows, a Discord community,

[01:02:07] [SPEAKER_02]: regular hangouts.

[01:02:08] [SPEAKER_02]: You get an AI inside t-shirt

[01:02:11] [SPEAKER_02]: if you become an executive producer

[01:02:13] [SPEAKER_02]: of which I think we have five right now.

[01:02:16] [SPEAKER_02]: Dr. Du, Jeffrey Maricini, WPVM 103.7

[01:02:19] [SPEAKER_02]: in Asheville, North Carolina,

[01:02:21] [SPEAKER_02]: Paul Lang and Ryan Newell.

[01:02:23] [SPEAKER_02]: Y'all are awesome.

[01:02:24] [SPEAKER_02]: Everyone who supports us on Patreon is awesome.

[01:02:27] [SPEAKER_02]: So thank you for that.

[01:02:29] [SPEAKER_02]: We really do appreciate it.

[01:02:31] [SPEAKER_02]: And yeah, I think that's all there is to it.

[01:02:33] [SPEAKER_02]: I'm gonna go ahead and wrap this up

[01:02:36] [SPEAKER_02]: and hit the road and go camping for the evening

[01:02:38] [SPEAKER_02]: with my daughter and her friend.

[01:02:40] [SPEAKER_02]: So on the other side of camping,

[01:02:42] [SPEAKER_02]: I guess I'll see you next week.

[01:02:43] [SPEAKER_02]: Yeah, thank you for being here.

[01:02:45] [SPEAKER_02]: Always fun.

[01:02:46] [SPEAKER_02]: We'll see you guys next time

[01:02:47] [SPEAKER_02]: on another episode of AI Inside.

[01:02:50] [SPEAKER_02]: Take care y'all.

[01:02:51] [SPEAKER_02]: Bye.