Jason Howell and Jeff Jarvis dig into Anthropic's back-to-back data leaks exposing Claude Code source and a secret frontier model called Mythos, OpenAI killing its adult chatbot and shuttering Sora, a record $122 billion funding round ahead of IPO, Apple letting third-party AI plug into Siri, university students fighting AI with typewriters, quantum researchers warning encryption could crack sooner than expected, and new AI video models from Google and ByteDance.
Note: Time codes subject to change depending on dynamic ad insertion by the distributor.
Chapters:
0:00:00 - Start
0:09:31 - Claude Code's source code appears to have leaked: here's what we know
0:17:59 - Exclusive: Anthropic acknowledges testing new AI model representing ‘step change’ in capabilities, after accidental data leak reveals its existence
0:20:57 - Can we talk for a second about my time with Claude Cowork?
0:38:09 - The Sudden Fall of OpenAI’s Most Hyped Product Since ChatGPT
0:43:53 - OpenAI closes record-breaking $122 billion funding round as anticipation builds for IPO
0:45:13 - Apple Plans to Open Up Siri to Rival AI Assistants Beyond ChatGPT in iOS 27
0:50:26 - College students are writing with AI – but a pilot study finds they’re not simply letting it write for them
0:54:17 - University students fight artificial intelligence with vintage typewriters
1:02:14 - Exclusive: Anthropic acknowledges testing new AI model representing ‘step change’ in capabilities, after accidental data leak reveals its existence
1:05:49 - Google commits to video generation, announces Veo 3.1 Lite
1:06:45 - ByteDance's new AI video generation model, Dreamina Seedance 2.0, comes to CapCut
1:08:03 - Meta launches two new Ray-Ban glasses designed for prescription wearers
1:10:14 - Google Gemini now lets you import your chats and data from other AI apps
1:12:50 - Bluesky leans into AI with Attie, an app for building custom feeds
00:00:00:04 - 00:00:26:00
Unknown
Coming up next, Jeff Jarvis and I dig into anthropic. Back to back data leaks, including a leaked frontier model called clod Mythos. Very mysterious. I dive deep into my time and work with clod Co-work OpenAI's record $122 billion funding round as it inches closer to IPO, and new research suggesting quantum computers could crack today's encryption far sooner than expected.
00:00:26:00 - 00:00:41:12
Unknown
That's coming up next on this episode of the AI inside podcast.
00:00:41:14 - 00:01:02:26
Unknown
Hello everybody, and welcome to another episode of the AI Inside podcast is the show where we take a look at the AI that is layered throughout the world of technology and all those recurring themes that come up. I think this week the recurring theme is anthropic doing awesome, but fumbling the ball a little bit and then a little bit of open AI and so much other, stuff to talk about.
00:01:02:26 - 00:01:26:05
Unknown
I'm one of your host, Jason Howell, joined, as always, by Jeff Jarvis in the office. Hey hey, hey, hey, sir. Great to see you, too. Happy to be here. We even have a little quantum in today's show. Yeah, I think I did last minute. I know totally as I'm like. Yeah, because I was watching this recording, on Wednesday in the Supreme Court on, birthright citizenship.
00:01:26:05 - 00:01:46:23
Unknown
This has nothing to do with AI, but I get to the point. The second was on, and Trump actually showed up at court, and I said I envisioned him sitting there. It would be like me sitting in a, graduate level quantum mechanics course at MIT. Wouldn't understand a thing. Right? So we have quantum today doesn't mean we can explain it.
00:01:46:25 - 00:02:08:21
Unknown
I mean, we I will certainly try. I mean, I, I understand it, you know, again with quantum, I understand it on a higher level, but in the weeds and all the details, that's where I'm still murky. Yeah. And, you know, I have to say, like there was a time when I felt like that about AI too. Like I remember when we were at Twit because it wasn't that long ago, like two and a half years ago prototyping this show.
00:02:08:21 - 00:02:29:28
Unknown
And I think you were out one of the days when we were doing it for, the twit, the club Twit audience. And so I ended up bringing my friend, Mike Wolfson on, and he did a deep dive into what tokens were. And I remember at that time just being so confused, like, I just, like, I think I understand, like, I kind of get it, but I don't.
00:02:30:05 - 00:02:47:21
Unknown
And now, like, we have a nature, a broader knowledge. Yeah, totally. And so I'm sure it'll be that way with quantum. Maybe it will be quantum inside somewhere in the, in the future. You know, one of the I'm working on a project for this with, the, Alan Alda Center for Communicating Science at Stony Brook. It's one of my one of my gigs.
00:02:47:23 - 00:03:09:10
Unknown
And, I think that the ability of technologists and engineers and scientists to communicate needs work. And the, the old a center which is founded by Alan Alda himself, is about mainly training scientists to communicate by by empathizing with the audiences needs and understanding where they where they come from and so on, so forth.
00:03:09:13 - 00:03:30:03
Unknown
But I think it goes beyond science, into technology and engineering. I think that, a lot of the, panic we see around technology. So for the sake of the show yesterday, I didn't tell you this. I went to the, AI doc movie. Oh, wait a minute. You said on the last I don't want to go.
00:03:30:03 - 00:03:54:02
Unknown
There was no way you got to touch that with the ten foot pole. All right, all right. Truth is, I haven't had movie popcorn in six years. Oh, it's so good. Yeah, that's my so bad for me. Yeah. So, you know, I've lost 28 pounds with my broken back and sepsis diet. I figured, okay. Oh, Papa. I'm still freaked about theaters that I won't be near everybody because Covid is still around, but nobody's going to.
00:03:54:02 - 00:04:15:20
Unknown
This thing was. Oh, yeah, there was three of us in the studio audience that was, in fairness, perplexing. That was theater midday on a Tuesday, but it was half price. Tuesday. I got the tickets for half right. So you didn't have to pay full price all the senior deal. You know, one bad day, you're stacking it. Yeah. Yeah.
00:04:15:23 - 00:04:35:07
Unknown
So. So, it's it's, I won't I won't bore you in the audience with the details of it. This guy comes along and he doesn't know what the heck. I was just like, we were talking. And so he goes to everybody, and they go down the full number that he says, well, but surely there's people out there who like this stuff.
00:04:35:07 - 00:05:01:21
Unknown
So then he goes to the full after the side, and then he comes back around. And in the end it's fairly obvious. My point finally being that I think that the fact that this movie could be made is an outcropping of the failure of the technologists in our industry that the industry cover to, explain what things are and to do so in a way that matters and people's lives, which they understand from their own perspectives and their own lives.
00:05:01:23 - 00:05:19:16
Unknown
And, that they can get an idea of how it works and what it means to them and so on. This is not difficult is why we're here, because we want to understand this. And as we learn what tokens are, we can then try to explain it better. If we ever learn what a qubit is, we can get there too.
00:05:19:19 - 00:05:44:20
Unknown
Right. But I think it's also where tokens were for me, like two and a half years ago. Exactly. Well, I think it's a little more, right. It's more of a quantum leap. If you will, from there. In terms of degree of difficulty, but that's why the show exists, because people who I think are bad at communicating well and yeah, that the the fact that a documentary like that is going to the theater is just sort of perplexing to me.
00:05:44:20 - 00:06:02:09
Unknown
Like I said last week, it feels like a Netflix documentary, but, and I, I guess Netflix makes movies that make it to the theater. They don't typically last very long there. It's kind of like a formality or something. I just don't know who I don't know what the audience is for an AI documentary theater, or why they thought it needed to be at the theater.
00:06:02:10 - 00:06:30:02
Unknown
Like, I think it's rather like Bologna, the movie. Which movie that. Melania. Oh, Melania. Okay. Yeah, right. I thought you said Bologna. And I was like, I've never really. But next time I hear AGI, that's Bologna. Yeah, that's that's overloading. So, I think that some people wanted it to be there for the visibility. It's funded in part, I think, by one of the dumpster, cultural groups.
00:06:30:05 - 00:06:44:22
Unknown
And I think I think they're underwriting the hell out of it for it to be there. And so if it's there, then that's, you know, it makes it seem as if this is important and then, you know. Yeah, right. I took my family to see this documentary and oh, my, my, I was scared about the future. We must do something.
00:06:44:28 - 00:07:03:03
Unknown
Yeah. Did you find anything redeeming about the documentary? Like you were like, you know what? That's a good point or anything like that. No, I think it was. It lacked nuance. But I give it credit. I give it credit that did the doomsayers and then it did the optimists. But at present. So it had. Yeah, it presented them as kind of extremes.
00:07:03:06 - 00:07:20:09
Unknown
And, and of course, the thing I was waiting for that did come at the end, I'll give you credit for this is what left out all on the way was our own agency, our own power to use this as we will. Same with the internet. The same with any technology. In the end, it's up to us in public and how we use it.
00:07:20:11 - 00:07:43:27
Unknown
Yes. What? The limits of the problems of companies and greedy billionaires and incompetent governments and whatever. But we still do have agency with us, and that's the moral at the end of it all. Finally, it takes a while. Okay. The password was good. I was is it was better than I remembered. It was very good. You know, the last movie I went to, I got popcorn and they didn't that like, I, they didn't have butter to be put on it.
00:07:43:27 - 00:08:01:21
Unknown
Like you couldn't actually add butter for your own good. Jason. I was like, what? What what kind of theater is this? You don't have butter. Like, you don't even have the option to have butter on the popcorn. That's so weird to me. You know how much oil's popcorn already? Oh, let well, let me tell you this. And then we'll actually get to the I news.
00:08:01:23 - 00:08:19:10
Unknown
I worked for years, around high school and out of high school for probably like 5 or 6 years. I worked in a movie theater that's right around Boise. And, at a certain point, I was like manager of a movie theater, the Egyptian movie theater in Boise, Idaho. The single screen Egyptian movie theater, very similar to the LA theater or whatever.
00:08:19:13 - 00:08:40:08
Unknown
And, so I saw how these theaters are run for for many years, and we always referred to it as buttery scented oil. Yeah, that's basically what it is. It's, it's it's oil that smells kind of like butter, but it's just straight oil, you know, oil and I don't know, did you buy the you did you pop the popcorn or did you buy the huge gigantic.
00:08:40:13 - 00:08:58:27
Unknown
Oh we pop the pop back. Yeah, yeah. I would I would make what was called a mama's batch. I called it Mama's Batch, which was like taking the. There was like a scoop of this, like, very thick oil that you had to put into the thing with the popcorn. And I would do two of those instead of one.
00:08:58:27 - 00:09:22:00
Unknown
Well, and so the popcorn would come out and it was like new, clear. Look at it. It was oh. And I was like, yeah, that's a mama's batch right there. Hey, you're only young once. Exactly. It was great then. I probably barf now, but it was great. Then. I don't know, another week. I'll tell you about my experience at Ponderosa Steakhouse, but let's get to the.
00:09:22:00 - 00:09:46:17
Unknown
Yeah. Okay. We'll we'll save the Ponderosa Steakhouse, story for next week's intro. We just have to remember to do that because I'm very curious. Let's talk about anthropic. I got anthropic on the mind. I'll talk about, Claude Co-work here in an in a little bit. But before we kind of talk about that, just because I've been using it a lot and I've got thoughts, a few things that happened with anthropic this week.
00:09:46:17 - 00:10:17:09
Unknown
Kind of a rough week for anthropic timed poorly as well, because I feel like there's a lot of like positive momentum happening with anthropic right now. They have a court decision in their favor when it comes to the Pentagon. That's right. Their usage is way up. Their public presence is is positive perception like people, you know, again, I use my wife is kind of like the, the the basis for this because I mean I do an AI podcast, but we don't like talk regularly about all the, the deep details of of AI.
00:10:17:10 - 00:10:34:27
Unknown
She asked me questions here and there, but I'm hearing her all the time, like I spoke to so-and-so, today and he's, you know, full in on on Claude, Claude code and he's building the blah, blah, blah. And she's like, starting to spit out all this anthropic knowledge. I'm like, man, okay. It's it's it's out there right now.
00:10:35:00 - 00:10:57:25
Unknown
Yeah. The news is getting around. Well, they had a bad week. In other ways, and it all seems to do with leaky data, information that they would rather not be out in the public sphere. But, hey, what do you know? It happens. Speaking of cloud code, which I just mentioned, anthropic itself accidentally exposed its full cloud code.
00:10:57:25 - 00:11:28:15
Unknown
TypeScript source code. And this is thanks to an NPM that stands for a node package manager that included inside of it a source map that points out to an unknown obfuscated zip archive. That was hanging out in anthropic R2 cloud storage, that zip file that it pointed to that people could see because it was un obfuscated, contained more than 1900 files, more than 512,000 lines of code.
00:11:28:17 - 00:11:53:14
Unknown
It, you know, detailed major internals, the instruction sets for clods. Elm engine 40, a genetic tools, a bunch of flags referring to unreleased features. So future roadmap, which is really useful for competing, if for for competitors to know that and start to kind of make sure. Do we have something like that in the pipeline? Maybe we should because they're working on it.
00:11:53:17 - 00:12:15:05
Unknown
So yeah, this is basically like to kind of, detail how I thought about this, what got leaked or what got out there is the is more or less the execution layer that sits on top of the knowledge layer. The knowledge layer is the AI model itself and all the things that it you know, that it does, that black box sort of thing.
00:12:15:05 - 00:12:32:17
Unknown
This is kind of like the code that surrounds that and directs it, let's say. Yeah, I don't think well, a I'm glad that that cloud came out and anthropic came out and said to me, they are bad. Yeah, they got bad security. It was, it was how it happened. They were open about human error. They tend to be that way.
00:12:32:17 - 00:12:54:10
Unknown
That's good. It did not include the weights. There's a lot of stuff that wasn't in it that was. Oh, okay. So that was. That was the next layer down. Yeah. Yeah, that's very important. As some folks said, you know, a lot of this, we've, we've the people who know what they're doing, had imputed anyway that there weren't, you know, huge revelations in what existed in this.
00:12:54:12 - 00:13:00:08
Unknown
What's interesting to me is that they went after,
00:13:00:10 - 00:13:17:22
Unknown
The they asked they did a copyright takedown request from GitHub for more than 8000 copies. And at stations of the cloud code instructions as the Wall Street Journal. And.
00:13:17:24 - 00:13:37:16
Unknown
You know, I wonder about that. I don't, I think, I guess it's GitHub zone policy to say, yeah, this shouldn't be there. Okay, we'll take it down because out of respect for you anthropic. But, you know, if you if as a journalist perspective, if we got it, we could have the journalism, argument is fair game.
00:13:37:19 - 00:13:57:11
Unknown
It was out there and puts out fair game. Have it and sorry. And that we could intern report it and report on it and share it for that matter. So I think it's more of of empathy with anthropic. Oops, somebody screwed up. We'll help you get rid of this stuff. But it's out there. Yeah, yeah. They're filing copyright takedown requests on this.
00:13:57:11 - 00:14:14:03
Unknown
So but that's a request I don't I think that if somebody said if I let it be known that I had it, I doubt they came to me and said, you should take it down. I think I'd probably say it's after his name. It was in public, I got it, I didn't steal it. I didn't cause for it to be stolen.
00:14:14:05 - 00:14:39:03
Unknown
Yeah, I have it. If I, if there's a book. Well, okay. So so summary a, as we say in Jersey, it fell off the truck, because the code fell off the truck. If if a load of, books, stopped in front of my house and books fell off, and I wasn't got one, I didn't steal it, right.
00:14:39:05 - 00:15:03:29
Unknown
It was a stolen. It was lost. I now have it. No. So I don't think, I think it's, it's just a what's interesting about the story is that Claude was having such a good month after having. Yeah I know being the center of this whole huge fight with the Pentagon which turned to its advantage and people discovered it and it's growing and it won a court case for now.
00:15:04:02 - 00:15:25:24
Unknown
Or an injunction for now. It's just, well, even even those with luck have bad luck. Yeah. And but I do also wonder, like, how much this really impacts anything. Like it's embarrassing. That's it. And, you know, it's embarrassing at a moment when there's a lot of good, you know, good, energy and goodwill and all that stuff kind of flowing anthropic way.
00:15:25:24 - 00:15:42:17
Unknown
So it's kind of a poorly timed, embarrassing incident, but I don't. Yeah, like like you said, I it doesn't seem to me that this source code is so much of the secret source that like, oh, now everybody has a cloud code, you know, that's just as powerful as cloud code. There's still so much here that we don't have.
00:15:42:21 - 00:16:07:13
Unknown
Yeah, source code is just one layer. I, you know, I think there is there is some impact, negative impacts of the fact that there were a lot of feature flags in here. So if anthropic was holding on to features to, you know, kind of surprise release somewhere down the line and or beat other models to some of these features, you know, now everybody kind of has a general sense of some of the directions that they're working.
00:16:07:13 - 00:16:28:16
Unknown
So that could be a little damaging. I suppose. You know, some of those things. There was Kairos, which is an always on autonomous and daemon, mode running cloud code in the background. So it's just kind of always on running in the background. I haven't used cloud code. So, you know, that seems like something that maybe I would have assumed it already had, but I guess not.
00:16:28:19 - 00:16:43:16
Unknown
I you make a big deal about the things that can how long this thing can run. So I think that is that's probably a big deal in the sense that I think if you can keep it running all the time. Yeah. I mean, it doesn't just wake up, it's going right. It's just keeps going. Yeah. So that's a feature.
00:16:43:18 - 00:17:15:12
Unknown
Yeah. Under cover mode, letting cloud code, makes stealth commits to public repositories without mentioning anthropic and any internal info to commit messages. Okay. Yeah. And I wonder on that one, is that just because if people use cloud code to create their own things like that, that would essentially be useful to remove any mentions of anthropic or cloud code from that code so that they could represent it as their own code.
00:17:15:12 - 00:17:40:29
Unknown
I guess that's that's what you get out of that, I suppose. Yeah. Or is it that they're trying to cover tracks or something or. I don't know that that would be my my conclusion though, that I would jump to. And then another one is, buddy, this is a weird one, a Tamagotchi style terminal pet with stats like chaos and snark for personality.
00:17:41:01 - 00:18:01:17
Unknown
Okay, what is that? Is that like a demo mode or I don't know, I don't know what that is. So some of these probably more impactful than others. And I know that there's more I just pulled out of for you. Yeah. So, so I would not all have a better one next week. Yeah. Well, I mean, and that isn't even in there.
00:18:01:17 - 00:18:27:12
Unknown
There's another, separate data leak of unsecured, content. Which one? Which story is it? Is this one? Sorry. I pulled it up, and now I'm losing. There we go. Here it is. This is unfortunate. Anyways, a separate data leak. Unsecured content management system cache that exposed around 3000 unpublished assets. Many of it internal looking.
00:18:27:12 - 00:18:51:10
Unknown
So this is more kind of foundational from a on the company side of things, how the company is operating. There's a draft blog post in that cache that was an announcement of a new frontier model called Cloud Mythos. The draft post refers to it as a step change in capability. And so basically, to put it differently, you've got opus right now.
00:18:51:10 - 00:19:18:25
Unknown
Opus 4.6 is like their top model. This is like this is the best that that cloud can offer, at least to my knowledge. It's opus 4.6. Mythos would be yet another on top of that. So you've got your sonnet, you've got your haiku, you've got your opus. This isn't like the next version of opus is mythos. This is there's a new frontier model that that is even better, more capable than opus on top of it, above it called mythos.
00:19:18:27 - 00:19:35:06
Unknown
And so, you know, knowing that that's coming down there, I mean, we know they're crossly doing this. Of course, we know that they're constantly working on improving things. I think maybe the news here is that it's not just that they're improving the opus model. They're actually going one step further, and that's a pressure on them to have the next hole model out.
00:19:35:06 - 00:19:55:12
Unknown
It's it's it's Vera Rubin versus Blackwell to put an Nvidia terms. Right. It's yeah. Yeah. There you go. Yeah. It's a it's I probably would car model example, but it's entirely new, a new a new show instead of just a new. Yeah, yeah, yeah. It's it's, model Y versus a model X I don't know.
00:19:55:13 - 00:20:19:09
Unknown
Yeah. Maybe we don't put it in Tesla terms, but maybe, you know, she's a different, no different maker model. Apparently, though, this model mythos is being tested with a limited number of people and security minded people. Early testers, because anthropic sees the model as. And I think this is a direct quote, far ahead of any other AI model in cyber capabilities.
00:20:19:09 - 00:20:39:23
Unknown
They're warning from this blog post was that it could power large scale software, exploit campaigns, and so they want to start the rollout with security community so that they can harden code bases before attackers get some of these, tools themselves. Oh, so, okay.
00:20:39:25 - 00:21:07:20
Unknown
It's too powerful, Jeff. It's just too powerful to handle it. The world can't handle the power of mythos, but we're going to release it anyways. Whatever. So that's anthropic. At least a couple of stories impacting the company right now, and I thought I would just take a second, to mention that I've spent the last week, week and a half now, getting to know and work with, Claude Co-work.
00:21:07:20 - 00:21:25:19
Unknown
Now, have you? I know you've been using Claude, but not as much as I should, because I've been working on some other projects. But I need to dig in. I'm eager to hear your experience, I think. I think maybe you start playing around with Co-work for some of those projects, for some of the writing projects that you have, because I think it would be a really great space for that.
00:21:25:22 - 00:21:45:01
Unknown
I it requires your Claude app installed, or at least I installed the app, and I have co-work running on the computer because the whole idea is that, you know, it's it can integrate into the computer and the files on your computer, and you can also plug it into, like, your Gmail or your drive or notion or any of these things.
00:21:45:07 - 00:22:08:05
Unknown
You choose what it connects to, but I have it in the app and then, and I have it running on my Mac. I don't know if they have a PC version of it, and I'm guessing that I'm sure that I have a Chromebook version. No, I do not believe that they have a Chromebook version. I guess you could run projects in, in cloud, via cloud.
00:22:08:08 - 00:22:27:06
Unknown
But you wouldn't get the co-work aspect right. The core aspect is, you know, it's really powerful. We talked about it on the show the last couple of months. There have been some, there have been some, Wall Street panics related to Co-work because they'll release a, a certain, you know, skill or plug in, and everybody freaks out that, oh my goodness.
00:22:27:08 - 00:22:45:01
Unknown
You know, replacing tech software is not going to be needed anymore because of Co-work. I don't know that I've come to that deduction in my short amount of time with it, but I will say that there's an organizational quality, and I'm not going to show it because it's tied to projects that I'm working on that I don't feel comfortable, like just showing.
00:22:45:03 - 00:23:11:28
Unknown
So I'll just kind of explain it to you. But, it has an organizational approach around projects inside of Co-work that I find works really well for my brain, because sometimes when I've been using these different LMS like ChatGPT or Gemini or whatever, it's like I'm always opening a new chat and there may be a way to organize it in there, but it's not integrated in a way that I automatically or instinctually kind of do that.
00:23:11:28 - 00:23:44:11
Unknown
And so things just get really messy. And I get I had to like start a notepad for different projects and always put my chat links in there so that I can get to them fast and, and all that stuff. But in Co-work, basically the way it's structured and I think it kind of works like this in the cloud for projects as well, which is separate from Co-work, but kind of similar, is that you start a project and your project might be, I'll just put it in terms, that I'm working with right now, like if I'm doing podcast consulting for a client, I would start a project and you can hire Jason to do that.
00:23:44:11 - 00:24:04:07
Unknown
Let's make clear to all the whole world, by the way. But go ahead. Yes, maybe I'm detailing a little too much to you right now how I do some of the work, but I'm playing around with it just to get a sense and understand how it works and what it can be good for. I would create like a project around like consulting, and then I would load it with a bunch of files that act as my rules.
00:24:04:07 - 00:24:29:17
Unknown
My, you know, the, the, the goals, the processes, all of these things that I put in there. And if you're running co-work on your machine, it actually gives you a file folder. That's a folder. Your that's a local folder. It's a local folder. So if I'm not but you can call upon it. Yes, exactly. If it's within the structure of the Co-work instance, that project instance, it automatically pulls from it.
00:24:29:17 - 00:24:51:19
Unknown
So if I just drag files of context into that folder, they automatically kind of get tied into the context of that project. And so if you're really used to working in like a file sort of way on your, on your computer, this kind of ties into that in a really unique way. And, you know, I don't have to like, drag it into the chat window and be like, analyze this for blah, blah, blah.
00:24:51:19 - 00:25:11:03
Unknown
I just kind of move it in there. And now it becomes, you know, it analyzes it because it's part of the folder and it becomes part of the understanding of the entire project. And then within that project, I fire off different chat sessions that are tied to the project, that understand the context of the other chat sessions of the project, but also continue to kind of build inside of them.
00:25:11:03 - 00:25:26:10
Unknown
So you have to you have to tell it to remember one session. No, I haven't I haven't had to do that. It just because it's inside of the project, it just kind of knows, like all of this is fair game. It's all tied to this. It's all part of your prior and your prior chats are recorded in essence.
00:25:26:10 - 00:25:44:29
Unknown
Yeah. Yeah. There. Yes. And they're sorted and they're easy to get to and yeah, it's just really nice. All right. So I have a, I have a taxonomy of people question for you. Okay. We're different. You Jason how are very organized person I try to be, you try to be and you succeed. You have you have a list.
00:25:44:29 - 00:26:03:22
Unknown
Some ways I, the rundown. So. So to compare this to Twit, the rundown that Jason prepares for the show, which is he always does a great job of it. I mean, that is we go through the rundown. Yeah. And with Restore by Story, it's versus a twit. It's there's a whole bunch of stories and you never know what story is going to be next to Leo says, oh, did that what role did that?
00:26:03:22 - 00:26:24:17
Unknown
What I don't know, right. Jason, schedules himself. Jason, you are an organized human being. I am not yeah, I am not. So what I wonder. And so the one time in my career when I had this system, I think I've said this on the show, I didn't know what to ask her to do because I am right.
00:26:24:19 - 00:26:44:25
Unknown
Do you think this works for. For both personality types? I mean, I do once you. Yes, I do, because as organized as you're giving me a lot of credit for my organizational skills versus how I feel about my organizational skills, I feel like many things in my life that I know just enough to be dangerous about a lot of things.
00:26:45:02 - 00:27:12:21
Unknown
And so I think I'm organizational, sort of. But that can sometimes break down. And it really and it can let me down. I think that this can work for that, because if you get into the habit of like thinking of the work that you're doing in terms of projects like for example, if you're writing a book that's about this particular thing or whatever, that becomes a project, and then any task you ever have to do related to that book is just done from within that project.
00:27:12:24 - 00:27:30:16
Unknown
Then every little task that you do benefits from the work that you've done and all the other tasks that are related to it. And it doesn't get confused by any of the other work that you're doing outside of that project. It's just kind of a limit on how much you can put in a project. There is, but I haven't come anywhere close.
00:27:30:16 - 00:27:45:29
Unknown
Like I've loaded one project with a bunch of files, and I think I'm at like 3%. You know, like there's probably you would probably put it through a lot more paces than I would, considering how, you know, how you approach your research for your books and stuff. But yeah, it's really cool. I got to say, I'm really impressed.
00:27:46:02 - 00:28:05:14
Unknown
So if I wanted to use this and I can't on my Chromebook, which is, by the way, it should a little size, that's a bummer. It's a bummer. It's bad for the Chromebook in the sense that a lot of this, I just don't think it'll work locally on it. On the other hand, from a security perspective, if you're a school or a company or somebody, I think it it gives a burnish to the Chromebook, like, we can't run all that crap.
00:28:05:14 - 00:28:27:24
Unknown
Don't worry. Anyway, if I wanted to run that, would a mac neo be sufficient? Does it? Is it is it demanding, or is it really just a question? It's really just connecting to things locally. But all the all the processing has happened. It's not it's not like you're running over. Yeah. No, no, you're you're not running these massive models on your machine locally.
00:28:27:26 - 00:28:50:01
Unknown
It is commanding your anthropic account and your anthropic, you know, whatever your paid tier you're at. Which, by the way, I haven't run into, you know, any of my cap yet, and I'm on I think the pro account, if I did like if I did start running into those caps, I would seriously be at this point with the benefit that I'm getting out of it, enticed to upgrade.
00:28:50:06 - 00:29:17:25
Unknown
Because if you're listening to anthropic, don't charge him. He's not right. Don't don't do it. I mean, I know a lot of Haber, but I'm I'm of you're now it's. Yeah. It's valuable. Totally. And and you know, there is a part of me that's like, okay, am I, am I leaning too much into this one tool? But then at the same time, like, I got a lot of stuff I need to do, and if this one tool is doing it and I feel more organized, and the quality of the work on the other side is as good as I'm seeing so far, maybe that breaks down.
00:29:17:25 - 00:29:37:20
Unknown
That's one question I have is as I build these projects and they become bigger and bigger, does some of that kind of consistency start to break down? So I would encounter that yet. But I don't know. I want a generic size this a little bit in terms of, of what functions are proving to be valuable is an organization.
00:29:37:20 - 00:30:06:02
Unknown
Is it an expression that is to say, writing? Is it, inspiration, brainstorming? What what is it that you're finding valuable? Organization is is one big thing, having some sort of, like, an organized kind of windowed, you know, area to put the right things in the right places, being able to tap into, like, plug ins and skills, which I haven't even talked about because I haven't really explored them on a deeper level yet.
00:30:06:05 - 00:30:40:19
Unknown
But that brings in very specialized, kind of knowledge into how it works with these things. And yeah, just like an analysis, like a lot of what I'm doing right now, especially with like podcast consulting, is I'm taking large amounts of data. And I want to understand that data, you know, from a wide view. And then I can kind of dive in and understand it on a deeper level and figure out how I want to tackle things and solve problems with it, you know, and, being able in a, in a single chat to just be like, here's all the data that I have for this very specific need, synthesize this and let's, you
00:30:40:19 - 00:31:11:29
Unknown
know, Claude is very good at it from a visual sense. So spin up, you know, spin up a spreadsheet or spin up a graph that shows that visualizes what this is so I can understand it better. You know, stuff like that is proving to be really helpful so far. You know, this is all somewhat related. I was thinking just the other day, as you were talking about, about the ability of, of Claude to make, visualizations and such, the argument to people, there was a story about a, I think, a fortune writer who's using the AI to write a whole bunch of stories, and he's writing more stories that everybody else and people
00:31:11:29 - 00:31:33:28
Unknown
are journalists are agog and aghast and, peeved at what am I what am I, trite? Glib lines is that a story is just another form of data visualization. Yeah. We don't think anything that, and so we argue that no one has to be kind of hand done by the human being, which is, you know, where I would lean as well.
00:31:34:01 - 00:31:50:16
Unknown
However, when, when Excel came out and it could suddenly draw a bar chart for you or a pie chart for you, what didn't you have to draw that circle by yourself and you have to color it in by yourself. That's cheating. Right? No. Of course it's to visualize the data and did it automatically based on what you gave it.
00:31:50:19 - 00:32:07:27
Unknown
How different is that really, in simple cases, from it writing an email for you or writing emails for you? I don't think it's that different. Totally. Yeah, I think I think that's where we're defining these things and our own comfort as we go. Right? Right. Nor is it still in jobs. We back of the day. I was going back to what I was doing.
00:32:08:00 - 00:32:31:13
Unknown
Hot type, or available for preorder right now. A book. My wife, over there brought out all her files for when she did the the first major Macintosh network to produce a magazine, and they were weekly and in there was a slideshow that she gave to Sebold seminars, which was a big deal in publishing. And, hey, remind me there were actual slides to go into my Kodak.
00:32:31:13 - 00:32:52:11
Unknown
Yeah, yeah. Right. Right. Slide called slides somewhere. Yes, exactly. And second, somebody in a graphic design department in the company had to make them. They had to draw the circles. They had to do the color in the purple lines. Right, right and right. And so that stuff got eliminated a long time ago. So Claude comes along, Co-work comes along that does more of that.
00:32:52:11 - 00:33:12:14
Unknown
It's just it's just a, it's not a, it doesn't strike me as a quantum. Pardon me leap but a progression. Yeah. It's the, it's the next step. And what I don't know is I feel like we talked about it a lot. But you know because it's air quotes I that is being applied to this to do these things.
00:33:12:14 - 00:33:42:13
Unknown
It has a stink. If ten years ago a piece of software, you know that didn't have the stink of I, let's say did some of these things, it would have been revolutionary and it would have been like, oh my goodness, you mean I don't have to do that anymore? That's smart software, right? And so sometimes I think the capabilities of these things gets entered, like mixed in with the kind of wag the finger of, of AI which isn't undeserving to a certain degree on a different, like a different playing field.
00:33:42:13 - 00:34:00:06
Unknown
But the capabilities itself is pretty amazing. And yeah, I've certainly run into that with, you know, even using this for, you know, like podcast consulting. Again, it's like how, you know, what would I be doing if I didn't have these tools? I'd be doing a lot more manual work. I'd still be doing the same work, but it would be more manual.
00:34:00:08 - 00:34:19:14
Unknown
And, you know, so there's there's been a little bit of guilt that I've been contending with where I'm like, well, okay, so then how much of this is me and how much of this is the tool? But then again, it comes down to, yeah, but you bring your knowledge and your experience to the tool. And that kind of collaborative process is the value.
00:34:19:16 - 00:34:38:01
Unknown
Yeah, I can envision you with, I don't know what Jason's doing exactly with this, but I could envision a spreadsheet filled with, download numbers, with ad revenue numbers. Right. Those kinds of things. Right. Where you look at it, say it. Inspire me, find me, find a pattern. Find me when it's totally. What? Yes. Exactly that. Exactly that.
00:34:38:07 - 00:34:57:11
Unknown
The other thing that struck me out of how to watch the AI doc was the way that he tried to explain I simply. And he had screen at the screen of people repeating the same word is that it finds patterns. Yes. And that's what it does. It's really good at that. And then it struck me that, this is a new definition for me.
00:34:57:12 - 00:35:23:00
Unknown
I'll try this out on you. Creativity is breaking the pattern. Oh yeah. Right. So what I does I think I learned what's been there before and that's useful. That's valuable. What we do is to say well why, why not try this. Why not do something different. Why not break that. Whether it's how you phrase something or how you plot a story or how you strategize a company.
00:35:23:02 - 00:35:46:17
Unknown
If you're, you know, Steve Jobs broke patterns. That was creative. It was a creative act. Now, I like that. I like that as a as a piece of a definition of creativity. Because you can, you know, you can be creative in patterns and regurgitating, reusing and everything people people are successful is that is that highly creative work?
00:35:46:20 - 00:36:07:25
Unknown
I don't know, compared to something that truly breaks, as you say, the pattern and comes up with something is truly there's intelligence doing it. Yeah. To to follow in the pattern. Learning from the pattern. Figuring out the pattern. I don't diminish any of that. Yeah, but I think that's what I is good at. What I, I don't think is going to be good at is creativity is breaking it.
00:36:07:27 - 00:36:22:19
Unknown
What we should be. Yeah. Sometimes, sometimes I ask the chat bot, you know, give me novel ideas as if it knows what novel is you know. Right. Right. And it comes back with the same stuff and I'm like an look. Yeah. There's I still have to be creative to come up with a different unique computer.
00:36:22:19 - 00:36:37:09
Unknown
Yeah, yeah, yeah. Totally totally cool. So that's, that's cool. You keep keep on reporting on that. That stuff. That's interesting. Yeah, I will as I, as I learn more things, we can, we can definitely talk about that. And if you use it a lot to know your thoughts as well. And anyone watching or listening, what is it?
00:36:37:09 - 00:36:57:19
Unknown
Contact that I inside might have to buy a new. I think honestly, a Neil would work for this. I mean, there's no reason why it shouldn't. It's considering everything is happening in the cloud. So I'm a Chromebook portable for the house, the Mac. Right now I'm talking to you via mac, which was given by the Twit folks got eight years ago.
00:36:57:21 - 00:37:11:28
Unknown
It's still rocket. It's still steam powered. It's amazing. What the colon once in a while to keep it going. But yeah, it keeps it keeps running. But it's here at my desk, and I want to be able to do that. I want to be able to be on the deck or be somewhere else. Totally. I may break down.
00:37:11:28 - 00:37:30:22
Unknown
I might break down and buy a mac, I don't know. All right, if you do, we'll talk about it. Yeah. Real quick. Want to let everybody know that we do have a Patreon? If you like our wonderful tangents like that one. Patreon.com slash I inside show and we can continue to tell you how we feel about Co-work over the years and months and years ahead.
00:37:30:25 - 00:37:55:17
Unknown
Big thanks to a few new patrons. Adam Skara is Skara mela, Rob Bug's Life, and Clayton Brucker three. Amazing. Thank you. Each of them actually paid for the full year, not just monthly. They just they just went here, have a year. So here's an engagement ring for Jason. Yes. Thank you. I do, thank you for being with us for, for this show.
00:37:55:17 - 00:38:10:11
Unknown
And on the Patreon, Patreon, Patreon.com slash AI inside show. We're going to take a quick break. And, you know, we actually have other news to talk about, not just our thoughts on the latest updates to cloud. Co-work. So we'll talk about that in a moment.
00:38:10:14 - 00:38:29:14
Unknown
All right. The March to OpenAI's IPO continues. A few, related stories to that to discuss, first of all, OpenAI has, dropped plans for yet another of its big priorities the adult chat bot integration, killing its darlings before they're even born.
00:38:29:14 - 00:38:49:05
Unknown
Basically, what I wonder about this, we'll never know, was the motive to drop this, because people knew it was a really stupid idea to, yeah, important business. Or did they drop it because, like Sora, it was a distraction from trying to make the business work. I almost I would like to think it's the former, but I think it's probably the latter.
00:38:49:08 - 00:39:14:17
Unknown
Somebody might be a little of both, but somebody said, given our priorities, given everything we got to do, given the pressure we're under, really, do we really have to do porn? Yeah. I mean does yeah. So does does bringing adult content into a chat bot, mainstream it more or mainstream it less? Because I think company I really, really wants their product to appeal to everyone.
00:39:14:23 - 00:39:36:12
Unknown
And I wonder if bringing something like that actually pushes mainstream users away. Well, it's not just means I know it attracts users. I know that for sure. But yeah, yeah, but it's not just mainstream users, it's business users. Well, exactly that too. Yeah, totally. Do you want your business products to play in the sandbox? And maybe you don't, you know, maybe that's bad for business.
00:39:36:14 - 00:40:00:27
Unknown
Literally. So what's so smart? Again, I'm not a prude about this, right? Pornography exists. It's gotta exist forever. It's existed since, print, but, it's not a smart strategy. Yeah, a business strategy. And it's also not good for, you know, after all this stuff. And we didn't talk about it. We shouldn't, because it's really.
00:40:00:29 - 00:40:23:28
Unknown
It's the old ancient days of social media, the, meta and, YouTube jury verdict last week, which I think was wrong in a lot of ways, but it comes out of this emotional response to the stuff, and, and if the AI world wants to communicate its value and virtue, doing this is just dumb for everybody.
00:40:24:01 - 00:40:42:23
Unknown
Yeah, yeah. So that's gone. We already know that Sora is, is going on. Is Sora done or is it? You know, I don't know. Is it just announced as done? I haven't even checked in on it lately. I bet if I loaded it up, it would probably still load. I'd still be in there if I got the saw app.
00:40:42:26 - 00:41:10:03
Unknown
Anyways, we know it's it's, not long for this world, at the very least. And, Wall Street Journal had a post where they got a little bit of a closer look into the why, behind that and some details. $1 million per day to run and, the product revenue, obviously, you know, with the user base did nothing to cover that, weak engagement after an early spike, which we knew about.
00:41:10:03 - 00:41:34:18
Unknown
It came out of the gate rock. And, when it launched, you know, late last year, and then within like a month or two, it was like nobody was was talking about it. I know I had a hot I was on it for a hot minute and then I completely lost interest. So it's it's weird when your own use lines up with everybody else's for no reason other than you're all on the same page somehow.
00:41:34:20 - 00:41:53:14
Unknown
OpenAI needed all that compute for higher priority models, and the article does mention a future spud model. It's called Spud. I guess. And so, you know, some of these things, like I think people were assuming last week. And now Wall Street Journal has sources that they say are kind of confirming some of these things, just basically saying, you know, the priorities are different.
00:41:53:14 - 00:42:15:03
Unknown
And, this was really expensive to run. Generating all that video every single day is pricey. And hey, we've got different priorities. And it doesn't involve this. So we're getting out this. Yeah, it would make more sense for YouTube to offer this, in the long run, because you're making videos. Making making social videos. You know, metas thing was stupid.
00:42:15:05 - 00:42:34:26
Unknown
What was it called? I can't remember even, matters that it had that you could make social videos, but, you know, made up stuff. It's the last thing you need on Facebook is more and more spam and I still have that. Why am I blanking on this? I'm logging on to it. Yeah. So I just finished listening to a book called The Last Kings of Hollywood, which is about Spielberg.
00:42:34:28 - 00:42:52:28
Unknown
Coppola and Lucas. Okay. And so that's the Lucas and Star Wars, right? The really high end graphics like this that came out of a movie studio, I think, you know, we we definitely have seen this is going to come down to the point where any creator could use this stuff. I still think it's going to be there.
00:42:52:28 - 00:43:12:26
Unknown
I think it's going to be made. But it needs a market for it, and the market is probably going to come from, more likely to come from, I think, fanfiction and people spending money on this. But it's worth it's worth somebody to spend money on a story to make their story that they want to make it so they can make money out of it.
00:43:12:29 - 00:43:33:22
Unknown
Then it makes some commercial sense. It's going to be smaller, more expensive. That's where it should be. But it didn't make sense for a while now. It was a show off product. Yeah, well, I mean, and they were able to. You know, to carve out a deal, you know, with, with Disney to kind of immediately, at least seemingly, prove its existence.
00:43:33:25 - 00:43:58:27
Unknown
And so much for that. So much for that. Yeah. They hadn't even, you know, that that deal never actually materialized into cash. Even prior to this announcement, no cash had had traded hands by that point. So it wasn't like they had to do this. Really expensive. What's a billion bucks, Disney? Yeah, sure. Like your Sam Altman, billion is literally that's for error and that and that taps into kind of today's news.
00:43:58:29 - 00:44:26:21
Unknown
Was it today or last night? I can't remember, but OpenAI closing. Yeah. Record $122 billion funding round ahead of its anticipated IPO. OpenAI says it has more than 900 million weekly active users, more than 50 million paying subscribers and 2 billion in monthly revenue opposed to big money valuation of 852 billion. So before it's even on the public markets, yeah, $1 trillion company.
00:44:26:23 - 00:44:46:01
Unknown
Wow. Rounded up. Wow. We'll see. I don't think so. So there was a story. There was a story that didn't put it in there that that, retail investors can get in on this now, but you have to know somebody and it's a, you know, a high end ETF if you're if you're rich, if you're part of the Epstein class, you have those contacts.
00:44:46:01 - 00:45:08:16
Unknown
But I don't but I don't know if I would, buy OpenAI stuff. Yeah, I don't know, not getting that feeling right now, I guess. I mean, I think if I, if I owned a little before the IPO and I sold the first day, I'd make money. Yeah, but hold on for the long term. Not sure.
00:45:08:18 - 00:45:36:15
Unknown
Yeah, yeah. Still unprofitable. There you go. Yeah. More details on Apple's approach, to I, which I don't know how much of this is, is completely new. I mean, we have some details on exactly how they're going to approach this. But I feel like we've kind of been hinting at this in, in recent weeks. Yes, we know that Gemini is going to underpin, Siri on, on those devices.
00:45:36:18 - 00:46:01:07
Unknown
But Apple is going to allow third party AI assistance to plug directly into Siri, through a new, extensions system. And that's according to Mark Gurman at Bloomberg. So basically it's it's an opportunity for Apple to say, sure, you want to you want to install these AI models on your device. If you install that through the App Store, we get to take a cut.
00:46:01:09 - 00:46:22:09
Unknown
Of course, of course you can do that. But but at the same time, I also think it's really like that's a that's a flexibility that I would love to see on Android two. Just like being able to kind of integrate, with your chosen AI, because right now Gemini is very deeply integrated into, into Android. And then it's not a bad thing.
00:46:22:09 - 00:46:43:25
Unknown
I like Gemini on on Android and everything, but I also, you know, as I'm like playing around with cloud, for example, I'd be curious to see what my Android device would be if cloud was powering that side of things. So maybe Apple's on us onto something strategically. I think it's interesting. And you know, we can debate in five years whether Apple was behind or smart.
00:46:43:27 - 00:47:05:20
Unknown
Yeah. We came to why. The smart argument is that it's all commodity. And they give their users the ultimate choice and they are plug and play and they are interchangeable and so fun. Use various other ones. Okay. .1.2 is that there will be and I certainly think this is true. And this is something that Yann LeCun has talked about at a high level.
00:47:05:20 - 00:47:32:01
Unknown
But even at a consumer level, there's going to be many more specialized AI assistants. And so if you want your, to teach me Spanish AI assistant or your health AI assistant or your, I'm a sales person and give me that inspiration before I go in to close the sale. AI system, whatever it is. The personalities and such, I think the flexibility to be able to use many.
00:47:32:04 - 00:47:51:11
Unknown
And it creates then, as the App Store did a, a marketplace. That can be a very rich marketplace, where people can make these things and they can make money. And you're right, Apple can then make money in turn. It's an interesting strategic is is it a strategic fail or strategic brilliance. You never know with Apple.
00:47:51:14 - 00:48:12:12
Unknown
That's that's the big question that everybody I think is is wondering as we you know, this is this is all going to be announced at the Worldwide Developers Conference in June. So we are going to have to wait much longer. But it is a real question of like, okay, so is Apple doing this because it yeah, because it, it drag its feet and it had a few missteps.
00:48:12:12 - 00:48:41:05
Unknown
And this is just what it has to do now. Or is this a smart strategic choice. Well I guess it could be both. It could be dang. We've we failed, but we landed on something that, that is unique and, and doesn't, doesn't have us risking everything on the potential of this working or not. We can kind of rely on others and put off some of the, you know, put off some of the risk that a company like Apple would take if it was to develop its own.
00:48:41:07 - 00:49:06:26
Unknown
Yep. So, yeah. And I'll be curious to see how, how Gemini is as like the like one thing I'm confused about is if there's extensions which allow you to load in, say, you know, ChatGPT or cloud or whatever your, your preferred, model is, does that supersede the Gemini underpinnings of Siri that we've been hearing about, or is that like, you know what I mean?
00:49:06:26 - 00:49:26:12
Unknown
Like it gets a little confusing for me. Are you choosing to replace Gemini with that or is Gemini and Siri kind of one? I, which I think is the case calls upon these other things and it calls upon these other things. Yeah. I'm not sure exactly what that's going to look like. Well, what we've certainly seen is, we've talked about this a lot.
00:49:26:13 - 00:49:50:20
Unknown
We had on, a few guests in interviews, we see agents calling upon agents. It's not as if you have an associate agent. Yeah, it works for you. It's going to call upon, you know, company terms. You're a new employee. You come in, here's your, onboarding agent. It in turn, calls on the H.R. Agent it in turn calls on the insurance benefits agent, in turn, calls on the other agents.
00:49:50:20 - 00:50:08:16
Unknown
Right. And each is specialized in its work and does its job, and they know how to communicate. That's a model that we see happening in kind of the Salesforce world. In the larger, world out there. I think we can certainly see the Gemini. I could come along and say, oh, you want to do that? I think I found an agent that could do it for you.
00:50:08:16 - 00:50:34:14
Unknown
You want me? Okay. Hooker in. Yes. Okay. Here we go. Yeah. Yeah. That's that's big time right now. I put in a couple of stories that I thought that, that were interesting to me, but I thought you might have some, some cool perspectives on some of this stuff. Let's see here. Where are we? Okay, first of all, these these both have to do with university students and how they tackle their own work in 2026.
00:50:34:14 - 00:50:58:29
Unknown
You know, this year of of AI everywhere and and all that kind of stuff. How are how are they doing this? You know, what does this look like? Well, one example I came across an article, talking about 20 undergrad college students at Kennesaw State University. They underwent a pilot study into how they use AI for writing tasks and kind of looked at like, how how are they actually integrating with AI?
00:50:58:29 - 00:51:18:17
Unknown
And I think a lot of people would jump to the conclusion that, oh, well, students don't even write anymore. They just use the AI to spin up a thing, and they throw that in there and call it, you know, macaroni. Those students are from this report. Anyways, for the study, use AI to spark ideas to draft theses.
00:51:18:19 - 00:51:38:09
Unknown
Our thesis thesis thesis is the plural of thesis. Okay, there we go. Okay, so I was right. It just kind of sounded wrong when I said it. It's, something else. And so, like something else, moving beyond bro blocks, like writer's blocks, that sort of stuff. And I, you know, I would agree, like, these are things that I sometimes use AI for.
00:51:38:12 - 00:52:02:24
Unknown
It's a great use for that. But the report found that they composed mostly on their own. So they see AI as a starting point, not a finished product. They rejected outright rejected seemingly generic output, which AI is very good at giving. If you don't, if you don't know how to work with it on a deeper level, it can be very generic sounding and very recognizable, and rarely accepted.
00:52:02:26 - 00:52:29:24
Unknown
Text written by the AI verbatim for what they were doing so they would spend the time revising heavily in an effort to claim their own ownership and authorship of the final draft. That sounds about right to me. I don't know how this how this matches up with other universities, but I think that's interesting. So, there was another story that was in the Washington Post, very similar, judge but he fed his students easy answers.
00:52:29:24 - 00:52:52:20
Unknown
So he built an app to argue with them. Similarly, is he he designed an app, called Casey, and, he wants the students to, to to go back and forth with it. Casey capitalizes on on the capacity to, to slow students down, to actually make them focus and to make them consider very different ways of thinking about questions.
00:52:52:23 - 00:53:12:15
Unknown
The justification for for writing pedagogically. Don't you use words like that? I do, it's a good word. I fake it because I'm not really I don't I don't want a PhD, folks. But the justification, the whining that I hear about what I was going to say, what the the reason we teach writing is to teach students how to think.
00:53:12:18 - 00:53:32:19
Unknown
Well, all education is to teach students how to think, how to analyze information, how to come to their own conclusions, how to make arguments, and so on, so forth. And and so I can do amazing things. And it doesn't just come out in prose. That's one way. And I value it because I'm a writer, and I think it does teach people a lot.
00:53:32:21 - 00:53:53:13
Unknown
I think a, you know, a journalism degree used to be there's an argument the journalism degree is the new English degree, because you have to be able to get information and, and, formulate, a thesis and express yourself well and be logical, and deal with conflicting ideas and all that. Right? All that's true about writing, but it's true about other ways, too.
00:53:53:13 - 00:54:13:01
Unknown
And I think that AI can can help do these things where you're putting students into a Socratic dialog, an argument, a debate with it, you're forcing them to try to come up with something more. I think it's brilliant way to do it, I love it. Yeah, I think it's super smart. Yeah, that's that's cool. I, I missed that you put that down there.
00:54:13:01 - 00:54:38:27
Unknown
That totally fits in with this, little arc. So it's three stories. We got those two, and then, a Cornell University or at Cornell University, a German language instructor is requiring students to compose their essays on vintage manual typewriters. That's right. It's an analog assignment. She's sourced dozens of machines, taught the students how to operate, how to feed paper, return the carriage.
00:54:38:27 - 00:55:03:29
Unknown
One student was like, oh, that's why it's called the return key. I used it as a way to help students lessen A's or reduce rather, A's influence on assessment and on their work. And the students actually say that the work on the typewriter is, as you can imagine, slow. It's physical, it's socially demanding. There is no delete key, no notifications.
00:55:04:07 - 00:55:32:27
Unknown
There's more talking with their classmates as they're working on these things. And I thought, this is interesting because it's totally true, a need to think through your sentences before committing by hitting the keys, because there's no easy undo for any of this, right? So I did I thought that was an interesting, tactic. And probably actually, I would imagine that's very useful because when I think of the reliance that we kind of grow into, or at least I can speak for myself that I grow into sometimes with AI on certain tasks.
00:55:33:00 - 00:55:50:23
Unknown
Something like this really forces you out of that. It's like, no, if you're using a typewriter. And that's the challenge right now, and you're probably going to learn a lot about how you write and kind of your, your gaps and your, you know, the spaces that you have to work on when you're forced to slow down and do it manually like that.
00:55:50:25 - 00:56:17:07
Unknown
Okay. Jason, did you ever actually use typewriters? Yes, I did when I was a kid. Not, yeah. It was one of those things where there was a typewriter up in our attic at my parents house, and I saw it, and I was really just kind of, like, fascinated by it. And I pulled it down, and, yeah, I had it in my room and I would type, you know, I went through kind of like a period of, like, I'm going to, I'm going to type of journal, I'm going to do a journal.
00:56:17:09 - 00:56:38:14
Unknown
I probably did it like maybe two days or something, but I was I was really into it at the time. And then, you know, eventually I got to convert 64 and had a word processor in there, and then it was it was passed. So, my handwriting is god awful. It is terrible. And my parents made me take typing class after sixth grade.
00:56:38:17 - 00:56:58:27
Unknown
They said, no teacher is ever going to understand what you're saying and you have to do. The center was mainly me and young woman, in the class. I somehow remember the big royal typewriter was being pink. And I learned to type and, grateful. Absolutely grateful that I did. Yeah. The point that I can't I couldn't write anymore with a pen.
00:56:58:27 - 00:57:25:02
Unknown
I had to write the typewriter the way I thought. And then I write about this at length at the end, at an afterward and, hot type, available for preorder right now. And thank you. So the end is my own typographical autobiography where I explained how I went from typing, and I've talked in the show before about how I was rewrite, and you had to write one paragraph at a time, and I had to go through the process.
00:57:25:04 - 00:57:42:03
Unknown
And then along came the computer and how it fundamentally changed how I write. And that's how I think. No, I didn't change my brain cells, but it allowed me to get just something down because I was a fast writer, a newspaper writer, and then I could go back and edit that. So it made me more of an editor.
00:57:42:05 - 00:58:04:08
Unknown
If I had to sit down to a typewriter now, I'd fight very hard. Yeah, I would find it very hard to. There's a there's a commitment to your committing to everything, or you're cursing, retyping the whole damn thing because you screw. Yes. Right now. Now I got to spend my time retyping the thing. And, yeah, there were, there were the, the later electronic typewriters that could go back and.
00:58:04:09 - 00:58:32:16
Unknown
Yeah, whiteout automatically and then allow you to type over it. I remember those I don't know where the professor found that many typewriters. Yeah, totally. How do you source all of those? And they're in good working condition and everything. I mean, there is something really satisfying, though, about using a typewriter, like there's something that the kind of, I don't know, the experience of it, the sound, the paper get hit, although the tactile, the feel, the tactile quality, we try to recreate that with our computer keyboards all the time.
00:58:32:16 - 00:58:52:29
Unknown
This is the feedback that has happened. Yeah. If you ever go to the Museum of printing, which you all, if anyone is near Boston, you absolutely must on a Saturday go to Haverhill, mass. Or they have a working linotype. They also have a huge collection of typewriters and they're just wonderful. Kids love to see typewriters. Absolutely love it.
00:58:53:01 - 00:59:19:27
Unknown
And yeah, I have two here. Still. I won't get rid of them because they're they're artifacts. They're wonderful artifacts. They are. I also they are. Oh, well, hold on a second. Digging it out and then or. No, you've got the keys, right? No no no no no. Then came the IBM Selectric. Oh. Oh, yeah.
00:59:19:27 - 00:59:37:26
Unknown
Look at that thing. Right? Yeah. It's a little rotating ball. Yeah, yeah. The the ball. One of the early chapters of Hard Type is entirely about the invention of the typewriter, because it was a necessary precursor to the machine. You needed the idea of the keyboard, the earliest typewriters and the earliest typewriter machines were based on harpsichord keyboards.
00:59:37:28 - 00:59:58:03
Unknown
Because those were the keyboard people knew. Right? Yeah. And the idea that you would have one letter to one key and it would be done differently, the inner workings was still based on the harpsichord of Hitting Below or the piano hitting below. And so it was a progression that got us to the typewriter. So that is how I love the machines.
00:59:58:06 - 01:00:16:24
Unknown
Yeah, they're pretty awesome. I got to say. That's a that's a cool artifact right there as well. Before we go any further, I just noticed, oh. So nightmare. How do you think it was good to have you around? Good to have you here. It's been a little while. Thank you for the super chat. Says. And this is related to the Apple story.
01:00:16:24 - 01:00:40:06
Unknown
Says this is a strategic necessity for Apple. It's very clear Apple has no idea how to catch up. So this offloads the pressure while they work on their version. It also provides a safety net in case they never do. Yeah I mean I think there's there's, some serious truth to that. It kind of puts them in of a powerful place to kind of keep working, but still be in the game sort of thing.
01:00:40:09 - 01:01:07:09
Unknown
Yeah. Interesting. Thank you. Oh, zap. Thank you very much. Always appreciate it. All right. I'm also going to talk about typewriter. What's that? The electric typewriter. Yeah. Yeah, I'm old enough kids that I had to type without electricity. Yeah, it took it took real finger muscles. You. Yes. I'm sure your fingers were, you know, bulging muscles on your fingers because of hitting those keys for the rest of my.
01:01:07:11 - 01:01:24:12
Unknown
I had strong fingers. Totally. Yeah. I mean, my parents had in the attic, they had a regular one. And then at some point we had an electric one. But so I do remember, like, how forcefully you needed to hit it, because if you didn't, it wouldn't contact with the ribbon hard enough to actually press something into the paper.
01:01:24:15 - 01:01:39:18
Unknown
Well, and then it was a carbon paper. And carbon paper is a whole other story I tell. Oh, I have one of my typewriters. I have a little, portable that's the size of a laptop, but thicker and heavier than I would take with me on assignments.
01:01:39:21 - 01:02:03:11
Unknown
And I would sometimes take with me a gigantic, fax machine we dubbed the Mojo Wire. You would take a piece, a piece of paper, and you put it in in the scanner, the drum, and it would go around. And then you take the phone. The old fashioned phone? Yes. And put it in the brassiere. And, it would take, I don't know, five, six, seven minutes to set one page as an image.
01:02:03:13 - 01:02:20:01
Unknown
Wow. When you're on, it all starts somewhere, right? And it started somewhere prior to that, too. That was an image that old people. That was that evolution. Something else. Well, speaking of typewriters, want to talk about quantum?
01:02:20:04 - 01:02:48:29
Unknown
Kind of the opposite end of the spectrum. Quite, quite literally. The opposite end of the spectrum. But, a couple of research papers arguing that breaking 256 bit elliptic curve cryptography. So let's just say strong current modern cryptography, will take far less quantum resources and time than was previously believed. So this mentions Q de, which was a term I had not heard before.
01:02:49:00 - 01:03:12:11
Unknown
This is basically the day that quantum computers can break today's mainstream public key encryption, and apparently it's closer than we think. These machines, do not yet exist to do this, so it's not like they're proving it. This is all, I think, theoretical. But one team showed how it could effectively crack sec. 256 encryption in about ten days.
01:03:12:13 - 01:03:39:02
Unknown
That's roughly 100 times reduction compared to prior estimates. So it's a lot faster. And then separate from that, Google researchers said Bitcoin's SEC 250 6K1 based sec encryption that mean that matters to someone. So I'm just going to read it. Could be cracked in under ten minutes on an order of around 500,000 physical qubits that compare qubits.
01:03:39:07 - 01:04:01:27
Unknown
That compares to current platforms that have demonstrated arrays of more than 6000 qubits. So maybe that means if I'm understanding this, we're still a ways away, but they're starting to see how possible it is. Yes. Yep. And it was Google also put up a, a piece of explaining why they did this and that. It's the responsible thing to warn people so that they can get ahead of it.
01:04:01:29 - 01:04:30:03
Unknown
Which I think is critical. Yeah. Yeah. They definitely do a lot of research in different facets and, and do it from that perspective. So this would be another one. And you know maybe this just kind of points to the work that needs to be done now on real world systems to prepare. I don't know how you prepare for a post-quantum algorithmic world at this stage, right, because we don't even have the systems to do this.
01:04:30:03 - 01:04:55:03
Unknown
But so I yeah, beyond my pay grade. But you should do better at preparing for quantum businesses and Bitcoin. So does that mean that what happens when that happens. Bitcoin is just like valueless right. Like what's yeah description is broken on you. Like yeah. Oh the whole the whole system. Yeah. The whole system I mean once it's once it's breakable.
01:04:55:03 - 01:05:23:01
Unknown
Yeah. I don't I don't know if you can upgrade its, its encryption. Yeah. I mean it kind of be you kind of be dumb if you couldn't like they had to have thought about that. Maybe they don't. I don't know. Crypto is a world I don't entirely understand and I'm okay with it. All right. We have a YouTube channel, by the way, if you didn't know, many people are watching us, on YouTube, actually also watching us on LinkedIn, sometimes we, stream to LinkedIn.
01:05:23:02 - 01:05:40:04
Unknown
You you stream to your X account. Jeff. So we're going in a number of different directions, but if you want the storage version of this video podcast, just go to YouTube, search for AI Inside Show. We might also have just videos that aren't part of the podcast on that channel as well. We have a few up right now working on a few more.
01:05:40:04 - 01:05:50:05
Unknown
So, check it out and we're going to take a quick break, get back and do a little speed round of a few things that you may have missed and get you out of here. That's coming up in a moment.
01:05:50:08 - 01:06:00:22
Unknown
All right. We talked a lot about Sora earlier in the last couple of weeks. Anyways, so we know OpenAI is getting out of the video generation business, but Google and Google announced video 3.1 Lite.
01:06:00:22 - 01:06:25:24
Unknown
It is a lower cost, but very fast text to image model 48 seconds up to 1080p. Quality resolution. So it's Google's most cost effective video model. And Google could probably afford to do this because it has so many other proven bets inside of its business that it can foot the bill for the costly video generation. And again, because it has YouTube, it has the out let for it.
01:06:25:26 - 01:06:53:01
Unknown
Truth. Right. Yeah. We're we're going to see, I await the controversy at the Oscars for the first animated short. It's made this way. Yeah. It's going to happen. Right. Yeah. Almost certainly. Along these same lines, Bytedance's rolled out its dream to see dance 2.0 a video model inside of tap cut.
01:06:53:02 - 01:07:13:29
Unknown
So if you use cap cut which is a, a video editing app that you that is on all platforms at this point I think. But you know, many people just use it on their mobile devices. And you live in Brazil, Indonesia, Malaysia, Mexico, the Philippines, Thailand and Vietnam. You can use this video generation model. It is it limits clips to 15 seconds.
01:07:14:01 - 01:07:39:17
Unknown
It has some strict safeguards in place, apparently. It bans real face generation. It has some IP filters. It places invisible watermarks on the content. But, you know, video generation inside of an editing app. I mean, Adobe has something similar. I believe they have video generation inside of premiere now using Firefly. So yeah, nice to have it. I suppose.
01:07:39:19 - 01:07:59:24
Unknown
I haven't really used or relied upon it, you know, but I'm sure there are plenty of video editors out there and especially, especially people using cap cut because a lot of cap cut is meant for like shorts on TikTok and whatever, you know, the kind of vertical platforms. That's what a lot of vertical, creators use to edit their videos.
01:07:59:24 - 01:08:22:21
Unknown
Mobile. And so there you go. Right? Yeah. Meta announced two prescription optimized Ray-Ban meta smart glasses. Two new models starting at 499, launching April 14th. They've got the blazer, which is a more squared version, and then the rounded version is the Scriber. And they're also doing this to get more support for I think this is the blazer.
01:08:22:21 - 01:08:52:25
Unknown
Maybe. Is that what it is? Yeah. More support for different, types of prescription lenses. So broader support for transitions, progressives, the different, the different kind of frame shapes and stuff. So, you know, if you are a prescription lens wearer and you want more options, Meta's got you covered. Apparently they're leaning heavily into that. Yeah. Because the sunglasses I walk by, you're even in Macy's.
01:08:52:25 - 01:09:14:20
Unknown
They're still in the Ray-Ban. Yeah. Smart glasses. Right. And I'm thinking if it's just sunglasses, this doesn't have to be. It doesn't make sense. It doesn't have to be. Yeah. I've seen plenty of people now that have just regular lenses, inside of of the frames. And then here, you know, they're talking, you know, the ability to do transitions and progressive.
01:09:14:21 - 01:09:36:04
Unknown
So by all accounts, these are just glasses frames. Yeah. And if you wanted to be sunglasses, you can if you wanted to be, you know, just straight up lenses with prescription support. You can this doesn't have anything to do with the the display version, whatever they call it. Do they call it. Yeah. The which could be interesting because I think how it refracts on.
01:09:36:06 - 01:09:55:07
Unknown
Yeah. Shapes will be interesting. I wonder, I wonder what kind of changes will happen with that technology as we go forward to of course I as an idiot, this is all PTSD for me because I have like Google Glass with prescription lenses. And I have a progressive that was that was like $700. I had to add on to the $1,500 for them.
01:09:55:07 - 01:10:11:29
Unknown
Oh, but did you wear them a lot? Like, did you get enough use out of it? No, I'm an idiot. It made sense at the time. Jeff. It's it's a cool time, believe me. In the last two years, I bought many things that made sense at the time. And then I look at it now and I'm like, why did I do that?
01:10:11:29 - 01:10:38:22
Unknown
That was stupid. Yeah. And let's see here. Gemini. Now, let's, now allows free and paid users to import chat and data from other AI assistants. I think this sounds more impressive than it actually is, in my opinion. Users can either import summarized profiles using a prompt that's provided to them by Google to be fed into the third party chat bot.
01:10:38:27 - 01:10:57:27
Unknown
So I've even done this myself where I'm like, I want to take this chat content and somehow bring it into another thing. So summarize all the most relevant things from this chat. And then I'm going to introduce. And it never gets everything. It doesn't get it in the same contextual depth as it does in the original source. So I wonder how how well this would work.
01:10:57:27 - 01:11:18:27
Unknown
But Google apparently has a system to to do it. Well, this is this is this is, data portability. Yeah. This is what we all did for social media is I want to keep my social graph elsewhere. In this case, it's more than just my social graphics. It's the it's the thoughts that I had. My prompts, my, data, my feeding it data.
01:11:18:27 - 01:11:37:24
Unknown
All this is mine. So I should be able to take it elsewhere. You should. But I guess my my argument here is like, you already can do that. Like I can already open up a chat with. ChatGPT. That is, you know, a million pages long, select all copy, you know, paste it into another thing and say, learn this and you can do it.
01:11:37:24 - 01:11:52:24
Unknown
I don't see what's different between these two things. Well, what what you really what I really want and not use it the way you are now. But you have your Claude folder. I don't know what they call it, but your project, right? Yeah. The project. Right. You should be able to point a different AI at that same project.
01:11:52:26 - 01:12:15:06
Unknown
Oh, yeah. I don't think you can do that. I'm sure you can't. But that's what that's what portability would mean. Yeah. Right. Right. And I think probability is an ability I don't think we've we had heard a call for AI interoperability. But when it gets to an agent world, we're going to need it. Yeah. Yeah. Google does have like a different, a different feature that's based on full chat histories.
01:12:15:06 - 01:12:33:17
Unknown
But again, I don't know what this is. What of this is, is any more than just, like, copying the contents of a chat. Like, I don't know that in either of these examples, you get any of the deeper kind of understanding and knowledge. It's still like interpreting what you put in there to give you something to load into another element.
01:12:33:17 - 01:12:49:27
Unknown
And in that case, you're going to miss detail unless you load that other. Let me say here's where we got and here's all the source documents that I fed into it so that you're on the same page, you know, but you're still doing more work than just, take my data and put it over there now. Yep, yep.
01:12:49:27 - 01:13:19:04
Unknown
Yeah. And then finally, blue Sky getting into the AI game with Addie at IEEE. It's a standalone cloud powered AI assistant. Excuse me. It is not part of the blue Sky app, so you're not going to find this in the blue Sky app. It lets at protocol users design their own natural language custom feeds. So, you know, and they say eventually they're going to enable vibe coding of full social apps there as well.
01:13:19:06 - 01:13:42:03
Unknown
So what's fascinating about this to me, there was there was a discussion today about after after the, meta YouTube, addiction, verdicts. The argument is that it's product design and it's the algorithms, the evil algorithms. Somebody I wish I could say who it was, I could give credit because I can't remember who it was said on blue Sky.
01:13:42:06 - 01:14:00:08
Unknown
I'm a lot more addicted to blue sky than I am to Facebook. And blue Sky doesn't have an algorithm, and that's what makes me addicted to it. And this what this really does. What I think he really does is we've we've talked about this for a long time in the days of social media, is that you can make your own algorithm.
01:14:00:10 - 01:14:14:27
Unknown
Right. And that's what this is. So you have no one to blame but yourself when you're addicted to it. What. Yeah. What is, what is it to say. What does it mean when you say I'm addicted to this. It's ridiculous. It's ridiculous. It just means that you like using it and you use it a lot.
01:14:15:04 - 01:14:43:18
Unknown
Yeah, right. I wrote about this in, the web. We we've, also kind of on sale, but they dropped it. And, I don't like my publisher for that book. There was a I'm trying to see if I can find a right now for use. Early on, as soon as the internet came out, there was a, an argument that.
01:14:43:22 - 01:15:10:10
Unknown
Oh, here we go. In July 1996, only two years after the introduction of Netscape, Columbia University psych a psychiatry professor, Ivan Goldberg, posted a notice to an online bulletin board. He found it intended to parody the language of the diet of the DSM, the, Mental Health Directory of ailments. Goldberg announced criteria for a new diagnosis internet addiction.
01:15:10:12 - 01:15:34:26
Unknown
The astute might have noticed the word humor in the URL or the odd symptoms listed, like voluntary or involuntary typing movements of the fingers. Even so, people appeared on the bulletin board claiming they suffered from the addiction that he had just concocted, so he kept the joke going. He created an internet addiction support group, even though he believed the support groups were ridiculous.
01:15:34:28 - 01:15:57:07
Unknown
And he regretted the coinage of this because it became a whole thing. Then came along. The entrepreneurs. Kimberly Young founded the center for Internet Addiction in 1996, presented a paper at the American Psychological Association declaring the emergence of a new clinical disorder, internet addiction. Mind you, Google didn't come out until two years later in 1998.
01:15:57:07 - 01:16:36:27
Unknown
Facebook until 2004, YouTube 2005. Twitter 2006. But she declared that people were addicted to the internet even then, and not surprisingly, she advertised for volunteers for her first study from Goldberg's parody group as well as a web aholic support group. And then did a supposed survey of them, but it was all yucky. And then came along another person, for the first ever, last name young, who founded an inpatient hospital clinic for internet addiction in Pittsburgh in 2009.
01:16:37:00 - 01:17:05:01
Unknown
That was one other's fall 2002 therapist founded restart atop Serenity Mountain, 25 miles from Microsoft's headquarters. Addiction is a trope. It's B.S. it's made up even in the case that existed, and this is not our purview were about AI, not social media, but the one the most recent victory for the forces, against social media, involved a young woman who had plenty of other issues in life.
01:17:05:01 - 01:17:27:29
Unknown
And it's simplistic to say that there's one thing here. So, yeah, addiction is is B.S., but it's a way that they are attacking technology and act like we don't have any agency. Like we're just just ciphers have been pulled into this. So anyway, so I like this blue sky thing because it gives every user agency and power and responsibility you can do it.
01:17:27:29 - 01:17:46:21
Unknown
They love it. I'll be curious to. It's only available right now to a certain small group, but like okay opens up and then we'll users be able to share their algorithms with it. I was just wondering that I don't know, if I'd like to work hard to create a custom feed, like, could you share it so that other people could get addicted to your algorithm?
01:17:46:21 - 01:18:06:11
Unknown
Well, I did that with lists. I mean, in the club Covid pandemic. Yeah, I made quite a bit of hay, but created a Covid, experts list. And you can follow that list. It wasn't algorithmic but it could have been benefited with some algorithmic juice. So yeah. Sure. Sure. Interesting. Let's see what comes out of that.
01:18:06:14 - 01:18:26:18
Unknown
You just got a taste of Jeff's work. The web we weave, whether you like it or not. Hahahahahahahaha. If you like Jeff's writing. Hey, you've got a lot of catching up to do. The web. We weave magazine, the Gutenberg parenthesis, soon to be hot type. I mean, today's show is filled with examples and topics, so our apologies for all the plug ins, folks.
01:18:26:18 - 01:18:49:09
Unknown
But no, I think it perfectly weaved in. It was it was a natural integration. So go to Jeff jarvis.com and you can find all the ways to support Jeff. And then of course there is intelligence, AI and humanity a new book series new book series from Bloomsbury. Thank you boss. Yeah. Thank you Jeff, such a good time hanging out with you.
01:18:49:12 - 01:19:18:03
Unknown
So that Jeff doesn't have to bring it up. I'll just mention pod tune up.com if you want to, you know, have some help, some support on, your podcasting, successes. Triumphs and tribulations. Let's say that I can help you pod tune up.com, reach out and, let's talk. And then I inside dot show is the place on the web where you can go to find all the information about this show, subscribe, reviews, video, socials.
01:19:18:04 - 01:19:50:21
Unknown
It's all there. I inside dot show and then finally, finally there is the Patreon Patreon.com slash AI inside show. We have some amazing executive producers each and every week. Doctor Du, Jeffrey Marikina, Radio Asheville, one of 3.7 Dante Saint, James Barnard, Eric Jason cipher, Jason Bray, Brady, sorry, sorry. Jason, Anthony. Downs, Mark. Archer, Karsten. Szymanski. We've got, some amazing people supporting us each and every week, and we just really appreciate, having the opportunity to do that.
01:19:50:28 - 01:20:08:27
Unknown
And then, of course, I want to throw a quick thank you to the folks behind the scenes. You helped me do this show, Victor Bug not who edits, and Daniel Kraft, who does our social media stuff. So thank you to, Victor and Daniel. Thank you to you for watching and listening each and every week. We will see you next time because there will be a next time.
01:20:08:27 - 01:20:15:17
Unknown
That'll be leave a review. Don't forget to leave a review. Please do and we'll see you next time on AI inside. Take care everybody. Bye.



