Jason Howell and Jeff Jarvis dive into the $665 million Shiba Inu coin donation to the Future of Life Institute, the potential $100 billion "Stargate" supercomputer from OpenAI and Microsoft, Jason gives a hands-on demo of Stability AI's new music generation model, and more!
NEWS
- Artifact social news curation app shutting down, acquired by Yahoo
- Yum Foods has plans to go all in on generative AI
- How people are using generative AI based on Harvard Business Review report
- New York City's chatbot providing incorrect legal/business advice
- Generative AI used to create virtual persona of real person for advertising
- OpenAI and Microsoft in talks for $100 billion "Stargate" supercomputer
DEMO
- Stability AI's Stable Audio 2.0 for music generation
[00:00:00] Every day is filled with opportunities to express yourself. And within every day piece
[00:00:04] from Blue Nile, you can do it without saying a word. Blue Nile's collection of diamond
[00:00:09] jewelry is the ultimate statement, with sparkle that elevates any outfit or switch things
[00:00:15] up with a sapphire piece sure to spark conversation. Either way, their diamond guarantee ensures
[00:00:20] you get the highest quality but the best price. Express yourself with Blue Nile, the original
[00:00:26] online jeweler, the bluenile.com. That's bluenile.com.
[00:00:30] Since 2013, Bombas has donated over 100 million socks, underwear and t-shirts to those facing
[00:00:36] homelessness. If we counted those on air, this ad would last over 1,157 days. But if
[00:00:43] we counted the time it takes to make a donation possible, it would take just a few clicks.
[00:00:47] Because every time you make a purchase, Bombas donate an item to someone who needs it.
[00:00:51] Go to Bombas.com slash Acast and use code Acast for 20% off your first purchase. That's
[00:00:56] Bombas.com slash Acast, code Acast.
[00:01:00] I'm Sandra and I'm just the professional your small business was looking for. But you didn't
[00:01:04] hire me because you didn't use LinkedIn jobs. LinkedIn has professionals you can't find
[00:01:08] anywhere else, including those who aren't actively looking for a new job but might be open
[00:01:13] to the perfect role like me.
[00:01:15] In a given month over 70% of LinkedIn users don't visit other leading job sites so if
[00:01:20] you're not looking on LinkedIn, you'll miss out on great candidates like Sandra.
[00:01:24] Start hiring professionals like a professional.
[00:01:26] Post your free job on LinkedIn.com slash spoken today.
[00:01:34] This is AI Inside Episode 11 recorded Wednesday April 3rd, 2024. Open AI and Microsoft's $100
[00:01:41] billion Stargate supercomputer.
[00:01:44] This episode of AI Inside is made possible by our wonderful patrons at patreon.com slash
[00:01:49] AI Inside Show. If you like what you hear head on over and support us directly and thank
[00:01:54] you for making independent podcasting possible.
[00:02:02] Hello everybody and welcome to another episode of AI Inside. Hopefully your weekly source for
[00:02:11] you know what's going on in the world of AI? From news, sometimes interviews had an amazing
[00:02:16] interview last week with Salcon. Really each and every week we have the opportunity to
[00:02:21] kind of look at a different aspect of artificial intelligence and learn all about it on one
[00:02:26] of your hosts. Jason Howell also joined by my co-host on this show each and every week.
[00:02:31] Jeff Jarvis, good to see you Jeff.
[00:02:33] Oh, oh boss. Good to see you.
[00:02:34] Yeah.
[00:02:35] Good to see you too. You were you were busy earlier this morning sort of related to the
[00:02:41] show. We're not going to talk about it but tell us a little bit about what kind of what
[00:02:43] you were watching or participating.
[00:02:45] So meta was having an update on a fascinating project they've done with Stanford with the
[00:02:51] deliberative democracy project there where in four countries, they went to a deliberative
[00:02:55] democracy is really interesting because people are made to read them to first get surveyed
[00:03:02] on their attitudes then they're forced to read they agree to read a bunch of background
[00:03:07] information about something and then I think there's discussion and then afterwards you
[00:03:11] see how their attitudes have or have not changed and they have a whole bunch of questions
[00:03:15] about AI and that is governance of that.
[00:03:19] And they wanted to get people's attitudes about what mattered to them most.
[00:03:22] And I haven't dug through it all yet. It's a lot.
[00:03:25] It was Brazil, Spain, US and Germany.
[00:03:29] You could guess the Germany with more conservative above anything.
[00:03:31] But one thing that comes out very clearly is that if people are not told that they're dealing
[00:03:36] with AI, they're bloggers. They're not happy.
[00:03:40] But when they are, it has an impact.
[00:03:43] There's some things they still don't like and even if you tell me I don't want to do
[00:03:46] that.
[00:03:47] Like he really shouldn't use this for romance.
[00:03:50] That's kind of creepy.
[00:03:52] But other interesting things where they where they talked about it so the data itself
[00:03:57] interesting.
[00:03:58] I got to spend some time to dig into it.
[00:03:59] And then the question is, and this was the questions in the brief time, only 50 minutes
[00:04:02] of questions people had is how actually Meta is going to use this in their governance.
[00:04:08] But it's an interesting model to look at.
[00:04:10] So I'm glad they're doing it.
[00:04:11] You know, one can like or dislike Meta, the oversight board, one can respect or not.
[00:04:18] But they do do experiments like this which are interesting.
[00:04:20] So I'll see, we'll talk about it in a future episode.
[00:04:23] Yeah.
[00:04:24] And some of their efforts when it comes to AI have been more open than what we're seeing
[00:04:28] from a lot of other companies.
[00:04:29] So at least there's that too.
[00:04:31] Right.
[00:04:32] Yeah, okay, cool.
[00:04:34] Well we can definitely look back on that.
[00:04:36] Maybe next week take a look back once we've had a little bit of time to digest and stuff.
[00:04:41] When you said things AI probably shouldn't be used for.
[00:04:45] You mentioned romance when one of the things that came up for me was law.
[00:04:49] And we're going to actually talk about that a little bit later in the show too because this
[00:04:53] keeps happening where AI is used in certain ways to advise around laws and stuff like
[00:04:58] that.
[00:04:59] And it's not the right tool for that.
[00:05:00] At least not at this point, maybe eventually but right now it keeps me bad enough to put
[00:05:05] AI with facts.
[00:05:06] But be wrong about facts that can have you go to jail or who's not.
[00:05:10] Exactly.
[00:05:11] Facts with consequences like really big consequences.
[00:05:14] Maybe not a good idea.
[00:05:15] Yeah.
[00:05:16] So yes, we will talk a little bit about that in a bit because we do have some news
[00:05:20] talked about real quick.
[00:05:22] Again, thank you for everyone giving the show review on Apple podcasts and you know what?
[00:05:28] If you're not reviewing it, you know what also really helps just letting people know.
[00:05:33] Tell people about AI inside.
[00:05:36] Here's one thing I've noticed real quick before we get into things.
[00:05:38] I've been doing technology podcasting for almost 20 years now between seeing it and
[00:05:42] trample.
[00:05:44] And I've done a lot of content around technology over the years.
[00:05:50] And I would say in my personal life like friends that surround me and everything, the
[00:05:55] content that I've done up until now hasn't been nearly as interesting to the normal people,
[00:06:02] the people who aren't so enthralled by technology that they just have to listen to podcasts
[00:06:08] on a regular basis about it.
[00:06:10] The normal people around me are very interested in learning more about AI.
[00:06:14] So when they find out that I'm doing an AI podcast, these are people that I'm used to
[00:06:18] them just being like, oh yeah, he's a podcast or okay.
[00:06:20] Anyway, so let's go out to dinner or whatever.
[00:06:23] Like they haven't cared before and now suddenly they're like, oh I'm listening to your
[00:06:27] podcast.
[00:06:28] I'm learning a lot about AI so it's interesting for me because it's different.
[00:06:32] I'm not used to that.
[00:06:33] I'm used to people kind of to a certain degree ignoring it a little bit in my personal
[00:06:37] life.
[00:06:38] If you're nice enough to mention this in social media, tag us and we will clap and gratefully
[00:06:43] and ecotistically repost, retweet whatever you say.
[00:06:48] And including criticism, including wishes of things you want us to do differently.
[00:06:52] Absolutely.
[00:06:53] That means we'd love to be part of the conversation.
[00:06:55] 100%.
[00:06:56] So just look for Jeff Jarvis and I on the line and then AI Inside Show.
[00:07:00] And then of course, yes, you can support us directly via patreon patreon.com slash
[00:07:06] AI Inside Show.
[00:07:08] That is the way where you know, you can put a few dollars behind your support and that
[00:07:13] goes to allowing this show to exist essentially.
[00:07:18] You know, we do have some costs with the show and we would love to grow this.
[00:07:21] So patreon.com slash AI Inside Show, you do get some perks.
[00:07:25] You get some bonus stuff depending on the level definitely add free versions of the show
[00:07:30] if that matters to you.
[00:07:32] And you also get to be called out at the beginning of the episode.
[00:07:35] So who's going to be called out this week Jason?
[00:07:37] The winner this week of the call out is Crystal Crystal.
[00:07:41] I just have crystals first name but thank you, Crystal for supporting us.
[00:07:45] Thank you.
[00:07:46] I believe still we're kind of in day one signups.
[00:07:48] There were a lot of people that moved over when we first launched the show.
[00:07:51] So Crystal, thank you for supporting us and enabling us to bring you AI Inside each
[00:07:58] and every week.
[00:07:59] Okay.
[00:08:02] How could I fly Jeff because just one year ago the future of life institute released
[00:08:10] its pause AI letter.
[00:08:12] And actually when I read this today, I was like really?
[00:08:15] It's already been a year since this happens since everybody was like freaking out about
[00:08:19] the fact that this letter, you know, co-signed by Elon Musk, Wozniak, a whole bunch of names
[00:08:26] basically saying we need to halt the development of AI systems for six months to I think enable
[00:08:34] kind of safety and regulation.
[00:08:35] They were never clear about what they wanted to do in that six months.
[00:08:38] Yeah.
[00:08:39] This is danger danger will robinson stop now before it's too late.
[00:08:43] These are moving too quickly for our comfort and we want things to slow down for some reason.
[00:08:48] And I mean, you know, to a certain degree, the some reason has, you know, often has to do
[00:08:53] with funding.
[00:08:54] I'm sure the future of life institute, you know, big part of their credo has to do with
[00:08:58] the existence of humanity into the future.
[00:09:02] But there's a lot of money riding on the line and that's kind of why we're discussing this
[00:09:06] today.
[00:09:07] I'm friend of the show, Nareet Weiss Blatt who was on episode three by the way to talk all
[00:09:12] about a lot of these topics, specifically the future of life institute did write about
[00:09:19] basically.
[00:09:21] So yeah, what happened right now is essentially a year ago, was it a year ago?
[00:09:27] There was an undisclosed $665 million, $665 million donation from VitaLick buterin,
[00:09:36] who's the co-founder of Ethereum cryptocurrency, he's actually donated this money in the form
[00:09:41] of Shiba Inu coins to the future of life institute.
[00:09:46] I think a year ago if I'm not mistaken and this was news that prior to, you know, this
[00:09:52] coming out in the past week, we weren't aware of this.
[00:09:55] I think the understanding was that the future life institute was, quote, primarily funded by
[00:10:02] the Musk foundation.
[00:10:03] Which was to the tune of like 10 million, you know, much lower.
[00:10:07] Much lower.
[00:10:08] $665 million.
[00:10:09] And actually what I think is really interesting about this is that I remember back in, was it
[00:10:17] 2021, I think when buterin had, you know, Shiba Inu was kind of, I think it was still popping
[00:10:24] off at the time as Nareet calls it the shit coin called Shiba Inu.
[00:10:31] But anyways, he had made news back then because he had burned 90% of his Shiba Inu holdings.
[00:10:39] And then he had planned to give the remaining 10% to a quote, not yet decided charity
[00:10:44] with some values to crypto relief in parentheses preventing large scale loss of life but
[00:10:52] with a more long term orientation.
[00:10:55] And I think what we're seeing now is it turns out this is, this is where that money went.
[00:11:00] It went to the future of life.
[00:11:02] The future of life and Nareet does really good work in looking at where this money goes
[00:11:07] and what she was on the show, she talked about how they're funding fellowships and classes
[00:11:13] and scholarships and things at major universities trying to get this gospel of test-created long
[00:11:19] termism out into the world.
[00:11:22] And it's not just the two people you mentioned so far there's also Max Tecmark who is a co-founder
[00:11:28] of the so-called Future of Life Institute.
[00:11:31] He's an MIT, Jan Talling, the billionaire Skype co-founder and other folks are involved in
[00:11:39] this whole I would argue cult like thing.
[00:11:43] And it has huge, huge resources as a result to go spread this.
[00:11:49] And my problem always with this is that media are not doing their homework.
[00:11:54] The New Yorker did a cover about some of this, I think, two issues ago and didn't really
[00:11:59] get into the cultish issues of that.
[00:12:01] Just kind of said, oh these guys are a little crazy and they're really interesting and
[00:12:03] fun.
[00:12:04] No, they're out there claiming that there's going to be doom from technology and it's their
[00:12:10] way to control that technology.
[00:12:14] And they're getting hearings in Congress and they're getting hearings at number 10 downing
[00:12:18] and in Brussels and in media.
[00:12:20] And nobody's really except for Nereet and Emil Torres who Jason and I have talked to
[00:12:26] before.
[00:12:27] And others have really talked this through.
[00:12:30] So I'm glad we're coming back to the story often and we're going to keep doing it because
[00:12:35] we've got to get some media attention to this stuff.
[00:12:38] Yeah, yeah, really fantastic work on Nereet's part, her write-up on the AI panic.
[00:12:44] It's a very detailed write-up.
[00:12:46] Very detailed, yeah.
[00:12:47] I'm like, as I'm reading through it, I'm like, well how the organization that it takes
[00:12:51] to really kind of put this timeline together and kind of connect the dots and everything
[00:12:56] is something that Nereet is very, very good at.
[00:12:59] So I highly recommend anybody if you're not subscribed to the AI panic newsletter you should
[00:13:04] be.
[00:13:05] And if you do and when you do, you'll see other reports I'm sure there's a shorter political
[00:13:09] version of it.
[00:13:10] Yes.
[00:13:11] The headline is the little known AI group that got $616 million so that gives you kind
[00:13:14] of the what we call cliff notes for this.
[00:13:18] Yeah, Shiba Inu.
[00:13:20] Who'd a thunk?
[00:13:21] Who'd a thought that Shiba Inu would change the world as it is?
[00:13:29] Yeah, where is Shiba Inu now actually?
[00:13:31] I'm super curious.
[00:13:34] This was the, I mean it was a total kind of meme group back in the day here.
[00:13:42] Sorry, I'm trying to do this on the fly.
[00:13:44] So I apologize.
[00:13:47] Well, I think the price today dropped 15%.
[00:13:53] That I know.
[00:13:54] Well, three.
[00:13:55] Okay.
[00:13:56] Yeah, April third but it's not given me so hard to do the price because the price is 0.0020703.
[00:14:10] If you were one of those meme coin investors way back in the day and you didn't get scared
[00:14:16] by the total bust or the total drop in value that everything saw like a year, a year and
[00:14:23] a half ago, two years ago and you still held on to things.
[00:14:26] I don't know.
[00:14:27] Maybe you've recovered some of that with this but who'd a thought that as Nareed put
[00:14:31] it a shit coin like Shiba Inu and there are many others like it.
[00:14:36] This one in particular got a big boost because Elon Musk tweeted it and that kind of seemed
[00:14:42] to launch it into another stratosphere within the confines of what it is which seems to be
[00:14:49] a pretty insignificant crypto coin.
[00:14:51] Yeah, I don't know.
[00:14:52] The whole crypto thing but I totally lost, lost any sort of connection or ongoing interest
[00:14:59] in seeing where it's headed and then I looked at the Bitcoin price recently and was like
[00:15:03] holy moly, like is it beyond where it was when it was at its peak?
[00:15:07] If not, it's very close.
[00:15:08] It's crazy how these things up.
[00:15:10] I wish for it.
[00:15:11] Maybe somebody has done this and if you already have the audience to see this please send
[00:15:14] it to me as a family tree.
[00:15:16] The connections between crypto and AI Dumerism.
[00:15:21] Yeah.
[00:15:22] Oh my goodness.
[00:15:23] It's overlapped there.
[00:15:25] I bet you absolutely there is.
[00:15:28] And I think we've mentioned this but it seems like a lot of the people who are big time
[00:15:33] pushing, promoting cryptocurrency a handful of years ago, kind of diverted their attention
[00:15:40] over to AI being the new kind of big exciting thing that has everybody's imaginations going
[00:15:46] and where there is that interest in that imagination.
[00:15:50] There is a lot of money hiding and there is well, a lot of money opportunity if you know
[00:15:55] how to air quotes play the game which feels like a big part of the cryptocurrency game.
[00:16:03] So anyways, that's a little bit of a divergence from AI but I think as you point out also
[00:16:09] kind of related.
[00:16:10] It's just we're starting to see how these relationships actually come into play.
[00:16:16] That's really interesting.
[00:16:18] And I was never fond of the crypto boys and so they're the last people I would put in
[00:16:22] charge of the world run by AI.
[00:16:24] Sure.
[00:16:25] Sure.
[00:16:26] Yeah.
[00:16:27] Interesting.
[00:16:28] So you fond of artifact, the social news curation app.
[00:16:34] I feel a little guilty.
[00:16:36] I feel a little guilty Jason because artifact was a neat thing started by folks to do AI
[00:16:41] and news and I looked at it one day and I said oh this looks nice maybe I'll use this
[00:16:46] and I forgot about it completely forgot about it until the next thing I heard was that
[00:16:50] they were almost almost going to shut it down.
[00:16:52] Yeah.
[00:16:53] And a week ago said oh no actually once we got rid of our entire staff it's not that expensive
[00:16:58] to keep going.
[00:16:59] So we'll keep it going for now and then today there's new news Jason.
[00:17:04] Yeah, yeah, yeah.
[00:17:05] Well essentially so what happened in this timeline?
[00:17:08] This is an app that launched like not like a year ago maybe a little bit more than a
[00:17:13] year ago and I'm you know I installed the app and I was like okay this is kind of cool
[00:17:18] like I'm not entirely seeing the social graph component here yet like it took a while
[00:17:24] for me to kind of see how effective it was at recommending articles that I cared about.
[00:17:31] I'll be completely honest like when I used it it was hard for me to see immediately the
[00:17:35] difference between what I got out of using that app versus what I got out of using something
[00:17:39] like Google News and so I didn't really use it a lot like I tried to use it and tried
[00:17:46] to use it and then eventually I stopped trying to use it and then I think three months ago
[00:17:54] the founders who by the way are co-founders of Instagram my career Kevin's system essentially
[00:18:00] announced that artifact was going to be shuttered that there wasn't you know demand and that
[00:18:05] they were there didn't seem to be enough reason to continue the development of the app
[00:18:11] and the service all the employees pretty much left the building and then they started
[00:18:16] hearing from suitors and one of those suitors was Yahoo who ultimately beat out the others
[00:18:22] and are essentially acquiring artifact the technology not the employees they've already
[00:18:29] not the app either really they're not intended to use the app they're intending to integrate
[00:18:34] right technology into Yahoo news which I haven't been how long has been so many of the Yahoo
[00:18:40] news yeah yeah I don't know yeah it's one of those legacy kind of things that's like oh yeah
[00:18:46] still still kicking I bet a lot of people use it though when I told us still this yeah
[00:18:50] when I go home to visit my my mom and you know there's always like a short little list of
[00:18:57] technology things that I need to help her with just it's just my son Lee duties to come home
[00:19:04] and you know fix so be it for my mom when she has them and one of the things that I often have
[00:19:10] to do is I have to you know kind of like check take a look at her email which isn't a primary
[00:19:15] email it's not like a Google new or a Gmail or anything like that it's you know some random
[00:19:20] email service and when I go to that site to like log in that's when I see things like Google news
[00:19:26] or Microsoft you know being news imported into their kind of interface and it's all populated
[00:19:33] and graphical and colorful but those are the times that I interact with something like a Yahoo
[00:19:40] news not necessarily as a destination and that just makes me think that a lot of people probably
[00:19:44] do whether they're going to the Yahoo news site or not that information is being pulled into other
[00:19:50] places you know in interactive ways and people still I mean go there and yeah it was a destination
[00:19:56] for news and it's you know it's a list of news and okay and mail and such so maybe this will improve
[00:20:03] with them but Yahoo of course gained a reputation for killing the babies at bot. Flickr yeah
[00:20:12] you know it almost died under Yahoo other services of course Yahoo who's now on its 57th owner
[00:20:18] since those days but it's it's this kind of to me it's a zombie of the internet yeah like AOL
[00:20:26] yeah yeah just keep working together yep they have plans to integrate this like recommendation engine
[00:20:33] technology into Yahoo news um 185 million readers per month by the way with Yahoo news so yeah not
[00:20:42] insignificant and system and Krieger they're gonna stay on with Yahoo through the transition as
[00:20:47] advisors then I imagine they're gonna bounce so there you go artifact the app not not continuing
[00:20:54] technology investors were made whole which is probably the most you could wish for yeah yeah
[00:21:00] this is interesting uh how AI intersects with fast food this is an article that you forwarded to me
[00:21:09] late last week and uh yeah tell me tell me why this caught your attention I'm curious well I am
[00:21:16] a regular I must admit Taco Bell customer the new product the three cheese flat bread is pretty
[00:21:21] good folks okay spicy it's noted little saucy it's it's good um some of my normal order has been
[00:21:29] two bean burritos that said I like them no one yes that's it's good it's a classic so classic
[00:21:34] never it is it is the old days I used to love the old and charito but they brought it back kind of
[00:21:38] ruined it anyway yeah uh so Taco Bell I fascinates me because it was really a junky brand yeah
[00:21:46] and and through market clever marketing they made it oddly hip um strangely kind of camp hip
[00:21:55] and fun and I remember going to VidCon and they had really cool um uh T-shirts and other things
[00:22:03] there and they're doing fun things on social media and so on so now we have their head of
[00:22:09] their chief digital and technology officer at young LeBrens the parent company say they're
[00:22:14] gonna go AI first so what just fascinating to me about this is how yeah how so obviously they're
[00:22:20] gonna use um AI to understand demand and time and pricing and so on uh who was it it was not
[00:22:27] um well what chain was it oh what was Wendy's that said they were gonna do variable pricing
[00:22:33] higher pricing in prime and prime times and people went I didn't hear it was not they went
[00:22:38] berserk about this that's ridiculous and they said now they're not gonna do it now if they
[00:22:42] just said we may have lower prices in off time is to drive demand people was it really cool great
[00:22:48] right they were stupid into the opposite but I would imagine menu items pricing uh marketing regional
[00:22:54] and fun the AI is in everything now yeah so it's not that big a deal but it just amused me um
[00:23:01] and so we're gonna see whether we can get somebody from Taco Bell on the show
[00:23:06] to talk about this because maybe there's ways that we just can't imagine that they're doing this
[00:23:10] now obviously at the same time as California just enacted its $20 minimum
[00:23:16] um uh per hour wage scale in fast food restaurants in large chain restaurants put that way
[00:23:24] um much talk about how they're going to raise prices and much talk about how they're gonna further
[00:23:29] reduce staff but they've already been doing that if you go to new McDonald's not that I go
[00:23:33] there very often folks but I walk by them um you see them pushing you hard to order on the screen
[00:23:40] mm-hmm yeah it's uh reduce the counter it is it is interesting when I walk in you know I don't go to a lot
[00:23:46] of fast food but occasionally we go to habit it's my younger daughters like favorite restaurants so
[00:23:51] and I can't use yeah we go to have it and that's actually part of this as well I think habit is
[00:23:56] one of the umbrans chipotle is not I'm sorry to say Jeff um but we you go into some of these
[00:24:02] these restaurants and yeah the first thing you see is this like row of kiosks of screens and as a
[00:24:08] customer I'm always like I don't know which one I should do like there's someone standing there
[00:24:14] like why not just walk up to them and and give my order to them or you know do it on the screen and
[00:24:21] the more people do it on the screen what signal does that send to the restaurant that now suddenly
[00:24:25] they don't need you know as many to hire as many people and I mean ultimately that's kind of
[00:24:30] you know part of what we're talking about here do a park is uh like you said the chief digital
[00:24:34] and technology officer and you know his vision here is that so many aspects of the fast food
[00:24:40] industry can benefit from generative AI you know they've got a they've got an app a super app
[00:24:48] for restaurant managers for managing their operations and it could have a chatbot in there that
[00:24:53] acts like a coach says this article on the Wall Street Journal how should I set this oven
[00:24:57] temperature as one example and um you know this kind of vision that the technology in the restaurant
[00:25:03] could be interfaced with and could be automated with the use of AI and chatbots in the future so instead
[00:25:10] of so like I remember back in the day when I got hired at you know this restaurant or whatever
[00:25:15] and you got the employee manual and you're supposed to read it and if you if you need to reference
[00:25:20] you know certain guides to understand the processes of what or whatever and this just strikes me as
[00:25:25] like one of those things that LLM's are really good at it's like if you did a certain specific
[00:25:30] data set and then query the data set in natural language and uh somebody coming into a new role
[00:25:37] that needs know how should I set this oven temperature doesn't have to flip through a manual or
[00:25:41] you know or bother someone else to figure it out they just ask and they get the yeah and hopefully
[00:25:45] they get the right answer that's well that's that's an issue um my wife teaches English as a second
[00:25:49] language and you know a couple of her students work in McDonald's might select that you could
[00:25:53] imagine language would be useful for this as well but the basic bottom line here is a commission
[00:25:58] I'm amused by Taco Bell stories and all you have to put out a press release now with AI at an
[00:26:03] unexpected place and I'm a son and boom yeah you're gonna read it and you might even take a trip
[00:26:09] to Taco Bell to pick up a meal to like complete the I think I need to per research eat it while reading
[00:26:14] it yes yeah I think so okay I forgive you for that um all right and I did by the way I did
[00:26:23] reach out uh to Joe uh on LinkedIn who the heck knows if Joe's gonna see it and and respond but um
[00:26:31] I'd love to get Joe on and talk about this stuff to just speculate about what it could do absolutely
[00:26:36] I think that would be a really fun conversation so if anyone watching or listening knows Joe Park
[00:26:41] uh the PR people for Taco Bell yeah there are many putting a good word for us we'd love to have them
[00:26:48] all right we've got some more news and uh a little bit later I'm gonna do a quick kind of hands-on
[00:26:53] with an update to stability AI's stability audio um uh yes stability audio 2.0 uh music generation
[00:27:03] because I came up with some really interesting stuff there so I'm gonna show that off but we've got
[00:27:08] more news in a second normally being a little extra can be a bit much but when it comes to health care
[00:27:17] it pays to be extra and United Healthcare makes it easy with Health Protector Guard fixed
[00:27:22] indemnity insurance plans underwritten by Golden Rule Insurance Company they supplement your
[00:27:26] primary plan helping you manage out of pocket costs without the usual requirements and restrictions
[00:27:31] like deductibles and enrollment periods so when it comes to covering your medical bills you can
[00:27:36] feel good about being a little extra visit uh1.com to find the Health Protector Guard plan for you
[00:27:42] have a catch yourself eating the same flavorless dinner three days in a row dreaming of something better
[00:27:47] well hello fresh is your guilt-free dream come true baby it's me Kiki Palmer let's wake up those
[00:27:53] taste buds with hot juicy pecan crusted chicken or garlic butter shrimp scamping um hello fresh
[00:28:02] stop dreaming of all the delicious possibilities and dig in at hello fresh dot com
[00:28:07] let's get this dinner party started
[00:28:16] all right the information has sources saying that Microsoft and OpenAI are in talks to build a massive
[00:28:22] supercomputer to the tune of one hundred billion dollars of course this is not guaranteed to happen
[00:28:29] at this point these are sources saying that this is kind of like percolating at the moment but
[00:28:34] if this were to happen that expense alone would be more than triple of Microsoft's spend last year
[00:28:41] on servers buildings and other equipment well uh to build this this massive supercomputer that they're
[00:28:47] calling stargate yeah he's got to go with the sci-fi names yeah yeah kind of tap it I kind of
[00:28:53] like that that's uh that uh anthropic name that's AI just quad quad yeah Joe I'm Joe just talk to Joe
[00:29:03] we could trust Joe right we could trust Claude um stargate would power open AI's next generation of
[00:29:09] AI systems millions of AI accelerators estimates show it would be one of the largest most advanced
[00:29:17] data centers in the world because we're talking like this would be a massive scale several hundred acres
[00:29:23] here's here's a bunch of language that uh is sort of Greek to me but sure sounds like a lot so I'll
[00:29:28] just kind of read it verbatim five gigawatts accommodating uh 40 40 thousand Nvidia DGX GB 200 NVL 72
[00:29:37] rack systems you know those wait wait how many rack systems 40 thousand rack systems 2.88 million
[00:29:46] blackwell GPUs within them so when I watched the Nvidia keynote with Jensen you won yeah
[00:29:58] three or four weeks ago now and I would imagine maybe they were doing this visualization for Microsoft
[00:30:04] but they showed all the power of the blackwell right then there's two black wells on a board
[00:30:09] and then all these boards in a rack at all these racks in a row and all these rows and it's
[00:30:14] showed just going on and on and on and on and that took my breath away yeah in terms of the computing
[00:30:20] power and this is it yeah this is what I want to build but I still haven't seen it and this is
[00:30:26] always when I quote this stochastic parents paper saying that the size doesn't matter bigger can be
[00:30:31] worse for all kinds of reasons what I haven't heard really is it's gonna sound odd but the use case
[00:30:39] for such overwhelming compute power as they say uh houses actually used other than saying oh we can
[00:30:47] you know do more cool things with AI yeah but do we really need it for a business use case
[00:30:53] for a cultural use case uh I don't know yeah right find out what exactly does that translate into
[00:30:59] are you building are you building something like this towards a perception that there will be a
[00:31:06] need for this because we're talking 2028 this would be ready to go in 2028 you know so we're
[00:31:11] it's a little ways out there are some interim steps along the way it would be done in stages there
[00:31:16] would be a smaller scope supercomputer launching in 2026 on the way to full odds stargate um yeah but
[00:31:25] like what is what is the application of this at the end of the day how does this get put to use
[00:31:31] is it just oh we have a really a really powerful chat GPT now you know and is that even enough
[00:31:37] five years from now you know to justify any of this I mean I imagine it goes much further than that
[00:31:42] but you know as we're talking about this I'm recalling the fiber bubble that occurred
[00:31:49] that there was a huge overbilled fiber in the country and that's really what enabled the internet
[00:31:54] to grow at a lower cost because it was it was on it was dark fiber then it wasn't used
[00:31:58] the forecast before the first crash in 2000 were wrong there was over investment I had to go
[00:32:05] somewhere I wonder whether there's going to be a similar compute bubble and over invest in computing
[00:32:11] power where the business isn't really there but there could be dividends then is that oh well
[00:32:16] hell you want to get on a super computer and do your stupid little thing okay we got we got
[00:32:21] capacity huh we got the supercomputer put it to use yeah right like three years from now if
[00:32:26] this actually goes into place three years from now is the landscape as as a warm inviting and
[00:32:34] welcoming to the you know the existence of these generative AI systems now you know now it is
[00:32:41] in three years from now as many people have been kind of saying like oh I don't know if the
[00:32:47] demand's really there you know like we've been really hot about this stuff but maybe we're super
[00:32:52] overhyped and in a couple of years we're going to realize we you know we need to correct around
[00:32:57] this and if that's the case what happens to the stargates of the world that in the midst of
[00:33:04] the billion computers uh you know our mutual friend Leo the port has been kind of running joke with
[00:33:10] Leo and Paris and me on twig where Leo was year and a half ago was yeah it's a parlor trick it's
[00:33:16] not going anywhere total and then he took a walk on the beach with somebody we keep on wondering
[00:33:20] who it is was it Simon Altman was it Peter T. Who could it have been um who uh convinced him that
[00:33:26] no no no this is going to be huge and there's all kinds of things you can do with it and so the joke
[00:33:30] is now that he's a full effective accelerationist okay um and so that's the better out that's
[00:33:37] the bet that's going on right yeah is is is this a parlor trick or is this um the train you want
[00:33:43] to be sure to be on yeah we don't know yet yeah yeah that is interesting I do remember yeah producing
[00:33:49] behind the scenes when when Leo would talk about the parlor trick and there were I will admit there
[00:33:53] were a few times where I was in my office you know producing and headphones saying out loud uh I don't
[00:33:59] I don't agree I think you're wrong on this Leo I think you're right on a lot of things I think
[00:34:04] you're wrong on this and yeah and I was I was kind of you know interested to see when he started to
[00:34:10] come around on that and um you know I think that's that's telling yeah we'll certainly so huge
[00:34:16] investment um speaks loudly but yeah also go bad yeah indeed so we'll see you shared with me um
[00:34:25] also this past week an interesting article on Harvard Business Review how people are really using
[00:34:32] generative AI this is um a report done by filter technologies who basically mind the web
[00:34:39] to and you know online communities like Reddit and all these different places pulling a lot of
[00:34:43] data look for patterns in how people are most using generative AI right now and uh is what
[00:34:52] it what what what stood out to you when Jason what I thought was interesting to talk about because
[00:34:57] so they they they it was an interesting methodology that they went and found out what people were
[00:35:01] talking insane they're doing with it which I think there's more than you see all these surveys of
[00:35:05] CEOs say they're going to use it for this and that CEO said they're going to lay up people so this
[00:35:09] became a a scotch more real um they put it into categories uh people using it for technical assistance
[00:35:16] and troubleshooting 23% content creation and editing 22% personal professional support those
[00:35:23] are two very different things 17% learning education 15% creativity and recreation which I think
[00:35:29] is the under song unsung hero here 13% and then research analysis and uh oh decision making 10%
[00:35:36] yeah so that was interesting itself and they looked at various we know what what are people doing
[00:35:40] generating ideas and so on so forth when I went down to this huge chart they had that's what
[00:35:45] fascinated me because they have um a hundred different uses that they identified and some of them
[00:35:52] as well as curious to go through with you just to pick some out at random some are sensible and good
[00:35:57] and some are um full hearty put it that way yeah so uh generating ideas okay as long as you're
[00:36:08] willing to know that this comes from those who had the power to publish all these many years and
[00:36:12] it brings biases and limitations but if it helps you think better okay that's number one kicks
[00:36:17] kicks off a process of some sort I I know yeah from people I talk to that's a big part of how
[00:36:22] they're using it is generating ideas kind of a starting point right I have lying I'm not used for
[00:36:28] for that yet uh you know I'm trying very well I haven't usually yet but number two number two
[00:36:32] the most I don't think that's my popularity or anything I can be wrong but but number two is therapy
[00:36:37] and companionship that one doesn't line up for me no it doesn't either uh and it's kind of sad
[00:36:44] I can understand why people might think of these systems in that way but I don't know like
[00:36:53] I don't know if it's if I would call it a sixth sense necessarily or what but when I know I'm
[00:36:59] interacting with a computer I you know it could be said it could be telling me the same thing
[00:37:04] that a human would tell me but once I know that it's a computer it's an algorithm it's an AI
[00:37:10] and not an actual human I automatically it's like that humanity part of it is gone and it means
[00:37:18] less to me as a result and I think there's so much of therapy and especially companionship at least
[00:37:24] how I define it requires the humanity part to know that I'm being heard by another human or
[00:37:31] having a conversation with another human and we're on the same wavelength instead of just being
[00:37:35] informed by by data and numbers right and you know I defend the internet I have a book coming out
[00:37:41] in the fall called the web we which defends the freedoms of the internet and the value of the
[00:37:45] internet and also examines the problems of the internet and and I tend to mock those including
[00:37:52] Jonathan Hite right now who say oh the phone is taking away our humanity and doing bad things to
[00:37:56] young people and so on and so forth oftentimes our young people are using the phone to talk to other
[00:38:00] people to their friends they're more friendly with it however caveat to that is your internet
[00:38:06] friends aren't really like your real life friends they can be fair weather they can come and go
[00:38:12] they can use this in odd ways we haven't really worked out our norms there so you know there's
[00:38:20] a gradation there but if you find friendship in the machine I just maybe I'm being an old fart
[00:38:29] maybe the movie heard just disturbing but I think that's a little worrisome and I'm not a lord
[00:38:35] knows I'm no moral panicker about this but I find that a little worrisome yeah and I mean and
[00:38:40] I also realize as we're talking about this I feel a little hypocritical because it was just a couple
[00:38:44] of weeks ago that we talked about the the doll that had AI in it that was being a companion for
[00:38:51] you know people who felt very lonely and I think what it is what what I'm realizing is right
[00:38:55] that's very important for me yes yes and maybe there's some somewhere down the line where I find
[00:39:00] myself in that situation and I get it but but for me personally I don't get it and I'm not saying
[00:39:06] that it doesn't work for other people but I can't agree more all right so let's pick out pick out a
[00:39:09] few at at random there just yeah let's see specific song number three specific search well it
[00:39:17] doesn't work well with search now this is I had an argument with last week with Leo where he
[00:39:21] are trying to argue it does and I still think it doesn't I don't still think that it's not
[00:39:26] I heard I heard that conversation terribly good for that yeah yeah well I mean when I think about
[00:39:33] that I think about my use of perplexity which you know kind of the selling point of perplexity is
[00:39:39] it it's it's kind of like if you or at least the way I've been using it is sometimes I would turn to
[00:39:45] a search engine to make a comparison or to research a comparison between two things instead
[00:39:51] lately I've been going to perplexity and asking querying it and so kind of using it like a search
[00:39:57] engine and coming up with I think useful useful avenues with which to explore the answers in
[00:40:04] different ways it can I think I think it's more of that I think it's more more insight and inspiration
[00:40:10] fun and nonsense number six on this list yes that's what the best use is because it can't you
[00:40:16] can't be disappointed that way you can't be disappointed yeah really stupid jokes but you're not
[00:40:20] going to be led astray in that if you just say this is a toy it's fun what can I do with that how can
[00:40:26] I create that's what I want to see people doing more with um meeting summaries a lot of people are
[00:40:32] using it for that and when I'm I have not done that because I don't have to go to as many meetings
[00:40:36] anymore because I'm now embarrassed let me just say on that on that front if I can real quick
[00:40:42] please I've been I've been playing with the galaxy s24 ultra and it has you know galaxy a i's
[00:40:47] features and I had an experience where I recorded a meeting with it to test that feature out and
[00:40:54] it this summary was almost immediate and it was incredibly helpful it was like oh my goodness that's
[00:40:59] exactly what I wanted out of this experience so yeah it's I mean in my in my limited experience with
[00:41:06] this and I want to do more with it it's very useful for that especially for my like scatterbrain um
[00:41:12] kind of mind which sometimes like I can have a meeting and if it's not recorded you know even though
[00:41:17] I was fully present there half of the actual detail is going to be lost on me because it just
[00:41:23] I can't have a hard time retaining it and for me you know a business meeting where it's pretty
[00:41:27] straightforward I can see that being really useful yeah what I wonder about is you know having
[00:41:31] been identified as in the faculty meetings um there's a subtlety that occurs under the surface
[00:41:38] oh why are they saying that oh why wasn't this raised yeah uh oh they're agreeing with that
[00:41:44] at them those two are agreeing oh that's a new alliance right and it's never going to pick up
[00:41:50] on on the politics yeah and the um manipulation that can occur in meetings and so I wonder whether
[00:41:58] right they had dealt right and no no what I was gonna say is until we're at the point to where
[00:42:04] our co-workers are also robots and they're part of the mix and then they can pick up on those
[00:42:08] those relationships and that's fine so we could go on with a hundred other we want I know
[00:42:13] I just found this I found this interesting to think dungeons and dragons a safe space to ask questions
[00:42:20] medical advice again medical advice legal research I mean these are things that like if you're
[00:42:26] using it you better be using it with like a strong asterisk before and after that is like
[00:42:32] you know be really good or real doctor um a fact checking yeah um replying the emails okay but
[00:42:41] beware make sure you read what it says explaining legalese well okay but don't sign any contracts on
[00:42:47] the basis yeah so I found this a really interesting way to start thinking about how people might use
[00:42:53] the literate machine that can speak our language and and I want to kind of keep this
[00:42:58] chart around as I hear about people uh using this and the fact that this came from people's actual
[00:43:04] usage I thought it was really good work on the Harvard Business Reviews part and let's see who's
[00:43:08] the timeline here so we can get credit marked sour centers yeah really interesting stuff um and
[00:43:16] actually when I look at this list of all the ways that is being used I'm like wow that's
[00:43:21] that's a pretty amazing technology that can you know do some degrees all you know a lot of those
[00:43:26] things maybe some of them better than others and some of them not well at all but still people
[00:43:30] are finding uses in all these ways for you know some of this new stuff and it sure we had peace
[00:43:37] bits and pieces in different examples of this prior to the last couple of years but it really
[00:43:42] seems like the the barrier to entry is lower now yeah absolutely people are more inclined to
[00:43:49] experiment with these things and think about them when they're looking for you know possibilities
[00:43:54] or solutions or ideas or whatever you know the other moral of the story I think is that though
[00:43:59] I think that a g i is b s we're not going to get artificial general intelligence we're not going
[00:44:04] to get a machine smart as us that's the wrong goal but this is a general machine looking at this
[00:44:10] list of a hundred things that you can make it do with greater or lesser reliability and quality
[00:44:14] it still says wow you can ask it to do lots of things because it has this corpus of human speech
[00:44:22] and that's still impressive yeah yeah very impressive well stuff yeah people should definitely
[00:44:28] check that out we were talking about a law AI chatbots yeah on the other hand store
[00:44:34] yeah this is the good side and now we're going to go to bad sorry New York City launched a chatbot
[00:44:40] last year for giving New Yorkers information about starting a business in the city managing
[00:44:46] a business in the city powered by Microsoft's Azure AI services unsurprisingly enough the chatbot
[00:44:53] offered incorrect answers two things like in the article points out as the example as one of
[00:44:58] the examples do I have to accept tenants on rental assistance in New York City landlords cannot
[00:45:04] discriminate by source of income yet the chatbot reported that no landlords don't need to accept
[00:45:09] these tenants which is very wrong the city which is the the site the the city which is a confusing
[00:45:19] brand because it's the city of New York right the city dot NYC which by former Dean Sir Barlett
[00:45:23] helped start to fill the news desert that is New York City believe or not there we go they did
[00:45:29] this work with the markup yeah okay they did this work with the markup they have a list of
[00:45:33] examples there where and you know it's not just a couple of things do I have to inform staff
[00:45:38] about schedule changes the machine said no no specific regulations or requirements of mandate and
[00:45:43] forming staff the actual answer is for many workers particularly in retail fast food bosses are
[00:45:48] required to provide significant notice of schedule changes can I keep my funeral home pricing
[00:45:54] confidential yeah says the machine you can keep it confidential federal trade commission has
[00:45:59] outlawed concealing funeral prices yeah these are can I make my store cashless yes you can make
[00:46:05] your store cashless in New York City no stores have been required to carry cash as payment since 2020
[00:46:12] so it's just a case of go get on the train ride to say oh recall we have a chat bot yeah
[00:46:22] it's gonna be do wrong by the people it's meant to serve yeah um and probably try to I'm not a fan
[00:46:28] of the current mayor of New York uh mayor Adams um uh many art and I think it was an effort
[00:46:34] by his administration to to look hip and yeah to stay together and stay connected to the next thing
[00:46:42] huge mistake a few just certain yeah part of the constancy and it's good
[00:46:45] reporting to go to go play with it and find out what it's doing right wrong in that case
[00:46:49] you know I've I've objected to other cases where people say I got it to say a wrong answer well
[00:46:54] yeah sure we know that um but in this case where it matters to go find out um I think that it's
[00:47:02] good reporting yeah indeed and I think we know just real quick one one other thing they pointed
[00:47:07] out is that the same question asked by different people would generate different answers if
[00:47:14] asked enough times and that's that's a pretty big kind of critical thing at least if it was
[00:47:19] probably the same bad answer you could go in there potentially I mean you know theoretically speaking
[00:47:24] you could go in there and and put in a correction for it or something like that but
[00:47:28] but if it's if it's I don't know what it's doing what is it interpreting things differently
[00:47:32] depending on the the the verbiage of the question even though it's really about the same
[00:47:36] I don't know there's that's one of the one of the mysteries about about generative AI
[00:47:40] is it does give different answers different times is that because
[00:47:44] it responds to what people have been asking is it right is it that random um I don't mean
[00:47:49] it's not random but when it makes a decision based on this is the next word
[00:47:57] I could have asked the question exactly the same way but it could also be part of a sequence of
[00:48:01] questions which could affect what it thinks the next word is or the way I ask the question
[00:48:06] could affect what it thinks the next word is and again I don't think is the wrong verb here but
[00:48:10] right bear with me um so that requires some work to see what's different and in that meta project
[00:48:16] we mentioned at the top of the show one of the questions they asked is whether or not
[00:48:20] chatbot should be consistent their answers which is not an issue that I'd really
[00:48:24] thought of before but exactly this reason totally yeah okay how would you test it otherwise
[00:48:29] yeah absolutely um yeah I think reading through this again same same as you it made me realize
[00:48:35] oh that consistency part that's actually incredibly important I mean you know again it goes back to
[00:48:40] can you trust what you're using it for and if it's not consistently giving you answers in either
[00:48:46] traction then it then to a certain degree it is kind of random really and you know don't trust
[00:48:52] randomness those two things don't travel your friends give you plenty of that you don't need to
[00:48:58] cheat that's right and uh finally intelligence sir has an article about a creator named
[00:49:06] Ariel Ariel who has become a poster child of generative AI and online advertising she offered just
[00:49:15] a little bit about her story her marketing video services online through fiber through upwork
[00:49:20] and few other different marketplaces and a French digital marketing company called arcads
[00:49:25] reached out to her and asked her if she would take uh take part in an AI tech tech campaign she was
[00:49:32] given a few prompts asked to quote talk as long as you can they kind of came back and said can you
[00:49:38] you know talk in different locations and everything all of this in an effort to kind of like fill up
[00:49:43] some sort of AI system to create a virtual version of her and now one of the generated videos has
[00:49:51] gone viral she's in a car ranting to her phone um an audience about a body odor and deodorant
[00:50:00] here and I can show this but I do need technically I need to change my settings in order to do
[00:50:07] this so give me one second and I will share I wonder what she was told about the project what was
[00:50:14] disclosed and how it could be used because it's one of those it's for for a man I'll say that you
[00:50:19] you go do a model picture and suddenly you end up in an erectile dysfunction ad right uh you'd like
[00:50:25] to know how it might be used but you share your your your rights to your image and doing these kinds
[00:50:30] of things so I suspect she signed it away I don't know what the concomprations yeah I wonder how much
[00:50:35] how much visibility you have in that and um if if that's not part of the contracts that that folks
[00:50:41] like Ariel are signing it probably should be maybe make that abundantly clear like I don't know how
[00:50:47] she's responding to this let me see if this is gonna actually play the audio I hope that you can hear it
[00:50:50] so let me get this straight you guys are telling me that when you're out of the house for hours
[00:50:55] you're comfortable walking around with all that stinky body odor that's been building up on
[00:50:59] you all day what's even generated by the way is that some of you she didn't actually say this
[00:51:04] odorant on top of that odor which is honestly probably making it worse let me tell you what I do
[00:51:10] it's a little hygiene it's pretty convincing looking I recommend she should a little her head's
[00:51:14] a little naughty yeah I mean very much like staring into your soul you know her eyes are very
[00:51:20] much like staring deep into your she's not making they're not oblique not yeah and the audio
[00:51:28] the more I listen to it the more it has that kind of I couldn't even like describe it if I wanted
[00:51:33] to it's kind of a glassy kind of like breathy I don't know it kind of anyways it sounds the more I
[00:51:40] listen to it the more it sounds a I but all things consider that's pretty darn impressive it is
[00:51:45] pretty important something that that nailed you know that child in completely hate is the lumely
[00:51:51] lady who convinces women that they smell bad and tries to sell stuff and now this poor woman was
[00:51:56] used to become a lumely lady in essence I don't know what brand it was um and it's interesting so
[00:52:01] you know you worry about stock photos how they could be used you worry about stock video and how it
[00:52:06] could be used but now you've got stock identity and how it could be used which is fascinating and
[00:52:14] I mean technologically cool um and it's not the technology that's the problem it's how it's used
[00:52:19] and how people you know this is an area where there can be regulation that if you sign your identity
[00:52:24] to be used um the contract has to be drawn in a way that uh you have some say over your use of
[00:52:33] likeness but again same thing with stock photos same thing with stock video totally and now I mean
[00:52:38] now it's made up with you saying it yes exactly now it now it's yes is there is there a difference
[00:52:46] between the perception of reality and what actually happens when there you are saying these things
[00:52:53] and if you truly don't believe what you're saying yeah it really comes down to the contract right
[00:52:57] like yeah if you're doing this sort of work this is this is illustrating the need for a contract
[00:53:04] that is up to date with how these things are being used at this moment and will be used more because
[00:53:10] I'm sure this is not the only time we're going to see this sort of thing um yeah and I mean we've
[00:53:15] heard about this also you know in regards to like actors and actresses and Hollywood and
[00:53:21] you know that was part of the strike that would happen months ago I think that was part of
[00:53:26] the agreements that they were coming up with is that you couldn't you know there had to be kind
[00:53:30] of rules and guardrails around how how that likeness is used and one of the show all over your
[00:53:36] audience. What's the rundown today was that was an open AI says that they have a new
[00:53:41] generative model that can take a very short period of your own speech and then create more speech
[00:53:47] so this has people preaked especially cigarettes and so our cigarettes are out there saying don't
[00:53:54] you dare use this in this way and so we need to have norms and we need to have standards for how
[00:54:01] things get used. Yeah yeah interesting to see it all develop. Speaking of things developing so I
[00:54:10] saw right before the show that stability AI I saw the news anyways I don't know if it launched
[00:54:17] you know today or a few days ago or whatever but stability AI launched stable audio 2.0 this is
[00:54:24] its audio generation model it used to create 90 second clips and you had to be a paying member
[00:54:33] but right now what they're doing is it now creates up to three minutes of music and you you know
[00:54:38] you give it the words around the music that you wanted to create and it generates something up to
[00:54:44] three minutes you get 20 credits per month for free and I tested a few of these and you know what
[00:54:50] I what I realized in using them is that the generations actually it takes two credits per generation
[00:54:57] which I think is a little misleading like it must be a duration thing I bet if I shortened the length
[00:55:02] the duration of the clip it would use one versus two but you do also have the ability to
[00:55:09] upload your own source audio given that the source audio is copyright free so they do require
[00:55:20] that sorry I realized I'm not signed in so I'm signing in right now and then I can kind of do a
[00:55:26] little bit of show and tell here because I think it's it's really interesting and I've got some
[00:55:31] some thoughts and some examples to kind of show it off here. It's trained on data from
[00:55:36] audio sparks has a library of 800,000 audio files the owners on audio spars had the ability to opt out
[00:55:42] for the training so that gives you a little bit of a sense of kind of where this kind of comes from
[00:55:48] the first clip that I did earlier this morning let's see here does it give me the the yeah so
[00:55:56] the prompt that I gave it is the overall vibe should be a fusion of pop punk energy with massive
[00:56:00] contemporary pop production and themic yet modern and groove driven the different sections should
[00:56:05] ebb and flow with dynamic changes to keep interest over the three-minute duration that's what I
[00:56:10] asked it to do let's see what it came back with I'll play it hopefully.
[00:56:20] Is there a volume? You're the musician I'm not.
[00:56:24] Yeah I mean let's see here.
[00:56:32] Who that's super poppy oh yeah okay so you know and it kind of has little little movements
[00:56:41] and everything like that I think what's interesting to me where we're at right now with this is that
[00:56:46] the music AI is starting to get a better sense of structure because I feel like really early
[00:56:54] versions of this stuff felt a lot more random you had these weird like kind of discordant sounds
[00:57:01] paired up with each other and there was no you know there's not no like there might be a 4-4 B
[00:57:06] to know that we're pretty very used to hearing here in the the western kind of you know
[00:57:12] musical landscape but you wouldn't have like a section with that would be dedicated to like a verse
[00:57:18] and then something that's a little different to a chorus and then come back to the verse again
[00:57:22] and kind of you know revisit the verse and in my playing around with it it seems like they're
[00:57:27] getting better at understanding kind of the structure of music that is trained on anyways and
[00:57:34] I do think that that's an advancement is this a song that I would listen to.
[00:57:39] I mean no it's it's got kind of a stock music quality to me see now we're kind of in kind of a
[00:57:48] noisier part. It has those movements and it's you know it's not perfect it's not something that I
[00:57:55] would choose to listen to on a daily basis but I do think it's it's going somewhere there was
[00:58:00] another thing that I did and I was like okay well how does it do with like straight up like
[00:58:05] electronic music because in my mind electronic music like house and techno and you know EDM
[00:58:10] and all that kind of stuff it's almost like robotic music in essence because it was created
[00:58:17] with electronics and so much of it is designed around kind of the the rigidity of the digital
[00:58:24] machine and so maybe that speaks the language a little better. So this is this is like a techno
[00:58:30] version and I'm gonna skip a little bit ahead in it
[00:58:36] and I mean you know I certainly had my days of going dancing and stuff.
[00:58:42] I would totally hear this on the dance floor. You know what I mean?
[00:58:50] Okay okay so file that little sound that you just heard away when I heard that little
[00:58:57] little loop here I'll play it again which sound you might sound like noise to a lot of people but
[00:59:04] if you like techno and you like electronic music you know that might actually sound like something
[00:59:09] similar to what you listen to. I heard that and I was like kind of wonder what would happen if I
[00:59:14] took the product of the machine and I brought it into my own system and you know turned it into
[00:59:23] something that was usable in my mind you know as far as like a kind of like a remix sort of thing
[00:59:29] and so tell me if this sounds any better.
[00:59:36] Suddenly we've got the robotic like loop which is you know kind of noisy and random and whatever
[00:59:43] and you throw a little bit of the human approach on top which was was the drums and the drum loop
[00:59:48] and everything took me like 10 minutes to put together. Like I would totally base a song around
[00:59:53] that. So this this tools like this from a creativity standpoint what we're talking about early
[00:59:57] as for earlier as far as like you know how people are using these tools. This is where like I start to
[01:00:02] get goose pimples where I'm like instead of me searching and searching for hours trying to find a
[01:00:08] hook that I like or going on to a music library and finding something that that catches my ear
[01:00:14] like I can go into stability audio and come up with something that is I'm gonna put in air quotes
[01:00:19] original because it's based on something else but it's but it's based on something else that the end
[01:00:25] result is something that no one else has you know I don't think that you could get this system
[01:00:29] to create that loop again. I think that's I think that's kind of compelling as a musician I find
[01:00:36] that compelling. Well always in these cases I've said it earlier in today's show the more this stuff
[01:00:42] is used as a tool for creativity in our hands the happier I am the more benefit we get out of it the
[01:00:49] the the the safer it is I say for the wrong word but yeah more secure we can be in what it does
[01:00:58] and we use it I at some point I hope we can have on a guy named Lev Manabitch I think I mentioned
[01:01:02] the past who does a lot of work is it the CUNY Graduate Center on art and AI and art and technology
[01:01:12] and how people use it and I think that that it is just a tool that's all it is and when it's used
[01:01:18] that way that's really appealing and so yeah I don't think the music is so good but I'll be
[01:01:25] curious as you play with this Jason whether with your musical knowledge how prompting can improve it
[01:01:33] same as I don't know how to do with with I did some funny stuff I'm giving a talk tomorrow in
[01:01:38] Washington at the IPP which stands for something about privacy professionals and I asked I should
[01:01:46] have sent these images to you but I'll send them to you for next week I asked a couple of the
[01:01:52] Generative AI's to come up with the literate machine or the internet that's connected
[01:01:57] or things like that and there's some pretty funny images but I don't know how to instruct it well
[01:02:03] for in the language that people would use around drawing and illustration in art to get what I want
[01:02:11] you know that around music so I'd be interested to see you really play with this and try to get
[01:02:16] it to produce something you really think is good yeah opposed to interesting for a machine
[01:02:22] yeah interesting for a machine I mean yeah I mean that little clip like I could totally you know
[01:02:27] get get behind that I will fully admit though like how did I come up with the prompt I was like
[01:02:33] where do I even begin with the prompt on this I went into perplexity and I said hey create
[01:02:38] me a short prompt separated by commas that is a house music track that has some ethereal you know
[01:02:45] kind of energy and pads like I it's like I gave it a starting prompt idea and then I asked perplexity
[01:02:51] to kind of give me a little bit more around that and I know there's a word for this I know there's
[01:02:56] there's a certain word for using an lllm to create a prompt for another lllm and I can't
[01:03:01] remember it off the top of my head um and then that is what I fed into stability which is a whole
[01:03:07] other can of worms or it's like you know it's not purely me but it's me collaborating with the
[01:03:12] machine to come up with the the prompt that informs this other machine to do these things and I
[01:03:20] know it's it's really kind of cool yeah it's very it tickles my need my desire to be creative
[01:03:29] in creative ways you know and the problem I'm having is that is that I haven't found the creative
[01:03:37] uses for me yet yeah I'm right here and so I write and and I don't know how to use it for that which
[01:03:45] I know about um I'm working on an next book so I've got to find ways to put it in there just to see
[01:03:50] how it interacts with me and what I can do with it similarly no book lm which we've talked about a lot
[01:03:55] from google um if I had my files which are all in paper huge amounts of paper uh and books
[01:04:02] and things like that all around yeah um if I had that all digital then I'd die to have the machine
[01:04:11] organized it for me and come up with concepts but it's not yes and the amount of work to make
[01:04:15] a digital not worth it um yeah so I'm looking for the ways to use this stuff creatively and
[01:04:20] the only thing I'm doing is trying to get silly images out which is fun but yeah fun useful
[01:04:30] potentially yeah power points yeah yeah sure right yeah yeah and I mean google knows this right
[01:04:36] as one example they've integrated the ability to generate images inside of of your docs and
[01:04:40] stuff like that what's up the amazing things I've been using image because google took down
[01:04:45] Gemini's image making they still have image fx up and so when I go into image fx I'd say a good half
[01:04:53] the time it refuses to do the image and says see the facts and see the rules and I cannot figure out
[01:04:58] what I violated I'm asking things so I write my next book uh or the one coming out this fall
[01:05:04] the web we weave about this concept of pharma which was a mythical beast uh that was the
[01:05:11] the personification of rumor and reputation and news and so on so it's a for Virgil's an A
[01:05:18] and don't worry I won't go into lung it's a um a female beast covered with feathers each feather
[01:05:25] has an eye an ear and a mouth and they never close and they're always listening and they're
[01:05:30] always talking and this is rumor and the joke is this is the internet um and so I wanted to draw
[01:05:36] there's this this this really weird picture with 1504 of pharma so I wanted to do a modernized
[01:05:42] version of pharma connected to the internet that's all so I described pharma as this beast that does
[01:05:48] this and that doesn't look at these characteristics and I want you to update it uh to show this
[01:05:52] connected to the internet um wouldn't do it hmm I think connected to the internet I can't figure out
[01:05:58] what I what I was saying wrong yeah but I think it was that connection to the internet I finally got
[01:06:03] something out of it but there were things I was doing wrong and I could not figure out and it's
[01:06:08] not gonna tell you because then you're gonna game it so you don't know but the guardrails are now
[01:06:12] set up this tight yeah yeah Google's not taking any change right now they're they're very shy at
[01:06:20] the moment uh when it comes to their systems um that's interesting when you were describing
[01:06:26] that character you know my my thought was wow you're you're describing exactly what what you put into
[01:06:32] you know put in as a prompt and and I'd be curious to see what it comes back with you know
[01:06:36] give me 20 of those and pick the best one I'll send you my presentation before next week because
[01:06:40] we can have some positive images cool awesome well we have uh reached the end of this episode
[01:06:46] of AI inside always a good time and um always fun hanging out with you Jeff thank you so much for
[01:06:52] being here every week um Gutenberg parenthesis dot com anything else you want to leave
[01:06:57] funny boy excellent I want to show it real quick if you are watching the video version
[01:07:04] you can see that he uh that Jeff has a magazine offered there also the Gutenberg with discount
[01:07:10] uh codes so that's right always a point five percent off and the next book coming out is called
[01:07:16] uh the web we weave and you can pre-order that too oh excellent is that on here as well no it's not
[01:07:21] yet they got it not yet not yet soon to be I'm sure i'd be nice to my son is what it amounts to
[01:07:26] okay well thank you Jeff I always appreciate you as for me uh just go to yellowgoldstudios.com that
[01:07:36] will forward you to the youtube channel and you can see right there if you're watching the video
[01:07:42] version it's kind of like inception if you're on the youtube channel watching the live version of
[01:07:46] the recording of this show then you'll see the youtube channel with a little live call out of this
[01:07:52] episode um all this really means is if you subscribe to the channel at yellowgoldstudios.com
[01:07:58] you will get notified if you if you choose to uh when we go live and you can watch live and you can
[01:08:03] participate in chat like we have some people today participating in chat we've got Steve Young
[01:08:11] Tay we've got Daniel a lot of repeat Sean Vinky Pooh all sorts of people participating while
[01:08:18] we record this live and we appreciate them being here uh each and every week as it turns out
[01:08:23] pretty awesome stuff we record this show every Wednesday 11 a.m. Pacific 2 p.m. Eastern um and again
[01:08:30] like I said just find it on youtube if you want to tune into the live recording or you can just
[01:08:35] subscribe by going to a.m. inside.show and uh or just search any of your favorite pod catchers
[01:08:42] and you will find us you can also support us directly as i said earlier in the show patreon.com slash
[01:08:47] a.m. inside.show that goes directly to us in funding um this show and and building it and uh
[01:08:55] we have plans to build it we want to build it even bigger so help us do that patreon.com slash a.m. inside
[01:09:01] show thank you everyone for watching and listening and supporting us we appreciate you and we will
[01:09:07] see you next time on another episode of AI Inside. Bye everybody
[01:10:07] Hold up what was that? Boring no flavor that was as bad as those leftovers you ate all week
[01:10:14] kiki baama here and it's time to say hello to something fresh and guilt free hello fresh jazz
[01:10:20] up dinner with pk and crusted chicken or garlic butter shrimps can be now that's music to my mouth
[01:10:25] hello fresh let's get this dinner party started discover all the delicious possibilities at hello
[01:10:31] fresh dot com this year build your credit history with the time secured credit builder visa credit card
[01:10:38] no credit checks to apply get started at chime dot com slash build the time credit
[01:10:42] builder visa credit card is issued by the bank court bank and a or strike bank and a members
[01:10:46] FDIC chime checking account and a 200 qualifying direct deposit required to apply