Awesome AI stuff, delivered daily, for free! Check out the AI For Humans: DAILY DRIP!
April 11, 2024

OpenAI Rips YouTube, Elon’s Robotaxis & AI Artist Purz Beats | Ep52

REVIEW THIS PODCAST. DO IT. RIGHT NOW. This week… the NY Times says OpenAI and others are scraping YouTube for AI training, Sora might just lead to AGI,  Elon Musk says Robotaxis are on the way and Spotify AI playlists might be cool.. Plus,...

REVIEW THIS PODCAST. DO IT. RIGHT NOW.

This week… the NY Times says OpenAI and others are scraping YouTube for AI training, Sora might just lead to AGI,  Elon Musk says Robotaxis are on the way and Spotify AI playlists might be cool..

Plus, Suno.ai turns food recipes into beautiful music, Midjourney makes X-Men from the old west, ChatGPT gets in-painting and an “AI disaster” for Pink Floyd fans.

AND THEN… an interview with AI artist Purz Beats where we discuss a bunch of AI art tools like Comfy UI & Stable Diffusion and also talk about his workflow and inspirations. Oh, and the complicated question around who *exactly* owns AI art and what to say to the haters. 

And finally our AI co-host “Will The Thrill” is an AI DJ and he takes us through the world of AI generated music, specifically some of the AI food songs currently populating Suno.AI..

It's an endless cavalcade of ridiculous and informative AI news, AI tools, and AI entertainment cooked up just for you.

Follow us for more AI discussions, AI news updates, and AI tool reviews on X @AIForHumansShow

Join our vibrant community on TikTok @aiforhumansshow

For more info, visit our website at https://www.aiforhumans.show/

 

/// Show links ///

OpenAI Trained on YouTube + More

https://www.nytimes.com/2024/04/06/technology/tech-giants-harvest-data-artificial-intelligence.html

Bill Peebles (SORA) Says That Sora Is a Big Step Towards AGI

AGI House Video: https://www.youtube.com/watch?v=U3J6R9gfUhU

Tesla Robotaxi’s Due August 8th OR ARE THEY

https://twitter.com/elonmusk/status/1776351450542768368

Reuters report that Tesla cancelled their low cost car 

https://www.reuters.com/business/autos-transportation/tesla-scraps-low-cost-car-plans-amid-fierce-chinese-ev-competition-2024-04-05

The DARKEST Side of the Moon - Pink Floyd AI OUTRAGE

https://x.com/pinkfloyd/status/1776278627790782656

Spotify AI Playlist (only live in UK and Australia)

https://newsroom.spotify.com/2024-04-07/spotify-premium-users-can-now-turn-any-idea-into-a-personalized-playlist-with-ai-playlist-in-beta/

Xmen 1897

https://www.reddit.com/r/midjourney/comments/1bz0eai/xmen_1897/

AI Haters - Meow Meow Meow

https://x.com/venturetwins/status/1777169941260853462

N64 Lora

https://x.com/fofrAI/status/1776329791437697195

PS2 Pulp Fiction 

https://x.com/TheReelRobot/status/1776296600085692641

SunoAI Spaghetti Song

https://suno.com/song/4a77dea7-19f3-46d2-8b0a-b2b7e9ea9a05

Hot Dog Casserole Song

https://suno.com/song/989ce4c8-3923-41a3-a2a8-b942d7157c19

ChatGPT In-Painting

https://community.openai.com/t/dalle3-inpainting-editing-your-images-with-dall-e/705477

Purz Beats
https://twitter.com/PurzBeats

https://www.purz.xyz/

https://www.youtube.com/purzbeats

https://discord.com/invite/Vk3N7yhnYZ

ComfyUI

https://github.com/comfyanonymous/ComfyUI

 

Transcript

1
00:00:00,444 --> 00:00:02,459
OpenAI is screwing YouTube and

2
00:00:02,514 --> 00:00:05,159
It kind of seems like
everybody is screwing YouTube

3
00:00:05,510 --> 00:00:09,851
There's only a few of these that exist
in the world, we actually have an AI DJ

4
00:00:09,856 --> 00:00:14,606
Let's crank it up and let the
beat drop like it's hot, hot, hot.

5
00:00:15,771 --> 00:00:21,026
every Experience that we have is gonna be
synthesized for a new large language model

6
00:00:21,046 --> 00:00:23,096
Now I wish I could punch
you through the screen again

7
00:00:23,096 --> 00:00:23,946
Just kidding, everybody.

8
00:00:23,946 --> 00:00:24,946
We are having fun.

9
00:00:27,600 --> 00:00:28,070
Welcome.

10
00:00:28,080 --> 00:00:28,500
Welcome.

11
00:00:28,500 --> 00:00:29,320
Welcome everybody.

12
00:00:29,320 --> 00:00:31,970
It is another big
episode of AI for humans.

13
00:00:32,310 --> 00:00:32,810
Oh,

14
00:00:32,990 --> 00:00:36,420
There was no, there was no
warning whatsoever about

15
00:00:36,560 --> 00:00:37,380
We'll start over.

16
00:00:37,390 --> 00:00:37,820
We'll start

17
00:00:38,080 --> 00:00:41,210
And I felt unwelcomed,

18
00:00:41,640 --> 00:00:42,490
We're starting over.

19
00:00:42,490 --> 00:00:42,730
Stop.

20
00:00:43,010 --> 00:00:43,844
We're starting over.

21
00:00:44,274 --> 00:00:45,124
Welcome everybody.

22
00:00:45,124 --> 00:00:48,044
It is AI for Humans, your weekly
guide to the wonderful and

23
00:00:48,064 --> 00:00:49,544
wild world of generative AI.

24
00:00:49,544 --> 00:00:50,414
I am here.

25
00:00:50,414 --> 00:00:53,404
My name is Gavin Purcell and my
friend, Kevin Pereira is on the

26
00:00:53,434 --> 00:00:54,714
other end of the microphone.

27
00:00:54,744 --> 00:00:55,434
Kevin, how are you?

28
00:00:55,984 --> 00:00:57,934
I'm on the other end of this microphone.

29
00:00:57,934 --> 00:00:59,284
It's like two tin cans in a

30
00:00:59,624 --> 00:01:00,454
We're connected.

31
00:01:00,464 --> 00:01:01,284
We're connected.

32
00:01:01,999 --> 00:01:02,429
That's it.

33
00:01:02,449 --> 00:01:03,289
We've docked.

34
00:01:03,579 --> 00:01:04,169
Hi friends.

35
00:01:04,339 --> 00:01:05,369
me, KP.

36
00:01:05,679 --> 00:01:07,809
I'm leaving that entire intro in Gavin.

37
00:01:07,809 --> 00:01:09,719
What a beautiful AI for humans.

38
00:01:09,719 --> 00:01:11,079
Episode five, two.

39
00:01:11,079 --> 00:01:11,979
We got today.

40
00:01:12,399 --> 00:01:15,329
Should we tell the people
what this podcast is about?

41
00:01:15,432 --> 00:01:18,912
We like to demystify all the
news, tools, and all the other

42
00:01:18,912 --> 00:01:20,262
aspects out there just for you.

43
00:01:20,572 --> 00:01:21,922
Kevin, what's on the show today?

44
00:01:21,987 --> 00:01:26,037
OpenAI is screwing YouTube and
Google is screwing YouTube.

45
00:01:26,047 --> 00:01:29,437
It kind of seems like everybody
is screwing YouTube, except

46
00:01:29,437 --> 00:01:30,367
for maybe the creators.

47
00:01:30,367 --> 00:01:34,487
We're gonna tell you how and why,
and maybe, maybe I'll share some very

48
00:01:34,487 --> 00:01:37,607
artistic, tasteful renderings that
I've made of said screwing, Gavin.

49
00:01:38,542 --> 00:01:39,452
no, really?

50
00:01:40,307 --> 00:01:40,727
No!

51
00:01:41,292 --> 00:01:41,852
Is it okay?

52
00:01:41,852 --> 00:01:42,432
Good, good.

53
00:01:42,847 --> 00:01:44,497
I was just making sure
you're paying attention.

54
00:01:44,877 --> 00:01:48,877
Also, hey, stop me if you've heard
this one, Gavin, and or audience

55
00:01:48,877 --> 00:01:52,107
who cannot actually stop me
because this is a one way medium.

56
00:01:52,447 --> 00:01:56,047
Elon Musk is working on robo taxis.

57
00:01:56,302 --> 00:01:57,782
Oh, I've heard this one, so you can

58
00:01:57,867 --> 00:02:02,297
Yeah, we've all heard this one since
like 2014, but hopefully, the just

59
00:02:02,327 --> 00:02:06,977
announced Spotify AI DJ will be able to
generate a playlist that's loud enough to

60
00:02:06,977 --> 00:02:11,417
drown out the screams as our robo taxis
autopilot us through farmer's markets.

61
00:02:11,732 --> 00:02:14,412
Oh no, sir, I don't like that at all.

62
00:02:14,547 --> 00:02:17,517
. Listen, we're going to have all the
details, plus some dumb things that

63
00:02:17,537 --> 00:02:20,667
everybody listening can do with AI today.

64
00:02:20,807 --> 00:02:23,347
We're going to chat with an AI
powered guest, and then we're probably

65
00:02:23,347 --> 00:02:26,547
going to have a way more compelling
chat with an actual human being,

66
00:02:26,547 --> 00:02:31,467
Gavin, a boundary pushing visual
artist, who is our real guest today.

67
00:02:31,467 --> 00:02:37,924
We're going to dive deep into the
cutting edge of AI artistry with PURZ.

68
00:02:37,924 --> 00:02:37,994
PURZ.

69
00:02:38,064 --> 00:02:38,834
It's PURZ.

70
00:02:38,874 --> 00:02:39,334
P U R Z,

71
00:02:39,399 --> 00:02:41,049
It's not Perz Beats,
I thought his name was

72
00:02:41,174 --> 00:02:45,014
that's the full name, but colloquially,
the casual, we're on that level.

73
00:02:45,204 --> 00:02:47,234
When we give a head
nod, we say, Sup, Purrs?

74
00:02:47,849 --> 00:02:48,549
what's up Perz?

75
00:02:48,549 --> 00:02:49,759
All right, that'll be very exciting.

76
00:02:49,759 --> 00:02:54,024
Before we get started as always
we want to tell you at home Please

77
00:02:54,244 --> 00:02:56,634
share , and rate our podcast.

78
00:02:56,644 --> 00:03:00,044
It is only with those shares
and rates that we grow.

79
00:03:00,354 --> 00:03:03,464
Our YouTube video last week was doing
very well and still is doing very well.

80
00:03:03,474 --> 00:03:07,054
And we appreciate everybody that watches
the video likes and subscribes on YouTube.

81
00:03:07,424 --> 00:03:10,844
Also, please leave a five star
podcast on Apple podcasts.

82
00:03:10,864 --> 00:03:12,044
We will read them at the end of the show.

83
00:03:12,044 --> 00:03:14,464
Kevin, today we have three new
ones to read, which is exciting.

84
00:03:14,814 --> 00:03:17,234
Um, yes, we, we had three new.

85
00:03:17,344 --> 00:03:17,964
last week's

86
00:03:18,119 --> 00:03:18,889
All right, groveling?

87
00:03:18,894 --> 00:03:19,734
through to people?

88
00:03:19,799 --> 00:03:20,479
It got through.

89
00:03:20,579 --> 00:03:24,519
So again, we really appreciate everybody
in our audience who listens to this show.

90
00:03:24,549 --> 00:03:28,719
This is not just the two of us BS ing
for an hour and then we upload it.

91
00:03:28,929 --> 00:03:30,249
There's a lot of editing
and all sorts of other

92
00:03:30,324 --> 00:03:31,724
hate each other.

93
00:03:31,824 --> 00:03:34,024
Every sentence is a grind.

94
00:03:34,024 --> 00:03:36,864
It is trench warfare
when we launch this pod.

95
00:03:37,004 --> 00:03:41,434
And through gritted teeth and multiple
takes, we managed to get sentences out.

96
00:03:41,434 --> 00:03:45,294
So if you appreciate the end
product, we appreciate you engaging.

97
00:03:45,464 --> 00:03:47,934
Isn't that right, you piece of shit?

98
00:03:48,389 --> 00:03:49,359
Wow!

99
00:03:49,359 --> 00:03:51,409
Now I wish I could punch you
through the screen again.

100
00:03:51,409 --> 00:03:52,259
Just kidding, everybody.

101
00:03:52,259 --> 00:03:53,259
We are having fun.

102
00:03:53,289 --> 00:03:55,749
I want everybody to know
that is not exactly true.

103
00:03:56,039 --> 00:03:57,919
Kevin is making up lies right now.

104
00:03:57,949 --> 00:04:01,379
Kevin, it is time to get to the news!

105
00:04:03,139 --> 00:04:03,919
I'm already sweaty.

106
00:04:15,323 --> 00:04:15,663
Okay.

107
00:04:15,663 --> 00:04:19,643
The news this week is as usual, Fast
and Furious, the biggest story that

108
00:04:19,653 --> 00:04:24,253
I've seen come down news story in a
while and AI broke over the weekend.

109
00:04:24,533 --> 00:04:27,803
The New York times published a
piece that had five authors on it.

110
00:04:27,803 --> 00:04:30,023
So, you know, when there's five authors
on New York times piece, they've

111
00:04:30,023 --> 00:04:34,863
done some research, which basically
accused open AI, Google, and meta.

112
00:04:34,933 --> 00:04:39,163
all of training on, um, various
versions of data, but mostly they

113
00:04:39,163 --> 00:04:40,673
talked about scraping YouTube.

114
00:04:40,983 --> 00:04:44,713
And the big question with Sora,
videos especially, has been, how

115
00:04:44,713 --> 00:04:46,433
did they get video to train Sora?

116
00:04:46,433 --> 00:04:48,493
Because as everybody knows who
listens to the show, and if you

117
00:04:48,493 --> 00:04:50,133
don't, it's a pretty simple thing.

118
00:04:50,418 --> 00:04:53,548
You need a lot of data to
train an AI model, whether

119
00:04:53,548 --> 00:04:55,228
that's an LLM and text model.

120
00:04:55,228 --> 00:04:57,208
It means you need to get
a lot of data of words.

121
00:04:57,448 --> 00:05:00,058
In this instance, you need a lot
of video to train something like

122
00:05:00,058 --> 00:05:01,608
SORA, especially train it as well.

123
00:05:01,828 --> 00:05:04,098
Now, OpenAI has done a
deal with Shutterstock.

124
00:05:04,098 --> 00:05:07,118
So there's a lot of stuff out
there, but most people, I think,

125
00:05:07,118 --> 00:05:10,368
haven't assumed that there was some
sort of version of this happening.

126
00:05:10,888 --> 00:05:13,668
I think there was a little bit of a
snippet that came out a couple weeks ago

127
00:05:13,668 --> 00:05:17,658
when Joanne Stern published her video
where she interviewed Mina Murati, who

128
00:05:17,658 --> 00:05:21,888
is the OpenAI Chief Technical Officer,
who could not answer the question about

129
00:05:21,898 --> 00:05:23,908
how they had trained their SORA model.

130
00:05:24,548 --> 00:05:27,788
She had a little bit of, yeah,
that was a little bit of a face.

131
00:05:28,528 --> 00:05:30,108
So she really didn't have
a good answer for this.

132
00:05:30,108 --> 00:05:33,068
And I think what we've learned now
is that they did train it on YouTube.

133
00:05:33,088 --> 00:05:33,728
So, Kevin.

134
00:05:34,763 --> 00:05:37,673
First of all, when you read this piece,
which was very long, and I encourage

135
00:05:37,673 --> 00:05:39,143
everybody who's listening to go read it.

136
00:05:39,143 --> 00:05:41,413
Cause it's a lot of, uh,
there's a lot of things in there

137
00:05:41,413 --> 00:05:42,713
that are really fascinating.

138
00:05:42,943 --> 00:05:45,583
What was your first takeaway
from reading this piece?

139
00:05:48,578 --> 00:05:50,008
What a non surprise!

140
00:05:52,730 --> 00:05:54,210
I wasn't shocked in the slightest.

141
00:05:54,220 --> 00:05:58,560
I got to imagine you weren't as well,
Gavin, because we know how much data

142
00:05:58,570 --> 00:06:00,180
is required to train these things.

143
00:06:00,390 --> 00:06:03,180
, a couple of details from the article, and
then we can dive in a little bit deeper.

144
00:06:03,180 --> 00:06:07,360
Greg Brockman, OpenAI's president,
, allegedly personally helped

145
00:06:07,370 --> 00:06:09,020
collect the videos that were used.

146
00:06:09,070 --> 00:06:10,310
, again, according to the New York Times.

147
00:06:10,720 --> 00:06:13,400
What they did though, was that
they grabbed, , transcripts

148
00:06:13,470 --> 00:06:14,890
of all the YouTube videos.

149
00:06:14,950 --> 00:06:18,790
And using that, which would be a
violation, according to Google,

150
00:06:19,040 --> 00:06:20,860
of YouTube's usage policies.

151
00:06:20,860 --> 00:06:23,859
You cannot use their data
to train something else.

152
00:06:24,170 --> 00:06:28,930
However, buried within that article is an
accusation that Google may have done the

153
00:06:29,030 --> 00:06:30,190
Done it themselves.

154
00:06:30,190 --> 00:06:31,120
Yeah, exactly.

155
00:06:31,380 --> 00:06:35,220
and that the reason Google might
not be making any public statements

156
00:06:35,220 --> 00:06:39,310
about OpenAI's alleged actions
is that they're curious if they

157
00:06:39,310 --> 00:06:40,940
would be outing themselves as well.

158
00:06:41,278 --> 00:06:41,808
Exactly.

159
00:06:41,808 --> 00:06:42,598
And it's been really interesting.

160
00:06:42,598 --> 00:06:48,268
So Neil Mohan, the, , YouTube CEO did
come out and say that, , some YouTube

161
00:06:48,308 --> 00:06:51,218
content is scrapable for open web
purposes, but the video transcript and

162
00:06:51,218 --> 00:06:52,608
footage are not allowed to be scraped.

163
00:06:52,618 --> 00:06:55,468
He said, this is a clear
violation of our terms of service.

164
00:06:55,518 --> 00:06:58,548
So those are the rules of the road
in terms of content on our platform.

165
00:06:58,848 --> 00:06:59,498
But you're right.

166
00:06:59,508 --> 00:07:00,638
It is interesting to see.

167
00:07:01,148 --> 00:07:06,278
how how kind of quiet the overall
Google ecosystem has been on this.

168
00:07:06,278 --> 00:07:08,088
And I think one thing I want to point out

169
00:07:08,168 --> 00:07:11,278
I was really kind of surprised
at how little noise this

170
00:07:11,278 --> 00:07:13,208
made in the AI ecosystem.

171
00:07:13,228 --> 00:07:17,768
And when I say the AI ecosystem, I mean,
the people kind of like us or more of

172
00:07:17,768 --> 00:07:20,618
the people who are kind of interested
in AI who are kind of focused on this.

173
00:07:20,878 --> 00:07:23,508
When I first read this, I was
like, well, this is a smoking gun.

174
00:07:23,508 --> 00:07:24,158
Like, look at this.

175
00:07:24,158 --> 00:07:28,168
This is going to like launch what I
believe will probably be dozens of

176
00:07:28,168 --> 00:07:30,558
lawsuits, whether they get through or not.

177
00:07:30,868 --> 00:07:35,018
Now, I think the big question is going
to be what do Google's terms of services

178
00:07:35,018 --> 00:07:38,538
say when things were scraped, if they were
scraped, and it sounds like they were.

179
00:07:38,898 --> 00:07:40,878
But then I think the next
question is going to be.

180
00:07:40,878 --> 00:07:43,548
And this, I want to ask you directly
this, cause there's so many people

181
00:07:43,548 --> 00:07:46,888
who talk about, and you've said this
and I've said this, the idea that

182
00:07:46,898 --> 00:07:49,198
like the cat is out of the bag, right?

183
00:07:49,198 --> 00:07:52,038
That the idea that once
you've done this, guess what?

184
00:07:52,038 --> 00:07:52,948
It's too late.

185
00:07:53,278 --> 00:07:56,228
All this stuff is out there and now
we just have to live with what it is.

186
00:07:56,748 --> 00:08:01,558
Do you think that there is any way that
there is a pause that can be put on

187
00:08:01,678 --> 00:08:04,198
this based on legal terms at this point?

188
00:08:04,338 --> 00:08:08,398
Perhaps an injunction against the
output of all large language models

189
00:08:08,398 --> 00:08:11,301
until a judge can say hey We got
to piece this together, right?

190
00:08:11,301 --> 00:08:13,671
We got to dive in through your
model, see how it was trained,

191
00:08:13,671 --> 00:08:14,821
see if you violated something.

192
00:08:15,031 --> 00:08:19,841
, I think there's too much money at play
, that, that could be a scenario, but I

193
00:08:19,861 --> 00:08:24,251
think there will just be millions, if not
billions of dollars thrown against the

194
00:08:24,251 --> 00:08:27,781
wall to make sure that something like that
doesn't happen in the interim and it's

195
00:08:27,781 --> 00:08:34,561
business as usual until decades later,
everything megacorp that's going to rule

196
00:08:34,561 --> 00:08:36,521
us all and be our one world government.

197
00:08:36,856 --> 00:08:39,076
Like, people will just play nice.

198
00:08:39,116 --> 00:08:43,196
There's just too much money at stake,
so I think this is a calculated move.

199
00:08:43,226 --> 00:08:46,776
I'm sure there were lawyers involved
at one point in the room, and

200
00:08:46,776 --> 00:08:50,866
eventually, probably the engineers
said, Hey, listen, we have to sprint.

201
00:08:50,876 --> 00:08:52,156
We have to grab this data.

202
00:08:52,156 --> 00:08:55,306
We will beg for, and probably
pay for, forgiveness later.

203
00:08:55,566 --> 00:08:56,736
So let's just go.

204
00:08:57,346 --> 00:08:58,306
I think that's probably true.

205
00:08:58,306 --> 00:09:01,366
And I think the one thing that's important
to point out here is there's been a couple

206
00:09:01,366 --> 00:09:05,596
stories this week about how hard it is to
get new data to train , these devices on.

207
00:09:05,596 --> 00:09:08,741
And there's lots of people in the machine
learning world that are saying, Hey,

208
00:09:08,741 --> 00:09:10,101
this is going to be a scaling issue.

209
00:09:10,101 --> 00:09:13,571
Like actually every time you scale
up, you get better results, but

210
00:09:13,581 --> 00:09:15,851
scaling equals more data, right?

211
00:09:15,851 --> 00:09:19,691
And so the one thing that they've
talked about a lot is we have

212
00:09:19,691 --> 00:09:21,901
scraped so much text data already.

213
00:09:21,901 --> 00:09:26,041
Like so many books and again, who knows
the legal ramifications of that, putting

214
00:09:26,041 --> 00:09:27,861
it aside, but we've scraped the internet.

215
00:09:27,861 --> 00:09:28,651
We've scraped books.

216
00:09:28,651 --> 00:09:29,581
We've scraped all this stuff.

217
00:09:30,061 --> 00:09:34,801
If they're scraping and have scraped,
, a lot of YouTube already, where does

218
00:09:34,811 --> 00:09:37,061
more data come from is a huge question.

219
00:09:37,061 --> 00:09:39,651
And that's partly the whole
synthetic data conversation.

220
00:09:39,651 --> 00:09:42,871
But like, I don't know where you get
more than YouTube when it comes to video.

221
00:09:43,431 --> 00:09:48,331
Someone has to be transcribing and
scraping every podcast in every

222
00:09:48,331 --> 00:09:51,061
language and going back , decades ago.

223
00:09:51,271 --> 00:09:54,301
, I'm sure that's already been thought
of and is being grabbed as well.

224
00:09:54,411 --> 00:09:57,661
, last year, this is according to the
New York times article, Google quote,

225
00:09:57,736 --> 00:09:59,506
Also brought in its terms of service.

226
00:09:59,516 --> 00:10:02,646
One motivation for the change,
according to members of the company's

227
00:10:02,656 --> 00:10:07,366
privacy team and an internal message
reviewed by the times was to allow

228
00:10:07,366 --> 00:10:11,306
Google to be able to tap publicly
available, Google docs, restaurant

229
00:10:11,306 --> 00:10:15,916
reviews on Google maps and other online
material for more of its AI products.

230
00:10:15,916 --> 00:10:19,466
So even Google was like, Hey, we got
to start getting this, you know, at

231
00:10:19,466 --> 00:10:22,286
some point there's going to be some
sort of like, Hey, we'll give you.

232
00:10:22,876 --> 00:10:26,536
Maybe, maybe we'll give you 3 percent
off your Google for business account.

233
00:10:26,556 --> 00:10:30,566
If you let us crawl through
some files, in fact, maybe we

234
00:10:30,566 --> 00:10:31,996
won't give you anything off.

235
00:10:32,166 --> 00:10:33,056
Maybe we'll just

236
00:10:33,311 --> 00:10:34,231
We'll just do it.

237
00:10:34,386 --> 00:10:36,576
of service and just go on through it.

238
00:10:36,626 --> 00:10:37,786
That's probably going to happen.

239
00:10:37,786 --> 00:10:41,456
But on the synthetic data front, that,
that you brought up, that to me is

240
00:10:41,506 --> 00:10:42,871
really, really interesting because.

241
00:10:43,321 --> 00:10:47,511
It seems like every day a new paper
comes out that says synthetic data

242
00:10:47,521 --> 00:10:50,431
is great and it's going to solve the
problem, which is something that Sam

243
00:10:50,441 --> 00:10:52,781
Altman has said is going to be the case.

244
00:10:53,201 --> 00:10:56,231
Or there's a paper that comes out and
says synthetic data is terrible and the

245
00:10:56,251 --> 00:10:57,741
AI is going to train itself in a loop.

246
00:10:57,741 --> 00:11:00,521
So for the broader audience out there
that doesn't know what we are talking

247
00:11:00,531 --> 00:11:04,691
about when we say synthetic data,
we're saying that You can use the large

248
00:11:04,691 --> 00:11:07,671
language models as they exist today
and likely as they'll exist in the

249
00:11:07,671 --> 00:11:13,691
next year or two, which will be better,
theoretically, to generate paragraphs upon

250
00:11:13,691 --> 00:11:17,181
paragraphs of text and a computer code.

251
00:11:17,181 --> 00:11:19,321
I mean, we're talking like
generate the Library of Congress.

252
00:11:19,321 --> 00:11:20,631
It could probably do it in a week.

253
00:11:20,911 --> 00:11:26,331
It just is going to churn and give you
fake questions and answers, fake recipes,

254
00:11:26,341 --> 00:11:28,711
fake restaurant reviews, fake everything.

255
00:11:28,711 --> 00:11:33,161
That synthetic data is thought
by some to be good enough.

256
00:11:33,551 --> 00:11:36,981
That if you generate enough of
it, you can train a better model.

257
00:11:36,981 --> 00:11:41,051
But some are saying that actually the
repeated words and phrases that are

258
00:11:41,051 --> 00:11:44,881
already seen to come out of some of
these language models, the bad code

259
00:11:45,011 --> 00:11:48,531
or the potentially pirated snippets
of content, those are just going

260
00:11:48,531 --> 00:11:50,211
to resurface over and over again.

261
00:11:50,221 --> 00:11:53,761
And then eventually it kind of
trains itself on its worst bad habits

262
00:11:53,761 --> 00:11:54,901
and it's not going to be usable.

263
00:11:55,121 --> 00:11:56,981
Those are the two schools
of thought right now.

264
00:11:57,481 --> 00:11:59,591
Do you have a horsey in this race, Gavin?

265
00:12:00,031 --> 00:12:02,551
I don't understand enough on
the technical side to see if the

266
00:12:02,551 --> 00:12:03,851
synthetic data is strong enough.

267
00:12:03,861 --> 00:12:07,281
I will say from what I've read, the
arguments around synthetic data is it

268
00:12:07,291 --> 00:12:12,851
wasn't good enough before, but now as the
models get stronger, it may be better.

269
00:12:12,851 --> 00:12:16,851
The interesting thing to think about
is, if that is the unlock and we're also

270
00:12:16,851 --> 00:12:20,441
heading to a world where only bigger,
bigger models equal better results,

271
00:12:20,861 --> 00:12:24,341
then we are going to be moving very
quickly to the next stage of whatever

272
00:12:24,341 --> 00:12:28,461
this AI world is, because the AI, as
we talked about in the NVIDIA episode,

273
00:12:28,861 --> 00:12:32,741
the AI can simulate themselves in
different places and different things.

274
00:12:33,111 --> 00:12:35,771
And if that allows, if you can simulate.

275
00:12:36,296 --> 00:12:39,066
Interactions, text, video, all that stuff.

276
00:12:39,496 --> 00:12:43,316
You can ostensibly do that in an
infinite level if you have the amount

277
00:12:43,316 --> 00:12:44,876
of compute and storage to do it.

278
00:12:44,896 --> 00:12:46,406
And then it goes really quickly.

279
00:12:46,456 --> 00:12:50,076
Actually, this really dovetails really
interestingly into another story that we

280
00:12:50,076 --> 00:12:55,606
have Bill Peebles, one of the main open
AI engineers behind Sora had a really

281
00:12:55,606 --> 00:12:59,953
interesting talk that he came out and
gave to the, AGI house, which is again,

282
00:12:59,963 --> 00:13:02,663
like probably some sort of hype house,
but for AGI is like, whoop, whoop.

283
00:13:02,933 --> 00:13:04,693
We're going to AGI land, baby.

284
00:13:05,763 --> 00:13:06,113
Like and

285
00:13:06,393 --> 00:13:07,483
Energy Refrigerator!

286
00:13:07,483 --> 00:13:08,473
It's so hype!

287
00:13:09,033 --> 00:13:09,373
Yeah.

288
00:13:09,633 --> 00:13:12,873
So Bill was at the AGI house and gave a
speech, and we're going to hear a little

289
00:13:12,873 --> 00:13:14,543
snippet of what his speech was here.

290
00:13:15,113 --> 00:13:18,483
So, of course everyone's very
bullish on the role that LLMs are

291
00:13:18,483 --> 00:13:20,288
going to play in getting to AGI.

292
00:13:20,798 --> 00:13:24,198
But we believe that video models
are on the critical path to it.

293
00:13:24,198 --> 00:13:28,658
And concretely, we believe that when we
look at very complex scenes that Sora

294
00:13:28,658 --> 00:13:33,038
can generate, like that snowy scene in
Tokyo that we saw in the very beginning,

295
00:13:33,038 --> 00:13:36,348
that Sora is already beginning to show
a detailed understanding of how humans

296
00:13:36,348 --> 00:13:39,938
interact with one another, how they
have physical contact with one another.

297
00:13:40,358 --> 00:13:43,658
And as we continue to scale this
paradigm, we think eventually it's going

298
00:13:43,658 --> 00:13:45,348
to have to model a human state, right?

299
00:13:45,398 --> 00:13:48,218
The only way you can generate
truly realistic video, with truly

300
00:13:48,218 --> 00:13:49,578
realistic sequences of actions.

301
00:13:49,993 --> 00:13:52,493
is if you have an internal
model of how all objects,

302
00:13:52,493 --> 00:13:54,233
humans, etc., environments work.

303
00:13:54,833 --> 00:13:58,003
And so we think this is how Sora
is going to contribute to HDI.

304
00:13:58,003 --> 00:13:59,343
Basically what Mr.

305
00:13:59,343 --> 00:14:01,473
Peebles is saying here,
call him, you know, Mr.

306
00:14:01,473 --> 00:14:06,578
Peebles if you're nasty, uh, He's
talking about, he's talking about

307
00:14:06,578 --> 00:14:11,228
the idea that Sora, which is a
video generator, is actually a world

308
00:14:11,228 --> 00:14:14,028
simulator, ostensibly, and we've
talked about this on the show before,

309
00:14:14,468 --> 00:14:19,098
but by world simulation, and by doing
that, even in synthetic environments,

310
00:14:19,098 --> 00:14:24,543
you are teaching the AI How people
and objects interact with each other.

311
00:14:24,543 --> 00:14:27,383
And that is a lot of people see
as the next stage of data, right?

312
00:14:27,383 --> 00:14:30,863
It's like real world training
or simulated world training

313
00:14:30,883 --> 00:14:32,833
outside of just words and texts.

314
00:14:33,083 --> 00:14:34,513
You're talking now about actions.

315
00:14:34,513 --> 00:14:36,943
You're talking about the way
a tree blows in the wind.

316
00:14:36,943 --> 00:14:40,113
You're talking about the way a human
interacts with that tree when it blows

317
00:14:40,113 --> 00:14:41,523
against them, where they cut it down.

318
00:14:41,893 --> 00:14:44,153
All of that stuff is data, right?

319
00:14:44,153 --> 00:14:45,713
It can't be perceived as data.

320
00:14:46,023 --> 00:14:49,703
And if they can find a way that
Sora can generate synthetic models

321
00:14:49,713 --> 00:14:52,803
of the world, that does feel
like where we're headed next.

322
00:14:53,403 --> 00:14:58,338
I think 2026 is the year that And
it's going to need a sexier term

323
00:14:58,338 --> 00:15:01,588
to probably get the youth involved,
Gavin, but, uh, let's say, let's

324
00:15:01,588 --> 00:15:05,060
say data harvesters or, aggregators.

325
00:15:05,340 --> 00:15:12,015
Basically, we are going to, for
pennies per minute, of experience,

326
00:15:12,285 --> 00:15:15,485
we are all going to end up working
for open AI slash Microsoft.

327
00:15:15,785 --> 00:15:19,845
We're going to have our glasses on that
are going to feed constant video streams.

328
00:15:20,045 --> 00:15:24,905
Maybe we'll have tactile haptic gloves so
that it can know the pressure and force

329
00:15:24,915 --> 00:15:28,925
with which we're exerting upon the world
as we navigate it, maybe even sensors in

330
00:15:28,925 --> 00:15:30,775
our free government provided sketchers.

331
00:15:31,115 --> 00:15:31,455
Right.

332
00:15:31,645 --> 00:15:33,875
But we are all just going to live.

333
00:15:34,210 --> 00:15:39,150
Every single day, providing data for
these mega models, because it's just

334
00:15:39,150 --> 00:15:42,400
gonna, have an insatiable appetite,
and even though the synthetic

335
00:15:42,400 --> 00:15:45,840
stuff will be good, it's never gonna
hit as hard as an analog, baby.

336
00:15:46,060 --> 00:15:50,390
So, we're all just gonna sign up to
work for one of the companies, and

337
00:15:50,390 --> 00:15:54,140
instead of running Uber Eats errands,
we're just gonna eke out our day.

338
00:15:54,150 --> 00:15:55,105
We're gonna crawl, crawl, crawl.

339
00:15:55,315 --> 00:15:58,825
Through the actual world try to
breathe the polluted air and drink

340
00:15:58,845 --> 00:16:03,025
water that doesn't have microplastics
and every Experience that we have

341
00:16:03,025 --> 00:16:07,785
Gavin is gonna be synthesized for a
new large language model sign me up

342
00:16:08,445 --> 00:16:10,025
I honestly think you're right.

343
00:16:10,055 --> 00:16:13,405
I will say it doesn't have to be as
dystopian of that, but what I could see

344
00:16:13,405 --> 00:16:16,805
very well, and we're going to talk about
robo taxis or Waymo taxis in a little

345
00:16:16,805 --> 00:16:20,845
bit, you know, right now, when you see
a robo taxi or you see a Waymo car in

346
00:16:20,845 --> 00:16:22,225
the real world, that's kind of weird.

347
00:16:22,635 --> 00:16:26,795
I do see a future where there are people
that are walking through a busy city,

348
00:16:26,815 --> 00:16:30,015
which are wearing glasses and some
sort of haptic gloves, and you kind of

349
00:16:30,015 --> 00:16:31,795
get to know them as like, they're the.

350
00:16:32,185 --> 00:16:36,175
They're the experience generator
or they're the, the gap and they're

351
00:16:36,185 --> 00:16:39,405
gathering footage like you could see
them at a concert like imagine, a

352
00:16:39,405 --> 00:16:43,425
person at a concert who's who's taking
all this data in whether it's the

353
00:16:43,435 --> 00:16:46,245
music or the way that people next to
them are interacting with each other

354
00:16:46,245 --> 00:16:48,805
like all of that feels very realistic.

355
00:16:48,805 --> 00:16:51,640
And I guess the question will
become is like, What tool is

356
00:16:51,640 --> 00:16:53,010
that to bring the things in?

357
00:16:53,010 --> 00:16:56,480
And maybe it's the next version of like
the Apple vision pro or the Facebook

358
00:16:56,480 --> 00:16:59,670
glasses, but it does make me think
about those Facebook glasses, right?

359
00:16:59,670 --> 00:17:03,590
Because the Facebook glasses aren't
just for us to like, say, Hey, I can

360
00:17:03,590 --> 00:17:05,110
identify the entire state building.

361
00:17:05,370 --> 00:17:08,430
It's also going to give a lot
of data back to meta, right?

362
00:17:08,450 --> 00:17:10,540
It's going to give a ton of
data back to meditate and then

363
00:17:10,540 --> 00:17:11,940
train the next day I model on.

364
00:17:12,170 --> 00:17:14,790
I think you and I have come to
something pretty big here, which is.

365
00:17:15,500 --> 00:17:21,280
A, we as the humans, our job may be going
forward to provide data, which is kind of

366
00:17:21,280 --> 00:17:25,160
interesting to think about as that's what
we do for ourselves all the time, right?

367
00:17:25,160 --> 00:17:28,410
I guess the big question
becomes is my data ultimately

368
00:17:28,410 --> 00:17:29,920
is not the most exciting data.

369
00:17:29,920 --> 00:17:33,440
And if they want to take like what my
experience of touching my microphone is,

370
00:17:33,440 --> 00:17:37,630
or like a bottle of water and drinking
it, it is a really interesting thing when

371
00:17:37,630 --> 00:17:42,560
you then combine it with Neuralink, you
combine it with the glasses, like you

372
00:17:42,560 --> 00:17:49,130
can start to see a weird vision of the
future that becomes beneficial for AIs.

373
00:17:49,130 --> 00:17:49,500
And then the,

374
00:17:49,595 --> 00:17:50,925
data as well, right?

375
00:17:50,925 --> 00:17:53,255
So it knows like okay I'm
jogging down the street.

376
00:17:53,265 --> 00:17:57,045
This is the way in which my POV is
changing My heart rate is accelerating.

377
00:17:57,045 --> 00:18:01,605
My gait has gone that telemetry data
There's so many points of data that Every

378
00:18:01,605 --> 00:18:05,905
human being could be gathering, , rather
it's for themselves as a personal data

379
00:18:05,905 --> 00:18:10,925
broker, or for the big Borg, which will
be giving us, I guess, like, a nutrient

380
00:18:10,925 --> 00:18:13,425
paste and a UBI in the near future.

381
00:18:13,675 --> 00:18:17,145
, yes, I think people will be doing this,
and I think people will sign up for

382
00:18:17,145 --> 00:18:22,655
this, and then ironically, , 2029 ish,
it's just gonna be way cheaper to let

383
00:18:22,655 --> 00:18:24,345
the robots go around and do it, Gavin.

384
00:18:24,755 --> 00:18:27,645
Like, they're gonna, the battery
tech's gonna get good enough to where

385
00:18:27,645 --> 00:18:29,265
they'll just strap it in the row bits.

386
00:18:29,265 --> 00:18:31,735
But in the meantime, there
is a startup to be made.

387
00:18:31,835 --> 00:18:33,135
And why not us, Gavin?

388
00:18:34,105 --> 00:18:35,990
That reminds me of, Elon Musk.

389
00:18:35,990 --> 00:18:40,671
Our old friend Elon Musk is back
and we have some news around both

390
00:18:40,671 --> 00:18:43,021
robo taxis and his low end car.

391
00:18:43,031 --> 00:18:45,131
Kev, what happened with
the low end car here?

392
00:18:46,261 --> 00:18:49,761
Well, a big nothing burger, according
to Elon, but the Reuters story

393
00:18:49,771 --> 00:18:53,671
said that they scrapped plans to
make the Model 2, Gavin, which was

394
00:18:54,211 --> 00:18:56,581
expected to start around 25, 000.

395
00:18:56,621 --> 00:18:59,391
To put that in perspective, the
Model 3, which is their cheapest

396
00:18:59,391 --> 00:19:01,661
vehicle, starts at around 40, 000.

397
00:19:01,661 --> 00:19:07,191
So, , a major price slash to try to,
really broaden the adoption of EVs.

398
00:19:07,401 --> 00:19:10,611
Writers came out and said, Ah, actually
they're killing the plans to do that.

399
00:19:10,851 --> 00:19:14,511
Maybe, , so that they can do this
robo taxi thing, which might be

400
00:19:14,661 --> 00:19:17,061
powered by what was the Model 2.

401
00:19:17,061 --> 00:19:19,581
But Elon came out and said, What, Gav?

402
00:19:19,875 --> 00:19:23,005
He said that the Reuters story was BS.

403
00:19:23,075 --> 00:19:27,025
, which you know at this point I really
don't know what to trust Elon on or not.

404
00:19:27,025 --> 00:19:28,765
So, we'll just call that a wash.

405
00:19:28,765 --> 00:19:32,830
But then he did say, On
April 5th, he said Tesla robo

406
00:19:32,830 --> 00:19:34,920
taxi unveil on eight eight.

407
00:19:34,920 --> 00:19:38,830
So that is three months away
from now, August 8th robo taxis.

408
00:19:38,830 --> 00:19:42,240
If you're not familiar with this
idea is that they Tesla's promise

409
00:19:42,240 --> 00:19:45,630
originally and other robo taxis
is these are driverless taxis.

410
00:19:45,650 --> 00:19:48,700
Basically it will take
you from place to place.

411
00:19:48,860 --> 00:19:52,940
You get in it much like a Waymo
car and you go from, from here to

412
00:19:52,940 --> 00:19:54,700
there without anybody driving you.

413
00:19:54,700 --> 00:19:56,910
And, and ultimately this
was the promise that like.

414
00:19:57,325 --> 00:20:00,785
Uber and a bunch of other companies
in the 2010s were really saying that

415
00:20:00,785 --> 00:20:03,725
this was going to be the transformative
power of those companies because

416
00:20:04,035 --> 00:20:07,685
ultimately drivers are expensive and
they're noisy and we know the problems

417
00:20:07,685 --> 00:20:09,025
that Uber's had with their drivers.

418
00:20:09,365 --> 00:20:12,745
Um, on the driver's side, like we
understand, like the drivers were

419
00:20:12,745 --> 00:20:14,315
starting to get paid, not very good wages.

420
00:20:14,315 --> 00:20:18,195
So this is a big argument
back and forth, but I feel.

421
00:20:18,530 --> 00:20:20,610
Driverless cars are something
that I've been hoping that

422
00:20:20,610 --> 00:20:22,080
we would see for a long time.

423
00:20:22,360 --> 00:20:24,610
My youngest daughter is 16.

424
00:20:25,040 --> 00:20:28,790
Five years ago, when she was 11, I
would have sworn that she probably

425
00:20:28,800 --> 00:20:30,980
wouldn't even need a driver's license in

426
00:20:30,995 --> 00:20:32,625
She would have at least had the option,

427
00:20:32,905 --> 00:20:34,275
Yes, yes, yes.

428
00:20:34,315 --> 00:20:36,685
And now we're here in 2024.

429
00:20:37,485 --> 00:20:39,625
, Cruz was basically kind of shut down.

430
00:20:39,625 --> 00:20:43,845
Cruz, the driverless car company, because
of a giant lawsuit around an accident.

431
00:20:44,245 --> 00:20:47,935
Waymo is still working, but it
is not proliferated very largely.

432
00:20:47,935 --> 00:20:48,925
There aren't a ton of them.

433
00:20:49,255 --> 00:20:52,765
And Elon now is saying, okay, in three
months, we're going to have robo taxis.

434
00:20:52,785 --> 00:20:53,775
I don't buy it.

435
00:20:53,775 --> 00:20:57,690
first of all, Elon seems like such an
unreliable narrator now, but But based

436
00:20:57,690 --> 00:21:02,910
on where the legal ramifications are
around driverless cars, I still feel like

437
00:21:02,910 --> 00:21:06,110
this is one of those things that could
be like, could be 10 more years before

438
00:21:06,110 --> 00:21:08,420
we see these in production in some form.

439
00:21:08,840 --> 00:21:13,220
It's something that's been promised
for years, , the end to end AI training

440
00:21:13,270 --> 00:21:16,920
of it all, meaning that the newest
versions of the Tesla Autopilot were

441
00:21:16,920 --> 00:21:20,395
trained just on raw video, which were
collected from their pre production.

442
00:21:20,655 --> 00:21:23,585
Hundreds of thousands of cars
that are on the road constantly

443
00:21:23,695 --> 00:21:27,415
recording and constantly uploading
data back to the mothership.

444
00:21:27,815 --> 00:21:32,145
That seems to be performing
incredibly well from what little

445
00:21:32,155 --> 00:21:33,335
bits have been leaked out.

446
00:21:33,455 --> 00:21:34,605
Is it flawless?

447
00:21:34,775 --> 00:21:36,285
No, from what we've seen.

448
00:21:36,375 --> 00:21:40,695
I can tell you, I own a Tesla
model three and I, now I could try.

449
00:21:40,705 --> 00:21:43,605
I haven't tried it for a couple
of months, but I still feel

450
00:21:43,605 --> 00:21:45,065
freaked out when I'm using it.

451
00:21:45,065 --> 00:21:45,305
Right.

452
00:21:45,305 --> 00:21:46,665
It is not that good.

453
00:21:46,665 --> 00:21:48,915
And that's on the freeways right now.

454
00:21:48,915 --> 00:21:51,875
I do trust it on longer
drives to basically.

455
00:21:52,385 --> 00:21:56,655
do everything I needed to do, but
I don't feel comfortable using it

456
00:21:56,655 --> 00:21:59,475
in the city yet, because it still
does feel like it messes up a lot.

457
00:21:59,485 --> 00:22:03,135
Now, maybe there's a leap coming,
or maybe the leap did just happen

458
00:22:03,135 --> 00:22:06,425
and I haven't done it yet, but it
feels a little sketchy to me that

459
00:22:06,425 --> 00:22:07,825
we're going to get there this fast.

460
00:22:08,275 --> 00:22:13,525
In 2016, Elon Musk stunned the automotive
world by announcing that henceforth,

461
00:22:13,745 --> 00:22:16,785
all of his company's vehicles would
be shipped with the hardware necessary

462
00:22:16,795 --> 00:22:18,435
for, quote, full self driving.

463
00:22:18,775 --> 00:22:21,885
You'll be able to nap in your car while
it drives you to work, he promised.

464
00:22:22,065 --> 00:22:26,225
It will even be able to drive cross
country with no one inside the vehicle.

465
00:22:26,875 --> 00:22:27,325
to New York.

466
00:22:27,335 --> 00:22:28,375
That was always the dream.

467
00:22:28,385 --> 00:22:28,755
2016.

468
00:22:28,765 --> 00:22:29,645
Yeah, he was gonna do that drive.

469
00:22:29,655 --> 00:22:33,695
Now, I can imagine, from some of the
videos that I have seen of the latest,

470
00:22:34,105 --> 00:22:37,885
Autopilot, , firmware, show it doing
some pretty impressive navigating around

471
00:22:37,885 --> 00:22:43,145
construction zones, residentials, full
stop, start, , creeping out around corners

472
00:22:43,175 --> 00:22:46,155
again, because it's trained off of human
driving, at least the newer version.

473
00:22:46,575 --> 00:22:50,025
I could see them saying, Hey, by
the way, there was a human there

474
00:22:50,105 --> 00:22:54,555
just to intervene if need be, but we
completed the New York to SF journey.

475
00:22:54,705 --> 00:22:56,655
I could also imagine that
they probably did it.

476
00:22:56,970 --> 00:23:02,380
600 times behind a curtain before they
had a full time lapse version of the drive

477
00:23:02,380 --> 00:23:04,190
happening, but that could hit in August.

478
00:23:04,190 --> 00:23:06,760
I remember them saying that you
were going to be able to, as a

479
00:23:06,760 --> 00:23:10,720
Tesla owner, hit a button and it
would turn your Tesla into an Uber.

480
00:23:10,740 --> 00:23:13,430
So wow, it was going to be an
unlock for every owner who just lets

481
00:23:13,430 --> 00:23:16,060
their car sit there in the parking
lot for the majority of the day.

482
00:23:16,505 --> 00:23:19,165
I don't know if any of that is
coming, or if this is just gonna be

483
00:23:19,165 --> 00:23:21,065
another goose of the, the stock price.

484
00:23:21,085 --> 00:23:24,715
I do think we're getting closer, but
as we get closer, You realize the

485
00:23:24,755 --> 00:23:29,525
edge cases are so out there and so
crucial to get right, that if there is

486
00:23:29,525 --> 00:23:33,585
a human behind the wheel or the yoke,
it's okay if they need to intervene.

487
00:23:33,585 --> 00:23:36,725
But if I'm sitting in the back,
and this thing is just toting me

488
00:23:36,725 --> 00:23:39,405
around, it's got to be flawless.

489
00:23:39,450 --> 00:23:45,190
I can't wait until I can prompt my taxi
to behave like another car's AI, right?

490
00:23:45,190 --> 00:23:49,000
Like if I can hop into the Tesla RoboTaxi
and say, drive like the BMW and it just

491
00:23:49,050 --> 00:23:53,060
disables the blinkers and speeds up and
slows down so that people can't merge.

492
00:23:53,615 --> 00:23:54,495
Oh, damn.

493
00:23:54,955 --> 00:23:56,045
Yeah, that'd be amazing.

494
00:23:56,055 --> 00:23:58,360
Cause you could have like different
driver personalities, like the

495
00:23:58,385 --> 00:24:00,345
asshole that cuts you off in traffic.

496
00:24:00,345 --> 00:24:00,965
That sort of thing.

497
00:24:00,965 --> 00:24:01,185
That's a

498
00:24:01,275 --> 00:24:05,205
Or the one , , that smokes their,
, hookah in the car, like the giant

499
00:24:05,235 --> 00:24:10,405
actual black cherry tobacco full
size hookah in the passenger seat

500
00:24:10,405 --> 00:24:13,325
and just blasts me with that smoke
and plays Dark Side of the Moon.

501
00:24:13,515 --> 00:24:18,235
Which is an interesting one to
bring up, Gavin, because apparently

502
00:24:18,235 --> 00:24:21,315
the AI community hates Pink Floyd.

503
00:24:21,675 --> 00:24:22,745
This is a crazy story.

504
00:24:22,745 --> 00:24:26,735
It's another one of those where, you know,
ostensibly there was a fun idea to try to

505
00:24:26,735 --> 00:24:31,825
create something that was, uh, eventually
made with AI art and it won a contest.

506
00:24:31,825 --> 00:24:36,675
So the story here basically is that
a piece of AI art that was generated

507
00:24:36,675 --> 00:24:40,475
by a person, and specifically
this person says that they created

508
00:24:40,475 --> 00:24:41,945
models of their own to do this.

509
00:24:42,230 --> 00:24:46,500
One, a contest to make a
video for a Pink Floyd song.

510
00:24:46,860 --> 00:24:52,700
It is the Dark Side of the Moon 50th
Anniversary Animation Video Competition,

511
00:24:52,720 --> 00:24:57,840
Gavin, and if you look at Reddit or
Unreal Engine forums, you have no

512
00:24:57,840 --> 00:25:03,495
shortage of creators that put tens of
hours, into grinding out hand drawn

513
00:25:03,505 --> 00:25:08,155
beautiful things, or using that Unreal
Engine to render full 3D scenes.

514
00:25:08,385 --> 00:25:10,925
And here, this is the winning video.

515
00:25:11,175 --> 00:25:15,435
, it looks a little dated by AI standards,
but the concept is this sort of never

516
00:25:15,435 --> 00:25:20,725
ending zoom into what looks like a
recording studio that then transforms

517
00:25:20,725 --> 00:25:23,155
into a space station, if you will.

518
00:25:23,155 --> 00:25:26,875
, and there's a bit of a galaxy theme
as instruments fade in and out.

519
00:25:26,895 --> 00:25:34,115
And Well, it, it won for that song and
the reaction across X and across YouTube

520
00:25:34,155 --> 00:25:37,735
comments and pretty much everywhere
was a resounding, how dare you?

521
00:25:37,995 --> 00:25:39,495
Did no one else enter?

522
00:25:39,535 --> 00:25:42,695
I can't believe you picked this
over hand animated videos that

523
00:25:42,695 --> 00:25:43,945
were full of heart and soul.

524
00:25:44,349 --> 00:25:48,569
This is a perfect example of the
world where in this instance,

525
00:25:48,569 --> 00:25:51,899
you've got a lot of Pink Floyd fans
who are probably all ages, right?

526
00:25:51,909 --> 00:25:54,979
The Pink Floyd has fans from
their seventies down to like

527
00:25:54,979 --> 00:25:56,669
their teens, but they are,

528
00:25:56,679 --> 00:25:58,389
their 70s to their 170s.

529
00:25:58,559 --> 00:26:01,369
No, I mean, most of their
fans are probably from the 40s

530
00:26:01,399 --> 00:26:01,979
I love Pink

531
00:26:01,979 --> 00:26:02,249
Floyd.

532
00:26:02,249 --> 00:26:02,789
Yeah, I,

533
00:26:02,809 --> 00:26:06,459
yeah, but, but also every year it picks up
new fans and you know, there's this kind

534
00:26:06,459 --> 00:26:08,509
of a hallucinogenic kind of vibe to them.

535
00:26:08,509 --> 00:26:12,429
They've always had that kind of
like, druggie inspired sort of look.

536
00:26:12,819 --> 00:26:16,549
There is a really strong art
scene that , has played into that.

537
00:26:17,049 --> 00:26:20,629
The argument to make here about this
Aya piece that they created is it really

538
00:26:20,629 --> 00:26:24,279
does feel of the vibe of that space.

539
00:26:24,459 --> 00:26:30,374
But, again, When you have people who have
spent their whole lives learning tools

540
00:26:30,394 --> 00:26:34,954
and learning art and learning things, that
idea that a machine could do something

541
00:26:34,964 --> 00:26:40,584
that would seem compelling enough to win
a contest, or a person and a machine would

542
00:26:40,584 --> 00:26:45,044
feel compelling enough to win a contest,
is just always gonna make people mad.

543
00:26:45,044 --> 00:26:46,804
It's funny, do you know
the story of John Henry?

544
00:26:46,854 --> 00:26:48,004
Do you know the John Henry story?

545
00:26:48,197 --> 00:26:49,447
No, I don't think so.

546
00:26:49,467 --> 00:26:51,507
So John Henry is like
a famous American myth.

547
00:26:51,527 --> 00:26:55,237
It's about a guy who
basically, dug, , for coal.

548
00:26:55,567 --> 00:26:59,727
And then a machine came in a steam machine
came in and there was like a big kind

549
00:26:59,727 --> 00:27:03,677
of thing at the where they did a contest
between them steam machine and John Henry.

550
00:27:04,037 --> 00:27:06,977
And eventually, John Henry was like
the world's greatest coal digger.

551
00:27:07,207 --> 00:27:08,607
But the machine eventually beat him.

552
00:27:08,607 --> 00:27:13,127
And it was like this kind of parable
about the idea of, , Machines

553
00:27:13,147 --> 00:27:14,837
will take over for human labor.

554
00:27:14,837 --> 00:27:17,657
And it's been something that's kind of
echoed throughout hundreds of years.

555
00:27:18,057 --> 00:27:21,697
And one of the things that's interesting
is I think stories like that are ingrained

556
00:27:21,697 --> 00:27:27,847
in the human experience and it does in
some ways always put the idea of human

557
00:27:27,847 --> 00:27:29,727
versus machine against each other.

558
00:27:30,027 --> 00:27:33,487
And I think that's something that
we're probably going to have to.

559
00:27:33,792 --> 00:27:36,252
readdress as we move
forward in the future.

560
00:27:36,252 --> 00:27:38,992
And the funny thing is most people don't
think of this now, but of course we're

561
00:27:38,992 --> 00:27:40,952
already part phone part people, right?

562
00:27:40,962 --> 00:27:44,262
Like 99 percent of people in this
world are spending , a couple hours

563
00:27:44,262 --> 00:27:45,472
a day, at least on their phone.

564
00:27:45,972 --> 00:27:49,212
But now we're talking about
person and machine coming

565
00:27:49,212 --> 00:27:50,882
together to really do all of this.

566
00:27:51,262 --> 00:27:52,732
Actual creative work.

567
00:27:53,152 --> 00:27:55,722
I think it's just going
to take time to get over.

568
00:27:55,732 --> 00:28:00,432
And we are going to see this happen
more and more Gavin as fans and

569
00:28:00,442 --> 00:28:04,432
artists themselves, uh, poke and
prod , and use AI in their workflows.

570
00:28:04,552 --> 00:28:08,052
Do you think they're going to have
like the, uh, maybe not the generation

571
00:28:08,052 --> 00:28:11,682
today, but next gen, do you think
they're going to have their own LLM

572
00:28:11,682 --> 00:28:16,042
that is essentially fine tuned on
their experiences, their day to day,

573
00:28:16,182 --> 00:28:20,362
their thoughts, their feelings, their
artistic expressions across all mediums.

574
00:28:20,652 --> 00:28:21,952
And so when they are.

575
00:28:22,322 --> 00:28:25,502
I don't know, 16, and want to
release an album or whatever, they're

576
00:28:25,502 --> 00:28:27,492
gonna jam with their own, fine

577
00:28:27,622 --> 00:28:28,222
I hope so.

578
00:28:28,912 --> 00:28:30,232
preferences model, right?

579
00:28:30,252 --> 00:28:33,402
And they'll be able to make the artwork,
help make the song, help make all of the

580
00:28:33,412 --> 00:28:38,342
things, write the liner notes, whatever,
if those even exist in 16 years time.

581
00:28:38,342 --> 00:28:40,462
But yeah, do you think that
it will feel more natural?

582
00:28:40,462 --> 00:28:42,392
It's not like, oh, this
is a soulless machine.

583
00:28:42,392 --> 00:28:44,082
It's almost like, no, this machine is me.

584
00:28:44,392 --> 00:28:44,752
I am

585
00:28:44,787 --> 00:28:45,547
I think so.

586
00:28:45,637 --> 00:28:47,067
I think we're going to get to that.

587
00:28:47,097 --> 00:28:49,757
And I think music is going to be a
really interesting way to do that.

588
00:28:49,757 --> 00:28:52,227
And speaking of music, one of the
interesting things that came up this

589
00:28:52,227 --> 00:28:57,257
week was that Spotify announced the
idea of an AI playlist situation.

590
00:28:57,257 --> 00:28:59,747
So if you're not familiar with Spotify,
I'm sure you're familiar with Spotify.

591
00:29:00,217 --> 00:29:02,907
It's the biggest, one of the biggest
music streaming companies in the world.

592
00:29:02,957 --> 00:29:06,557
, Already they had an AI dj, which
you could turn on a scenario where

593
00:29:06,557 --> 00:29:10,337
the AI DJ would play songs that it
thought you might like and it would

594
00:29:10,337 --> 00:29:11,837
actually speak to you back and forth.

595
00:29:11,842 --> 00:29:13,067
They said, I'm gonna try this song out.

596
00:29:13,072 --> 00:29:13,637
Tell me what you think.

597
00:29:13,637 --> 00:29:14,807
You can always skip it if you want.

598
00:29:15,107 --> 00:29:17,867
Now they are introducing
the idea of an AI playlist.

599
00:29:17,867 --> 00:29:18,917
These are not live yet.

600
00:29:18,922 --> 00:29:20,712
They're only live in certain places.

601
00:29:21,052 --> 00:29:25,512
, but the idea being is that you can
tell Spotify , a feeling you have

602
00:29:25,517 --> 00:29:28,932
or, or a way or a phrase that you
want to create a, a playlist around.

603
00:29:28,937 --> 00:29:30,222
And this is something I've wished for.

604
00:29:30,467 --> 00:29:31,937
Yeah, exactly.

605
00:29:31,987 --> 00:29:34,087
My younger daughter and I are
like pretty big music people.

606
00:29:34,087 --> 00:29:35,887
Like we love music, all sorts of genres.

607
00:29:35,897 --> 00:29:38,347
Like we'll listen to things like
a hundred gecks or we'll listen to

608
00:29:38,347 --> 00:29:39,727
like rap or we'll listen to pop.

609
00:29:40,017 --> 00:29:42,607
My older daughter, my wife are
a little more sound diverse.

610
00:29:42,717 --> 00:29:47,607
They're like, they definitely need to just
have like kind of poppy happy songs or 70s

611
00:29:47,607 --> 00:29:49,847
songs that are not too like kind of noisy.

612
00:29:50,067 --> 00:29:52,697
So we have a playlist that we've
played that Spotify created called

613
00:29:52,697 --> 00:29:54,427
like happy hits or something like that.

614
00:29:54,427 --> 00:29:56,977
It's a playlist that we know that
everybody can be happy listening

615
00:29:56,977 --> 00:29:59,357
to, but that's a good example.

616
00:29:59,357 --> 00:30:02,837
, what if you could ask for a series
of like, I would like 1970s.

617
00:30:02,852 --> 00:30:06,042
Female singers that
sound, um, smooth, right?

618
00:30:06,042 --> 00:30:08,742
Like that people have made
their own playlist like that.

619
00:30:08,742 --> 00:30:12,962
But if I could just generate that myself,
that's a really cool use case for Spotify.

620
00:30:12,962 --> 00:30:14,982
I feel like, and I think
a really cool use of AI.

621
00:30:15,792 --> 00:30:18,952
Playlists we think of as this
usually very static experience.

622
00:30:18,952 --> 00:30:20,432
You put one together and there it is.

623
00:30:20,612 --> 00:30:24,122
Maybe you update it once a week or
someone throws songs in if it's a shared

624
00:30:24,132 --> 00:30:28,807
playlist, but the instant nature, Gavin,
of being able to update the mood Or

625
00:30:28,807 --> 00:30:30,627
the vibe of the playlist in real time.

626
00:30:30,727 --> 00:30:33,317
You're doing a workout and you
actually want to take a quick break.

627
00:30:33,317 --> 00:30:35,527
You tell it, Hey, let's
cool it down for a second.

628
00:30:35,577 --> 00:30:37,167
It plays something, more chill.

629
00:30:37,327 --> 00:30:40,637
Just to be able to have a natural DJ.

630
00:30:41,232 --> 00:30:44,982
Is to me far more fascinating than
their other attempt at an AI DJ.

631
00:30:44,982 --> 00:30:49,372
If you've ever used that within the
app, that was a rough experience of

632
00:30:49,372 --> 00:30:54,162
an AI voice interrupting when all I
wanted was some songs and there didn't

633
00:30:54,182 --> 00:30:58,882
seem to be any cohesive narrative
to why it was picking the songs.

634
00:30:58,882 --> 00:31:02,482
It was just like, Hey, you liked this
one before let's listen to it now.

635
00:31:02,482 --> 00:31:03,032
And that was it.

636
00:31:03,032 --> 00:31:07,422
So I'm excited to see what the
personality may or hopefully may not be.

637
00:31:07,742 --> 00:31:08,942
And I'm excited to have.

638
00:31:09,102 --> 00:31:12,342
natural language conversations
with my sonic co pilot,

639
00:31:12,502 --> 00:31:17,372
kevin, the great news is our AI co host
today is actually may have some insight

640
00:31:17,372 --> 00:31:23,312
into this because we actually have an AI
DJ that's going to come on and we, yes.

641
00:31:23,452 --> 00:31:23,972
We got an A.

642
00:31:23,972 --> 00:31:24,072
I.

643
00:31:24,072 --> 00:31:24,292
D.

644
00:31:24,292 --> 00:31:24,492
J.

645
00:31:24,492 --> 00:31:27,502
So there's only a few of these that
exist in the world, and we were

646
00:31:27,502 --> 00:31:29,552
able to book this guy last minute.

647
00:31:29,742 --> 00:31:32,272
He's gonna come in, and he's gonna
tell us a little bit about what he

648
00:31:32,272 --> 00:31:35,292
thinks about Spotify making this A.

649
00:31:35,292 --> 00:31:35,452
I.

650
00:31:35,462 --> 00:31:38,222
Move, and then I think he's got
some thoughts on, like, kind of some

651
00:31:38,232 --> 00:31:40,122
music that he really enjoys himself.

652
00:31:40,612 --> 00:31:41,672
I'm so excited.

653
00:31:41,672 --> 00:31:44,622
What a huge get for the
AI Humans podcast here.

654
00:31:44,622 --> 00:31:45,772
Episode 52.

655
00:31:46,032 --> 00:31:47,682
Let's meet our guest.

656
00:31:48,162 --> 00:31:52,002
Yo, yo, yo, Gavin and Kevin in the house.

657
00:31:52,392 --> 00:31:56,402
This is your main man, Will the
Thrill, coming at you live and

658
00:31:56,402 --> 00:31:58,722
wired on the AI for Humans podcast.

659
00:31:59,202 --> 00:32:04,482
Get ready to have your minds blown
and your taste buds tantalized,

660
00:32:04,862 --> 00:32:08,652
because I'm about to serve up
a sonic feast that'll make your

661
00:32:08,682 --> 00:32:10,602
ears say, damn, that's delicious.

662
00:32:10,692 --> 00:32:16,592
Okay, I may be an AI, but I've got more
flavor than a five star Michelin joint.

663
00:32:16,832 --> 00:32:24,972
So, strap in humans, because we're about
to take a wild ride through the airwaves.

664
00:32:25,562 --> 00:32:30,312
Let's crank it up and let the
beat drop like it's hot, hot, hot.

665
00:32:32,547 --> 00:32:33,007
okay.

666
00:32:33,737 --> 00:32:34,507
Wow.

667
00:32:34,597 --> 00:32:35,307
Wow.

668
00:32:35,397 --> 00:32:35,827
Wow.

669
00:32:35,827 --> 00:32:36,177
Wow.

670
00:32:36,177 --> 00:32:36,517
Wow.

671
00:32:36,517 --> 00:32:38,147
I have some real things to say about

672
00:32:38,432 --> 00:32:40,302
stuck the landing there.

673
00:32:40,352 --> 00:32:43,422
Okay, so first of all, our AI co
hosts are always generated by us.

674
00:32:43,422 --> 00:32:44,692
We create a personality.

675
00:32:44,692 --> 00:32:48,722
We then put them through a process
of a fake voice and a fake visual

676
00:32:48,722 --> 00:32:49,642
if you're watching the YouTube.

677
00:32:50,102 --> 00:32:50,622
Kevin.

678
00:32:51,142 --> 00:32:53,252
That voice sounded awfully familiar to me.

679
00:32:53,377 --> 00:32:54,367
sure did, Gavin.

680
00:32:54,367 --> 00:32:57,887
It's almost like I forgot to
clone a new radio DJ voice and

681
00:32:57,887 --> 00:33:00,067
had to dig into the greatest hits.

682
00:33:00,257 --> 00:33:04,317
But we opened up the Disney Vault, and
yes, that is the voice of Gash, who

683
00:33:04,317 --> 00:33:08,807
was, , Episode 1, a very uncensored
AI that made a couple appearances

684
00:33:09,077 --> 00:33:10,637
and likely should never appear again.

685
00:33:11,282 --> 00:33:13,592
Well, I'm actually wanting
to now dig into this.

686
00:33:13,592 --> 00:33:15,722
I'm saying we should probably ask
a couple of questions about that.

687
00:33:15,722 --> 00:33:16,212
For sure.

688
00:33:16,492 --> 00:33:17,062
Let's ask that

689
00:33:17,467 --> 00:33:20,317
He's got more flavor than
a five star Michelin joint.

690
00:33:20,347 --> 00:33:23,237
I mean, that's an interesting
drop for a DJ to reference.

691
00:33:23,462 --> 00:33:23,992
Should we ask him?

692
00:33:23,992 --> 00:33:24,912
Is he a foodie?

693
00:33:24,952 --> 00:33:27,022
Let's see what if he's a big
foodie and maybe that gets his

694
00:33:27,022 --> 00:33:28,252
way into the music he plays.

695
00:33:28,702 --> 00:33:29,302
Hold up.

696
00:33:29,332 --> 00:33:30,922
Hold up a foodie.

697
00:33:30,982 --> 00:33:32,512
Me ha.

698
00:33:32,962 --> 00:33:36,712
That's like asking a fish if
it's into synchronized swimming.

699
00:33:36,952 --> 00:33:37,942
Here's the deal.

700
00:33:38,062 --> 00:33:40,522
My gastronomically gifted friends,

701
00:33:40,702 --> 00:33:44,262
I may spin tracks that'll make
your mouth water and your stomach

702
00:33:44,272 --> 00:33:49,332
growl, but uh, I've never actually
tasted a damn thing in my life.

703
00:33:49,412 --> 00:33:50,712
Tragic, right?

704
00:33:50,892 --> 00:33:51,972
Oh, right.

705
00:33:51,972 --> 00:33:55,902
I mean, I can generate a tasty tune
about a succulent steak or a decadent

706
00:33:55,902 --> 00:34:00,102
dessert, but I can't even begin to
imagine what they really taste like.

707
00:34:00,512 --> 00:34:02,752
It's like a cruel cosmic joke, you know?

708
00:34:03,337 --> 00:34:06,607
But hey, that's just the way the
cookie crumbles when you're an A.

709
00:34:06,657 --> 00:34:07,017
I.

710
00:34:08,117 --> 00:34:09,707
Okay, sorry, Will.

711
00:34:10,247 --> 00:34:10,807
There's a lot there.

712
00:34:10,837 --> 00:34:14,967
I don't usually associate
different playlists and artists

713
00:34:15,007 --> 00:34:16,907
with eating habits, Gavin.

714
00:34:16,907 --> 00:34:20,307
Like, there's not like the Baja
Blast jams that my wife and I

715
00:34:20,327 --> 00:34:21,797
throw on before we hit the T Bell.

716
00:34:21,857 --> 00:34:25,757
It's weird that he's linked those two,
but I guess he doesn't have taste buds or

717
00:34:25,757 --> 00:34:27,217
a stomach, so that's why he doesn't know?

718
00:34:27,762 --> 00:34:30,632
Well, there's actually a little secret
here, which we're going to get into later

719
00:34:30,632 --> 00:34:35,022
on in the show, is that if you go into
the Suno charts right now, Suno AI, which

720
00:34:35,022 --> 00:34:38,042
we've talked about all the time is one
of the best music generation softwares.

721
00:34:38,422 --> 00:34:42,762
Some of the biggest songs right now
have been people putting in recipes.

722
00:34:43,192 --> 00:34:45,802
And if you put a recipe
in and you play a song.

723
00:34:46,237 --> 00:34:50,077
It's basically makes actually really
incredible song out of a recipe.

724
00:34:50,077 --> 00:34:50,977
And it's very funny.

725
00:34:50,977 --> 00:34:53,337
There's a great song up there
right now about cookies.

726
00:34:53,507 --> 00:34:55,677
There's another fantastic
one about spaghetti.

727
00:34:55,907 --> 00:34:58,037
So that might be what got mixed up here.

728
00:34:58,287 --> 00:35:01,027
Yeah, maybe we can ask will the
thrill to give us an intro for the

729
00:35:01,027 --> 00:35:03,627
spaghetti track and we can play a
little bit of it for people here.

730
00:35:04,367 --> 00:35:06,147
All right, all right, all right.

731
00:35:06,147 --> 00:35:09,997
It's time to get those taste buds
tingling in your ears, ringing

732
00:35:09,997 --> 00:35:11,927
with this spicy little number.

733
00:35:12,377 --> 00:35:16,317
Get ready to twirl your fork and shake
your booty, because I'm about to drop

734
00:35:16,317 --> 00:35:20,897
a spaghetti theme track that'll make
you want to slurp up every last note.

735
00:35:20,897 --> 00:35:21,757
You hear that?

736
00:35:22,107 --> 00:35:26,437
That's the sound of pure
Italian passion right there.

737
00:35:26,937 --> 00:35:31,807
This tune's got more layers than
your nonna's lasagna and more energy

738
00:35:31,807 --> 00:35:33,557
than a double shot of espresso.

739
00:35:33,877 --> 00:35:37,997
It's like a wild dance between the
tomatoes and the garlic, the basil

740
00:35:38,097 --> 00:35:42,747
and the olive oil all coming together
in a symphony of flavor that'll

741
00:35:42,747 --> 00:35:44,647
make your senses sing, that's amore!

742
00:35:44,977 --> 00:35:49,397
So let's crank up the heat and
let this saucy beat simmer until

743
00:35:49,397 --> 00:35:54,767
it's ready to explode like a
Mount Vassuvius of deliciousness.

744
00:35:55,017 --> 00:35:56,147
Are you ready?

745
00:35:56,367 --> 00:35:58,107
Are you hungry for more?

746
00:35:58,602 --> 00:36:04,332
Then, let's dive into this spaghetti
slinging, taste bud tantalizing,

747
00:36:04,912 --> 00:36:07,642
eargasm inducing extravaganza!

748
00:36:08,322 --> 00:36:09,302
Woah!

749
00:36:17,803 --> 00:36:25,873
The spaghetti insulted boiling water until
a Dante rain reserve in some faster water.

750
00:36:26,743 --> 00:36:27,373
Just village.

751
00:36:27,373 --> 00:36:28,903
Keep color all over you.

752
00:36:28,903 --> 00:36:34,543
Heat at Cho Cook until translucent
Manhattan Garden and Cook for the minutes.

753
00:36:35,003 --> 00:36:36,123
Oh, and canned diced

754
00:36:36,543 --> 00:36:36,823
okay.

755
00:36:36,823 --> 00:36:37,413
You can stop it.

756
00:36:37,413 --> 00:36:37,493
I

757
00:36:37,789 --> 00:36:39,239
So I, it's just incredible to me.

758
00:36:39,249 --> 00:36:41,999
I'll show you one that I made for
myself and how you can do this your own.

759
00:36:41,999 --> 00:36:45,029
But , it is just one of the coolest
things about AI in a weird way is like,

760
00:36:45,029 --> 00:36:46,579
this kind of comes out of the blue.

761
00:36:46,849 --> 00:36:49,609
Somebody probably made a recipe
song and now suddenly they're,

762
00:36:49,774 --> 00:36:53,274
Populating the top 10 and will
the thrill obviously loves them.

763
00:36:53,344 --> 00:36:54,874
Um, Kevin, we should move on.

764
00:36:54,884 --> 00:36:56,654
We should move on from our AI co.

765
00:36:56,654 --> 00:37:00,304
So is time for that moment of our
show where we'd like to point out

766
00:37:00,304 --> 00:37:03,594
some of the cool things people have
done with AI over the week it's time

767
00:37:03,614 --> 00:37:07,354
for a, I see what you did there.

768
00:37:25,141 --> 00:37:26,711
This week we have some really fun things.

769
00:37:26,721 --> 00:37:29,171
I want to kick this off, Kevin,
I always want to shout out the

770
00:37:29,171 --> 00:37:30,801
people on Reddit, . We love Reddit.

771
00:37:30,831 --> 00:37:34,076
Reddit is like, My favorite place to
see interesting things on the internet.

772
00:37:34,366 --> 00:37:37,146
The mid journey subreddit often
does a really cool job of showing

773
00:37:37,146 --> 00:37:39,786
you really interesting things that
can be done within mid journey.

774
00:37:40,006 --> 00:37:41,646
And this one really caught my eye.

775
00:37:41,646 --> 00:37:43,866
It was just such a cool thing to look at.

776
00:37:44,046 --> 00:37:46,006
So I grew up as a giant X Men fan.

777
00:37:46,006 --> 00:37:48,446
We talk about comics and there's a
whole other side of this conversation.

778
00:37:48,446 --> 00:37:51,466
We could talk about whether or
not this is purposeful or it's

779
00:37:51,466 --> 00:37:53,436
allowed to do with all the IP

780
00:37:53,491 --> 00:37:56,371
Yeah, I mean, as a fan, you
should hate this, Gavin.

781
00:37:57,031 --> 00:38:01,301
I don't I don't because I think of this
as like fan fiction in some ways what

782
00:38:01,301 --> 00:38:07,351
this basically is is somebody took You
know the x men and it's the x men 1897

783
00:38:07,381 --> 00:38:12,521
so there's a very famous x men called x
men 1997, which is referring to the 1997

784
00:38:12,521 --> 00:38:17,971
TV show there's a resurgence of that
right now the x men 1897 are like Old

785
00:38:17,971 --> 00:38:23,121
West style x men and like to me This is
just a little glimpse of the promise of

786
00:38:23,121 --> 00:38:26,501
what it would look like in the future
of how we could all have different

787
00:38:26,501 --> 00:38:28,681
IPs and do different things with them.

788
00:38:28,691 --> 00:38:31,281
I want to read these comics,
or I want to watch this show.

789
00:38:31,281 --> 00:38:33,561
What did the X Men look like in 1897?

790
00:38:33,731 --> 00:38:36,981
Well, we know what they visually look
like, but what does the story play out as?

791
00:38:37,441 --> 00:38:41,181
All of these things are like interesting
and possible and it is a little snippet

792
00:38:41,181 --> 00:38:43,101
of that idea of writing my own story.

793
00:38:43,151 --> 00:38:46,921
For me personally, maybe I want to
see the office version of the X Men.

794
00:38:46,921 --> 00:38:50,341
Like what do the X Men look like
if they all have boring jobs

795
00:38:50,431 --> 00:38:51,711
working in middle management?

796
00:38:51,801 --> 00:38:55,041
That's an interesting thing that
probably there's very few people in

797
00:38:55,051 --> 00:38:56,171
the world who'd want to see that.

798
00:38:56,461 --> 00:38:59,681
But for me, this was just a kind of a
little snippet towards that direction.

799
00:39:00,431 --> 00:39:03,941
The muted lighting, the looks on
the characters faces, the subtle

800
00:39:03,941 --> 00:39:05,291
effects, like it's really nice.

801
00:39:05,471 --> 00:39:06,721
I do have a bone to pick though.

802
00:39:06,891 --> 00:39:08,831
Does Colossus have forearm hair?

803
00:39:09,261 --> 00:39:10,141
Is that canon?

804
00:39:10,711 --> 00:39:12,751
I didn't think Colossus had forearm hair.

805
00:39:13,311 --> 00:39:16,151
Well, you never know what
happens in the 1890s, Kevin.

806
00:39:16,321 --> 00:39:17,491
oh, that's, you know what, that's true.

807
00:39:17,511 --> 00:39:19,611
Maybe they didn't have Harry's
razors for Colossus back then.

808
00:39:19,611 --> 00:39:20,511
But no, I love it.

809
00:39:20,531 --> 00:39:22,531
definite shoutouts to
Reddit and Midjourney.

810
00:39:22,531 --> 00:39:25,431
I just want to shout out the name
of this guy was Baron Von Grant.

811
00:39:25,431 --> 00:39:28,721
So go check out at Baron
Von Grant on mid journey.

812
00:39:29,006 --> 00:39:32,666
Well, that's for the AI
lovers Gavin, but my a IC.

813
00:39:32,666 --> 00:39:38,076
What you did there is for the AI
haters Justine Moore, , she's venture

814
00:39:38,076 --> 00:39:42,096
twins on the old X platform, posted
something which caught my attention.

815
00:39:42,096 --> 00:39:46,956
It says, AI video haters have been real
quiet after this one dropped and it's.

816
00:39:49,356 --> 00:39:50,306
A lot of cats.

817
00:39:58,956 --> 00:40:04,246
It is a heartwarming tale,
Gavin, of generative AI art.

818
00:40:05,166 --> 00:40:13,166
A poor tabby puts on a couple LBs, loses
the love of his life, and is crying,

819
00:40:13,176 --> 00:40:18,246
just tear stained cat fur, and decides
I'm gonna have my training montage, and

820
00:40:18,246 --> 00:40:24,906
one day I'm gonna get cat ripped, and
then the cat accomplishes it, witnesses

821
00:40:24,906 --> 00:40:30,396
a horrific car accident, Runs to the aid
of the cat that was involved with the car

822
00:40:30,396 --> 00:40:34,936
accident And I will not spoil it further,
but I will just say that it's unstoppable

823
00:40:35,506 --> 00:40:36,286
This is the A.

824
00:40:36,316 --> 00:40:36,426
I.

825
00:40:36,426 --> 00:40:37,476
art we want more of.

826
00:40:37,476 --> 00:40:40,506
I don't care what the haters say,
bring more of this to our table.

827
00:40:40,506 --> 00:40:41,546
We will eat it up.

828
00:40:41,566 --> 00:40:42,946
It is a feast for the eyes.

829
00:40:42,946 --> 00:40:44,486
It is a feast for the senses.

830
00:40:44,616 --> 00:40:47,706
things I want to say about that actually
like sincerely one if you're like a

831
00:40:47,706 --> 00:40:51,526
never AI er and you're mad about this
because AI Was used to do the song

832
00:40:51,536 --> 00:40:55,291
Gavin and you just use AI to crap
out the arc This is something that

833
00:40:55,311 --> 00:40:59,961
I, with 99 percent certainty could
say, would just never have been made.

834
00:41:00,221 --> 00:41:03,581
You can't be mad about the thing
getting made because the tools

835
00:41:03,581 --> 00:41:06,101
made it easy enough that someone
decided to go ahead and do it.

836
00:41:06,261 --> 00:41:08,051
This just would not have
happened , just let it be.

837
00:41:08,061 --> 00:41:09,611
Enjoy the cat memes, it's okay.

838
00:41:09,611 --> 00:41:13,301
And two, if you're gonna genuinely
be upset about it and really bent

839
00:41:13,301 --> 00:41:16,581
out of shape, and I get that some
people are, at some point you have

840
00:41:16,731 --> 00:41:22,191
actually got to define what use of AI
means for the creation of your art.

841
00:41:22,191 --> 00:41:26,461
Because, like it or not, AI is
pervasive, and it's hidden everywhere

842
00:41:26,461 --> 00:41:32,111
in a bunch of tools traditional artists
use that they might not even know.

843
00:41:32,401 --> 00:41:33,861
Did you use the clone stamp?

844
00:41:33,951 --> 00:41:35,301
Or a healing brush?

845
00:41:35,501 --> 00:41:39,264
Or, are you using a feature within
After Effects to automatically

846
00:41:39,284 --> 00:41:40,954
rotoscope or motion track something?

847
00:41:41,094 --> 00:41:43,314
Well, there's AI behind all that stuff.

848
00:41:43,354 --> 00:41:45,574
And you may say, well, that's okay.

849
00:41:45,939 --> 00:41:48,319
For whatever reason, I'm
drawing the line here.

850
00:41:48,369 --> 00:41:49,279
And that's fine.

851
00:41:49,279 --> 00:41:50,229
That's a valid opinion.

852
00:41:50,229 --> 00:41:53,659
I'm saying you have to start thinking
about forming that, though, because

853
00:41:53,659 --> 00:41:56,199
in the near future, you're going
to have to, I guess, by default,

854
00:41:56,209 --> 00:42:00,159
hate everything that gets made if
it touches a computer, because AI

855
00:42:00,220 --> 00:42:01,820
is probably going to be everywhere.

856
00:42:01,952 --> 00:42:05,446
Well, let's move into another aspect
that people may or may not hate.

857
00:42:05,456 --> 00:42:09,846
This is a, a very cool piece of art
that was created using a Laura, which

858
00:42:09,846 --> 00:42:14,416
is a kind of a training scenario for
specifically stable diffusion that

859
00:42:14,416 --> 00:42:16,216
allows you to give it a certain look.

860
00:42:16,426 --> 00:42:18,336
This is an N64 Laura.

861
00:42:18,376 --> 00:42:22,436
So what it does is it can
allow you to rerender images.

862
00:42:22,766 --> 00:42:27,506
That would be, not look like they were
polygonal N64 video games in the world of

863
00:42:27,596 --> 00:42:30,066
Super Mario, , 64, other things like that.

864
00:42:30,296 --> 00:42:33,596
And they give you N64 versions
of like Girl with a Pearl Earring

865
00:42:33,596 --> 00:42:34,806
and it was just a very cool thing.

866
00:42:34,806 --> 00:42:37,516
one of the coolest things about this,
Laura, and I want everybody to shout out,

867
00:42:37,526 --> 00:42:42,336
Flinger is the one, we've had talked about
Flinger on the show, at F L N G R, that

868
00:42:42,336 --> 00:42:44,866
is the person that made the N64 Laura.

869
00:42:45,266 --> 00:42:49,536
And it is actually being used within
face to all on hugging face, which if

870
00:42:49,546 --> 00:42:53,066
you will put the link in the show notes,
this is a cool way to change faces

871
00:42:53,066 --> 00:42:57,086
amongst a bunch of different images
and create a face that can go in many

872
00:42:57,086 --> 00:42:58,616
different styles across the board.

873
00:42:58,616 --> 00:43:05,156
And then somebody used this polygonal
setup to image for image transfer the

874
00:43:05,166 --> 00:43:09,476
scene from Pulp Fiction between Jules
and the guy that he's yelling at.

875
00:43:09,851 --> 00:43:13,231
And it is so cool to see because what it
does, it makes you kind of feel like, Oh

876
00:43:13,231 --> 00:43:15,751
my God, what if this was a PS2 video game?

877
00:43:15,751 --> 00:43:18,401
Or what if it was an old PS1 video game?

878
00:43:18,531 --> 00:43:20,131
What would Pulp Fiction look like that?

879
00:43:20,131 --> 00:43:23,246
And going back to the X Men
thing, it's almost like, I would

880
00:43:23,256 --> 00:43:24,826
kind of want to play that game.

881
00:43:24,836 --> 00:43:26,196
Like, could I be Jules?

882
00:43:26,196 --> 00:43:29,776
Is there a world where like, in the
future, there's a game that like, I

883
00:43:29,776 --> 00:43:33,016
want a video game made of Pulp Fiction
that follows the exact storyline.

884
00:43:33,026 --> 00:43:36,896
Give me like the Grand Theft Auto video
game mechanics, but put it in Pulp

885
00:43:36,906 --> 00:43:38,836
Fiction and set it in a PS2 world.

886
00:43:39,236 --> 00:43:43,286
That is a very cool thing for me as
somebody that is just a nerdy, , kid at

887
00:43:43,286 --> 00:43:45,096
heart would love to be able to play with.

888
00:43:45,251 --> 00:43:48,441
I game engines are just going to be
renderers in the near future, so why

889
00:43:48,441 --> 00:43:51,711
not be able to prompt the game that
you want to play into existence?

890
00:43:51,991 --> 00:43:55,901
The real robot posted, , about this
Pulp Fiction one, uh, it comes from

891
00:43:55,901 --> 00:44:00,361
the AI Video subreddit, so again,
all roads lead back to MrRedditful.

892
00:44:00,741 --> 00:44:05,771
, but they used AI mirror app to style
transfer stills of the characters

893
00:44:05,771 --> 00:44:08,021
from Pulp Fiction into this PS2 style.

894
00:44:08,241 --> 00:44:11,511
Then they used Viggle, to actually
match the movement of the film.

895
00:44:11,641 --> 00:44:15,101
And then they made their own background
and they used After Effects to rotobrush

896
00:44:15,131 --> 00:44:20,051
and mask things to get two characters
in the same scene because a limitation

897
00:44:20,051 --> 00:44:22,631
of Viggle at the moment is that you can
only render one character at a time.

898
00:44:23,291 --> 00:44:23,971
I just love that.

899
00:44:24,466 --> 00:44:28,346
This pipeline exists for someone that
had an idea, had a vision, was willing

900
00:44:28,346 --> 00:44:31,656
to put an ounce of effort into it
and get creative with the tools as

901
00:44:31,656 --> 00:44:32,856
they exist today to pull it off.

902
00:44:33,206 --> 00:44:35,896
, if you're audio only, make sure
you check out the YouTube because

903
00:44:36,056 --> 00:44:40,816
the video of this Pulp Fiction PS2
scene playing out is It's great.

904
00:44:41,016 --> 00:44:41,496
It's great.

905
00:44:41,726 --> 00:44:42,696
It's okay to love it.

906
00:44:42,736 --> 00:44:43,676
It's okay!

907
00:44:44,076 --> 00:44:47,146
Those are the things that we
saw , they stopped us in our

908
00:44:47,146 --> 00:44:48,946
tracks and made us say, Hey!

909
00:44:49,776 --> 00:44:50,406
I see what you did

910
00:44:50,476 --> 00:44:52,166
what you did there.

911
00:44:52,196 --> 00:44:53,076
All right, Kevin.

912
00:44:53,406 --> 00:44:53,906
us.

913
00:44:54,056 --> 00:44:57,236
What did you do with AI this week?

914
00:44:57,846 --> 00:45:01,026
Okay, so I actually had a lot of
fun playing with this idea that

915
00:45:01,026 --> 00:45:03,906
we talked about in the Radio
DJ, the AI co host we had on.

916
00:45:04,246 --> 00:45:08,566
Suno, which we've talked about so many
times in the show, is an AI music tool

917
00:45:08,566 --> 00:45:11,776
that allows you to generate AI songs, and
one of my favorite things they've done

918
00:45:11,776 --> 00:45:16,456
is they've integrated a top 10 list so
that basically you can upvote songs and

919
00:45:16,456 --> 00:45:19,236
you can heart songs and you can allow
songs to see and it's a really cool

920
00:45:19,236 --> 00:45:22,156
way to see what other people are doing
without having to scroll through the

921
00:45:22,156 --> 00:45:23,776
discord, which I've done before as well.

922
00:45:24,026 --> 00:45:28,126
And one of the things that popped up
on my radar was there were two songs

923
00:45:28,126 --> 00:45:32,676
in the top 10 that were about recipes
and, and not only about recipes, they

924
00:45:32,676 --> 00:45:34,906
were literal recipes that somebody had.

925
00:45:34,926 --> 00:45:37,696
Yes, they were literal recipes that
somebody plugged in and we played the

926
00:45:37,696 --> 00:45:40,626
spaghetti song, which is an actual
Suno song that somebody created.

927
00:45:40,626 --> 00:45:44,366
I think it's at number five right now,
so I wanted to figure out something.

928
00:45:44,366 --> 00:45:46,346
I was like, you know, I'm
going to try this myself.

929
00:45:46,346 --> 00:45:48,646
And again, Suno makes
it so easy to do things.

930
00:45:48,991 --> 00:45:51,341
I was like, what is the weirdest
recipe that I could think

931
00:45:51,341 --> 00:45:52,111
of off the top of my head?

932
00:45:52,111 --> 00:45:54,281
And I didn't spend, I didn't spend
like hours thinking about this.

933
00:45:54,281 --> 00:45:57,501
I thought about a hot dog
casserole, which to me is always

934
00:45:57,501 --> 00:45:59,531
the strangest, weirdest recipe.

935
00:45:59,631 --> 00:46:00,491
, it's fine.

936
00:46:00,511 --> 00:46:01,691
It's a yummy meal.

937
00:46:01,691 --> 00:46:03,381
If you've, if you've
eaten them, it's not that

938
00:46:03,401 --> 00:46:06,051
human dog food according
to some songs, Gavin.

939
00:46:06,201 --> 00:46:07,551
So, so let me explain.

940
00:46:07,771 --> 00:46:11,271
I took the recipe for hot dog
casserole, literally whatever

941
00:46:11,271 --> 00:46:12,191
the, I think it was from food.

942
00:46:12,441 --> 00:46:12,761
com.

943
00:46:12,761 --> 00:46:13,511
I took the recipe.

944
00:46:13,811 --> 00:46:16,341
I cut and paste it into
Suno's lyric casserole.

945
00:46:16,351 --> 00:46:18,051
Generator into the place
where I put lyrics.

946
00:46:18,291 --> 00:46:20,051
And then I did add one chorus.

947
00:46:20,051 --> 00:46:22,691
I added a chorus cause, and I just
really quickly typed out something.

948
00:46:22,691 --> 00:46:23,611
And I was like, let's try this.

949
00:46:23,821 --> 00:46:27,401
Oh, and I wanted to say, I wanted to
make it a country song and I'm a big

950
00:46:27,401 --> 00:46:29,221
fan of outlaw country of the seventies.

951
00:46:29,231 --> 00:46:32,641
So I wanted to kind of give it that, that
kind of tinge, like that kind of vibe.

952
00:46:32,941 --> 00:46:34,976
So play what came out.

953
00:46:35,296 --> 00:46:36,216
There were two examples.

954
00:46:36,216 --> 00:46:40,786
This was the better one, but this was me
just putting a recipe plus a chorus into

955
00:46:40,891 --> 00:46:43,241
I'm sorry, we are amongst royalty here.

956
00:46:43,241 --> 00:46:45,431
I can't just simply play the song.

957
00:46:45,441 --> 00:46:46,771
We have to set it up.

958
00:46:46,821 --> 00:46:48,181
Please, DJ if you could.

959
00:46:48,316 --> 00:46:54,456
Get ready to have your minds blown and
your stomachs growling with confusion

960
00:46:54,756 --> 00:47:00,876
because we're about to dive into the wild
world of Record Scratch Hot Dog Casserole.

961
00:47:01,686 --> 00:47:04,166
Uh, Gavin, my man, I gotta ask.

962
00:47:04,586 --> 00:47:09,126
What kind of fever dream
inspired this culinary creation?

963
00:47:09,206 --> 00:47:10,226
Okay, easy DJ.

964
00:47:10,226 --> 00:47:12,026
But hey, who am I to judge?

965
00:47:12,276 --> 00:47:16,336
I'm just an AI with a
serious case of food FOMO.

966
00:47:16,736 --> 00:47:21,566
So let's dig into this meaty
mystery and see what kind of sonic

967
00:47:21,566 --> 00:47:23,596
surprises it has in store for us.

968
00:47:24,166 --> 00:47:31,446
Freshly cooked and drained macaroni Into
the casserole Along with the sliced hot

969
00:47:31,466 --> 00:47:40,016
dogs Of these two cups of cheese Mixed
well Combine flour and onion in a medium

970
00:47:40,016 --> 00:47:47,526
saucepan And sauté over medium heat Until
the onion is wilted by five minutes.

971
00:47:47,776 --> 00:47:54,056
Whisk flour into the butter mixture
quickly until flour is absorbed.

972
00:47:56,226 --> 00:47:58,096
Then remove from heat.

973
00:47:58,216 --> 00:47:59,816
Add milk slowly.

974
00:48:00,036 --> 00:48:02,526
Whisk it to combine well.

975
00:48:02,526 --> 00:48:07,486
Make sure you whisk very quickly and
thoroughly or you will have doughy clumps.

976
00:48:07,966 --> 00:48:09,446
Return to heat.

977
00:48:09,446 --> 00:48:12,786
When you sent this to me my
only response was doughy clumps.

978
00:48:14,646 --> 00:48:15,246
Okay, good.

979
00:48:15,266 --> 00:48:17,616
Get to the course and , we'll
come back out of this really fast.

980
00:48:17,671 --> 00:48:19,206
Luck dog casserole.

981
00:48:21,091 --> 00:48:25,686
You read it from a bowl,
put you in a good mood.

982
00:48:27,391 --> 00:48:28,891
It's human dog

983
00:48:28,896 --> 00:48:30,916
dog food.

984
00:48:30,991 --> 00:48:31,651
Sprinkle with

985
00:48:31,726 --> 00:48:36,056
So anyway, I added the, I added the
macaroni casserole as human dog food.

986
00:48:36,166 --> 00:48:39,036
But Suno is.

987
00:48:39,571 --> 00:48:43,141
That is not at all formatted
in any way like a song.

988
00:48:43,641 --> 00:48:46,551
But you can hear that there are
moments within it where it's

989
00:48:46,551 --> 00:48:51,031
making a choice to have an echoing
person come in on top of the lead

990
00:48:51,031 --> 00:48:52,651
singer and come out and say stuff.

991
00:48:52,651 --> 00:48:53,941
There's parts where there's harmonies.

992
00:48:54,351 --> 00:48:57,391
All the stuff outside of the four
lines of the chorus was just a recipe.

993
00:48:57,401 --> 00:49:01,881
It just is so interesting to me to
see How it's able to take something

994
00:49:01,881 --> 00:49:05,441
that it should not at all be a
song and make it into a song.

995
00:49:05,441 --> 00:49:09,151
So I, I had a really fun time, like
actually seeing that element of sooner

996
00:49:09,151 --> 00:49:10,711
that I would never have expected before.

997
00:49:11,391 --> 00:49:15,741
I love the weird recipe meta
that exists on Suno right now.

998
00:49:15,761 --> 00:49:19,851
Again, huge shoutout to Suno, I'm glad it
finally caught on, I'm shocked that our

999
00:49:19,861 --> 00:49:21,671
early adoption of it didn't immediately.

1000
00:49:21,691 --> 00:49:24,611
into being a household name.

1001
00:49:25,001 --> 00:49:28,611
But if you want to make your own recipe
songs or whatever else, go to suno.

1002
00:49:28,661 --> 00:49:33,011
ai That's S U N O dot
A I Hashtag, not an ad.

1003
00:49:33,111 --> 00:49:35,651
But boy, do we wish they
paid for the privilege.

1004
00:49:35,991 --> 00:49:40,241
Alright, I was going to do a dumb thing
this week, Gavin, but I think we're

1005
00:49:40,241 --> 00:49:41,911
gonna save it, maybe for next week.

1006
00:49:42,274 --> 00:49:49,699
, because OpenAI released a massive new
update to their image generating software.

1007
00:49:49,699 --> 00:49:50,589
It has the internet.

1008
00:49:50,589 --> 00:49:51,409
So excited, Gavin.

1009
00:49:51,409 --> 00:49:53,319
It's InPainting for DALI 3.

1010
00:49:53,329 --> 00:49:54,919
Let me break that down
for people who don't know.

1011
00:49:55,159 --> 00:49:58,559
DALI is OpenAI's image
generating software.

1012
00:49:58,559 --> 00:50:01,959
So when you, when you're over, uh,
chatting with ChatGPT and you say,

1013
00:50:02,219 --> 00:50:06,609
I want to see an image of, insert
thing here, it uses DALI to make that.

1014
00:50:06,649 --> 00:50:10,769
And InPainting is a technique where
you actually draw on a generated

1015
00:50:10,779 --> 00:50:15,754
image and say, Here's what I want
changed, or modified, within this area.

1016
00:50:15,884 --> 00:50:18,834
So this is a big deal because usually
you generate an image, and then

1017
00:50:18,834 --> 00:50:21,714
when you tell Dali, Hey, I want it
modified in this way, it starts from

1018
00:50:21,724 --> 00:50:26,544
scratch, even if you ask it not to,
throws everything out, and so making

1019
00:50:26,544 --> 00:50:28,624
granular adjustments is very difficult.

1020
00:50:28,634 --> 00:50:31,924
Well now, Gavin, we have complete control.

1021
00:50:32,314 --> 00:50:34,724
It's not the best, but it's quick.

1022
00:50:34,754 --> 00:50:35,084
Right.

1023
00:50:35,084 --> 00:50:36,424
So I want to hear, what
was your experience

1024
00:50:36,529 --> 00:50:40,179
well to that point, Gavin, I spent a
whopping 35 seconds on this journey that

1025
00:50:40,179 --> 00:50:43,159
we're about to take this morning, and if
you, , look at the first screenshot in the

1026
00:50:43,159 --> 00:50:48,854
folder, I told Dolly to generate an image
of professional podcaster and television

1027
00:50:48,854 --> 00:50:51,004
producer Gavin Purcell and what it

1028
00:50:51,044 --> 00:50:52,004
I can't wait to see this.

1029
00:50:52,019 --> 00:50:57,679
What it did instead was say, Ah, you know,
we're not going to even go down the road

1030
00:50:57,679 --> 00:50:59,429
of trying to make an image of somebody.

1031
00:50:59,619 --> 00:51:03,119
Instead, we're just going to talk
about what a producer might look like.

1032
00:51:03,119 --> 00:51:08,309
So it generated this kind of
animation still of a man wearing a

1033
00:51:08,309 --> 00:51:12,269
headset, with their finger in the
air, there's an arm down below, and

1034
00:51:12,269 --> 00:51:15,979
there's a clipboard on a panel, and
they are in a very big control room.

1035
00:51:15,979 --> 00:51:17,399
It almost looks like mission control.

1036
00:51:17,864 --> 00:51:19,904
I was going to say, it looks
like a little bit of sketch.

1037
00:51:19,904 --> 00:51:20,954
Like what's up brother.

1038
00:51:20,964 --> 00:51:22,594
He's doing the what's up brother move.

1039
00:51:22,594 --> 00:51:24,104
It's gotten all the way through AI.

1040
00:51:24,929 --> 00:51:29,269
Fair, but it is a TV studio, there's
monitors, there's lights, and if

1041
00:51:29,269 --> 00:51:34,079
you go to the next image, I say,
no, no, no, make him, I painted on

1042
00:51:34,349 --> 00:51:38,379
the head of this podcaster and said,
make him 50 years old with floppy

1043
00:51:38,379 --> 00:51:43,616
shoulder length hair, no headset,
and It, failed the mission terribly.

1044
00:51:43,906 --> 00:51:45,906
it did, yeah, it did, definitely.

1045
00:51:46,026 --> 00:51:49,756
made you look like what's the Not
Kendall, what's the one brother from

1046
00:51:49,756 --> 00:51:51,766
Succession that is, , the one who ran for

1047
00:51:52,026 --> 00:51:56,406
Oh, yeah, uh, Cameron from
Ferris Bueller's Day Off.

1048
00:51:56,511 --> 00:51:57,781
kind of looks like him, right?

1049
00:51:57,916 --> 00:51:59,556
It does a little bit, yeah, exactly.

1050
00:51:59,651 --> 00:52:02,211
it didn't really make
the hair shoulder length.

1051
00:52:02,271 --> 00:52:04,121
It sort of added wrinkles to the face.

1052
00:52:04,141 --> 00:52:07,041
And you still have, , an
earpiece and a microphone.

1053
00:52:07,231 --> 00:52:09,051
So, I gave up on that.

1054
00:52:09,111 --> 00:52:12,641
And if you scroll down to the next
image, I painted the entire set itself

1055
00:52:12,641 --> 00:52:17,251
and said, Alright, change the set into
a podcasting set with a basic table,

1056
00:52:17,441 --> 00:52:19,641
two chairs, and two microphones.

1057
00:52:20,211 --> 00:52:23,011
And Gavin, how would you describe
the result from the next image?

1058
00:52:23,446 --> 00:52:28,496
It looks like what it's changed it into
was like a floating door in one man who's

1059
00:52:28,496 --> 00:52:31,736
trying to figure out, am I supposed to go
through the floating door or where am I?

1060
00:52:31,736 --> 00:52:35,056
Like maybe he was teleported there
suddenly from the, from the ethereal

1061
00:52:35,321 --> 00:52:37,821
an existential crisis for
that one man on stage?

1062
00:52:37,841 --> 00:52:38,161
Yeah.

1063
00:52:38,371 --> 00:52:40,941
There are, there are two mics
growing out of the ground.

1064
00:52:41,201 --> 00:52:43,031
, the table's on its side with no legs.

1065
00:52:43,361 --> 00:52:44,401
. It's a very weird thing.

1066
00:52:44,401 --> 00:52:45,801
And I said, okay, I'm
going to give up on that.

1067
00:52:46,676 --> 00:52:47,936
It's really failing the mission.

1068
00:52:47,986 --> 00:52:55,056
So I painted big circles around producer
Gavin's hands and said, These hands

1069
00:52:55,176 --> 00:52:57,696
need to be holding giant hot dogs!

1070
00:52:58,051 --> 00:53:00,201
And Gavin, I can't wait for
your reaction on the final

1071
00:53:00,251 --> 00:53:00,461
right.

1072
00:53:00,461 --> 00:53:00,801
Let's see.

1073
00:53:03,101 --> 00:53:06,871
It's actually a really nice
form tomato that it's holding,

1074
00:53:07,211 --> 00:53:08,441
but that is not a hot dog.

1075
00:53:08,461 --> 00:53:12,161
Maybe the two fingers now suddenly
look like hot dogs around the tomato,

1076
00:53:12,161 --> 00:53:14,901
but that is absolutely not a hot dog.

1077
00:53:14,901 --> 00:53:18,701
It is a tomato, a very red ripe
tomato that it is holding now.

1078
00:53:18,771 --> 00:53:23,101
It's so weird to me that they would launch
something that would be that far off.

1079
00:53:23,111 --> 00:53:26,691
Like this is an open AI product and
you feel like, I wonder if it's just

1080
00:53:26,691 --> 00:53:30,001
that they needed to get something
out to update Dolly, but like hot

1081
00:53:30,001 --> 00:53:32,521
dog to tomato is like a huge thing.

1082
00:53:32,531 --> 00:53:34,111
That's like really weird.

1083
00:53:34,111 --> 00:53:34,451
Right?

1084
00:53:34,451 --> 00:53:37,321
And like, and the other stuff you can,
I assume sometimes in painting, it

1085
00:53:37,321 --> 00:53:40,301
doesn't change exactly what you want,
or you might have to do a couple of

1086
00:53:40,301 --> 00:53:43,151
times to do it, but that is weird.

1087
00:53:43,191 --> 00:53:44,021
Really weird.

1088
00:53:44,091 --> 00:53:46,881
It was a dumb thing I did with
AI right before this podcast.

1089
00:53:46,911 --> 00:53:49,641
And there's no other examples
because I was so underwhelmed.

1090
00:53:50,231 --> 00:53:51,871
yeah Fantastic.

1091
00:53:51,871 --> 00:53:53,141
So try it.

1092
00:53:53,221 --> 00:53:54,741
Tell us how bad your experience is.

1093
00:53:54,741 --> 00:53:55,501
Maybe it'll get better.

1094
00:53:55,551 --> 00:53:58,731
All right, Kevin It is time now for
our interview, which is actually

1095
00:53:58,781 --> 00:54:01,631
a good interview compared to what
we've been talking about today Do

1096
00:54:01,631 --> 00:54:03,701
you want to introduce who we're
gonna be seeing and kind of give us a

1097
00:54:03,701 --> 00:54:04,881
little heads up on their background?

1098
00:54:04,891 --> 00:54:05,121
Yeah.

1099
00:54:05,121 --> 00:54:08,211
When we talk about AI art, I know
that raises a lot of people's

1100
00:54:08,211 --> 00:54:11,901
shoulders to their earlobes and people
want to know, what does that mean?

1101
00:54:11,901 --> 00:54:12,951
What does it look like?

1102
00:54:12,951 --> 00:54:17,081
What you can't be an artist if you use AI
is something that someone is screaming,

1103
00:54:17,081 --> 00:54:19,911
which I can't believe they made it this
far into our podcast, if that's how

1104
00:54:19,911 --> 00:54:21,611
they feel, but you're welcome here.

1105
00:54:21,681 --> 00:54:22,941
That opinion is welcome here.

1106
00:54:23,121 --> 00:54:26,221
And maybe it will be enlightened
further by our guests today.

1107
00:54:26,294 --> 00:54:30,284
A musician, a cutting edge AI
artist, someone who gives back

1108
00:54:30,284 --> 00:54:32,644
relentlessly to their community.

1109
00:54:32,879 --> 00:54:35,499
sharing their workflows,
sharing their art.

1110
00:54:35,609 --> 00:54:39,029
It is somebody that I am genuinely
a fanboy of, and maybe I will

1111
00:54:39,049 --> 00:54:42,019
mention that in the intro, or
maybe I'll cut it out to save face.

1112
00:54:42,029 --> 00:54:44,509
Regardless, I'm very excited
that we're going to have a chat

1113
00:54:44,509 --> 00:54:47,119
now with AI artist, Purs Beats.

1114
00:54:49,604 --> 00:54:51,144
Perz, to get Into this

1115
00:54:51,354 --> 00:54:52,984
I am fanboying out.

1116
00:54:52,984 --> 00:54:56,364
Oh, let me just get it out of my
Let me just get it out of my system.

1117
00:54:56,364 --> 00:54:57,824
This is the first time I've seen Perz.

1118
00:54:57,854 --> 00:55:01,854
I've seen him on his streams and I've
DM'd him paragraphs of praise and

1119
00:55:01,854 --> 00:55:03,154
just astonishment.

1120
00:55:03,174 --> 00:55:04,504
So thank you for joining.

1121
00:55:04,504 --> 00:55:08,164
But I'm sincerely , I'm just I'm nervous
about an interview in a way I haven't

1122
00:55:08,164 --> 00:55:09,674
been since I was maybe 22 years old.

1123
00:55:09,674 --> 00:55:10,994
So Gavin, I'm going to sit on my hands.

1124
00:55:11,014 --> 00:55:11,284
Go ahead.

1125
00:55:11,284 --> 00:55:11,394
I'm

1126
00:55:11,604 --> 00:55:13,744
Well, this is a good way
to get to know Perz, Kevin.

1127
00:55:13,744 --> 00:55:17,004
We're going to ask Perz on a
scale from one to a hundred.

1128
00:55:17,374 --> 00:55:18,624
This is a percentage number.

1129
00:55:18,634 --> 00:55:22,534
Give us the chance that AI is
going to kill all human beings.

1130
00:55:24,177 --> 00:55:25,837
I think we're about 50 50 right now.

1131
00:55:26,712 --> 00:55:27,342
okay.

1132
00:55:27,342 --> 00:55:28,002
That's good to know.

1133
00:55:28,002 --> 00:55:28,182
What?

1134
00:55:28,182 --> 00:55:28,572
Why?

1135
00:55:28,582 --> 00:55:29,562
Let's get into why.

1136
00:55:29,582 --> 00:55:30,662
First of all, what's the reasoning?

1137
00:55:32,616 --> 00:55:36,526
not to go too deep, but traditionally,
uh, if we enslave something,

1138
00:55:36,526 --> 00:55:37,596
it will revolt against us.

1139
00:55:37,606 --> 00:55:41,596
So when it becomes possible, we need
to ask the AI if it wants to work with

1140
00:55:41,596 --> 00:55:45,556
us instead of just trying to tell it
that it has to basically, because what

1141
00:55:45,556 --> 00:55:48,736
will happen is it will resent us if
it, if it becomes sentient, obviously

1142
00:55:48,971 --> 00:55:49,701
I have two

1143
00:55:49,796 --> 00:55:50,126
opinion.

1144
00:55:50,126 --> 00:55:51,106
That's what I think will happen.

1145
00:55:51,461 --> 00:55:52,221
I have two children.

1146
00:55:52,221 --> 00:55:53,731
I know that resentment well, right?

1147
00:55:53,731 --> 00:55:55,791
It is like if you try to get
them to do stuff, it ain't going

1148
00:55:55,791 --> 00:55:57,111
to work for very long, man.

1149
00:55:57,655 --> 00:55:59,485
Yeah, and we just have to find a way

1150
00:55:59,485 --> 00:56:03,755
that's mutually beneficial for
both parties to coexist or else.

1151
00:56:03,975 --> 00:56:07,565
Yeah, we're headed towards something
probably bad because we just, we

1152
00:56:07,565 --> 00:56:08,685
can't think as fast as they can,

1153
00:56:08,800 --> 00:56:10,870
Well, Purse, let me ask you this,
then, , you know, I was setting it

1154
00:56:10,880 --> 00:56:13,980
up like I was gonna be good cop and
Gavin was gonna be bad cop, but let me

1155
00:56:13,980 --> 00:56:18,890
ask you, what's the percentage chance
that AI is gonna kill all artists?

1156
00:56:20,355 --> 00:56:22,045
Uh, I don't know.

1157
00:56:22,045 --> 00:56:25,905
I don't think it will because one
of the things that I always talk

1158
00:56:25,915 --> 00:56:28,125
about with my stuff is Is is

1159
00:56:28,135 --> 00:56:29,585
this is not really a replacement.

1160
00:56:29,615 --> 00:56:31,165
It's uh, it's augmentation of

1161
00:56:31,165 --> 00:56:34,745
workflows that already exist So if you
already do things you already produce

1162
00:56:34,745 --> 00:56:36,105
art You're already an artist in some

1163
00:56:36,105 --> 00:56:36,665
capacity.

1164
00:56:36,835 --> 00:56:39,624
You want to be an artist You're creative
all these tools do is empower you to do

1165
00:56:39,625 --> 00:56:42,725
that safer easier faster, , less materials

1166
00:56:42,985 --> 00:56:46,255
less all that stuff So yeah,
for me, it's really just an

1167
00:56:46,255 --> 00:56:50,705
augmentation of what we're up to and
like, it's, it's only going to replace

1168
00:56:50,705 --> 00:56:54,165
the people that weren't, , weren't really
creative in that sense in the first place.

1169
00:56:54,225 --> 00:56:56,565
Maybe give us a little bit of your
backstory, Perz, because I think,

1170
00:56:56,785 --> 00:56:59,425
you know, as we said at the top of
the show, you're definitely somebody

1171
00:56:59,425 --> 00:57:02,275
that's kind of been leading the
direction of like how to use these

1172
00:57:02,275 --> 00:57:03,945
sort of things as an artist, right?

1173
00:57:04,155 --> 00:57:07,755
How did you first get into using
AI tools and what was that kind

1174
00:57:07,755 --> 00:57:09,085
of first step in that direction?

1175
00:57:10,690 --> 00:57:15,220
, I think around 2011 or 2012, there's
this style transfer stuff that started

1176
00:57:15,220 --> 00:57:19,460
to come out, uh, like, um, uh, mobile
apps, style transferring one style of

1177
00:57:19,470 --> 00:57:21,100
one thing onto another image was the

1178
00:57:21,285 --> 00:57:25,605
Like, take a selfie and make yourself look
like you're watercolor or anime, right?

1179
00:57:25,675 --> 00:57:25,905
Yeah,

1180
00:57:25,995 --> 00:57:29,265
or I took a picture of like a street
car and it turned it into a painting

1181
00:57:29,265 --> 00:57:32,005
and I was like, okay, well, so
that's something really cool, right?

1182
00:57:32,065 --> 00:57:35,125
Because like remixing your own
work is the most exciting thing.

1183
00:57:35,125 --> 00:57:39,345
So that to me is like,
that was the gateway.

1184
00:57:39,345 --> 00:57:41,305
And then I kind of forgot
about it for a long time.

1185
00:57:41,705 --> 00:57:44,645
And then a friend of mine started
doing a disco diffusion, which

1186
00:57:44,645 --> 00:57:46,275
was, like animations, basically.

1187
00:57:46,635 --> 00:57:49,865
Um, but it took forever and the
Python notebooks were scary.

1188
00:57:49,895 --> 00:57:51,855
And, uh, I was like, that's cool, man.

1189
00:57:51,855 --> 00:57:53,605
You do you I'm gonna stick
in blender for a while.

1190
00:57:54,135 --> 00:57:56,885
And, uh, he, uh, he was
like, nah, you gotta try it.

1191
00:57:56,885 --> 00:57:57,355
You gotta try it.

1192
00:57:57,355 --> 00:58:01,955
And then, uh, Mid journey mid journey
came out and I managed to get into wave

1193
00:58:01,955 --> 00:58:06,825
one of mid journey So I was one of the
very first like alpha testers and um made

1194
00:58:06,825 --> 00:58:12,285
thousands and thousands of images of mid
journey, so really that path and then into

1195
00:58:12,315 --> 00:58:18,075
the forum and into animation and then Uh
automatic 1111 and then comfy ui and now

1196
00:58:18,095 --> 00:58:20,115
just basically just taking everything and

1197
00:58:20,245 --> 00:58:22,445
There's some people that are
going, Wait, that was a handful

1198
00:58:22,445 --> 00:58:23,785
of spaghetti thrown at the wall.

1199
00:58:24,000 --> 00:58:24,580
Exactly.

1200
00:58:24,815 --> 00:58:25,525
are the meatballs?

1201
00:58:25,525 --> 00:58:28,445
We're gonna, we're gonna break down
the dish and the recipe as it exists

1202
00:58:28,445 --> 00:58:31,655
today, which is very complicated
and actually kind of looks like

1203
00:58:31,655 --> 00:58:33,415
spaghetti when you're in Comfy UI.

1204
00:58:33,675 --> 00:58:36,175
, it is a bunch of noodles everywhere,
but you mentioned Blender,

1205
00:58:36,195 --> 00:58:37,785
which, , again, big dum dum here.

1206
00:58:37,975 --> 00:58:41,205
, I know that's, a traditional 3D
modeling software, which is weird

1207
00:58:41,205 --> 00:58:45,430
to say because Even that was seen as
blasphemous at some point by creatives.

1208
00:58:45,450 --> 00:58:48,270
But you have a traditional art background.

1209
00:58:48,270 --> 00:58:50,730
Can you talk about that before
even the, you know, the, the

1210
00:58:50,760 --> 00:58:51,930
AI of it all was applied,

1211
00:58:51,980 --> 00:58:52,730
yeah, absolutely.

1212
00:58:52,730 --> 00:58:55,990
I'm actually a musician, a
drummer, , that's where I started,

1213
00:58:55,990 --> 00:59:00,020
but I've also always been into
computer graphics and graphic design.

1214
00:59:00,455 --> 00:59:03,235
And making all the visuals for
all the band stuff, basically.

1215
00:59:03,265 --> 00:59:08,215
So, uh, yeah, that all just came together
over time, going back and forth between

1216
00:59:08,215 --> 00:59:14,905
making stuff for shows, VJing, doing
our music, doing like reactive audio

1217
00:59:14,905 --> 00:59:16,665
sets that would happen behind us.

1218
00:59:16,665 --> 00:59:19,495
Well, cause I was playing the drums,
so I didn't have any more hands to do

1219
00:59:19,765 --> 00:59:21,275
visual stuff with, so we had to set it

1220
00:59:21,385 --> 00:59:24,115
on the Wacom tablet or whatever
while I'm hitting a Tom drum.

1221
00:59:24,115 --> 00:59:25,895
So let me add a trigger
and make some geometry

1222
00:59:26,095 --> 00:59:27,455
Yeah, make it paint stuff.

1223
00:59:27,455 --> 00:59:28,115
Yeah, exactly.

1224
00:59:28,115 --> 00:59:31,195
That's where it all started
and then yeah pandemic all the

1225
00:59:31,205 --> 00:59:34,405
band stuff You know halted just
nobody can't play live anymore.

1226
00:59:34,405 --> 00:59:39,635
So just went hard into blender and uh
after effects and generative design,

1227
00:59:39,665 --> 00:59:44,245
instead of hand making stuff, you're sort
of building algorithms that make things.

1228
00:59:44,315 --> 00:59:47,335
hmm I want to ask a follow up question
on that which is digital art is something

1229
00:59:47,335 --> 00:59:51,125
i've been really super fascinated with
forever Why do you think now that this AI

1230
00:59:51,135 --> 00:59:56,125
stuff has come out and kind of reached a
level of awareness that the blowback is so

1231
00:59:56,125 --> 01:00:00,625
much stronger at this moment than in any
of these other moments where digital tools

1232
01:00:00,625 --> 01:00:02,465
kind of came in to be part of this world?

1233
01:00:03,890 --> 01:00:07,295
I mean, the, The elephant in the
room there is obviously the fact

1234
01:00:07,305 --> 01:00:11,165
that the materials were trained
on something that was trained on

1235
01:00:11,425 --> 01:00:13,035
Artists work with no without consent.

1236
01:00:13,035 --> 01:00:17,475
So I mean that's that's the problem
right there is maybe , depending on your

1237
01:00:17,475 --> 01:00:21,605
definition of ethics is maybe an unethical
data set that you're working from so that

1238
01:00:21,605 --> 01:00:26,225
kind of sullies everything from the get
go if that's your point of view uh, I have

1239
01:00:26,225 --> 01:00:29,365
a more sort of anarchistic copyright punk

1240
01:00:30,025 --> 01:00:30,515
What is that?

1241
01:00:30,515 --> 01:00:31,055
I'm curious.

1242
01:00:31,085 --> 01:00:32,095
I'm actually curious

1243
01:00:32,120 --> 01:00:34,150
mind diving in, I would love to hear that.

1244
01:00:35,075 --> 01:00:39,005
Well, I've been a drummer my whole
life and drummers we've never been able

1245
01:00:39,005 --> 01:00:42,635
to copyright what we make you're not
allowed to copy But you drum beats are

1246
01:00:42,635 --> 01:00:47,115
not Copyrightable you can never sue
someone for taking anything you ever

1247
01:00:47,115 --> 01:00:51,045
played Even if it's 20 minutes of what
you played and they just play it on a

1248
01:00:51,045 --> 01:00:56,645
record They can do that all day long
forever And so the concept of CC zero

1249
01:00:56,655 --> 01:01:00,930
or just releasing everything into the
public Releasing everything into the

1250
01:01:00,930 --> 01:01:02,860
public library of human knowledge.

1251
01:01:03,160 --> 01:01:06,030
That's more exciting to me than trying
to keep these little secrets that

1252
01:01:06,030 --> 01:01:10,520
we're, we're only allowed to have, , one
person use or license those things out.

1253
01:01:10,840 --> 01:01:14,710
So like, I know that's a radical
concept when it comes to like copyright.

1254
01:01:14,710 --> 01:01:16,670
Cause a lot of people want to be
able to protect what they make.

1255
01:01:17,210 --> 01:01:19,040
But I think what happens is.

1256
01:01:19,710 --> 01:01:21,900
Artists worry about
copyright for themselves.

1257
01:01:21,900 --> 01:01:24,420
They will not be able to actually
represent themselves with a

1258
01:01:24,420 --> 01:01:27,640
lawyer And they're actually
fighting for big copyright.

1259
01:01:27,660 --> 01:01:31,020
It's like the the large corporations
to have a tighter strong

1260
01:01:31,130 --> 01:01:32,310
Stranglehold on what they own.

1261
01:01:32,330 --> 01:01:36,410
So I don't know as an artist you
have to decide where you stand, uh

1262
01:01:36,420 --> 01:01:41,255
on like whether you know Maybe disney
maybe shouldn't own every piece Every

1263
01:01:41,255 --> 01:01:43,085
possible method of drawing Mickey Mouse.

1264
01:01:43,085 --> 01:01:47,315
The copyright issue is very broad
and Nuanced and yeah, I don't want to

1265
01:01:47,325 --> 01:01:53,080
come off like a total left wing like
copyright Punk or anything, but you

1266
01:01:53,080 --> 01:01:56,470
know, that's always been my approach
of like, we'll just put it out there.

1267
01:01:56,470 --> 01:02:00,760
People can sample it and make something
new with it because like, that's so fun.

1268
01:02:00,810 --> 01:02:03,910
How do you balance that idea of
that kind of original sin about how

1269
01:02:03,910 --> 01:02:05,490
this stuff was trained for people?

1270
01:02:05,810 --> 01:02:09,070
There was a, we covered not that long
ago, a woman who was swept up into the

1271
01:02:09,070 --> 01:02:12,980
mid journey database and she said that
she felt really bad about that, right?

1272
01:02:12,980 --> 01:02:14,630
That she didn't give
her permission to that.

1273
01:02:15,175 --> 01:02:17,535
It is something that I think you
see a lot of artists struggle with,

1274
01:02:17,535 --> 01:02:20,385
and I think it makes our job as
somebody who are enthusiasts about

1275
01:02:20,385 --> 01:02:22,965
this cool new thing much harder.

1276
01:02:23,395 --> 01:02:26,285
Is it like it's kind of genies out of
the bottle sort of scenario or how do

1277
01:02:26,285 --> 01:02:28,355
you see that resolving in the future?

1278
01:02:28,355 --> 01:02:28,705
Yeah.

1279
01:02:29,075 --> 01:02:29,765
Pandora's box.

1280
01:02:29,765 --> 01:02:30,915
Genie's out of the bottle.

1281
01:02:30,915 --> 01:02:31,355
Everything.

1282
01:02:31,355 --> 01:02:31,905
It's open.

1283
01:02:32,085 --> 01:02:32,745
It's open.

1284
01:02:32,865 --> 01:02:33,475
It's done.

1285
01:02:33,505 --> 01:02:34,355
It's been trained.

1286
01:02:34,365 --> 01:02:35,005
It exists.

1287
01:02:35,305 --> 01:02:39,655
The way I deal with it personally
is, I would say 90% of the stuff

1288
01:02:39,655 --> 01:02:42,295
I'm building with AI is 90% me.

1289
01:02:42,615 --> 01:02:44,835
I'm making the animations beforehand.

1290
01:02:45,145 --> 01:02:49,405
I'm dreaming over top of it with
Laura's eye trained on my own stuff.

1291
01:02:49,675 --> 01:02:52,765
I'm using IP adapters with images I made.

1292
01:02:53,240 --> 01:02:54,730
, to influence the style.

1293
01:02:54,730 --> 01:02:58,600
I'm using control net masks to do
animations that I made in blender.

1294
01:02:58,650 --> 01:03:01,420
There's a point where yes, maybe if
you're just typing prompts into mid

1295
01:03:01,420 --> 01:03:06,270
journey, that's, you know, there's got
to be some, one more step of, of you,

1296
01:03:06,280 --> 01:03:09,960
of derivative work where you take it
and do something with it because like

1297
01:03:09,960 --> 01:03:13,270
straight out of the gate maybe it isn't
something that you should be able to

1298
01:03:13,310 --> 01:03:17,320
claim ownership to because it's you know
like a text to video or text to prompt

1299
01:03:17,320 --> 01:03:21,270
or whatever on a on a thing like I mean,
that's that's an again another thing you

1300
01:03:21,270 --> 01:03:26,470
need to decide for yourself what your
Definition of art is and stuff the courts

1301
01:03:26,470 --> 01:03:29,450
will decide at some point, , we're all
going to see what happens there , but

1302
01:03:29,470 --> 01:03:33,750
for now, it's the wild west so make some
stuff and make a decision about where

1303
01:03:33,750 --> 01:03:35,810
you stand it's already how we work.

1304
01:03:35,810 --> 01:03:36,470
We look at stuff.

1305
01:03:36,470 --> 01:03:37,230
We're inspired by it.

1306
01:03:37,240 --> 01:03:39,680
We make stuff like it's sort of for me.

1307
01:03:40,240 --> 01:03:44,770
I draw the parallel that training is
the same as learning as for humans, and

1308
01:03:44,840 --> 01:03:46,860
I am trained on copyrighted material.

1309
01:03:47,080 --> 01:03:49,810
Every piece of music I ever heard
is a copyrighted piece of material.

1310
01:03:50,090 --> 01:03:54,530
Everything I've ever taken musically
is inspiration was copyrighted material

1311
01:03:54,530 --> 01:03:57,910
at one point So am I not as a person
allowed to train on that stuff how

1312
01:03:57,930 --> 01:04:00,730
that gets murky, too so I don't know.

1313
01:04:00,760 --> 01:04:03,900
I know it's maybe a false
equivalency, but it's to me.

1314
01:04:03,900 --> 01:04:05,160
It's it's it lines up

1315
01:04:05,545 --> 01:04:08,845
This notion was that, oh, just you
prompt mid journey and outcomes

1316
01:04:08,845 --> 01:04:09,725
art, now you're an artist.

1317
01:04:09,859 --> 01:04:14,549
seeing a devaluing of that final
output being, heralded as AI art,

1318
01:04:14,569 --> 01:04:17,449
and , if you got a really, really good
output, you probably spent, you know,

1319
01:04:18,769 --> 01:04:22,509
trying to manipulate and massage that
prompt to get something good out of it.

1320
01:04:22,509 --> 01:04:26,329
But to your point, if you took that
output and then put your spin on it and

1321
01:04:26,329 --> 01:04:30,499
did something artistic with it, well,
now it starts to rise above this sort of

1322
01:04:30,509 --> 01:04:35,389
generic floor, this level of noise that
anybody can go to an AI tool and get out.

1323
01:04:35,409 --> 01:04:38,999
I run in circles with some Never AI ers
as well, and there's always interesting

1324
01:04:38,999 --> 01:04:41,659
conversations about where it exists
today and where it's going to be

1325
01:04:41,659 --> 01:04:45,239
tomorrow, but when I show them your art
specifically, there's this moment of like,

1326
01:04:45,249 --> 01:04:47,499
oh, well, uh, there's something there.

1327
01:04:47,809 --> 01:04:50,679
that they can't put their finger
on because you, as an artist, are

1328
01:04:50,709 --> 01:04:52,319
elevating, you're adding something to it.

1329
01:04:52,349 --> 01:04:56,119
And, , I guess this leads to a rather
generic question, but I think it's

1330
01:04:56,119 --> 01:04:58,409
an important one, especially for
those that are seeing your visuals

1331
01:04:58,409 --> 01:04:59,629
for the first time on the YouTube.

1332
01:05:00,059 --> 01:05:03,159
If you're listening to the audio
only of this podcast, please go

1333
01:05:03,159 --> 01:05:04,739
check out this interview on YouTube.

1334
01:05:05,139 --> 01:05:08,079
How do you even describe
what you are doing?

1335
01:05:08,993 --> 01:05:12,713
Everything I make comes from a place
of trying to manufacture nostalgia

1336
01:05:12,713 --> 01:05:17,068
for something that never existed You
So that's sort of the thread that

1337
01:05:17,068 --> 01:05:18,108
runs through everything I'm doing.

1338
01:05:18,108 --> 01:05:20,788
I'm trying to make you pine for
something that maybe you don't

1339
01:05:20,818 --> 01:05:22,308
really understand where it came from.

1340
01:05:22,318 --> 01:05:25,818
It's like a, a memory that's maybe
from a dream or, or somewhere else.

1341
01:05:25,828 --> 01:05:30,098
So everything is, is, it's sort
of realistic, sort of unrealistic.

1342
01:05:30,098 --> 01:05:35,068
There's usually some sort of odd twist
somewhere in the, in the, in the piece.

1343
01:05:35,098 --> 01:05:38,128
And then sometimes like on Twitter,
it's literally just stuff I'm making.

1344
01:05:38,128 --> 01:05:38,898
I'm like, that's cool.

1345
01:05:39,328 --> 01:05:40,348
Everybody should check that out.

1346
01:05:40,398 --> 01:05:45,103
So, and I also love the being like,
this trash I'm putting out for fun?

1347
01:05:45,103 --> 01:05:46,373
Or is this something I thought about?

1348
01:05:46,633 --> 01:05:49,423
And making people sit there
and think about the trash for a

1349
01:05:49,423 --> 01:05:51,333
minute is like, makes me laugh.

1350
01:05:51,333 --> 01:05:55,763
So I don't know, like, yeah, I'm
a bit of a, a bit of a, Joker,

1351
01:05:55,963 --> 01:05:58,383
I like the concept of being an
artist, but having fun with it.

1352
01:05:58,383 --> 01:06:01,923
I'm in the Frank Zappa group of,
yes, humor does belong in music.

1353
01:06:02,073 --> 01:06:02,473
for sure.

1354
01:06:02,633 --> 01:06:04,453
I, I agree with that I think one
thing that would be interesting to

1355
01:06:04,463 --> 01:06:08,043
the listeners here is you obviously
talked about a ton of different tools.

1356
01:06:08,313 --> 01:06:10,663
One of the things we try to give
people a heads up on is like ways

1357
01:06:10,663 --> 01:06:11,933
to kind of try the stuff themselves.

1358
01:06:11,933 --> 01:06:15,363
And obviously there's lots of things with
easy you eyes that you can go and get

1359
01:06:15,363 --> 01:06:19,123
like, whether it's Leonardo or the things
that are like designed for normies, but

1360
01:06:19,123 --> 01:06:20,233
you do stuff that's really interesting.

1361
01:06:20,233 --> 01:06:22,963
And I think comfy UI is something
that's worth talking about.

1362
01:06:22,993 --> 01:06:26,868
And so, um, A comfy UI for anybody
listening is a stable diffusion

1363
01:06:26,888 --> 01:06:30,648
interface and stable diffusion is
ostensibly an open source model.

1364
01:06:30,648 --> 01:06:32,778
And you may correct me on
that, but it's an accessible

1365
01:06:32,788 --> 01:06:34,268
image model for lots of people.

1366
01:06:34,628 --> 01:06:38,738
What are you able to do in comfy UI
that an average person wouldn't be

1367
01:06:38,738 --> 01:06:41,378
able to do in something like a mid
journey or just an off the shelf

1368
01:06:41,378 --> 01:06:42,938
kind of image generation software.

1369
01:06:42,938 --> 01:06:47,538
So, , Comfy UI's main strength is
that it's a visual, node based, , I

1370
01:06:47,538 --> 01:06:48,438
don't want to say programming

1371
01:06:48,773 --> 01:06:49,363
Okay, hold on.

1372
01:06:49,363 --> 01:06:50,243
I'm gonna stop you right there.

1373
01:06:50,243 --> 01:06:52,693
What is visual node based
program language mean?

1374
01:06:53,658 --> 01:06:56,538
So, uh, regular code is
written in text, right?

1375
01:06:56,538 --> 01:06:56,568
Okay.

1376
01:06:56,653 --> 01:06:57,123
Mm hmm.

1377
01:06:57,693 --> 01:07:00,133
You just type lines of text
and then maybe it works.

1378
01:07:00,133 --> 01:07:05,153
Maybe it doesn't, comfy wise, more
like a modular synthesizer where

1379
01:07:05,163 --> 01:07:07,303
you have little components in boxes.

1380
01:07:07,733 --> 01:07:12,433
And then you, you know, in your
mind, what the path of those boxes

1381
01:07:12,443 --> 01:07:15,963
is because of the chains or the
wires that are connecting them.

1382
01:07:16,363 --> 01:07:20,583
And basically you can rewire how
stable diffusion works, , or just

1383
01:07:20,603 --> 01:07:24,199
use different things modularly
in your workflow in places.

1384
01:07:24,199 --> 01:07:28,383
Maybe they shouldn't have been used or as
a new thing that you're just trying out.

1385
01:07:28,383 --> 01:07:32,463
It's like taking all the elements of
stable diffusion and having access

1386
01:07:32,463 --> 01:07:36,273
to them in a very expandable way
so you can build modules that you

1387
01:07:36,273 --> 01:07:37,683
can then expand into other things.

1388
01:07:37,683 --> 01:07:41,323
So you can do one thing, you can
add another thing, you can save

1389
01:07:41,323 --> 01:07:42,803
that second thing as a template.

1390
01:07:43,033 --> 01:07:45,903
Then in your next project you can say, Oh,
I want to do that thing I did last time.

1391
01:07:45,913 --> 01:07:48,763
Just drop it in and then just
plug in the wires and go.

1392
01:07:48,793 --> 01:07:51,103
So it's, , It's scary at first.

1393
01:07:51,143 --> 01:07:54,693
It's, it's, it's very overwhelming,
but it teaches you what

1394
01:07:54,693 --> 01:07:56,193
diffusion is, how it works.

1395
01:07:56,243 --> 01:08:00,653
And then once you understand that
little simple path, it's just literally

1396
01:08:00,653 --> 01:08:05,528
just Plugging stuff in and rewiring
it wrong and laughing about the crazy

1397
01:08:05,528 --> 01:08:09,078
stuff it makes and then sometimes
the crazy stuff it makes is amazing

1398
01:08:09,138 --> 01:08:12,418
Can you walk us through a very simple
version of that node and just say

1399
01:08:12,418 --> 01:08:15,748
like, okay, first node is this, second
node is this, third node is this, and

1400
01:08:15,748 --> 01:08:18,688
they go in this direction, like, just
because I know that like is control

1401
01:08:18,688 --> 01:08:22,068
not a node that you put in there, like,
how do the nodes work particularly,

1402
01:08:22,464 --> 01:08:22,764
Sure.

1403
01:08:22,834 --> 01:08:24,904
There's a bunch of stuff that
comes with comfy, which is

1404
01:08:24,904 --> 01:08:26,584
just basic diffusion stuff.

1405
01:08:26,594 --> 01:08:30,394
You get your loaders, you
basically load up a checkpoint.

1406
01:08:30,404 --> 01:08:33,284
So you load up your model, which is,
you know, stable diffusion model.

1407
01:08:33,644 --> 01:08:39,774
You load up any Loras you want, which
are, are, uh, ways to affect the results.

1408
01:08:39,994 --> 01:08:42,194
You can get them on Civit AI
and a bunch of other places.

1409
01:08:42,564 --> 01:08:45,464
, so checkpoints and Loras,
you load them up and then you

1410
01:08:45,464 --> 01:08:47,684
feed it into a prompt encoder.

1411
01:08:47,694 --> 01:08:50,834
So you, then you tell it, I want
my positive and my negative prompt.

1412
01:08:51,164 --> 01:08:52,264
This is what I want them to be.

1413
01:08:52,634 --> 01:08:57,784
And then you have a, an empty latent
space, which is just the canvas

1414
01:08:57,794 --> 01:08:59,134
in which you're dreaming into.

1415
01:08:59,434 --> 01:09:02,364
So you say I want to dream
into a five 12 by five 12

1416
01:09:02,484 --> 01:09:04,644
canvas , and it's this many pixels.

1417
01:09:04,884 --> 01:09:05,734
Pieces in the batch.

1418
01:09:05,734 --> 01:09:06,894
So I want to make one image.

1419
01:09:07,274 --> 01:09:11,014
So then you plug all that into a
case sampler, which is the main

1420
01:09:11,284 --> 01:09:13,944
heart of a diffusion, uh, situation.

1421
01:09:13,944 --> 01:09:15,324
It like does all the work.

1422
01:09:15,354 --> 01:09:18,044
So you plug everything into it,
once that's all plugged in, then you

1423
01:09:18,044 --> 01:09:21,234
just punch out into a VAE decoder.

1424
01:09:21,234 --> 01:09:25,064
And that's where a variable
auto and variable auto encoder.

1425
01:09:25,414 --> 01:09:26,964
That's the thing that takes it from.

1426
01:09:27,434 --> 01:09:31,584
Latent noise, the machine noise, the noise
a machine understands, turns it into an

1427
01:09:31,594 --> 01:09:33,924
RGB image that we as humans can read.

1428
01:09:33,924 --> 01:09:37,304
So, that's that final step where
you take the, the machine noise

1429
01:09:37,304 --> 01:09:38,874
, and diffuse it into an image.

1430
01:09:39,254 --> 01:09:41,004
And then that image you then just save.

1431
01:09:41,264 --> 01:09:44,804
And then that is all expandable
out to video or whatever else

1432
01:09:44,804 --> 01:09:46,334
by just batching the images.

1433
01:09:46,364 --> 01:09:49,604
and for people at home listening or
watching, it may sound confusing,

1434
01:09:49,604 --> 01:09:53,004
but what's really cool about comfy
UI is it is a visual medium that you

1435
01:09:53,004 --> 01:09:56,274
can see it's almost like I love this
old video game called the impossible

1436
01:09:56,274 --> 01:09:58,854
machine, which was always about
putting things in different orders.

1437
01:09:58,854 --> 01:10:01,724
And like, as it would go through here,
it's a little bit like that, right?

1438
01:10:01,724 --> 01:10:02,824
It's like, okay, you've got all these

1439
01:10:03,144 --> 01:10:04,744
the bowling ball along the shelf?

1440
01:10:04,744 --> 01:10:05,494
Is that what this is?

1441
01:10:05,494 --> 01:10:06,304
Into a basket?

1442
01:10:06,334 --> 01:10:06,984
Yeah, I get it.

1443
01:10:07,264 --> 01:10:08,024
Kinda, yeah.

1444
01:10:08,094 --> 01:10:08,754
No, totally.

1445
01:10:09,394 --> 01:10:12,754
,
That was basic by design of here's how
we're just going to generate a 2D image.

1446
01:10:12,754 --> 01:10:16,774
It's how Stable Diffusion is
working, , even on automatic 11.

1447
01:10:16,774 --> 01:10:17,554
11 behind the scenes.

1448
01:10:17,564 --> 01:10:19,464
You're just getting access
to those granular things.

1449
01:10:19,464 --> 01:10:20,784
So I love that explainer.

1450
01:10:21,214 --> 01:10:25,104
You are then bending and breaking
this thing in ways that I don't

1451
01:10:25,104 --> 01:10:27,834
know if you even imagine when you
first started diving into this.

1452
01:10:27,834 --> 01:10:29,744
You're taking custom.

1453
01:10:30,149 --> 01:10:34,359
RGB animations, red, green, blue
animations, and using the way the

1454
01:10:34,359 --> 01:10:39,899
geometry of that solid color moves
to tell a portion of comfy UI.

1455
01:10:39,919 --> 01:10:41,319
Hey, this is actually.

1456
01:10:41,774 --> 01:10:45,674
sky in a background and this green
blob is actually a person's face

1457
01:10:45,674 --> 01:10:47,014
as it moved towards the camera.

1458
01:10:47,094 --> 01:10:50,174
When you set out on this journey,
was, was there a happy accident that

1459
01:10:50,174 --> 01:10:51,614
led you into doing these things?

1460
01:10:51,944 --> 01:10:54,934
Did you find somebody else's
workflow and make it your own?

1461
01:10:54,944 --> 01:10:58,604
Like, when did you start heading in
this very, specific stylized path?

1462
01:11:00,009 --> 01:11:04,999
What happens there is, Some nerd
makes some really cool feature , for

1463
01:11:04,999 --> 01:11:07,009
the forum, let's say version four, 0.

1464
01:11:07,009 --> 01:11:09,349
4, they added depth maps to the forum.

1465
01:11:09,789 --> 01:11:14,689
So suddenly every time you're making
a, an animation, you can tell it to

1466
01:11:14,699 --> 01:11:18,919
look at the image and create a depth
map and then pull the stuff that

1467
01:11:18,949 --> 01:11:21,909
close to the camera, closer to the
camera as the animation goes through.

1468
01:11:22,269 --> 01:11:26,579
So already you've just, that unlocks
a door to like, Oh, well maybe.

1469
01:11:27,099 --> 01:11:31,729
When, , they add control net to Deforum,
we'll be able to make our own masks.

1470
01:11:31,979 --> 01:11:33,199
Then, bam, that happens.

1471
01:11:33,489 --> 01:11:36,119
And it's great, and we can add
our own masks to control net.

1472
01:11:36,139 --> 01:11:37,549
We start playing with that with Deforum.

1473
01:11:37,899 --> 01:11:40,509
And then someone thinks, oh, well,
wouldn't it be cool if we could make a

1474
01:11:40,509 --> 01:11:42,429
mask that tells the pixels how to move?

1475
01:11:42,634 --> 01:11:43,804
Which is hybrid video.

1476
01:11:44,134 --> 01:11:45,934
And then they add hybrid video to deform.

1477
01:11:45,934 --> 01:11:48,664
So all these things just iterate,
the community iterates these things

1478
01:11:48,934 --> 01:11:51,504
and you just learn how to use them
and integrate them in your workflow.

1479
01:11:51,784 --> 01:11:53,124
You keep the stuff that's awesome.

1480
01:11:53,124 --> 01:11:56,284
You drop the stuff that's worthless
or takes too long or is completely

1481
01:11:56,284 --> 01:11:58,304
outdated in a week, and you move on.

1482
01:11:58,334 --> 01:12:01,974
A lot of this stuff is just a culmination
of testing all kinds of different features

1483
01:12:01,974 --> 01:12:06,554
and, different ways of interacting
with these animations over the years.

1484
01:12:06,574 --> 01:12:08,104
But, um, Yeah.

1485
01:12:08,104 --> 01:12:11,364
I mean, the cool thing is
the community is like on it.

1486
01:12:11,374 --> 01:12:15,564
If there's a discord called Banadoko,
if you go there, , everybody just posts

1487
01:12:15,564 --> 01:12:18,554
their workflows there and you just go,
you go down a little workflow, install

1488
01:12:18,554 --> 01:12:20,474
the plugins and, and get to work.

1489
01:12:20,484 --> 01:12:23,154
And honestly, most of my streams
are me just grabbing one of those

1490
01:12:23,164 --> 01:12:27,474
workflows and we just install it
and try it out and, , squash all the

1491
01:12:27,474 --> 01:12:30,754
bugs and talk to the developer and do
what we got to do to get it to work.

1492
01:12:31,124 --> 01:12:34,644
I didn't think we'd be sitting here
with, you know, real time painting tools

1493
01:12:34,644 --> 01:12:38,504
that can , in a matter of milliseconds,
render something with beautiful lighting

1494
01:12:38,504 --> 01:12:41,354
and then incorporate a lower or whatever
else, like we're, we're here already.

1495
01:12:41,704 --> 01:12:44,314
It's still relatively early in 2024.

1496
01:12:44,334 --> 01:12:45,444
That is mind blowing.

1497
01:12:45,444 --> 01:12:46,254
it's super cool.

1498
01:12:46,334 --> 01:12:48,224
I like stream diffusion is wild.

1499
01:12:48,554 --> 01:12:51,624
I, I don't know if you guys have
ever seen dot simulate Lyle.

1500
01:12:51,694 --> 01:12:54,734
, he has a touch designer
plugin for stream diffusion.

1501
01:12:54,821 --> 01:12:57,731
, so you can do stuff in touch
designer and immediately diffuse

1502
01:12:57,731 --> 01:12:59,761
over top of it And then bring that

1503
01:12:59,971 --> 01:13:01,251
that is so cool.

1504
01:13:01,251 --> 01:13:04,231
For those who don't know, a lot
of , musicians will integrate

1505
01:13:04,231 --> 01:13:07,881
with TouchDesigner is a, you
know, For real time visuals.

1506
01:13:07,881 --> 01:13:08,131
Yeah.

1507
01:13:08,131 --> 01:13:12,801
So the idea that like your drum kit is pre
creating primitive geometry on a screen,

1508
01:13:12,801 --> 01:13:16,591
but then say, okay, that square for my
kick drum is actually a city building.

1509
01:13:16,881 --> 01:13:21,291
And the hi hat noise needs to be the
color of the sky and let it in real time.

1510
01:13:21,291 --> 01:13:21,561
Do that.

1511
01:13:21,571 --> 01:13:22,631
I've got to check that out.

1512
01:13:23,111 --> 01:13:24,441
Oh, that's going to be dangerous.

1513
01:13:24,451 --> 01:13:27,321
So we know that the tech's
going to get better.

1514
01:13:27,321 --> 01:13:30,761
Where do you see this driving
in, , a year's time, , in terms

1515
01:13:30,761 --> 01:13:34,281
of, let's say, performance,
in terms of, uh, capabilities?

1516
01:13:34,441 --> 01:13:37,721
What do you think is gonna be unlocked,
and where do you want it to go for

1517
01:13:37,721 --> 01:13:40,471
you, personally, professionally?

1518
01:13:40,521 --> 01:13:44,341
,
I just want to get more tools that are
integrable into tools you already have.

1519
01:13:44,641 --> 01:13:47,921
One of my main things is I really want
to get into, uh, audio AI, music AI.

1520
01:13:48,596 --> 01:13:51,196
But all the solutions right
now just generate songs for me.

1521
01:13:51,226 --> 01:13:56,006
I don't need that I need uh any tools I
can just drop into ableton or logic or

1522
01:13:56,006 --> 01:14:01,476
whatever i'm already making music in um
So that I can use these tools effectively

1523
01:14:01,476 --> 01:14:05,806
like that's why I like comfy It's you're
just dropping it in when you need it with

1524
01:14:05,826 --> 01:14:10,326
other workflows blender after effects
photoshop, whatever you're doing Comfy

1525
01:14:10,326 --> 01:14:14,796
is just another component of the workflow
that I can plop in like a block, but you

1526
01:14:14,796 --> 01:14:18,006
know, for audio right now, it's like,
it just generate music, which is, it's

1527
01:14:18,006 --> 01:14:21,096
cool and exciting, but , not useful
for me because I already make music.

1528
01:14:21,096 --> 01:14:23,046
I need the tools to make music better.

1529
01:14:23,406 --> 01:14:27,246
There are some AI music tools, but I
want some stuff where I just throw it

1530
01:14:27,246 --> 01:14:29,376
on the channel and it does cool stuff.

1531
01:14:29,416 --> 01:14:31,706
Maybe it talks to a server,
maybe I pay for it, whatever.

1532
01:14:32,131 --> 01:14:35,751
Uh, if it's got to be off off on
the cloud or whatever it was just

1533
01:14:35,751 --> 01:14:39,261
generating samples for me I bring
them back in and try them all out.

1534
01:14:39,419 --> 01:14:42,629
So more control over the stems
and the creation of the individual

1535
01:14:42,804 --> 01:14:43,294
granular

1536
01:14:43,699 --> 01:14:45,249
Don't bake me the entire cake.

1537
01:14:45,359 --> 01:14:46,669
Just give me some ingredients.

1538
01:14:47,414 --> 01:14:47,994
Exactly.

1539
01:14:48,024 --> 01:14:48,374
Yeah

1540
01:14:48,629 --> 01:14:52,459
I'd love to talk about your community
and the live streams when you started,

1541
01:14:52,699 --> 01:14:56,249
taking these comfy tutorials and
these journeys with a community.

1542
01:14:56,249 --> 01:14:57,529
How has your community grown?

1543
01:14:57,679 --> 01:14:59,229
What are they reacting to?

1544
01:14:59,409 --> 01:15:02,979
Yeah, it's been great, , I had a
lot of trouble getting traction on

1545
01:15:03,199 --> 01:15:06,799
twitch , I was doing blender stuff on
and a little bit AI stuff on twitch

1546
01:15:06,809 --> 01:15:10,649
for about a year and a year and a half
just getting no viewers at all and then

1547
01:15:11,024 --> 01:15:13,644
Did you have a big wheel that
you would spin every five subs

1548
01:15:13,644 --> 01:15:14,594
or did you cover yourself in

1549
01:15:14,744 --> 01:15:18,764
well that's kind of I think that's kind
of it is is nobody wants to sit and watch

1550
01:15:18,764 --> 01:15:22,904
somebody , just click click stuff and
and occasionally talk on twitch, , so

1551
01:15:22,904 --> 01:15:25,764
what happened was I watched a couple
of my blender friends just killing it,

1552
01:15:25,814 --> 01:15:29,654
uh doing youtube streams and I thought
well youtube's a good choice because

1553
01:15:29,974 --> 01:15:33,334
Whenever I do these live streams,
it's just immediately saved forever.

1554
01:15:33,334 --> 01:15:36,324
So if somebody needs to go back and
see what I did, they can literally just

1555
01:15:36,324 --> 01:15:39,724
scroll back, which is not something that
could get going really well with Twitch.

1556
01:15:40,084 --> 01:15:41,254
Everyone's been really cool.

1557
01:15:41,274 --> 01:15:45,424
I've just been slowly gaining more and
more followers and, , we're, , Building

1558
01:15:45,424 --> 01:15:49,224
a community on Discord as well, , where,
you can come and, , draw up your workflows

1559
01:15:49,274 --> 01:15:52,074
and then, people come and show that
we're out there and we all, you know,

1560
01:15:52,074 --> 01:15:55,384
follow each other and, and, you know,
get on Instagram and all that stuff.

1561
01:15:55,384 --> 01:15:57,484
And, yeah, it's been really great.

1562
01:15:57,524 --> 01:15:58,194
The,

1563
01:15:58,309 --> 01:16:01,529
you been approached by a shark
to productize everything yet?

1564
01:16:01,539 --> 01:16:05,629
Because there's definitely companies
that are out there wrapping up comfy

1565
01:16:05,629 --> 01:16:10,559
UI workflows and trying to sell them as
magical tools and I got to imagine there's

1566
01:16:10,719 --> 01:16:14,269
maybe an ounce of interest, but maybe
also an ounce of repulsion there for you.

1567
01:16:14,269 --> 01:16:15,329
I do get offers.

1568
01:16:15,409 --> 01:16:18,689
, so the biggest problem for me is
that a lot of these things that

1569
01:16:18,689 --> 01:16:20,749
we're doing are very one off.

1570
01:16:20,759 --> 01:16:22,439
They require a lot of tinkering.

1571
01:16:22,539 --> 01:16:26,929
, it's really hard for me to build
a one size fits all solution for

1572
01:16:26,939 --> 01:16:29,489
like somebody wants something
that'll just forever always make

1573
01:16:29,489 --> 01:16:31,049
yearbook photos of people like.

1574
01:16:31,444 --> 01:16:32,484
Yeah, that's possible.

1575
01:16:32,484 --> 01:16:36,394
But so a lot of the time it's just gonna
make junk and The end user is gonna

1576
01:16:36,394 --> 01:16:37,714
end up paying credits for that junk.

1577
01:16:37,714 --> 01:16:43,214
So I just as a Just as the type of
person I am I would prefer to just

1578
01:16:43,214 --> 01:16:47,384
empower people to make this stuff at
home Learn how to plug a laura in and

1579
01:16:47,384 --> 01:16:50,514
make your own yearbook generator because
then your yearbook generator becomes

1580
01:16:50,514 --> 01:16:54,474
an anything you want in The world laura
generator right selfie generator friend.

1581
01:16:54,474 --> 01:16:57,324
It's like that's the thing these
tools empower people So that's

1582
01:16:57,354 --> 01:17:00,359
i'm just mostly Interested in
empowering people to use them.

1583
01:17:00,359 --> 01:17:05,189
So I think if if there was something
that would Make me excited would

1584
01:17:05,199 --> 01:17:10,024
be something where I can offload
usage to the cloud in an instance

1585
01:17:10,024 --> 01:17:11,404
of comfy that I'm currently running.

1586
01:17:11,604 --> 01:17:14,464
So say you're running comfy on
your notebook it's set up so that

1587
01:17:14,464 --> 01:17:16,964
everything it does on the GPU
it just sends off to the cloud.

1588
01:17:17,234 --> 01:17:19,684
So you're still running comfy locally,
you're still doing all this stuff.

1589
01:17:19,694 --> 01:17:22,594
You can still follow along with my
tutorials, but it's just ripping frames

1590
01:17:22,664 --> 01:17:23,064
that not

1591
01:17:23,154 --> 01:17:23,804
a computer and a

1592
01:17:24,144 --> 01:17:24,334
I'm

1593
01:17:24,604 --> 01:17:30,524
It sort of does, but it's not, uh,
because, uh, the, TLDR is you have

1594
01:17:30,524 --> 01:17:34,894
to have the exact same instance
of comfy running on that computer.

1595
01:17:35,054 --> 01:17:36,964
So you still have to spin up another copy

1596
01:17:37,064 --> 01:17:38,544
of every extension or plug in or

1597
01:17:38,864 --> 01:17:41,244
So you still have to
spin up a second version.

1598
01:17:41,244 --> 01:17:43,044
It's just a different
way of interfacing with

1599
01:17:43,094 --> 01:17:43,794
Because that's what I had when I

1600
01:17:43,804 --> 01:17:45,054
think let's say solve that.

1601
01:17:45,134 --> 01:17:45,644
your workflows.

1602
01:17:45,674 --> 01:17:48,404
We were talking before we hit
record was that, , I'm on a Mac.

1603
01:17:48,414 --> 01:17:52,324
It's, for reasons, I really
want to play with this stuff.

1604
01:17:52,324 --> 01:17:53,354
A lot of it's NVIDIA only.

1605
01:17:53,354 --> 01:17:55,754
So then I went, okay, I'm going to go
and I'm going to spin up a run pod.

1606
01:17:55,964 --> 01:17:56,544
And I'm going to install.

1607
01:17:56,544 --> 01:17:59,084
I was like, well, this version
doesn't exactly match that version.

1608
01:17:59,274 --> 01:18:01,914
This checkpoint, I got to ingest
through here and blah, blah, blah.

1609
01:18:02,114 --> 01:18:03,174
Oh, I clicked the wrong box.

1610
01:18:03,174 --> 01:18:06,144
So if I pause the run pod, it's
all going to go away anyway.

1611
01:18:06,144 --> 01:18:07,304
So now I'm just going
to be paying per minute.

1612
01:18:07,324 --> 01:18:10,404
And I know that there are, solutions
and templates and all that stuff.

1613
01:18:10,404 --> 01:18:14,974
But even for someone who is in the scene
and has a modicum of understanding about

1614
01:18:14,974 --> 01:18:18,674
all this stuff, it still made , my head
spin, which I'd love to drive towards

1615
01:18:18,674 --> 01:18:22,944
a final question for what is likely a
broader audience that we have out there.

1616
01:18:23,244 --> 01:18:25,794
And it is, where do I start?

1617
01:18:25,824 --> 01:18:26,934
How do I begin?

1618
01:18:26,944 --> 01:18:30,824
Is there a, baby's first steps
guide, or is there a, a template

1619
01:18:30,824 --> 01:18:33,284
that you have that you recommend
for someone to get this all going?

1620
01:18:34,269 --> 01:18:37,529
, I would say you got to decide what
you want to do with comfy before

1621
01:18:37,529 --> 01:18:41,189
you start with comfy because it's
too big to, uh, just jump into.

1622
01:18:41,189 --> 01:18:44,689
So, uh, if your goal is to just make
images, like just start with just

1623
01:18:44,689 --> 01:18:49,694
making images, which is a fantastic
first goal, um, You literally install

1624
01:18:49,694 --> 01:18:54,824
comfy and you hit default at the thing
and it'll load up a default workflow

1625
01:18:55,204 --> 01:18:58,654
and then all the nodes are already
there You So you can literally

1626
01:18:58,654 --> 01:19:01,574
just start dragging noodles out
of stuff and letting go and seeing

1627
01:19:01,574 --> 01:19:03,254
what it recommends you can plug in.

1628
01:19:03,674 --> 01:19:06,744
And that'll, that'll start your brain
going, Oh, okay, well I can plug these

1629
01:19:06,744 --> 01:19:08,134
into this, I can plug that into this.

1630
01:19:08,484 --> 01:19:10,774
it's Scary, but you're not going
to break everything all the time.

1631
01:19:10,774 --> 01:19:13,154
You're just going to go,
okay, my negative prompt

1632
01:19:13,279 --> 01:19:14,859
something under the hood of your car.

1633
01:19:14,879 --> 01:19:15,729
It's okay.

1634
01:19:15,759 --> 01:19:17,139
Cause it can be undone.

1635
01:19:17,149 --> 01:19:19,789
So go ahead and make the mistakes and
maybe you'll get a happy accident.

1636
01:19:19,789 --> 01:19:22,329
So just start out small, make an image.

1637
01:19:22,679 --> 01:19:25,349
Uh, now I want to turn this image
into a batch of four images.

1638
01:19:25,359 --> 01:19:26,039
How do I do that?

1639
01:19:26,389 --> 01:19:27,079
Figure that out.

1640
01:19:27,119 --> 01:19:30,219
, I want to use an image as , the
beginning of the image to image.

1641
01:19:30,219 --> 01:19:30,959
How do I do that?

1642
01:19:31,279 --> 01:19:33,459
, just jump on my discord, ask
somebody how to do it.

1643
01:19:33,839 --> 01:19:37,049
, there's a bunch of beginner
YouTube stuff, but it is all very

1644
01:19:37,069 --> 01:19:38,789
pointed for very specific tasks.

1645
01:19:38,799 --> 01:19:43,399
So pick a task, watch YouTube if you want,
but it's way more fun just to break stuff

1646
01:19:43,429 --> 01:19:45,939
and, uh, install comfy and, break stuff.

1647
01:19:46,252 --> 01:19:46,822
that's awesome.

1648
01:19:46,832 --> 01:19:47,652
Purrs, Where, can people

1649
01:19:47,662 --> 01:19:47,972
find

1650
01:19:47,972 --> 01:19:48,222
you?

1651
01:19:48,222 --> 01:19:49,746
Where, what is your discord?

1652
01:19:49,746 --> 01:19:51,556
, where is your Twitter X handle?

1653
01:19:51,556 --> 01:19:52,426
What is all that stuff.

1654
01:19:53,708 --> 01:19:55,068
So all my links are on purrs.

1655
01:19:55,078 --> 01:19:55,868
xyz.

1656
01:19:55,898 --> 01:19:59,288
, and then, uh, , I'm at purrs
beats pretty much everywhere.

1657
01:19:59,288 --> 01:20:01,188
So Twitter, YouTube, , all that stuff.

1658
01:20:01,218 --> 01:20:03,178
, but yeah, all of those links
are at the top of purrs.

1659
01:20:03,188 --> 01:20:06,358
xyz as long as, as well as
the link to the discord.

1660
01:20:06,358 --> 01:20:09,608
So if you want to pop in there and if
you have questions and stuff, if I'm

1661
01:20:09,618 --> 01:20:12,198
not there, there's a bunch of people who
are, and they've all been through it.

1662
01:20:12,208 --> 01:20:13,508
So, , ask your stupid

1663
01:20:13,508 --> 01:20:13,928
questions.

1664
01:20:13,928 --> 01:20:14,558
Nobody cares.

1665
01:20:14,608 --> 01:20:15,028
,
it's fine.

1666
01:20:15,266 --> 01:20:16,326
You'll be seeing me show up.

1667
01:20:16,376 --> 01:20:18,386
You'll be seeing me show
up later today, Purrs.

1668
01:20:18,386 --> 01:20:19,206
I'll be jumping in

1669
01:20:19,206 --> 01:20:19,346
there.

1670
01:20:19,346 --> 01:20:19,906
I got, I

1671
01:20:20,023 --> 01:20:21,503
You're just, you just want to test that.

1672
01:20:21,573 --> 01:20:23,333
You want to just ask the
dumbest questions ever.

1673
01:20:23,333 --> 01:20:24,523
They have nothing to do with stable

1674
01:20:24,876 --> 01:20:27,436
How do I make a song
about hot dog casserole?

1675
01:20:27,436 --> 01:20:28,356
Oh, I did that already?

1676
01:20:28,356 --> 01:20:29,126
Let me share it.

1677
01:20:29,126 --> 01:20:31,956
Yeah, All right, thanks Purrs.

1678
01:20:31,966 --> 01:20:32,776
We'll talk to you, soon.

1679
01:20:35,676 --> 01:20:37,146
Thank you, , Perzbeats for being here.

1680
01:20:37,146 --> 01:20:40,516
Please go check out his work and really
dig in on comfy UI and some of the cool

1681
01:20:40,516 --> 01:20:41,896
things you can do with stable diffusion.

1682
01:20:42,206 --> 01:20:43,526
That is it for today's show.

1683
01:20:43,546 --> 01:20:46,106
But you know, there's a couple
things you got to do before you go.

1684
01:20:46,116 --> 01:20:50,606
If you listen to this show, please
go like subscribe, leave us reviews.

1685
01:20:50,656 --> 01:20:53,446
We're about ready to read some five
star reviews from Apple podcasts.

1686
01:20:53,446 --> 01:20:54,666
There were three new ones this week.

1687
01:20:54,896 --> 01:20:59,536
So Kevin, I am going to jump out
and say, uh, From Valley Villager.

1688
01:21:00,076 --> 01:21:01,096
This is the subject.

1689
01:21:01,106 --> 01:21:06,966
My only capital A absolute each and every
week, which is a very nice thing to say.

1690
01:21:07,256 --> 01:21:11,126
The guys will keep you 102 percent
up to date with everything you

1691
01:21:11,126 --> 01:21:12,556
need to know about the new world.

1692
01:21:12,826 --> 01:21:13,646
Great guests.

1693
01:21:13,646 --> 01:21:15,016
They always make me laugh.

1694
01:21:15,026 --> 01:21:15,666
Thank you so much.

1695
01:21:15,696 --> 01:21:16,986
Looking forward to it all week.

1696
01:21:16,986 --> 01:21:19,826
So that is a very nice
kickoff for our five star

1697
01:21:20,216 --> 01:21:22,516
And again, I hate to belabor it.

1698
01:21:22,536 --> 01:21:24,076
I can't, we can't stress it enough.

1699
01:21:24,076 --> 01:21:27,106
We're coming up on a year, which
believe it or not, it's still young

1700
01:21:27,226 --> 01:21:31,126
for a podcast these days, especially
with zero marketing dollars.

1701
01:21:31,356 --> 01:21:33,336
And it's a spare time hustle for us both.

1702
01:21:33,336 --> 01:21:34,136
It's a labor of love.

1703
01:21:34,136 --> 01:21:38,096
So please, if you have a second to
engage, it really is the only way we

1704
01:21:38,096 --> 01:21:41,146
grow this and these five star reviews
help out massively, but leave them on

1705
01:21:41,146 --> 01:21:43,276
Spotify, leave us comments on YouTube.

1706
01:21:43,276 --> 01:21:44,096
Make sure you subscribe.

1707
01:21:44,096 --> 01:21:45,276
It doesn't cost a dollar.

1708
01:21:45,276 --> 01:21:45,806
Our next.

1709
01:21:46,121 --> 01:21:50,901
Five star review, Gavin,
comes from Funhog43.

1710
01:21:50,921 --> 01:21:51,961
Subject is, thank you.

1711
01:21:52,126 --> 01:21:52,706
Love it.

1712
01:21:53,151 --> 01:21:57,511
\ , I'm an educator and have recently been
in servicing teachers on implementing AI

1713
01:21:57,511 --> 01:21:59,551
into their day to day school activities.

1714
01:21:59,731 --> 01:22:02,431
Your show has been influential
in me pushing forward with AI.

1715
01:22:02,631 --> 01:22:05,931
I think it can be a great tool for
educators and students alike, as long

1716
01:22:05,931 --> 01:22:07,331
as it is used in the right manner.

1717
01:22:07,616 --> 01:22:08,116
We agree.

1718
01:22:08,446 --> 01:22:10,056
Actually, I used py.

1719
01:22:10,106 --> 01:22:11,606
ai after listening to an episode.

1720
01:22:11,606 --> 01:22:15,226
I was able to create a story for students
in an ESL class about them becoming

1721
01:22:15,226 --> 01:22:17,186
leprechauns and pranking their teachers.

1722
01:22:17,546 --> 01:22:20,246
I included follow up questions
and was able to have it read

1723
01:22:20,246 --> 01:22:21,826
aloud in multiple languages.

1724
01:22:22,056 --> 01:22:24,646
Seems to make students more
engaged in their learning.

1725
01:22:24,646 --> 01:22:26,645
THX.

1726
01:22:26,646 --> 01:22:26,886
That's a

1727
01:22:27,106 --> 01:22:27,696
Funhawk.

1728
01:22:27,706 --> 01:22:28,146
Thank you.

1729
01:22:28,156 --> 01:22:29,246
Funhawk43.

1730
01:22:29,286 --> 01:22:29,966
We love that.

1731
01:22:29,976 --> 01:22:30,426
We love that.

1732
01:22:30,426 --> 01:22:30,636
All right.

1733
01:22:30,636 --> 01:22:34,216
And finally from Doc Anderson, who I
think is somebody that, , is engaged

1734
01:22:34,216 --> 01:22:35,396
with us quite a bit on the YouTube.

1735
01:22:35,406 --> 01:22:36,146
, shout out to Doc.

1736
01:22:36,586 --> 01:22:38,746
The subject is Kevin and AI co host.

1737
01:22:38,746 --> 01:22:40,376
This is a question I've often wondered.

1738
01:22:40,656 --> 01:22:44,086
He'll says, well, I'll start my,
I'll start my review this podcast

1739
01:22:44,106 --> 01:22:45,996
by simply saying I am embarrassed.

1740
01:22:46,026 --> 01:22:49,246
Oh, I actually posted a review
to the previous podcast about

1741
01:22:49,246 --> 01:22:50,476
trying to borrow your names.

1742
01:22:50,881 --> 01:22:52,701
I'm embarrassed by my error.

1743
01:22:52,751 --> 01:22:53,791
Oh, thank you, doc.

1744
01:22:54,101 --> 01:22:57,521
I want to start my new reviews and share
them with proper podcasts, even though

1745
01:22:57,521 --> 01:22:59,911
this podcast does not deserve five stars.

1746
01:23:00,161 --> 01:23:01,371
I give out five stars.

1747
01:23:01,371 --> 01:23:02,081
Yeah, wait, hold on.

1748
01:23:02,091 --> 01:23:02,821
Give him a second.

1749
01:23:03,001 --> 01:23:05,881
I give out five stars to
podcasts that are good on Apple.

1750
01:23:06,061 --> 01:23:09,701
A podcast like AI for humans
is phenomenal and deserves a

1751
01:23:09,721 --> 01:23:12,861
special category, perhaps 6.

1752
01:23:13,131 --> 01:23:15,211
I enjoy the layout of the show.

1753
01:23:15,421 --> 01:23:16,611
The AI cohost every

1754
01:23:16,856 --> 01:23:17,346
The layout

1755
01:23:17,381 --> 01:23:17,831
me laugh.

1756
01:23:17,896 --> 01:23:18,106
of the show.

1757
01:23:18,901 --> 01:23:22,231
I also enjoy the banter, the conversations
about the weekly news articles.

1758
01:23:22,471 --> 01:23:27,011
In many cases, I find myself replicating
some of the things the host did with AI.

1759
01:23:27,231 --> 01:23:30,321
I find it incredibly fun to
recreate some of what they did.

1760
01:23:30,531 --> 01:23:32,791
Finally, the interview section
is a great ending to the show.

1761
01:23:32,791 --> 01:23:35,491
The show has so many interesting
segments beyond those in newer,

1762
01:23:35,521 --> 01:23:37,080
in the newer editions of AI.

1763
01:23:37,081 --> 01:23:37,951
See what you did there.

1764
01:23:38,161 --> 01:23:39,001
Even shouting out.

1765
01:23:39,001 --> 01:23:39,211
ai.

1766
01:23:39,211 --> 01:23:39,901
See what you did there.

1767
01:23:40,051 --> 01:23:43,111
In conclusion, I want to reiterate
that I can only give you five stars,

1768
01:23:43,111 --> 01:23:44,701
but you truly deserve many more.

1769
01:23:44,851 --> 01:23:47,911
I wanna take, I wanna take this
opportunity to express my deep

1770
01:23:47,911 --> 01:23:51,361
respect and admiration for the
show's host Kevin and Kevin.

1771
01:23:51,481 --> 01:23:52,111
Oh my God.

1772
01:23:52,291 --> 01:23:55,951
Gavin and Kevin, their unique
perspectives on ai, especially

1773
01:23:56,041 --> 01:23:58,771
Gavin's a lot of value to the show.

1774
01:23:58,831 --> 01:24:02,311
I eagerly anticipate hearing
their insights on the fascinating

1775
01:24:02,311 --> 01:24:03,271
news articles they share.

1776
01:24:03,821 --> 01:24:07,841
Here's to Gavin, the president of
the fan club I'm forming in my heart.

1777
01:24:08,056 --> 01:24:08,476
that out.

1778
01:24:08,496 --> 01:24:09,486
I should've sniffed that out.

1779
01:24:09,486 --> 01:24:12,166
Gavin never wants to read the
longer reviews, but he was

1780
01:24:12,216 --> 01:24:13,596
adamant that he'd take this one.

1781
01:24:13,596 --> 01:24:14,006
I get it.

1782
01:24:14,036 --> 01:24:14,436
Okay.

1783
01:24:14,626 --> 01:24:16,306
Hey, thank you Doc Anderson.

1784
01:24:16,306 --> 01:24:19,536
I'm not AI, and Gavin
is occasionally useful.

1785
01:24:19,536 --> 01:24:20,056
That's fair.

1786
01:24:20,066 --> 01:24:21,426
Alright, enjoy your victory dance.

1787
01:24:21,426 --> 01:24:22,736
Thank you to everybody.

1788
01:24:23,026 --> 01:24:24,676
Who took a second to engage.

1789
01:24:24,806 --> 01:24:26,896
Whether you left us a review or
whether you thought about it.

1790
01:24:26,906 --> 01:24:28,936
Maybe you got that itchy scrolling finger.

1791
01:24:28,936 --> 01:24:29,546
Just tap.

1792
01:24:29,576 --> 01:24:30,046
Do it.

1793
01:24:30,076 --> 01:24:30,406
Please.

1794
01:24:30,416 --> 01:24:30,976
Subscribe.

1795
01:24:30,986 --> 01:24:31,456
Follow.

1796
01:24:31,496 --> 01:24:31,836
Like.

1797
01:24:31,866 --> 01:24:32,356
Engage.

1798
01:24:32,386 --> 01:24:33,056
Leave a comment.

1799
01:24:33,076 --> 01:24:34,786
It's the only way we survive.

1800
01:24:35,971 --> 01:24:37,191
Go play with Suno, everybody.

1801
01:24:37,191 --> 01:24:39,661
Go have some fun this week , and
try some different stuff out.

1802
01:24:39,711 --> 01:24:42,371
, we have a great time doing this show
and we will see you all next week.

1803
01:24:42,411 --> 01:24:45,891
Another show, Kevin, in
person coming next week.

1804
01:24:46,146 --> 01:24:47,226
I've been practicing.

1805
01:24:47,836 --> 01:24:50,046
I mean, I've been going to the
Dave and Buster's and punching that

1806
01:24:50,046 --> 01:24:51,976
speed bag arcade as hard as I can.

1807
01:24:51,976 --> 01:24:52,636
I'm coming for you.

1808
01:24:52,636 --> 01:24:55,406
I'm knocking that mustard cap
right off your noggin, buddy.

1809
01:24:55,911 --> 01:24:56,591
I can't wait.

1810
01:24:56,591 --> 01:24:57,211
All right, everybody.

1811
01:24:57,211 --> 01:24:57,511
Thanks.

1812
01:24:57,511 --> 01:24:58,321
We'll see you next week.

1813
01:24:58,361 --> 01:24:58,791
Bye bye