June 5, 2025

GPT-5 Coming Soon, New ChatGPT For Business & Huge AI Video Updates

The player is loading ...
GPT-5 Coming Soon, New ChatGPT For Business & Huge AI Video Updates

OpenAI is inching towards GPT-5 this July and Sam Altman has been out there talking up a good game. How good will it be? And what will the future be? WE DIVE IN.

Between Google & Anthropic, the state-of-the-art AI models are pushing OpenAI hard & ChatGPT for Business updates are just the start of HOT AI SUMMER LFG (Sam’s words, not ours). Plus, Flux Kontext gives you open-source AI Imaging upgrades, Epic’s Tim Sweeny says that AI will make it easier for developers to make games, Palmer Lucky & Mark Zuckerberg make up with each other and YES VEO 3 GIVES US SYNCHRONIZED SWIMMING CATS.

HOT AI SUMMER IS CALLING AND WE HAVE ON OUR SHORTS. 

 

Join the discord: https://discord.gg/muD2TYgC8f

Join our Patreon: https://www.patreon.com/AIForHumansShow

AI For Humans Newsletter: https://aiforhumans.beehiiv.com/

Follow us for more on X @AIForHumansShow

Join our TikTok @aiforhumansshow

To book us for speaking, please visit our website: https://www.aiforhumans.show/

 

// Show Links //

hot ai summer lfg tweet from Sam Altman

https://x.com/sama/status/1930040146034078061

Sam Altman Interview at Snowflake Summit

https://youtu.be/qhnJDDX2hhU?si=71WRlUFTLQmq82at

GPT-5 In July? Heavy Rumors Suggest Yes

https://x.com/btibor91/status/1929241704873308253

Codex Gets Access To The Internet

https://x.com/sama/status/1930006856019390521

ChatGPT For Business Updates 

https://www.youtube.com/live/9lSRViLugE0?si=we8u0PkFlNSb9Zvk

Claude / Windsurf OAI Drama

https://x.com/_mohansolo/status/1930034960385356174

OpenAI Movie IS Happening 

https://www.hollywoodreporter.com/movies/movie-news/luca-guadagnino-to-direct-openai-movie-1236236357/

Palmer & Mark Make Good, Team Up On AI + Defense Work (Eagle Eye)

https://x.com/PalmerLuckey/status/1928127369841193438

 

https://www.wsj.com/tech/meta-army-vr-headsets-anduril-palmer-luckey-142ab72a

Core Memory Episode with Palmer Lucky

https://youtu.be/gVXPERyRjOA?si=QAQJOESFVCBQoMQE

Major Labels In Talks With Suno & Udio

https://www.bloomberg.com/news/articles/2025-06-01/record-labels-in-talks-to-license-music-to-ai-firms-udio-suno

Tim Sweeny Says 10 People Could Make Breath of the Wild with AI
https://www.ign.com/articles/ai-prompts-will-soon-let-a-10-person-te am-build-a-game-like-breath-of-the-wild-where-the-ai-is-doing-all-the-dialogue-and-you-just-write-character-synopsis-tim-sweeney-predicts

Flux Kontext 

https://bfl.ai/announcements/flux-1-kontext

Flux Kontext Photo Restoration

https://x.com/minchoi/status/1929194553384243458

Flux Kontext + Wan in ComfyUI

https://x.com/8bit_e/status/1929551748231757932

HeyGen Avatar 4

https://x.com/HeyGen_Official/status/1929930152659870083

Marc Andreesen Says Robotics Will Be The Biggest Industry In The History of the Planet

https://x.com/simonkalouche/status/1929748425224212737

Unitree Teasing Sub-1000 Dollar Humanoid

https://x.com/UnitreeRobotics/status/1928406440152269266

ETH Zurich Badminton Robot

https://x.com/adcock_brett/status/1929207262976848328

Is This The Greatest VEO 3 Video Ever? Cat Swimming News Story (also posted in the Bard subreddit!)

https://www.reddit.com/r/Bard/comments/1kxpb1h/i_signed_up_for_gemini_ultraheres_what_i_made/

Netflix Clone with Sora & VEO 3

https://www.reddit.com/r/SoraAi/comments/1l36fqo/made_a_netflix_clone_using_sora_and_veo/

Jack Dorsey Vibecoding

https://x.com/jack/status/1930261602202251446

 

1
00:00:00,150 --> 00:00:03,420
Open AI and Sam Altman are
gearing up for Wait for it.

2
00:00:03,660 --> 00:00:05,370
Hot AI Summer.

3
00:00:05,700 --> 00:00:06,600
LFG.

4
00:00:06,600 --> 00:00:07,140
Apparently

5
00:00:07,200 --> 00:00:07,680
that's right.

6
00:00:07,680 --> 00:00:09,540
Baby Hot AI summer coming in.

7
00:00:09,540 --> 00:00:10,200
Compute go.

8
00:00:11,040 --> 00:00:12,630
GPT five coming up hard.

9
00:00:12,630 --> 00:00:12,690
Yeah.

10
00:00:12,690 --> 00:00:13,200
Gavin, buddy.

11
00:00:13,200 --> 00:00:14,070
Hey, what did we say?

12
00:00:14,580 --> 00:00:15,210
What did we say?

13
00:00:15,660 --> 00:00:17,430
Don't go beast mode on intros.

14
00:00:17,460 --> 00:00:18,150
That's right.

15
00:00:18,150 --> 00:00:19,740
No obese mode and intros.

16
00:00:19,740 --> 00:00:20,250
That's right.

17
00:00:20,250 --> 00:00:20,610
We're both

18
00:00:20,610 --> 00:00:21,630
over 40 now.

19
00:00:21,630 --> 00:00:23,520
What did you say about GPT five opening?

20
00:00:23,520 --> 00:00:27,840
AI is preparing to release GPT five and
they've just released a new note taking

21
00:00:27,840 --> 00:00:30,300
app and Codex can access the internet.

22
00:00:30,480 --> 00:00:33,510
Kevin, is this gonna be how
the opening AI movie will end?

23
00:00:33,515 --> 00:00:33,810
I, I, I,

24
00:00:33,810 --> 00:00:34,920
I, Kevin.

25
00:00:35,405 --> 00:00:35,825
Are we going?

26
00:00:35,825 --> 00:00:36,335
Beast mode?

27
00:00:36,395 --> 00:00:39,665
No, Kevin, the opening AI
movie is actually for real.

28
00:00:39,755 --> 00:00:42,965
You know the movie I wanna see
is the decades long beef between

29
00:00:42,965 --> 00:00:44,555
Mark Zuckerberg and Palmer.

30
00:00:44,555 --> 00:00:47,225
Lucky they recently squashed
that beef because they're teaming

31
00:00:47,225 --> 00:00:49,055
up to make super soldiers.

32
00:00:49,115 --> 00:00:49,625
That's right.

33
00:00:49,625 --> 00:00:50,135
And

34
00:00:50,135 --> 00:00:55,415
Epic, CEO, Tim Sweeney says that a team of
10 could make breadth of the wild with ai.

35
00:00:55,715 --> 00:00:57,275
Oh, I'm sure that upset nobody.

36
00:00:58,775 --> 00:01:02,705
Also, some massive AI video and
image tools being released from flux.

37
00:01:02,985 --> 00:01:03,794
Luma Labs

38
00:01:03,824 --> 00:01:05,235
and, Hey, Jen.

39
00:01:05,474 --> 00:01:06,495
Hey, Jen.

40
00:01:06,675 --> 00:01:10,005
No, that's what I'm shouting at my bestie
across the bar and it's martini clock.

41
00:01:10,005 --> 00:01:13,725
Hey, Jen, put your Birkin on the
hook and let's down these mimosas.

42
00:01:13,725 --> 00:01:17,925
Hey, Jen, also, Gavin, you're
never gonna believe this one.

43
00:01:17,925 --> 00:01:18,225
Oh, what?

44
00:01:18,225 --> 00:01:21,615
Record labels are signing
deals with AI music companies

45
00:01:21,615 --> 00:01:24,134
like Suno and uio, who, what?

46
00:01:24,225 --> 00:01:24,854
Who would've guessed?

47
00:01:24,854 --> 00:01:25,755
But more importantly,

48
00:01:25,755 --> 00:01:28,995
Kevin VO three has brought
us something incredible.

49
00:01:29,345 --> 00:01:31,535
It is teaching cats how to swim.

50
00:01:31,535 --> 00:01:33,365
The training regimen is rigorous.

51
00:01:33,675 --> 00:01:37,065
Six hours daily, no breaks, no excuses.

52
00:01:37,545 --> 00:01:38,385
Uh, okay.

53
00:01:38,475 --> 00:01:39,615
On God, dead ass.

54
00:01:39,705 --> 00:01:42,615
Uh, cat swimming is like
peak GPT five core energy.

55
00:01:42,615 --> 00:01:43,305
It gives.

56
00:01:43,310 --> 00:01:43,550
Yeah.

57
00:01:43,634 --> 00:01:43,815
Yeah.

58
00:01:43,815 --> 00:01:44,805
This is AI for humans.

59
00:01:44,805 --> 00:01:45,914
We're busting for real.

60
00:01:46,009 --> 00:01:47,384
No, no, no.

61
00:01:47,384 --> 00:01:48,435
That was a test.

62
00:01:48,525 --> 00:01:49,755
No Beast mode and intros.

63
00:01:49,755 --> 00:01:50,175
Fine.

64
00:01:50,205 --> 00:01:50,475
Fine.

65
00:01:50,475 --> 00:01:51,765
This is AI for humans, everybody.

66
00:01:52,485 --> 00:01:55,245
What happened in the cannon of
this show that I'm the principal?

67
00:01:55,605 --> 00:01:56,655
When did that happen?

68
00:01:56,685 --> 00:01:56,865
Yeah.

69
00:01:56,865 --> 00:01:57,105
I don't

70
00:01:57,105 --> 00:01:57,285
know

71
00:01:57,285 --> 00:01:57,465
man.

72
00:01:57,465 --> 00:02:00,705
You're just the cool kid who wants Dan
memes and I'm the one who wet blankets.

73
00:02:00,705 --> 00:02:01,150
You all right?

74
00:02:01,180 --> 00:02:01,750
Start the show.

75
00:02:07,000 --> 00:02:08,980
Welcome to AI for Humans, everybody.

76
00:02:08,980 --> 00:02:09,520
And Kevin.

77
00:02:09,520 --> 00:02:13,985
It is Hot AI summer LFG,
according to Sam, LLFG

78
00:02:14,135 --> 00:02:14,425
Baby.

79
00:02:15,150 --> 00:02:16,560
Let the farmers grow.

80
00:02:16,560 --> 00:02:17,730
That's what that acronym means, right?

81
00:02:17,730 --> 00:02:18,210
Oh, is that right?

82
00:02:18,210 --> 00:02:18,960
I didn't realize that.

83
00:02:18,960 --> 00:02:19,680
That's interesting.

84
00:02:19,710 --> 00:02:24,299
Let's frigging go, baby Sammy Altman
trying to just chum the waters on

85
00:02:24,299 --> 00:02:28,019
the heels of Google io, stealing
maybe a little bit of thunder in

86
00:02:28,019 --> 00:02:29,579
their pretty little Gemini jar.

87
00:02:29,760 --> 00:02:32,040
What does hot AI summer mean?

88
00:02:32,040 --> 00:02:32,220
Gavin?

89
00:02:32,220 --> 00:02:32,400
Okay,

90
00:02:32,400 --> 00:02:34,859
so there's a couple big
things to get into here.

91
00:02:34,859 --> 00:02:38,370
The biggest thing so far I think that
we have to think about is GPT five,

92
00:02:38,370 --> 00:02:43,470
which is their huge next new model that
is planning on coming out as rumored.

93
00:02:43,805 --> 00:02:44,495
In July.

94
00:02:44,525 --> 00:02:47,255
Now these are rumors nobody
really knows for sure, but like

95
00:02:47,255 --> 00:02:48,425
that is a big thing coming up.

96
00:02:48,605 --> 00:02:52,865
Sam has also, you know, taken, uh, to the
world of interviews and has been doing

97
00:02:52,865 --> 00:02:58,205
some interviews in part, I bet you're
right to kind of like re-trigger the, uh,

98
00:02:58,205 --> 00:03:00,755
attention back to the OpenAI mothership.

99
00:03:00,785 --> 00:03:04,325
It's the most exciting thing that's
probably in the pipeline right now.

100
00:03:04,325 --> 00:03:07,175
And I think they thought maybe
the Sears portrait studio photo

101
00:03:07,175 --> 00:03:09,215
shoot between Sam and Johnny Ives.

102
00:03:09,635 --> 00:03:13,295
Uh, would be something would, would
capture some more headline news cycles,

103
00:03:13,295 --> 00:03:16,835
but, uh, well, let's, let's see what,
what Sam is, is talking about here.

104
00:03:16,925 --> 00:03:20,495
I, the, the models over the
next year or two years are, are

105
00:03:20,495 --> 00:03:21,995
gonna be quite breathtaking.

106
00:03:22,295 --> 00:03:23,555
Um, really

107
00:03:25,625 --> 00:03:28,475
there's a lot of progress ahead of
us, a lot of improvement to come.

108
00:03:29,144 --> 00:03:33,465
And like we have seen in the previous
big jumps, you know, from GPT-3 to G

109
00:03:33,465 --> 00:03:38,174
PT four businesses can just do things
that totally were impossible with the

110
00:03:38,174 --> 00:03:44,325
previous generation of models and, and
so what an enterprise will be able to do,

111
00:03:44,325 --> 00:03:47,415
we talked about this a little bit, but
just like give it your hardest problem.

112
00:03:47,790 --> 00:03:51,060
If you're a chip design company,
say, go design me a better chip than

113
00:03:51,120 --> 00:03:52,920
I could have possibly had before.

114
00:03:53,340 --> 00:03:54,060
Oh, what's it gonna do?

115
00:03:54,060 --> 00:03:55,860
Put Taki powder on it.

116
00:03:56,010 --> 00:03:56,970
Ooh, that sounds interesting.

117
00:03:56,970 --> 00:03:57,570
How much better?

118
00:03:57,690 --> 00:03:59,010
How much better can chips get?

119
00:03:59,280 --> 00:04:01,950
We got salt and vinegar, right?

120
00:04:01,950 --> 00:04:02,790
We got flaming hot.

121
00:04:02,790 --> 00:04:03,150
Not the kind

122
00:04:03,150 --> 00:04:03,510
of chips.

123
00:04:03,510 --> 00:04:04,740
Kevin, that's not the kind of chips.

124
00:04:04,740 --> 00:04:05,550
What else do we need?

125
00:04:05,550 --> 00:04:07,065
You need to Pringles.

126
00:04:07,070 --> 00:04:07,920
Pringles are still a chip.

127
00:04:07,920 --> 00:04:09,330
Just 'cause they have a funky can.

128
00:04:09,630 --> 00:04:10,530
It's still a chip.

129
00:04:10,950 --> 00:04:15,660
Um, if you're a biotech company trying to
cure some disease, say, just go work on

130
00:04:15,660 --> 00:04:18,660
this for me, like that's not so far away.

131
00:04:19,140 --> 00:04:25,410
Uh, and these models ability to understand
all the context you wanna possibly give

132
00:04:25,410 --> 00:04:29,970
them, connect to every tool, every system,
whatever, and then go think really hard,

133
00:04:30,750 --> 00:04:35,130
like really brilliant reasoning, and come
back with an answer and, and have enough

134
00:04:35,460 --> 00:04:39,000
robustness that you can trust them to go
off and do some work autonomously, like.

135
00:04:40,065 --> 00:04:43,635
That that, I don't know if
I thought that would feel so

136
00:04:43,635 --> 00:04:46,245
close, but it feels really close,

137
00:04:46,455 --> 00:04:46,784
Kevin.

138
00:04:46,784 --> 00:04:48,645
It feels so close to me right now.

139
00:04:48,645 --> 00:04:52,664
As, uh, Calvin Harris said once in a
very famous song, Sam is out there,

140
00:04:52,664 --> 00:04:53,835
you know, pitching his wares again.

141
00:04:54,465 --> 00:04:58,065
I think just to, to extract a little
bit about that, what he's getting at

142
00:04:58,065 --> 00:05:02,055
is this idea that you're gonna send
off your ais to go think for you.

143
00:05:02,055 --> 00:05:02,355
Right?

144
00:05:02,355 --> 00:05:05,145
And a lot of the stuff that
we've seen coming out of these

145
00:05:05,145 --> 00:05:06,974
thinking models is, is a big deal.

146
00:05:07,034 --> 00:05:08,985
First of all, the leap from GPT-3 to

147
00:05:08,985 --> 00:05:10,065
GT four, yeah.

148
00:05:10,395 --> 00:05:12,765
That really paints a picture.

149
00:05:13,065 --> 00:05:16,455
Uh, if you're new to this podcast
and potentially newer to ai.

150
00:05:16,670 --> 00:05:20,930
You don't know the world before,
like chat GPT, the difference

151
00:05:20,930 --> 00:05:23,000
from three to four was immense.

152
00:05:23,000 --> 00:05:23,535
Yes, yes.

153
00:05:23,540 --> 00:05:25,850
You know, and we we're now getting
used to like, every few months

154
00:05:25,850 --> 00:05:28,850
there's a new model or a new update
or a new trick, a new something.

155
00:05:29,060 --> 00:05:32,090
But like it was, it's a
massive change in capabilities.

156
00:05:32,360 --> 00:05:37,640
Uh, unlocking a ton of new use cases, uh,
spawning thousands of businesses, crushing

157
00:05:37,640 --> 00:05:39,530
a few hundred in the process as well.

158
00:05:39,530 --> 00:05:39,590
Yeah.

159
00:05:39,770 --> 00:05:41,180
But like, that was massive.

160
00:05:41,180 --> 00:05:41,570
So that.

161
00:05:42,015 --> 00:05:46,905
On the one hand, it gets me very excited
and he's been saying that for a while,

162
00:05:46,905 --> 00:05:51,585
Gavin like, so that is the generational
leap in capabilities that I am expecting.

163
00:05:51,614 --> 00:05:51,885
Yes.

164
00:05:51,885 --> 00:05:54,375
This is not a slightly new graphics card.

165
00:05:54,375 --> 00:05:55,515
This is a new Xbox.

166
00:05:55,515 --> 00:05:56,684
This is a whole new thing.

167
00:05:56,864 --> 00:05:58,844
It's the next generation I.

168
00:05:58,844 --> 00:06:02,595
However, on the other side, he mentions
stuff that we are already starting to

169
00:06:02,595 --> 00:06:04,455
see, which is these agentic behaviors.

170
00:06:04,665 --> 00:06:05,265
Send it off.

171
00:06:05,265 --> 00:06:08,715
It has access to tools,
MCP servers, whatever else.

172
00:06:08,835 --> 00:06:12,675
It has tons of context while we're
seeing million token context windows.

173
00:06:12,675 --> 00:06:16,785
Yeah, so on the one hand it's
supposed to be a complete sea change.

174
00:06:17,025 --> 00:06:18,525
Uh, on the other hand.

175
00:06:18,885 --> 00:06:23,205
It sounds like, uh, it's another iteration
of stuff that we're already seeing.

176
00:06:23,205 --> 00:06:27,195
So are you expecting a three to
four leap in this four to five, or

177
00:06:27,195 --> 00:06:30,405
are you, uh, expecting that it'll
just be, you know, more of the same?

178
00:06:30,405 --> 00:06:31,575
Will you be disappointed if it's not?

179
00:06:31,575 --> 00:06:31,604
I

180
00:06:31,604 --> 00:06:33,585
have two kind of theories on this.

181
00:06:33,585 --> 00:06:37,305
I can't, so three to four is, as
Kevin mentioned, is something that

182
00:06:37,305 --> 00:06:40,185
a lot of people have always said was
like one of the biggest leaps, right?

183
00:06:40,185 --> 00:06:42,810
And we haven't really seen a leap
like that, although I would argue.

184
00:06:43,349 --> 00:06:47,310
If you take the reasoning models and
compare them to GPT-4, if that we have

185
00:06:47,310 --> 00:06:48,870
right now, that is a very big leap.

186
00:06:48,870 --> 00:06:50,969
And maybe if you weren't from
right from that to the next

187
00:06:50,969 --> 00:06:52,020
thing, it would feel that way.

188
00:06:52,800 --> 00:06:55,830
There is a conspiracy theory that is
bubbling in my brain, and I don't know

189
00:06:55,830 --> 00:07:01,830
if this is real or not in any sort of
way I. That they put out 4.5 GPT, 4.5

190
00:07:01,830 --> 00:07:05,910
to kind of lower expectations so that
they could then surpass them again.

191
00:07:06,240 --> 00:07:08,010
Oh, you think they released a model to

192
00:07:08,010 --> 00:07:09,271
kind of sandbag A little bit bit.

193
00:07:09,276 --> 00:07:10,995
They wanted to get, get to the top
of a chart, but only because they

194
00:07:10,995 --> 00:07:11,790
didn't make a big enough deal.

195
00:07:11,790 --> 00:07:12,870
They didn't make a huge deal of it.

196
00:07:12,870 --> 00:07:15,630
And, and one of the interesting things,
Andre Cari, who we've talked about on

197
00:07:15,630 --> 00:07:19,740
the show a ton, former Tesla and opening
AI engineer, very good follow on X and

198
00:07:19,740 --> 00:07:21,450
does a lot of like educational stuff.

199
00:07:21,750 --> 00:07:26,130
Talked about the idea of 4.5, which is
this massive model works very slowly.

200
00:07:26,160 --> 00:07:29,880
Many of you who might have used it know
that it's a very creative model, but

201
00:07:29,880 --> 00:07:34,050
if that is the training data now for
the next reasoning model, he believes

202
00:07:34,050 --> 00:07:38,070
the leap could be very significant
because it is a much bigger model.

203
00:07:38,250 --> 00:07:43,230
So what's gonna be interesting to
see is, is GPT five specifically this

204
00:07:43,230 --> 00:07:46,530
kind of unified model that will be
the next version of the training.

205
00:07:46,530 --> 00:07:48,870
I think they're gonna try to
clean up all their naming stuff.

206
00:07:49,365 --> 00:07:52,784
I'm really curious to see if it's much
better at the reasoning side of it,

207
00:07:52,784 --> 00:07:54,615
because it's based on this larger model.

208
00:07:55,275 --> 00:07:58,724
I do think, Kevin, it's important to also
shout out this other thing that Sam said.

209
00:07:58,724 --> 00:08:01,455
'cause in that clip he seems
very optimistic about how

210
00:08:01,455 --> 00:08:02,534
great this is gonna be.

211
00:08:02,565 --> 00:08:05,625
He also then went into another
interview where he dropped this.

212
00:08:05,625 --> 00:08:07,515
We'll just play this little
snippet of this interview because

213
00:08:07,515 --> 00:08:08,685
it's a very different tone.

214
00:08:08,805 --> 00:08:09,795
I, I think.

215
00:08:10,425 --> 00:08:13,935
There are gonna be scary times ahead and
there's going to be a lot of transition.

216
00:08:14,235 --> 00:08:16,275
I'm very confident this can be managed.

217
00:08:16,275 --> 00:08:19,849
I think the more we can all do it
together, the the more likely we'll

218
00:08:19,849 --> 00:08:20,851
get someone that works for us managed.

219
00:08:20,856 --> 00:08:21,495
Who's managing it?

220
00:08:21,495 --> 00:08:22,005
Kevin, are you

221
00:08:22,005 --> 00:08:23,445
managing the scary times ahead?

222
00:08:23,530 --> 00:08:28,545
I, I, I guess I'm gonna ask the ag agentic
AI to manage it and I'll go away for a

223
00:08:28,545 --> 00:08:30,315
few weeks and when I come back I hope it's

224
00:08:30,315 --> 00:08:30,675
managed.

225
00:08:30,675 --> 00:08:33,885
Yeah, it, so anyway, this is like kind
of speaking out of two sides up again.

226
00:08:33,885 --> 00:08:36,075
I think the biggest thing for
everybody at home to know is like.

227
00:08:36,645 --> 00:08:40,065
We won't know until this comes
out, what sort of leap it is.

228
00:08:40,065 --> 00:08:43,785
I do believe if it is a significant
leap, if it is close to three to four,

229
00:08:43,785 --> 00:08:49,995
that Sam said, every conversation you
will have will conceivably be about ai.

230
00:08:50,055 --> 00:08:50,145
Yes.

231
00:08:50,145 --> 00:08:51,075
In the next year, right?

232
00:08:51,075 --> 00:08:52,215
Because it's that big of a deal.

233
00:08:53,040 --> 00:08:55,650
I'm sure my friends love
that every conversation they

234
00:08:55,650 --> 00:08:57,270
have with me is about ai.

235
00:08:57,270 --> 00:08:57,330
Yeah.

236
00:08:57,330 --> 00:08:59,520
It doesn't matter if we're
on the pickleball court or

237
00:08:59,520 --> 00:09:00,630
we're trying to watch a movie.

238
00:09:00,660 --> 00:09:00,780
Yeah.

239
00:09:01,020 --> 00:09:02,130
It's gonna go back to ai.

240
00:09:02,130 --> 00:09:05,130
But I think you're absolutely right
because you know, like the release of

241
00:09:05,130 --> 00:09:07,980
chat, GPT was a major cultural uptick.

242
00:09:07,980 --> 00:09:08,040
Yeah.

243
00:09:08,070 --> 00:09:10,020
In people discovering,
whoa, what is this thing?

244
00:09:10,080 --> 00:09:13,170
And that seems like nothing
in comparison to what these

245
00:09:13,170 --> 00:09:14,370
models and these tools do now.

246
00:09:14,370 --> 00:09:17,010
So if we get what feels like
a generational leap, yeah.

247
00:09:17,470 --> 00:09:19,690
Come July when this is
rumored to be released.

248
00:09:20,170 --> 00:09:20,620
Uh, yeah.

249
00:09:20,620 --> 00:09:21,400
All bets are off.

250
00:09:21,400 --> 00:09:24,010
Everybody is going to be
screaming from the mouth

251
00:09:24,100 --> 00:09:24,220
tops.

252
00:09:24,220 --> 00:09:26,290
Yeah, and you know, there's
already a couple small things

253
00:09:26,290 --> 00:09:27,280
that are dropping soon.

254
00:09:27,280 --> 00:09:30,910
Supposedly OpenAI oh three Pro,
according to Sam Altman is coming soon.

255
00:09:30,910 --> 00:09:31,930
This has been rumored for a while.

256
00:09:31,930 --> 00:09:34,720
In fact, it's delayed versus where it
was supposed to come out right now.

257
00:09:34,990 --> 00:09:39,040
I'll hear to know if like that comes out
prior to this it, it might, but also.

258
00:09:39,325 --> 00:09:42,445
They've now unleashed Codex on the
internet, which is their, uh, coding

259
00:09:42,445 --> 00:09:46,525
platform, which means that your,
your AI coding tool can now directly

260
00:09:46,525 --> 00:09:49,165
access the internet, which, you
know, one of my favorite things about

261
00:09:49,165 --> 00:09:52,045
this tweet from Sam, 'cause Sam,
you know, took some time to tweet.

262
00:09:52,045 --> 00:09:54,145
He put, he took like 10 days
off from tweeting and then

263
00:09:54,145 --> 00:09:55,495
started, uh, tweeting up a storm.

264
00:09:55,645 --> 00:09:58,345
In this tweet, he says, codex
gets access to the internet today.

265
00:09:58,345 --> 00:10:01,045
It is off by default, and
there are complex trade-offs.

266
00:10:01,314 --> 00:10:05,215
People should read about the risk
carefully and use it when it makes sense.

267
00:10:05,425 --> 00:10:08,275
So one thing to know about Codex
getting access to the internet is

268
00:10:08,275 --> 00:10:11,095
like, this is always the question of
like, what does an AI look like when

269
00:10:11,095 --> 00:10:14,454
it has full steam ahead on being able
to read everything and do everything?

270
00:10:15,300 --> 00:10:19,530
Yeah, I mean that's true, but look
as excited as I am, it's finally now

271
00:10:19,530 --> 00:10:23,370
at parody with other models that can
code and had access to the web before.

272
00:10:23,370 --> 00:10:27,780
So it's like other companies are
facing that, you know, that head on.

273
00:10:27,780 --> 00:10:29,640
I don't know if Code X can
do things, they can't yet.

274
00:10:29,640 --> 00:10:31,980
I haven't put its through its
paces, but one thing that did

275
00:10:31,980 --> 00:10:33,270
happen, Gavin, we're recording this.

276
00:10:33,510 --> 00:10:37,709
Uh, surprise, surprise on a
Wednesday and chat GPT for business.

277
00:10:37,709 --> 00:10:39,209
Yeah, just got a ton of updates.

278
00:10:39,420 --> 00:10:42,180
This might get in the weeds a little
bit for some if you're just like

279
00:10:42,180 --> 00:10:45,240
a personal user and, and this will
have benefits for you down the line.

280
00:10:45,240 --> 00:10:47,490
But for right now, for the
business users, they did a couple

281
00:10:47,490 --> 00:10:48,630
things, which are pretty massive.

282
00:10:48,630 --> 00:10:52,979
One Chat, GPT can now natively
connect to all of your databases.

283
00:10:52,979 --> 00:10:57,150
And don't get me wrong, this is very
quickly going to trickle down to the

284
00:10:57,150 --> 00:11:01,199
individual users, but we're talking,
you know, Google Drive, Dropbox.

285
00:11:01,345 --> 00:11:05,455
Uh, if anybody uses OneDrive, okay,
Kevin, I know what this cloud search

286
00:11:05,455 --> 00:11:05,725
for.

287
00:11:05,725 --> 00:11:08,665
We already know you've got all these
databases you've been generating

288
00:11:08,665 --> 00:11:13,555
for 20 plus years of, uh, fictional
character feed that you are now very

289
00:11:13,555 --> 00:11:16,675
excited to be able to search through
and surface whenever you need them.

290
00:11:16,695 --> 00:11:16,755
It

291
00:11:17,535 --> 00:11:21,975
and I need it to, if I say I
want feet in jam or jelly, it

292
00:11:21,975 --> 00:11:22,875
needs to know the difference.

293
00:11:22,875 --> 00:11:23,445
Good point.

294
00:11:23,475 --> 00:11:24,285
Okay, good point.

295
00:11:24,285 --> 00:11:28,155
And if it's a, if it's a character
stomping on lasagna versus stomping on

296
00:11:28,155 --> 00:11:30,075
grapes, it should know the difference.

297
00:11:30,075 --> 00:11:30,345
Exactly.

298
00:11:30,345 --> 00:11:30,585
Okay.

299
00:11:30,585 --> 00:11:31,935
And that can do that and trolls.

300
00:11:32,425 --> 00:11:33,775
Are different and it can do it right?

301
00:11:33,775 --> 00:11:36,265
I don't have to label my directories,
you know, I'm meticulous.

302
00:11:36,265 --> 00:11:39,475
Point is now it can connect
directly to all of your files.

303
00:11:39,475 --> 00:11:40,735
Now why does this matter?

304
00:11:40,735 --> 00:11:43,795
Well, this gives, uh, for
individuals, it eventually give it

305
00:11:43,825 --> 00:11:48,535
way more context on your life, your
businesses, your creative pursuits,

306
00:11:48,535 --> 00:11:50,785
et cetera, on the enterprise level.

307
00:11:51,195 --> 00:11:54,315
I have been, and I know you've been
asked to consult on this too, Gavin,

308
00:11:54,585 --> 00:11:56,145
no shortage of companies screaming.

309
00:11:56,145 --> 00:11:56,235
Yes.

310
00:11:56,355 --> 00:12:01,965
How do we integrate all of our data
into a system that spins up, you know,

311
00:12:01,965 --> 00:12:04,095
strike teams that work within a division.

312
00:12:04,095 --> 00:12:04,155
Yeah.

313
00:12:04,155 --> 00:12:07,785
And they come up with tools and then you
start building workflows and pipelines

314
00:12:07,785 --> 00:12:11,685
and looking at 16 different software
tools that can bolt this onto that and.

315
00:12:12,100 --> 00:12:16,210
While open AI's demonstration today
seems pretty rudimentary, you connect

316
00:12:16,210 --> 00:12:18,070
it to a thing and you can query it.

317
00:12:18,490 --> 00:12:20,830
It does respect the permissions
of the users, which is

318
00:12:20,830 --> 00:12:22,090
massive in an organization.

319
00:12:22,090 --> 00:12:25,540
So an intern doesn't have access
to like high level engineering

320
00:12:25,540 --> 00:12:26,680
docs or something like that.

321
00:12:26,980 --> 00:12:31,840
Um, potentially within a household you can
share documents, but if you are keeping

322
00:12:31,840 --> 00:12:34,945
information a secret, like your feet picks
repository, sure, you wanna have separate,

323
00:12:34,950 --> 00:12:36,310
you don't have to worry about the misses.

324
00:12:36,350 --> 00:12:38,660
Maybe you have a separate account
then coming across it case, why not?

325
00:12:38,660 --> 00:12:39,230
Right.

326
00:12:39,380 --> 00:12:43,760
Look, maybe you do all of your work inside
of a, a aluminum foil line, Faraday cage.

327
00:12:43,820 --> 00:12:45,500
Maybe that's where you
store your thumb drives.

328
00:12:45,500 --> 00:12:45,651
Sure, of course.

329
00:12:45,656 --> 00:12:45,795
Okay.

330
00:12:45,795 --> 00:12:45,796
Course.

331
00:12:45,801 --> 00:12:47,780
And she doesn't need to
know what happens in there.

332
00:12:48,410 --> 00:12:52,490
But this is big that this is rolling
out along with MP capabilities.

333
00:12:52,490 --> 00:12:52,550
Yeah.

334
00:12:52,550 --> 00:12:55,430
Which it, we've gotten into the
weeds about CPS in the past, but

335
00:12:55,430 --> 00:12:57,290
just know it, it means that it.

336
00:12:57,599 --> 00:13:01,859
It allows open AI's AI
to talk to other ai.

337
00:13:02,010 --> 00:13:02,099
Yes.

338
00:13:02,160 --> 00:13:03,359
Fairly effortlessly.

339
00:13:03,359 --> 00:13:03,420
Yeah.

340
00:13:03,420 --> 00:13:07,140
And so suddenly you can go from a
system that has a basic understanding

341
00:13:07,140 --> 00:13:11,250
of itself to a system that understands
AI systems everywhere, and you

342
00:13:11,250 --> 00:13:12,599
can get really powerful workflows.

343
00:13:13,079 --> 00:13:15,870
Sam has said in the past, and this is
where I'm hoping to connect the dots

344
00:13:15,870 --> 00:13:18,000
again, where I'm talking about grandiose
statements that don't come true.

345
00:13:18,060 --> 00:13:21,540
He has said the mistake that
many make are trying to.

346
00:13:21,780 --> 00:13:25,500
Uh, over-engineer and build for the
current shortcomings of the models Yes.

347
00:13:25,500 --> 00:13:26,339
As they exist.

348
00:13:26,400 --> 00:13:26,579
Yes.

349
00:13:26,610 --> 00:13:30,689
This is one of those things where so
many companies and individual creatives

350
00:13:30,689 --> 00:13:34,050
or contractors have gone out and been
like, I'm gonna engineer a thing that

351
00:13:34,050 --> 00:13:36,600
lets an AI talk to all of your documents.

352
00:13:36,750 --> 00:13:38,100
Well, guess what Friends?

353
00:13:38,145 --> 00:13:41,355
You know, Google has their
version, Amazon has their version.

354
00:13:41,385 --> 00:13:43,454
OpenAI just rolled theirs out now.

355
00:13:43,605 --> 00:13:46,875
And when it trickles down
to the individual user, that

356
00:13:46,875 --> 00:13:48,405
could be the Kleenex moment.

357
00:13:48,465 --> 00:13:48,585
Yeah.

358
00:13:48,855 --> 00:13:50,745
Foris that, that talk
to all sorts of people.

359
00:13:50,745 --> 00:13:50,835
And

360
00:13:50,835 --> 00:13:53,745
maybe even more importantly
for these AI companies, the pay

361
00:13:53,745 --> 00:13:55,185
for the Kleenex moment, right?

362
00:13:55,185 --> 00:13:58,335
Like where the idea is that suddenly
instead of a 'cause there's a lot

363
00:13:58,335 --> 00:14:01,515
of studies about like how people
are using AI in business and like a

364
00:14:01,515 --> 00:14:04,845
lot of experimentation, but not like
a lot of money that's set aside.

365
00:14:05,155 --> 00:14:08,425
And we know that both Anthropic
and open AI's, you know, business

366
00:14:08,425 --> 00:14:11,035
model has gone up and they've
been making a lot more revenue.

367
00:14:11,365 --> 00:14:15,175
This could unlock even huge amounts
of revenue because if you can imagine

368
00:14:15,175 --> 00:14:18,535
all of the businesses in the world,
if you're suddenly locked into an

369
00:14:18,535 --> 00:14:22,075
open AI ecosystem, that's a big
deal when all your data's there.

370
00:14:22,075 --> 00:14:23,935
And I think that is gonna
be the fight going forward.

371
00:14:23,935 --> 00:14:26,635
In fact, there's a story that kind
of tangentially connects to this.

372
00:14:26,635 --> 00:14:29,395
Claude and Windsurf are kind
of fighting a little bit.

373
00:14:29,395 --> 00:14:32,725
This is a vibe coating platform
that we know that OpenAI is buying,

374
00:14:32,725 --> 00:14:34,435
that they're buying for $3 billion.

375
00:14:34,694 --> 00:14:38,745
They basically lost access to,
uh, their specific, uh, fast

376
00:14:38,745 --> 00:14:40,245
pipeline to Claude models.

377
00:14:40,574 --> 00:14:44,115
And the CEO came out and said kind of
a long description of what this is.

378
00:14:44,115 --> 00:14:48,345
They don't really know why, but Kevin,
this definitely feels like some drama

379
00:14:48,345 --> 00:14:53,025
stuff that has happened around these two
companies because of like lock-in, right?

380
00:14:53,025 --> 00:14:56,655
We talked about with Google's like
very expensive, uh, AI ultra plan.

381
00:14:56,765 --> 00:15:00,035
Part of the goal here is to try
to get companies to lock into

382
00:15:00,035 --> 00:15:02,375
a specific OpenAI ecosystem.

383
00:15:02,645 --> 00:15:06,095
And you know, if Windsurf has access
to Olive Claude, it might mean that

384
00:15:06,095 --> 00:15:09,245
like by cutting clot off Anthropic,
seeing this and being like, we have to

385
00:15:09,245 --> 00:15:11,015
get locked in for our stuff as well.

386
00:15:12,035 --> 00:15:14,495
Well, I don't know if you saw the
other rumors that we're swirling about

387
00:15:14,495 --> 00:15:17,975
that, uh, there might be an anthropic
slash apple collaboration going on.

388
00:15:17,975 --> 00:15:18,605
Yeah, I can see that.

389
00:15:18,795 --> 00:15:20,444
Which could have
something to do with that.

390
00:15:20,444 --> 00:15:23,895
They're, they're talking about X code,
which is, uh, Apple's coding platform.

391
00:15:23,895 --> 00:15:25,185
I use it all the time.

392
00:15:25,335 --> 00:15:28,245
I am astonished at the lack of AI in it.

393
00:15:28,245 --> 00:15:31,574
Yeah, it's the even Android studio, if
you're building Android apps or any,

394
00:15:31,574 --> 00:15:34,905
anything in that environment, at least
it has Gemini built in now and is getting

395
00:15:34,905 --> 00:15:37,215
better, so Apple needs their solution.

396
00:15:37,845 --> 00:15:40,185
Anthropic and Claude makes a lot of sense.

397
00:15:40,185 --> 00:15:40,454
Yes.

398
00:15:40,545 --> 00:15:42,015
And that might be some
of the reason there.

399
00:15:42,015 --> 00:15:43,995
It could be the open AI deal as well.

400
00:15:43,995 --> 00:15:47,324
So, uh, you know, it, tough to
say, but really interesting the

401
00:15:47,324 --> 00:15:48,435
way everybody's going for lock-in.

402
00:15:48,435 --> 00:15:53,175
And again, to the point of connecting to
all of your documents, that plus memory

403
00:15:53,444 --> 00:15:57,704
equals a much more difficult switch
for an individual or an enterprise.

404
00:15:57,735 --> 00:15:57,824
Yes.

405
00:15:57,860 --> 00:16:01,280
When they're six months down the road
and this AI knows everything about them.

406
00:16:01,310 --> 00:16:01,520
Yes.

407
00:16:01,550 --> 00:16:04,280
For them to leap to another
AI might be really tough.

408
00:16:04,280 --> 00:16:05,750
So that helps for the lock.

409
00:16:05,750 --> 00:16:08,150
And the only other quick thing that
came outta this business thing that

410
00:16:08,150 --> 00:16:11,600
is interesting, but maybe not fully
featured yet, is there's a record mode

411
00:16:11,600 --> 00:16:15,110
that they are introducing into the
enterprise teams and education system

412
00:16:15,110 --> 00:16:19,250
where there's gonna be a button on your
chat t and you can record a meeting

413
00:16:19,250 --> 00:16:20,780
at least locally, it sounds like.

414
00:16:20,780 --> 00:16:22,760
And it's gonna record all
of your meeting notes.

415
00:16:22,760 --> 00:16:25,220
It's gonna be accessible as a
thing that you can get into.

416
00:16:25,574 --> 00:16:28,545
Kevin, it seems like to me this is
a little not fleshed out because the

417
00:16:28,545 --> 00:16:30,765
most useful version of this would
be something that could be a plugin

418
00:16:30,765 --> 00:16:34,005
into say, zoom or to Google Meet,
because how many meetings do you have

419
00:16:34,005 --> 00:16:35,385
where everybody's in the same room?

420
00:16:35,505 --> 00:16:36,525
Not that many anymore.

421
00:16:36,525 --> 00:16:36,614
Right?

422
00:16:36,614 --> 00:16:38,895
But this is a movement
towards the same thing.

423
00:16:38,895 --> 00:16:40,035
And again, it is lock-in.

424
00:16:40,035 --> 00:16:43,665
If you have data from your company
that is in one place, then it

425
00:16:43,665 --> 00:16:44,895
does become very searchable.

426
00:16:44,895 --> 00:16:49,604
On the other hand, it also kills
conceivably a number of AI startups

427
00:16:49,604 --> 00:16:52,395
like we have said again and again,
like there are AI startups out

428
00:16:52,395 --> 00:16:55,035
there that are trying to do this
across many different platforms.

429
00:16:55,380 --> 00:16:57,510
If each of these suddenly has
their own version of this.

430
00:16:58,005 --> 00:16:58,725
Sorry buddies.

431
00:16:58,725 --> 00:16:58,935
That

432
00:16:58,935 --> 00:16:59,835
might be the end of it.

433
00:17:00,315 --> 00:17:02,145
Well, listen, I'm a Nathan Fielder fan.

434
00:17:02,205 --> 00:17:03,345
I promise this will connect.

435
00:17:03,405 --> 00:17:08,474
You know, uh, taking notes on a meeting
is one thing, but rehearsing the meeting

436
00:17:08,655 --> 00:17:14,385
in advance with fake AI personas of
the people you're going to meet, Gavin.

437
00:17:14,385 --> 00:17:16,095
Well, that sounds a little.

438
00:17:16,335 --> 00:17:17,204
Different, right?

439
00:17:17,204 --> 00:17:18,165
Maybe Black Mirror ish.

440
00:17:18,165 --> 00:17:22,785
Well, Moderna, CHRO, Tracy Franklin
has something to say along those lines.

441
00:17:22,785 --> 00:17:22,815
Ooh.

442
00:17:23,085 --> 00:17:28,545
I've created profiles in A GPT, um,
of our executive committee, and I have

443
00:17:28,545 --> 00:17:33,495
scenarios of when two people are, are
maybe at conflict or when I have to go in

444
00:17:33,495 --> 00:17:36,075
with an opinion, um, or a recommendation.

445
00:17:36,075 --> 00:17:39,885
And how might the, the group
react to my recommendation.

446
00:17:40,105 --> 00:17:43,014
Or if I'm having a really bad day
and I need to understand myself

447
00:17:43,014 --> 00:17:47,544
and why I am triggering, I actually
have a completely interactive coach,

448
00:17:47,935 --> 00:17:54,205
therapist, you know, and, and teammate,
um, for me that I use all the time.

449
00:17:54,205 --> 00:17:55,764
It's been like my favorite
thing and I've said.

450
00:17:56,340 --> 00:17:58,140
Gavin Circle of trust.

451
00:17:59,400 --> 00:18:00,330
Do you do this too?

452
00:18:00,360 --> 00:18:03,675
Do you, do you have a Kevin GPT
that, that you run the ideas by?

453
00:18:03,675 --> 00:18:05,040
Because you know I'm
gonna blow up, but now

454
00:18:05,040 --> 00:18:06,180
I really do want one.

455
00:18:06,180 --> 00:18:08,580
Now I'm gonna go out
and create a Kevin GPT.

456
00:18:08,670 --> 00:18:08,820
How are you

457
00:18:08,820 --> 00:18:09,450
gonna prompt that,

458
00:18:09,450 --> 00:18:09,870
Gavin?

459
00:18:09,870 --> 00:18:10,560
I'm just curious.

460
00:18:10,560 --> 00:18:13,170
What about don't sentence, don't
you don't worry about that.

461
00:18:13,170 --> 00:18:13,950
I'll keep that secret.

462
00:18:13,950 --> 00:18:15,030
I mean, this is an interesting thing.

463
00:18:15,030 --> 00:18:15,240
Again,

464
00:18:15,270 --> 00:18:18,630
you are a professional, has been
no a c to your platinum plus

465
00:18:18,630 --> 00:18:20,220
cable television who done it?

466
00:18:20,225 --> 00:18:20,940
No, no, no.

467
00:18:20,970 --> 00:18:23,220
A footnote in the history of never.

468
00:18:23,730 --> 00:18:25,770
Hey, you have a Wikipedia
page still, Kevin.

469
00:18:25,770 --> 00:18:28,110
That's something I want you to make
sure you put that in your back pocket.

470
00:18:28,110 --> 00:18:28,260
Oh, nice.

471
00:18:28,290 --> 00:18:29,100
That's still something.

472
00:18:29,130 --> 00:18:30,840
Please don't update it anyone.

473
00:18:30,840 --> 00:18:34,140
It does not need to know about
my RV phase or my convertible

474
00:18:34,140 --> 00:18:35,970
hiking pants on television phase.

475
00:18:35,970 --> 00:18:36,030
So

476
00:18:36,180 --> 00:18:40,080
again, this is another thing that shows
you how this business side of AI is

477
00:18:40,080 --> 00:18:42,870
something that you can't a discount,
because we're here talking about like

478
00:18:43,080 --> 00:18:44,070
what's the cutting edge look like?

479
00:18:44,070 --> 00:18:46,500
But the money will be made
on the business level.

480
00:18:46,889 --> 00:18:50,550
Kevin, the other thing that's important
to realize about all this stuff is that

481
00:18:50,550 --> 00:18:56,399
the dramas around this, there will also
be money made on that because there is

482
00:18:56,399 --> 00:19:00,389
going to be a movie made of the OpenAI.

483
00:19:00,600 --> 00:19:02,220
Sam, is he or isn't he?

484
00:19:02,220 --> 00:19:02,760
CEO?

485
00:19:03,035 --> 00:19:06,780
Uh uh Saga that we covered very
specifically on this show last year.

486
00:19:07,090 --> 00:19:10,060
Andrew Garfield has been
cast as, uh, Sam Altman.

487
00:19:10,120 --> 00:19:13,959
The other thing I really liked about
this is if you saw, um, a Nora the, uh,

488
00:19:13,990 --> 00:19:18,250
best picture winner, the uh, Russian
ball guy, he is gonna play Ilya.

489
00:19:18,250 --> 00:19:19,149
And that guy's amazing.

490
00:19:19,149 --> 00:19:20,260
I love that guy in that movie.

491
00:19:20,260 --> 00:19:24,460
So, if you know Ilia Sr. Is the guy
who I. Kind of helped lead this Ouster

492
00:19:24,460 --> 00:19:28,660
moment and then got kind of pushed back
when it, he, Sam came back, ia, then

493
00:19:28,660 --> 00:19:31,810
went on to start, uh, and is still in
the middle of a company called Safe

494
00:19:31,810 --> 00:19:35,260
Super Intelligence, which we have not
heard a single thing about for a year.

495
00:19:35,320 --> 00:19:35,410
Right.

496
00:19:35,470 --> 00:19:38,710
But he and his co-founders and, uh,
uh, arguably a bunch more people

497
00:19:38,710 --> 00:19:41,140
are up in the mountains making
super intelligence right now.

498
00:19:41,610 --> 00:19:43,920
Kevin, it just an interesting
thing to see how quickly

499
00:19:43,920 --> 00:19:44,820
these things are happening.

500
00:19:44,820 --> 00:19:48,840
Now, while all that drama was going
down, like that's what you and I were

501
00:19:48,840 --> 00:19:51,870
texting back and forth, and I think
everybody in the bubble as well,

502
00:19:51,870 --> 00:19:53,520
were like, oh, this is the third act.

503
00:19:53,550 --> 00:19:54,060
No, wait, no.

504
00:19:54,060 --> 00:19:55,140
Yeah, this is the third act.

505
00:19:55,140 --> 00:19:55,380
Yeah.

506
00:19:55,530 --> 00:19:57,210
Oh man, this is gonna play out so great.

507
00:19:57,210 --> 00:20:00,900
'cause it was pure drama unfolding
on Twitter and behind closed

508
00:20:00,900 --> 00:20:02,340
doors and in the rumor mills.

509
00:20:02,340 --> 00:20:03,180
So I, I'm psyched for it.

510
00:20:03,210 --> 00:20:04,680
Is there anybody else you'd like to see?

511
00:20:04,680 --> 00:20:05,490
Play Sam Altman.

512
00:20:06,205 --> 00:20:07,645
You know, it's a really
interesting question.

513
00:20:07,645 --> 00:20:08,935
I've been thinking about this.

514
00:20:08,935 --> 00:20:11,455
I mean, Andrew Garfield had a role
in the social network, so there's

515
00:20:11,455 --> 00:20:12,564
an interesting connective tissue.

516
00:20:12,564 --> 00:20:12,655
Yes.

517
00:20:12,655 --> 00:20:17,665
But of course, the person that most
clearly is is, um, Jesse Isenberg.

518
00:20:17,665 --> 00:20:18,054
Right?

519
00:20:18,085 --> 00:20:18,385
Sure.

520
00:20:18,385 --> 00:20:24,205
Jesse Isenberg is so good at playing
nebbishy sort of characters, and did Mark

521
00:20:24,205 --> 00:20:26,514
Zuckerberg so well in the social network.

522
00:20:26,680 --> 00:20:27,880
That it's hard to see anyone else.

523
00:20:27,910 --> 00:20:29,140
Oh, you know, who might be really good.

524
00:20:29,140 --> 00:20:32,110
Speaking of Jesse Eisenberg, you know,
Kieran Culkin could be interesting

525
00:20:32,110 --> 00:20:33,700
in that role if he was able to do it.

526
00:20:33,850 --> 00:20:34,090
Oh.

527
00:20:34,090 --> 00:20:37,180
Because he's kind of small and kind of has
an e but he, maybe his energy is too much.

528
00:20:37,180 --> 00:20:40,240
Anyway, I, it'll be interesting to
see Andrew Garfield feels a little

529
00:20:40,240 --> 00:20:45,490
too suave to me to play Sam Alman,
but we will see, oh, Sam's gonna

530
00:20:45,490 --> 00:20:47,320
take, uh, take offense to that.

531
00:20:47,320 --> 00:20:49,030
I mean, this is not Gavin, this
is not about dissing on Sam.

532
00:20:49,030 --> 00:20:50,260
He is a charismatic, uh, young man.

533
00:20:50,260 --> 00:20:51,070
Sammy is a very.

534
00:20:51,230 --> 00:20:52,460
Suave gentleman.

535
00:20:52,460 --> 00:20:52,520
Yeah.

536
00:20:52,520 --> 00:20:55,640
If you wanna cast yourself in
something, you could cast yourself in

537
00:20:55,640 --> 00:20:58,160
the role of helper to AI for humans.

538
00:20:58,160 --> 00:20:58,520
That's right.

539
00:20:58,520 --> 00:21:00,230
We need you to subscribe.

540
00:21:00,230 --> 00:21:02,210
We need you to like this video.

541
00:21:02,210 --> 00:21:04,730
We need you to be aware that
you are part of our army.

542
00:21:04,730 --> 00:21:06,500
You are part of the people
that spread our message.

543
00:21:06,710 --> 00:21:08,630
Last week's YouTube video was Kevin.

544
00:21:08,695 --> 00:21:12,985
By far our most YouTube video
episode of all time, and that

545
00:21:12,985 --> 00:21:14,064
is because of people out there.

546
00:21:14,064 --> 00:21:14,845
Oh, I'm really sorry to hear that.

547
00:21:14,845 --> 00:21:15,024
It was

548
00:21:15,024 --> 00:21:16,044
an okay episode.

549
00:21:16,254 --> 00:21:16,825
It was all right.

550
00:21:16,825 --> 00:21:18,024
I promise it gets better.

551
00:21:18,024 --> 00:21:21,385
So if you're a returning champion
to AI for humans, thank you so much.

552
00:21:21,385 --> 00:21:25,075
Literally, you are the only way
Gavin and I are able to spread

553
00:21:25,075 --> 00:21:26,155
the message of this thing.

554
00:21:26,155 --> 00:21:28,435
So please share it with
your friends and family.

555
00:21:28,645 --> 00:21:30,325
Give the thumbs up, click the subscribe.

556
00:21:30,325 --> 00:21:31,014
It costs you nothing.

557
00:21:31,014 --> 00:21:34,615
If you listen to it on podcast, leave a
five star review and also huge thank you.

558
00:21:34,645 --> 00:21:35,155
There's this like.

559
00:21:35,300 --> 00:21:39,889
Contingent of very kind people who
give us $5 a month on Patreon and

560
00:21:39,889 --> 00:21:43,550
we use that to pay for editing and
for a bunch of AI tooling as well.

561
00:21:43,550 --> 00:21:45,050
So that helps immensely.

562
00:21:45,050 --> 00:21:48,440
Thank you all for helping out this,
uh, this little endeavor of ours.

563
00:21:48,440 --> 00:21:48,889
We appreciate

564
00:21:48,889 --> 00:21:48,950
it.

565
00:21:48,950 --> 00:21:49,430
Absolutely.

566
00:21:49,430 --> 00:21:51,860
And thankfully Kevin and I were
able to do this podcast 'cause we

567
00:21:51,860 --> 00:21:55,129
squashed our 10 year beef, just like
two other people in the industry.

568
00:21:55,165 --> 00:21:56,720
Palmer Lucky and Mark Zuckerberg.

569
00:21:56,750 --> 00:21:56,870
Mm-hmm.

570
00:21:57,705 --> 00:22:00,465
Yeah, when you back stabbed me
for that cute little late night

571
00:22:00,465 --> 00:22:04,275
thing and left the great gravy
train, we were all chugging along.

572
00:22:04,515 --> 00:22:05,265
Boy howdy.

573
00:22:05,265 --> 00:22:06,315
Did I write you off?

574
00:22:06,315 --> 00:22:09,585
But we're back together because,
you know, time heals all wounds.

575
00:22:09,915 --> 00:22:14,535
Palmer Lucky founder of Oculus, uh,
you know, amazing, amazing engineer.

576
00:22:14,685 --> 00:22:17,775
Uh, created, really responsible
for VR as it is today.

577
00:22:17,805 --> 00:22:20,265
Yeah, didn't like create vr, but
really brought it back into the fold.

578
00:22:20,655 --> 00:22:23,085
Um, unceremoniously ousted.

579
00:22:23,150 --> 00:22:24,800
From at the time Facebook.

580
00:22:24,830 --> 00:22:24,980
Yes.

581
00:22:24,980 --> 00:22:25,550
Now Meta.

582
00:22:25,940 --> 00:22:29,660
Um, and was very vocal about it after a
short period of time and I, I actually

583
00:22:29,660 --> 00:22:31,340
appreciate how vocal he was about it.

584
00:22:31,340 --> 00:22:35,210
'cause he talked about how difficult
his, like it was for him emotionally.

585
00:22:35,210 --> 00:22:35,270
Yeah.

586
00:22:35,270 --> 00:22:39,620
And intellectually, um, how, uh,
Andel his new defense company was.

587
00:22:39,620 --> 00:22:42,020
The, uh, let's see if he
got it in you twice kid.

588
00:22:42,020 --> 00:22:42,021
Yeah.

589
00:22:42,050 --> 00:22:42,320
Yeah.

590
00:22:42,350 --> 00:22:43,010
Sort of thing.

591
00:22:43,010 --> 00:22:45,260
And clearly he did have it in him twice.

592
00:22:45,260 --> 00:22:46,190
The company's doing amazing.

593
00:22:46,190 --> 00:22:47,060
Well, you know.

594
00:22:47,940 --> 00:22:52,590
A lot of documents have been leaked since,
uh, Palmer was ousted and he said it

595
00:22:52,590 --> 00:22:57,629
seemed like Mark Zuckerberg wasn't really
at the helm making the decisions about

596
00:22:57,629 --> 00:23:01,560
Palmer's Ouster that it might have been
others that are no longer at meta anymore.

597
00:23:01,740 --> 00:23:05,370
And it's a very roundabout way to
them getting together, taking a photo

598
00:23:05,370 --> 00:23:08,970
and what could be, I don't know,
the lobby of a rainforest cafe or

599
00:23:08,970 --> 00:23:10,110
meta, what if that's where they were?

600
00:23:10,290 --> 00:23:10,560
I would love

601
00:23:10,560 --> 00:23:14,070
to see the two of them in a rainforest
cafe, like underneath the frog looking

602
00:23:14,070 --> 00:23:15,750
up and the frogs just sitting there.

603
00:23:15,775 --> 00:23:19,945
Volcano, a hundred percent
happy to be a fly on that wall.

604
00:23:19,945 --> 00:23:20,005
Yeah.

605
00:23:20,005 --> 00:23:21,055
I would pay for the dinner.

606
00:23:21,265 --> 00:23:24,685
Point is they squash the beef because
they wanna make super soldiers.

607
00:23:24,925 --> 00:23:27,595
Um, Palmer was touting a
new vision system Yeah.

608
00:23:27,595 --> 00:23:28,375
Called Eagle Eye.

609
00:23:28,375 --> 00:23:28,435
Yeah.

610
00:23:28,825 --> 00:23:32,125
For soldiers for a long time that was
gonna be an augmented reality display

611
00:23:32,275 --> 00:23:36,505
that in his vision, every soldier
should have on their noggin when

612
00:23:36,505 --> 00:23:37,705
they're deployed in the battlefield.

613
00:23:37,705 --> 00:23:40,585
And the way this works is that
it creates like a mesh network of

614
00:23:40,585 --> 00:23:43,945
communication helmet to helmet, uh, where.

615
00:23:44,100 --> 00:23:48,180
If you, Gavin are on one point in
the battlefield and you have eyes

616
00:23:48,180 --> 00:23:51,660
on an enemy, some sort of vehicle,
an interesting point of reference,

617
00:23:51,810 --> 00:23:53,730
it can beam that visual data.

618
00:23:53,910 --> 00:23:53,970
Yeah.

619
00:23:54,030 --> 00:23:57,420
Those coordinates an
outline of another soldier.

620
00:23:57,420 --> 00:23:57,900
Whatever.

621
00:23:57,900 --> 00:24:01,440
It can enhance my vision with the data
that you have gathered with the data

622
00:24:01,440 --> 00:24:05,250
that an autonomous drone flying about
has gathered and give us a far more

623
00:24:05,250 --> 00:24:09,450
detailed, uh, picture of what is going
on in the battlefield and, you know.

624
00:24:10,024 --> 00:24:10,715
Makes sense.

625
00:24:10,715 --> 00:24:14,195
I'm sure it will be great fuel for
the, uh, bipedal robots that'll

626
00:24:14,195 --> 00:24:16,685
eventually exactly be running around,
around the battlefield, but that's

627
00:24:16,685 --> 00:24:18,185
their vision for soldiers today.

628
00:24:18,215 --> 00:24:18,455
Yeah,

629
00:24:18,455 --> 00:24:19,985
I mean, there's two things
I wanna mention here.

630
00:24:19,985 --> 00:24:23,165
One, I I, I dunno if Kev, if you heard
this, but there was an incredible

631
00:24:23,165 --> 00:24:27,379
interview with the Palmer on the Core
Memory podcast, a guy named Ashley

632
00:24:27,379 --> 00:24:30,395
Vance, who broke away from a major
tech publication to make his own stuff.

633
00:24:30,395 --> 00:24:32,314
You should definitely be
following core memory.

634
00:24:32,645 --> 00:24:34,835
I listened to the whole two
hours and I know you've been a

635
00:24:34,835 --> 00:24:36,125
Palmer fanboy for a long time.

636
00:24:36,125 --> 00:24:38,075
And I have to say what was really
interesting listening to this.

637
00:24:38,715 --> 00:24:41,865
I, you know, I always am like, you know,
I'm not super political, but like Palmer

638
00:24:41,865 --> 00:24:45,345
was very, very on the right and like I
was trying to kind of get a sense of like.

639
00:24:45,629 --> 00:24:48,840
Where he landed and what he was this
interview with him, it feels very

640
00:24:48,840 --> 00:24:52,409
reasonable and I do not think he
is like an extremist in some form.

641
00:24:52,409 --> 00:24:53,159
That was just my take.

642
00:24:53,159 --> 00:24:55,379
I know some people might think
that, but I thought it was

643
00:24:55,379 --> 00:24:56,250
a very reasonable interview.

644
00:24:56,250 --> 00:24:56,639
I do.

645
00:24:56,669 --> 00:24:59,669
I do suggest you go listen to this,
like it's a really interesting thing.

646
00:24:59,969 --> 00:25:02,399
The other side of this, in this
interview, he talked a lot about

647
00:25:02,399 --> 00:25:05,010
the idea of how these VR helmets.

648
00:25:05,265 --> 00:25:07,245
Need to be different than consumer ai.

649
00:25:07,245 --> 00:25:10,245
And he was pointing out the differences
between what like Apple has to

650
00:25:10,245 --> 00:25:14,295
develop and deliver for consumers
to do like the Apple Vision Pro

651
00:25:14,505 --> 00:25:15,975
versus what these helmets need to be.

652
00:25:15,975 --> 00:25:16,215
Right?

653
00:25:16,215 --> 00:25:20,415
Because in the helmet it's really
about functionality for soldiers on the

654
00:25:20,415 --> 00:25:23,745
ground, and that is a big difference
than like, uh, you know, a nerdy guy

655
00:25:23,745 --> 00:25:26,505
in their couch trying to have the
ultimate entertainment experience.

656
00:25:26,780 --> 00:25:29,990
It's all these little things you don't
think about when you think about like,

657
00:25:30,169 --> 00:25:31,939
defense mechanisms and all that stuff.

658
00:25:31,939 --> 00:25:34,340
And, and you know, the thing that really,
this stood out to me, if you've been

659
00:25:34,340 --> 00:25:38,750
following the news, you clearly saw
the, the weird drone attack that, that

660
00:25:38,750 --> 00:25:43,280
Ukraine, uh, unleashed on on Russia in
the last week, which is one of these

661
00:25:43,280 --> 00:25:47,659
really fascinating ways of looking at
how the future of warfare works now.

662
00:25:47,659 --> 00:25:47,750
Mm-hmm.

663
00:25:48,260 --> 00:25:50,780
So when you think about the
future of warfare and you

664
00:25:50,780 --> 00:25:52,070
look at what's going on now.

665
00:25:52,395 --> 00:25:55,635
It is exactly this kind of thing
that is going to lead to whatever

666
00:25:55,635 --> 00:25:56,625
the next stage of that is.

667
00:25:56,625 --> 00:25:59,565
Now, you may not love the fact that
like robots will be fighting for

668
00:25:59,565 --> 00:26:02,235
us, but I do believe in, Palmer
even says this in the interview,

669
00:26:02,565 --> 00:26:03,975
this is how we save lives, right?

670
00:26:03,975 --> 00:26:06,225
You save lives in training
by doing this, but also you

671
00:26:06,225 --> 00:26:07,635
save lives in the battlefield.

672
00:26:07,875 --> 00:26:10,635
And eventually, maybe it is
drones, fighting drones, and then

673
00:26:10,815 --> 00:26:11,925
who knows what that feels like.

674
00:26:11,925 --> 00:26:15,075
But it is a very different way of
looking at battlefields in general.

675
00:26:15,780 --> 00:26:15,989
Yeah.

676
00:26:15,989 --> 00:26:18,570
And again, don't take a second
to think about what happens when

677
00:26:18,570 --> 00:26:20,129
one side runs out of drones.

678
00:26:20,129 --> 00:26:20,219
Right.

679
00:26:20,489 --> 00:26:22,590
We don't, we don't think about that.

680
00:26:22,590 --> 00:26:23,250
That's not a thing.

681
00:26:23,250 --> 00:26:24,209
It's just metal on metal.

682
00:26:24,209 --> 00:26:26,010
It's basically a, but what was it?

683
00:26:26,010 --> 00:26:26,610
A BattleBots.

684
00:26:26,760 --> 00:26:26,820
Yeah.

685
00:26:27,060 --> 00:26:30,000
And then everybody goes home happy, and
we all share our resources because we

686
00:26:30,000 --> 00:26:31,229
live in a world of abundance, right?

687
00:26:31,229 --> 00:26:31,800
Also, don't

688
00:26:31,800 --> 00:26:35,489
think about what happens when one
side suddenly turns to understand

689
00:26:35,489 --> 00:26:38,520
how to control themselves versus
the human being controlling all of

690
00:26:38,520 --> 00:26:39,899
this is not worth thinking about.

691
00:26:39,899 --> 00:26:41,909
Now we're not gonna think
about what this means.

692
00:26:42,205 --> 00:26:44,545
We're just gonna talk
more about, uh, AI video.

693
00:26:44,905 --> 00:26:45,264
Kevin,

694
00:26:45,535 --> 00:26:49,135
I, I did use such a disservice as a friend
and co-host too, because like you were

695
00:26:49,135 --> 00:26:52,075
making like grilly solid points, but all
I could think about was a soldier on the

696
00:26:52,075 --> 00:26:56,485
battlefield trying to like pinch zoom, a
stock widget because Apple, yeah, sure.

697
00:26:56,485 --> 00:26:56,725
Exactly.

698
00:26:56,725 --> 00:26:59,185
They're saying it's like they're
under fire and it's like, no window.

699
00:26:59,185 --> 00:26:59,754
Get out of the way.

700
00:26:59,754 --> 00:27:01,345
Siri, close the Siri.

701
00:27:01,345 --> 00:27:01,825
Oh my god.

702
00:27:02,665 --> 00:27:02,905
The windows.

703
00:27:02,905 --> 00:27:03,655
Can you imagine having to
be in the battlefield and

704
00:27:03,655 --> 00:27:05,485
using Siri to control it and like.

705
00:27:05,969 --> 00:27:08,939
You set a timer for like yesterday
morning, and I can't turn it

706
00:27:08,939 --> 00:27:09,750
off, sir.

707
00:27:10,800 --> 00:27:13,409
Here's what I found on the web
for how to Stop the bleeding.

708
00:27:14,760 --> 00:27:14,820
No.

709
00:27:14,820 --> 00:27:16,110
As Alexa, it's like, here's

710
00:27:16,110 --> 00:27:18,929
a blog spot from 2011 about dog fur.

711
00:27:19,649 --> 00:27:20,280
We should move on.

712
00:27:20,280 --> 00:27:24,120
There's a big story, a brewing in the
AI audio space, which I know you'll

713
00:27:24,120 --> 00:27:26,610
have a lot of thoughts on Suno and uio.

714
00:27:26,610 --> 00:27:29,580
The two big players in this space,
and I, I kind of think of Suno as the

715
00:27:29,580 --> 00:27:30,870
big player, but UIO is still up there.

716
00:27:30,870 --> 00:27:32,490
They're both really interesting AI models.

717
00:27:32,865 --> 00:27:38,085
They are in talks with major labels
to not only make a deal, but give them

718
00:27:38,085 --> 00:27:40,035
some percentages of their company.

719
00:27:40,035 --> 00:27:42,500
So let us talk a little bit about
this and about the history of this.

720
00:27:42,500 --> 00:27:43,125
Oh, you don't say,

721
00:27:43,335 --> 00:27:45,375
you don't say Gavin.

722
00:27:45,525 --> 00:27:49,215
It was the, I was pat myself
on the back as hard as I could.

723
00:27:49,575 --> 00:27:49,635
Yeah.

724
00:27:49,665 --> 00:27:53,445
Um, I, I think you shared the sentiment
with me, I don't know, over a year ago.

725
00:27:53,445 --> 00:27:53,505
Yeah.

726
00:27:53,505 --> 00:27:54,075
When it was like.

727
00:27:54,590 --> 00:27:56,930
The labels are gonna sue,
this is gonna be a big deal.

728
00:27:56,930 --> 00:28:00,379
And I said, that genie's outta the
bottle and there's too much, too much

729
00:28:00,379 --> 00:28:03,170
money, too much money, money on the
table, too much money for it to be upset.

730
00:28:03,170 --> 00:28:05,480
And I was like, eventually
it's gonna be a negotiation.

731
00:28:05,480 --> 00:28:09,590
And oopsie don't, you know, it suddenly
the artist writes thing is not gonna

732
00:28:09,590 --> 00:28:13,430
be a, they're gonna get some pennies,
maybe fractions of a fraction of a penny

733
00:28:13,430 --> 00:28:14,990
after the labels get what they get.

734
00:28:15,230 --> 00:28:17,120
But there was just too much at stake.

735
00:28:17,120 --> 00:28:20,960
So I'm not surprised to see that,
uh, deals are being made here.

736
00:28:20,960 --> 00:28:22,730
So, you know, something I
thought about with this.

737
00:28:22,935 --> 00:28:27,794
Is that GPT-4 OH'S image Gen
moment, I think can be seen by

738
00:28:27,794 --> 00:28:29,175
different people in different ways.

739
00:28:29,175 --> 00:28:34,155
Obviously a lot of the anti AI people see
it as like, oh, they ripped off these, you

740
00:28:34,155 --> 00:28:37,544
know, the Ghibli, the Ghibli, uh, ip and
they're making all this stuff with Ghibli.

741
00:28:37,784 --> 00:28:40,034
Well, the thing I've thought about
with AI music, and I'm sure you felt

742
00:28:40,034 --> 00:28:41,475
about this before too, is that like.

743
00:28:41,930 --> 00:28:44,840
I kind of would love to be able to
make, whether it's like a parody

744
00:28:44,840 --> 00:28:48,920
version of something or like use
another song as a jumping off point to

745
00:28:48,950 --> 00:28:53,000
and, and like use either that artist's
voice or that artist's melody for

746
00:28:53,000 --> 00:28:55,430
that thing and remix my own stuff.

747
00:28:55,430 --> 00:28:56,810
And it's really hard to do that.

748
00:28:56,810 --> 00:28:58,370
You can do that open source.

749
00:28:58,370 --> 00:29:00,680
There are ways that you really
have to kind of go deep on it.

750
00:29:01,040 --> 00:29:04,280
So I'm curious to know, like in
the future of this world where Uio

751
00:29:04,280 --> 00:29:05,960
and Sunil have made these deals.

752
00:29:06,330 --> 00:29:09,870
It will be really interesting to
see, like, can I, a, a, a song

753
00:29:09,870 --> 00:29:13,260
that's like, again, we mentioned,
uh, Calvin Harris' feel so close

754
00:29:13,260 --> 00:29:14,160
to the top of the show, right?

755
00:29:14,160 --> 00:29:16,170
Which is a very classic like Dance Trike.

756
00:29:16,170 --> 00:29:17,430
I'm a, I'm a fan of that song.

757
00:29:17,610 --> 00:29:19,110
It's super fun to listen to.

758
00:29:19,350 --> 00:29:22,560
What if you could take that song and
like insert your own lyrics and like make

759
00:29:22,560 --> 00:29:25,440
your own version of that, you know, not
to make money off of, but you would be

760
00:29:25,440 --> 00:29:29,700
able to remix it using Calvin's voice
track and maybe the actual sound and the

761
00:29:29,700 --> 00:29:31,140
beats and the way the thing rises up.

762
00:29:31,465 --> 00:29:35,425
That all feels like super compelling to
the mainstream in a way that's different

763
00:29:35,425 --> 00:29:36,955
than just making music from scratch.

764
00:29:36,955 --> 00:29:37,795
Does that make sense to you?

765
00:29:38,095 --> 00:29:39,025
I think it does, yeah.

766
00:29:39,025 --> 00:29:43,585
And, and Mitch Glazer, I wanna say chief
executive officer of the RAAA said quote,

767
00:29:43,765 --> 00:29:46,525
the music community has embraced ai.

768
00:29:47,004 --> 00:29:47,065
Yeah.

769
00:29:47,065 --> 00:29:48,355
I dunno if that's a hundred percent true.

770
00:29:48,355 --> 00:29:48,625
Something.

771
00:29:48,625 --> 00:29:48,895
Yeah.

772
00:29:48,925 --> 00:29:49,285
Okay.

773
00:29:49,285 --> 00:29:50,695
That's what he saying has.

774
00:29:51,720 --> 00:29:55,500
We, uh, already, we are already partnering
and collaborating with responsible

775
00:29:55,500 --> 00:30:00,270
developers to build sustainable AI
tools centered on human creativity that

776
00:30:00,270 --> 00:30:02,700
put artists and songwriters in charge.

777
00:30:02,940 --> 00:30:05,940
So that is the head of the
recording Industry of America.

778
00:30:06,060 --> 00:30:08,040
What do you think the percentage
these companies had to give

779
00:30:08,040 --> 00:30:08,490
up, Kev?

780
00:30:08,580 --> 00:30:10,980
I mean, it's so, so hard
to say because it's.

781
00:30:11,225 --> 00:30:14,945
You know, all of the training data goes
into the big pot and gets stirred about.

782
00:30:14,945 --> 00:30:15,004
Yeah.

783
00:30:15,004 --> 00:30:16,835
And then something comes
out on the other end.

784
00:30:16,985 --> 00:30:20,824
So will they say, Hey, based
off a percentage of a particular

785
00:30:20,824 --> 00:30:24,274
artist that went into your
data set, we're gonna do that?

786
00:30:24,274 --> 00:30:27,514
Or do they have to build in tools that
can reference Interesting what percentage

787
00:30:27,514 --> 00:30:29,284
of an artist was used in the output?

788
00:30:29,284 --> 00:30:31,415
Or maybe they'll just
blanket say, Hey, guess what?

789
00:30:31,865 --> 00:30:36,125
You're gonna get a Mr. Wonderful
royalty on every account that signs up.

790
00:30:36,125 --> 00:30:36,725
1%.

791
00:30:37,024 --> 00:30:38,495
1% of everything.

792
00:30:38,879 --> 00:30:42,510
That, that, that is fine actually,
as long as the artists in the end

793
00:30:42,510 --> 00:30:45,720
also get their piece because time
and time again, you know, that's the

794
00:30:45,720 --> 00:30:47,970
thing we saw with streaming and then
you see artists coming out saying,

795
00:30:47,970 --> 00:30:49,920
Hey, uh, you streamed 9 million.

796
00:30:50,370 --> 00:30:50,610
Yeah.

797
00:30:50,610 --> 00:30:53,400
You know, sessions of my
song and I got 50 bucks.

798
00:30:53,400 --> 00:30:54,030
Well, that doesn't.

799
00:30:54,030 --> 00:30:54,820
Feel quite right.

800
00:30:54,820 --> 00:30:55,420
Absolutely.

801
00:30:55,420 --> 00:30:56,980
I think that's an important
thing to think about.

802
00:30:56,980 --> 00:30:59,980
And also like, uh, maybe acquisition
targets for both Apple and

803
00:30:59,980 --> 00:31:01,330
Spotify, these two companies.

804
00:31:01,510 --> 00:31:01,750
Okay.

805
00:31:01,750 --> 00:31:05,560
Kevin, another big story, Tim Sweeney,
CEO of Epic Games is coming out and

806
00:31:05,560 --> 00:31:07,660
saying some positive things about ai.

807
00:31:08,080 --> 00:31:13,065
Well, they pressed the AI button and um,
I. It might have been a bad idea, but like

808
00:31:13,065 --> 00:31:14,745
in truth, there's no pressing that button.

809
00:31:15,285 --> 00:31:18,525
I wanna read a little bit of his
quote here with an interview with IGN.

810
00:31:18,525 --> 00:31:22,065
He came out and said that AI
characters giving you the possibility

811
00:31:22,065 --> 00:31:24,945
of infinite dialogue with a
really simple setup for creators.

812
00:31:24,945 --> 00:31:29,265
Means small teams will be able to create
games with immense amounts of characters

813
00:31:29,265 --> 00:31:30,795
and immense and interactive world.

814
00:31:31,240 --> 00:31:35,080
What would it take for a 10 person team
to build a game like Zelda, breath of

815
00:31:35,080 --> 00:31:38,800
the Wild in the, uh, in which the AI
is doing all the dialogue and you're

816
00:31:38,800 --> 00:31:40,660
just writing some character synopsis.

817
00:31:40,660 --> 00:31:43,510
Kevin, what is your
immediate take on this idea?

818
00:31:43,510 --> 00:31:43,540
I.

819
00:31:44,610 --> 00:31:46,679
I am angry, but I don't know why.

820
00:31:46,679 --> 00:31:50,399
And I'm gonna throw out my caps lock and
just point me at a URL I'm ready to flame.

821
00:31:50,429 --> 00:31:51,479
You sound like most of

822
00:31:51,479 --> 00:31:53,639
the, uh, gamer or commenters
out in there in the world.

823
00:31:53,669 --> 00:31:56,250
I, by the way, so this is something we've
been talking about for a while, that this

824
00:31:56,250 --> 00:31:58,080
idea that gaming teams will get smaller.

825
00:31:58,080 --> 00:32:00,870
I know that a very lot of very smart
people in the gaming world are also

826
00:32:00,870 --> 00:32:04,949
tracking this, and I want everybody
to know that this, obviously coming

827
00:32:04,949 --> 00:32:06,780
from Tim Sweeney, this is a big thing.

828
00:32:06,780 --> 00:32:10,889
Like he is kind of like planting a flag
in the ground, as has John, John Carmack

829
00:32:10,889 --> 00:32:12,899
has, has other people in the AI space.

830
00:32:13,425 --> 00:32:17,265
Tim, just to be clear, is like the
leader of really one of the biggest

831
00:32:17,265 --> 00:32:19,035
game companies in the entire world.

832
00:32:19,065 --> 00:32:21,254
Fortnite, unreal Engine, all this stuff.

833
00:32:21,285 --> 00:32:24,675
So they just had the Unreal Expo, which
is where they debuted a lot of awesome

834
00:32:24,675 --> 00:32:26,895
new things, which are four footage,
which I dunno if you saw, looked

835
00:32:26,895 --> 00:32:28,410
amazing, but this is the deal I did.

836
00:32:28,410 --> 00:32:28,890
But again.

837
00:32:29,410 --> 00:32:31,990
One of the biggest empowerers of creators.

838
00:32:31,990 --> 00:32:32,080
Yes.

839
00:32:32,080 --> 00:32:33,160
And creatives.

840
00:32:33,190 --> 00:32:33,280
Yes.

841
00:32:33,370 --> 00:32:34,660
That's, that's epic.

842
00:32:34,870 --> 00:32:37,870
A lot of people use their tools, they
use their engines that are not just

843
00:32:37,870 --> 00:32:39,250
in gaming, but in motion pictures.

844
00:32:39,250 --> 00:32:39,310
Yeah.

845
00:32:39,315 --> 00:32:39,565
Yeah.

846
00:32:39,700 --> 00:32:42,610
So, uh, you look, I, I
fully subscribe to this.

847
00:32:42,610 --> 00:32:45,250
We've talked about the bar
ification of all things.

848
00:32:45,250 --> 00:32:45,490
Yes.

849
00:32:45,520 --> 00:32:48,460
Which is, the middle is going to be
eroded and on one side you're gonna

850
00:32:48,460 --> 00:32:51,970
have individual content creators
that are making feature films.

851
00:32:51,970 --> 00:32:52,030
Yeah.

852
00:32:52,270 --> 00:32:56,650
That might look a little AI at first,
but eventually will rival the Hollywood.

853
00:32:56,650 --> 00:32:58,000
AAA gaming.

854
00:32:58,000 --> 00:32:58,090
Yes.

855
00:32:58,210 --> 00:33:03,970
Massive multimillion dollar giant team,
GTA 12 stuff over on the other end.

856
00:33:04,000 --> 00:33:04,150
Yeah.

857
00:33:04,330 --> 00:33:08,710
And I, I know people fear of the,
the interim and the job displacement.

858
00:33:08,710 --> 00:33:10,270
I completely understand that.

859
00:33:10,270 --> 00:33:13,690
And you know, we don't need to dive
into that here, but, but just for a

860
00:33:13,690 --> 00:33:15,850
second, compartmentalize that and go.

861
00:33:16,080 --> 00:33:16,590
Okay.

862
00:33:16,650 --> 00:33:20,400
If you were at, let's say, a company
that had a team of 30 and that is

863
00:33:20,400 --> 00:33:24,090
no longer feasible, now you're,
you're, you're wondering what to do.

864
00:33:24,150 --> 00:33:27,660
Maybe you team up with three or four
others and you build a massively

865
00:33:27,660 --> 00:33:32,520
multiplayer game leveraging AI because
you can generate that many art assets.

866
00:33:32,520 --> 00:33:36,120
Once you define the style, you
can voice that many characters.

867
00:33:36,240 --> 00:33:38,490
Once you decide what they should
sound like and, and what their

868
00:33:38,490 --> 00:33:40,440
personality traits could be, you can.

869
00:33:40,560 --> 00:33:43,830
Vibe, code server architecture
to just get you up and running.

870
00:33:43,830 --> 00:33:43,919
Yeah.

871
00:33:43,919 --> 00:33:46,200
On a play test and maybe get
some fans and get funding, so.

872
00:33:47,100 --> 00:33:51,405
I I, I'm not saying you should for, uh,
completely forgive or forget everything

873
00:33:51,405 --> 00:33:54,765
else that's over there that could be bad
or potentially wrong, but open yourself

874
00:33:54,765 --> 00:33:59,385
up to what some of the most celebrated
game designers and creatives, uh, of all

875
00:33:59,385 --> 00:34:02,985
time are saying, which is this is gonna
be revolutionary and it's going to empower

876
00:34:02,985 --> 00:34:04,995
individuals to make amazing products

877
00:34:04,995 --> 00:34:06,195
of 100%.

878
00:34:06,195 --> 00:34:08,565
And the other thing to think
about is if you're a specialist,

879
00:34:08,565 --> 00:34:10,185
really in any industry.

880
00:34:10,639 --> 00:34:14,209
Unless it's, unless it's AI model
training, I would say think of the

881
00:34:14,209 --> 00:34:18,290
generalized version of what you want to
do going forward, because you will be

882
00:34:18,290 --> 00:34:21,230
able to do many more jobs with AI's help.

883
00:34:21,439 --> 00:34:24,290
So it, and what I'm saying that in
the games business is like you may

884
00:34:24,290 --> 00:34:27,529
have specialized in a specific type of
animation, or you may have specified

885
00:34:27,529 --> 00:34:29,179
in a specific genre of something.

886
00:34:29,419 --> 00:34:32,750
If you and a small team can be more
generalists and think about ways you can.

887
00:34:32,800 --> 00:34:34,180
Split up your work amongst yourself.

888
00:34:34,180 --> 00:34:36,880
I mean, Kevin and I have done this in the
kind of secret project we're working on.

889
00:34:37,240 --> 00:34:39,430
There's all sorts of ways that
you can open the door to this.

890
00:34:39,430 --> 00:34:42,370
So to me, it's cool to see Tim
Sweeney coming out and saying this.

891
00:34:42,370 --> 00:34:44,140
I think we're gonna be going
through this a little bit.

892
00:34:44,140 --> 00:34:47,560
I mean, obviously Breath of the Wild is
an amazing piece of art, of work, of art

893
00:34:47,560 --> 00:34:49,150
as is like many of the Nintendo games.

894
00:34:49,575 --> 00:34:53,265
I don't think he means per se that like
it's going to be exactly there yet, but

895
00:34:53,265 --> 00:34:54,915
he can see a world where it gets to it.

896
00:34:55,275 --> 00:34:59,265
If 10 people make breath of the wild,
that does not diminish breath of the wild.

897
00:34:59,265 --> 00:34:59,685
Yeah, totally.

898
00:34:59,715 --> 00:34:59,955
Right.

899
00:34:59,955 --> 00:35:00,255
Yeah.

900
00:35:00,255 --> 00:35:04,125
And that's an amazing thing for those 10
people to make another game that people

901
00:35:04,245 --> 00:35:07,935
absolutely love and still play to this
day, so, so let's not be upset that

902
00:35:07,965 --> 00:35:09,255
it's 10 people doing it potentially.

903
00:35:09,285 --> 00:35:09,465
Okay.

904
00:35:09,465 --> 00:35:12,435
You mentioned our super secret
project as much as we love.

905
00:35:12,470 --> 00:35:15,109
To pontificate about the
future of all things ai.

906
00:35:15,109 --> 00:35:17,930
As much as I love vibe coding and the
way it's unlocked, me being able to

907
00:35:17,930 --> 00:35:21,169
develop things and, and, and others
are claiming that as well, you and I

908
00:35:21,169 --> 00:35:23,959
are hunting for a full stack developer.

909
00:35:23,990 --> 00:35:24,049
Yeah.

910
00:35:24,230 --> 00:35:26,330
Someone with years of
experience and expertise.

911
00:35:26,509 --> 00:35:28,790
You and I know the tools
and what they can do.

912
00:35:28,819 --> 00:35:28,879
Yeah.

913
00:35:28,879 --> 00:35:32,000
And, and how you can breathe apps
and experience into existence.

914
00:35:32,000 --> 00:35:34,910
But there's still very much,
and I still think going to be

915
00:35:34,910 --> 00:35:37,069
very much a role for expertise.

916
00:35:37,075 --> 00:35:37,234
Yes.

917
00:35:37,520 --> 00:35:39,680
You know, for, for, for, by the way.

918
00:35:40,065 --> 00:35:42,404
If you happen to be a full stack
developer out there, reach out.

919
00:35:42,404 --> 00:35:43,185
Reach out, everybody.

920
00:35:43,185 --> 00:35:43,424
Reach out.

921
00:35:43,424 --> 00:35:45,795
Or someone with a audio engine experience.

922
00:35:45,795 --> 00:35:45,855
Yeah.

923
00:35:45,855 --> 00:35:46,605
Or uh, whatever.

924
00:35:46,785 --> 00:35:48,525
Like we, we reach out to us.

925
00:35:48,529 --> 00:35:48,710
We're hiring.

926
00:35:48,710 --> 00:35:49,305
We're hiring for real.

927
00:35:49,305 --> 00:35:49,485
Yeah.

928
00:35:49,485 --> 00:35:49,665
Figure

929
00:35:49,665 --> 00:35:49,965
it out.

930
00:35:50,235 --> 00:35:52,815
Kevin Flux context came out last week.

931
00:35:52,815 --> 00:35:56,805
This is Black Forest Labs update to
their flux image model, which we love.

932
00:35:56,940 --> 00:35:58,230
We've loved for a while.

933
00:35:58,440 --> 00:36:02,940
Um, they have come out with context, which
allows you to basically swap in different

934
00:36:02,940 --> 00:36:05,010
versions by keeping everything the same.

935
00:36:05,130 --> 00:36:09,420
If you know, uh, OpenAI of image sourcing,
this is a lot of control network, right?

936
00:36:09,420 --> 00:36:10,530
It's a lot of that sort of stuff.

937
00:36:10,530 --> 00:36:12,030
So you can put an image of something up.

938
00:36:12,240 --> 00:36:16,620
And hopefully get back the same thing
or, or match a style in specific way.

939
00:36:16,860 --> 00:36:19,050
And I'll point out it's natural
language editing as well.

940
00:36:19,050 --> 00:36:19,140
Yes.

941
00:36:19,140 --> 00:36:21,390
So if you snap a picture of
yourself and say, Hey, gimme a

942
00:36:21,390 --> 00:36:25,440
mohawk, or put me in a cool leather
jacket like Jensen, it can do it.

943
00:36:25,650 --> 00:36:27,120
Um, I know you played around with it.

944
00:36:27,120 --> 00:36:31,740
I, I very briefly poked at the API
and tried to, um, take a person

945
00:36:31,740 --> 00:36:32,970
and just modify their clothing.

946
00:36:32,970 --> 00:36:33,030
Yeah.

947
00:36:33,030 --> 00:36:35,580
Which is an example that everybody
is like, oh my God, this is amazing.

948
00:36:35,640 --> 00:36:38,790
I was, I gotta be honest, not
too impressed with the results.

949
00:36:38,910 --> 00:36:42,000
It put the clothing on the person,
but in the meantime completely.

950
00:36:42,030 --> 00:36:43,140
Modified their face.

951
00:36:43,140 --> 00:36:43,200
Yeah.

952
00:36:43,770 --> 00:36:45,330
Uh, and the pose and everything else.

953
00:36:45,330 --> 00:36:48,540
And I've just, like, that to me is one
of those examples that should just work.

954
00:36:48,630 --> 00:36:52,890
There was a great, uh, tweet thread
from Min Choi, king, king of the AI

955
00:36:52,920 --> 00:36:56,670
tweet threaders, but he talked about
a use case where it could EZ and,

956
00:36:56,670 --> 00:36:59,550
and help old photos that there's
a lot of good examples in there.

957
00:36:59,785 --> 00:37:03,895
There was another guy who went
by, uh, at eight bit e who showed

958
00:37:03,895 --> 00:37:08,665
what flux context plus Juan video
and comfy UI altogether could do.

959
00:37:08,665 --> 00:37:08,755
Mm-hmm.

960
00:37:08,755 --> 00:37:12,055
So you can see in a, in a really
interesting open source workflow, if

961
00:37:12,055 --> 00:37:14,875
you're deep in that space, you could do
a lot of interesting stuff, but, okay.

962
00:37:15,055 --> 00:37:17,305
My examples here, if you look
in our, in the drive there.

963
00:37:17,820 --> 00:37:21,840
I wanted to see what I could do quickly,
because in my use case, what I'm

964
00:37:21,840 --> 00:37:24,660
trying to figure out is like, first
of all, is this better than something

965
00:37:24,660 --> 00:37:26,430
like four oh image gen off the shelf?

966
00:37:26,430 --> 00:37:27,840
So I used a couple models first.

967
00:37:27,840 --> 00:37:32,010
I used Flux Context Pro, which
is their basic like swap out, uh,

968
00:37:32,070 --> 00:37:33,720
genre look for something else.

969
00:37:33,750 --> 00:37:38,580
So on the Flux Context Pro page, uh, on,
on Replicate, you see Black Forest Labs.

970
00:37:38,580 --> 00:37:40,950
And this is a big thing where
you're trying to see like how it

971
00:37:41,040 --> 00:37:42,600
transforms something into the other.

972
00:37:42,930 --> 00:37:44,820
In this instance it says make a nineties.

973
00:37:44,990 --> 00:37:45,979
Cartoon.

974
00:37:46,189 --> 00:37:49,850
So I wanted to use that exact same
prompt to see what I could do.

975
00:37:49,850 --> 00:37:51,439
And then I compared it
to four oh image Jen.

976
00:37:51,439 --> 00:37:55,370
So if you see in the drive, I took our
thumbnail from last week and I just

977
00:37:55,370 --> 00:37:59,359
used the exact same prompt that they had
here, and I put it through Flux Context

978
00:37:59,359 --> 00:38:00,919
Pro and you know, it's not bad, right?

979
00:38:00,919 --> 00:38:03,350
You can see like, it's a kind
of an interesting output.

980
00:38:03,350 --> 00:38:04,729
Like it gets the text, right?

981
00:38:04,910 --> 00:38:08,569
I don't think it's like amazing, but
it, it does look like nineties cartoon.

982
00:38:08,569 --> 00:38:10,939
But then Kevin, go and
look at the version I did.

983
00:38:11,280 --> 00:38:15,420
With four oh image gen, which again,
we've had our problems with, but to me,

984
00:38:15,690 --> 00:38:20,250
much more nineties cartoon, it almost
like the text almost looks better to me.

985
00:38:20,250 --> 00:38:21,960
Our faces are definitely better.

986
00:38:22,140 --> 00:38:25,980
So this goes to the point of like, I
think if you're really good at using

987
00:38:25,980 --> 00:38:27,750
open source tools and comfy ui, I.

988
00:38:28,390 --> 00:38:31,540
This is gonna be massive for you
because it's, it's a door you can open.

989
00:38:31,750 --> 00:38:36,760
The other thing I I used was Flux context
Max, which in their example, shows how

990
00:38:36,760 --> 00:38:40,870
you can take a, um, a logo and put it
in location and they're beau It's a

991
00:38:40,870 --> 00:38:42,970
beautiful sort of 3D shadowed logo.

992
00:38:43,210 --> 00:38:44,050
I did the same thing.

993
00:38:44,050 --> 00:38:47,290
Now granted our logo maybe not fit
as well, but the same thing with

994
00:38:47,530 --> 00:38:50,115
Context Pro and four oh Image Gen.

995
00:38:50,440 --> 00:38:54,400
And again, I have to say like the better
version of it was kind of four Oh, image

996
00:38:54,400 --> 00:38:57,970
gen. If you look at the two different
versions, like it, it put it, I used the

997
00:38:57,970 --> 00:39:00,130
same prompt and it put it in location.

998
00:39:00,130 --> 00:39:03,790
Now, the Flex Context Pro came
a little closer to holding onto

999
00:39:03,790 --> 00:39:06,490
stuff, but if you see the examples,
they're not perfect, right?

1000
00:39:06,490 --> 00:39:08,860
They're not, that's not like it's
a major thing that it felt, didn't

1001
00:39:08,860 --> 00:39:10,120
feel like a huge leg up to me.

1002
00:39:11,190 --> 00:39:11,700
But Al

1003
00:39:11,700 --> 00:39:12,660
for humans,

1004
00:39:12,780 --> 00:39:13,260
yeah, I do.

1005
00:39:13,260 --> 00:39:16,680
Is that about Al Borland or
weird Al? Like which Al is that?

1006
00:39:16,950 --> 00:39:18,420
Yeah, I mean that's, I'm
watching that totally got

1007
00:39:18,420 --> 00:39:19,140
it wrong, right?

1008
00:39:19,140 --> 00:39:21,840
Like Al for humans it didn't, it
thought that was AI for humans.

1009
00:39:21,840 --> 00:39:26,130
So in my general experience,
if you are a. Open source Maxi.

1010
00:39:26,460 --> 00:39:28,860
If you love Comfy ui, this is great.

1011
00:39:29,069 --> 00:39:33,660
If you are more trying to just do
something simple, I still think four

1012
00:39:33,660 --> 00:39:37,470
oh image gen is probably better at a
lot of this stuff than you might need.

1013
00:39:37,470 --> 00:39:39,509
And that, that was my, my
initial, uh, take on it.

1014
00:39:39,735 --> 00:39:40,035
Okay.

1015
00:39:40,035 --> 00:39:43,065
Kev, another thing I tried quickly,
Hagen Avatar four came out,

1016
00:39:43,065 --> 00:39:46,785
which, um, you know, is their most
updated talking avatar feature.

1017
00:39:47,055 --> 00:39:49,605
Um, just to show you like, you know,
this is, they did give us some credit,

1018
00:39:49,605 --> 00:39:50,625
so I wanna be clear about that.

1019
00:39:50,625 --> 00:39:52,275
I got some credits for
free to try this out.

1020
00:39:52,305 --> 00:39:56,085
I think it's good if you, if you play
the first clip, you'll get a sense of

1021
00:39:56,085 --> 00:40:00,255
like, I basically took a single screen
grab from a, from a, a mobile video

1022
00:40:00,255 --> 00:40:01,545
I made to see what I would look like.

1023
00:40:01,545 --> 00:40:02,895
So you can play that for everybody here.

1024
00:40:02,955 --> 00:40:05,535
This is my test of Hagen's Avatar four.

1025
00:40:06,015 --> 00:40:08,475
I was given some credits to
check it out and I'm wondering.

1026
00:40:08,830 --> 00:40:10,210
Do I need to do this anymore?

1027
00:40:10,600 --> 00:40:12,490
Well, you got Jordan Belfort mouth.

1028
00:40:12,490 --> 00:40:14,259
I know you got Wolf of Wall Street mouth.

1029
00:40:14,259 --> 00:40:14,980
I really do.

1030
00:40:14,980 --> 00:40:15,640
It's pretty

1031
00:40:15,640 --> 00:40:15,880
crazy.

1032
00:40:15,880 --> 00:40:16,810
Like Big Choppers.

1033
00:40:16,810 --> 00:40:16,960
Right.

1034
00:40:17,020 --> 00:40:17,290
Please,

1035
00:40:17,290 --> 00:40:19,360
if you're getting the
audio only go to this.

1036
00:40:19,360 --> 00:40:19,420
Yeah.

1037
00:40:19,420 --> 00:40:22,120
And watch it and tell me he's
not about to sell you a pen.

1038
00:40:22,299 --> 00:40:22,480
Yeah.

1039
00:40:22,480 --> 00:40:24,580
Like that is, that was full wolf mouth.

1040
00:40:24,580 --> 00:40:24,759
So

1041
00:40:24,759 --> 00:40:24,820
I

1042
00:40:24,820 --> 00:40:25,060
got

1043
00:40:25,060 --> 00:40:25,360
some, you

1044
00:40:25,360 --> 00:40:25,450
know,

1045
00:40:25,450 --> 00:40:27,174
you know what's, what's so
interesting about this to me,

1046
00:40:27,174 --> 00:40:30,100
and I've said this before with AI
avatars is like, look, it will work.

1047
00:40:30,100 --> 00:40:33,250
And, and one of the other big pushes for
them is like virtual avatars and like.

1048
00:40:33,509 --> 00:40:34,020
The difference.

1049
00:40:34,110 --> 00:40:36,630
Really the only main thing I have
problem with is the teeth look

1050
00:40:36,630 --> 00:40:39,150
different than my teeth and there's
a little funkiness to it as well.

1051
00:40:39,180 --> 00:40:41,700
Full disclosure, I use
hey gen almost daily.

1052
00:40:41,700 --> 00:40:41,819
Yeah.

1053
00:40:41,819 --> 00:40:42,960
On some of my AI products.

1054
00:40:42,960 --> 00:40:45,180
I've known about these updates,
but couldn't say anything.

1055
00:40:45,180 --> 00:40:45,480
Yeah, sure.

1056
00:40:45,480 --> 00:40:46,770
So I'm, I'm glad they're finally here.

1057
00:40:46,770 --> 00:40:49,259
'cause I think the gesture support,
which maybe we'll get to Yep.

1058
00:40:49,259 --> 00:40:52,920
In their new video editor, which
is really like a text file editor

1059
00:40:52,920 --> 00:40:54,180
is super interesting to me.

1060
00:40:54,180 --> 00:40:55,410
But for this.

1061
00:40:55,665 --> 00:40:56,475
Avatar version.

1062
00:40:56,475 --> 00:40:58,515
Gavin, what did you have to
do to train it on yourself?

1063
00:40:58,515 --> 00:40:58,995
Uh, nothing.

1064
00:40:59,055 --> 00:40:59,715
Single picture.

1065
00:40:59,715 --> 00:41:03,465
So that's the thing, like, so this is a
single screen grab, so that is very cool.

1066
00:41:03,465 --> 00:41:03,675
Right?

1067
00:41:03,675 --> 00:41:03,735
Yeah.

1068
00:41:03,765 --> 00:41:08,145
But you know, in some ways, like it's not
that far off in terms of what you can do,

1069
00:41:08,235 --> 00:41:09,855
withed, all these other lip sync tools.

1070
00:41:09,975 --> 00:41:10,215
Correct.

1071
00:41:10,215 --> 00:41:13,065
Um, and I will say to your point
about, you know, you've been using

1072
00:41:13,065 --> 00:41:15,705
this on the enterprise side, I think
from a value standpoint, if you

1073
00:41:15,705 --> 00:41:17,865
are really doing a lot of work or.

1074
00:41:17,995 --> 00:41:20,845
Really high-end work with one
model that you have created.

1075
00:41:20,845 --> 00:41:23,335
Like there's real value probably
in hey gen, like backend.

1076
00:41:23,605 --> 00:41:27,055
The other thing, Kevin, I'll show you is
I, I took a, I took a screen grab of a,

1077
00:41:27,055 --> 00:41:30,805
of our, uh, we gotta come up with a name
for our, our Terminator character that we

1078
00:41:30,805 --> 00:41:34,285
put in our thumbnail is what I tried to
make, uh, this character speak as well.

1079
00:41:34,285 --> 00:41:36,775
Because to me, again, the biggest
thing is like, how do you do

1080
00:41:36,805 --> 00:41:38,575
characters that aren't humanoid?

1081
00:41:38,785 --> 00:41:40,165
How does it recognize faces?

1082
00:41:40,165 --> 00:41:42,745
And it didn't do amazing with
that either in some form.

1083
00:41:42,900 --> 00:41:46,770
Ha ha, Kevin and Gavin, the world
isn't ready for me Avatar for Bender.

1084
00:41:47,190 --> 00:41:47,760
Oh, interesting.

1085
00:41:47,760 --> 00:41:49,290
Yeah, I, I've seen some examples where.

1086
00:41:49,965 --> 00:41:51,795
It, it works incredibly well.

1087
00:41:51,795 --> 00:41:51,855
Yeah.

1088
00:41:51,855 --> 00:41:55,005
Of animating like, you know,
weird characters or Pixar like

1089
00:41:55,005 --> 00:41:57,975
animals or whatever That particular
test was a little, and it might

1090
00:41:57,975 --> 00:42:00,675
just be like how its it, but
like, again, all this is just

1091
00:42:00,675 --> 00:42:01,725
getting better all the time.

1092
00:42:01,725 --> 00:42:03,855
And, and you know, if you really
wanna dive in, there's a, whatever,

1093
00:42:03,855 --> 00:42:05,655
an hour long, hey, gen, uh,

1094
00:42:05,805 --> 00:42:06,225
keynote.

1095
00:42:06,225 --> 00:42:07,785
There's a keynote presentation, a keynote.

1096
00:42:07,855 --> 00:42:10,765
I, I will say so, so the editor,
which is interesting is because,

1097
00:42:10,795 --> 00:42:14,185
you know, they're, they're trying to
rethink the way video editors work.

1098
00:42:14,185 --> 00:42:17,185
If you've ever done any video work, you
know that it's usually layers and it's

1099
00:42:17,185 --> 00:42:21,025
like, you know, your, your footage with
your graphics on top and maybe some

1100
00:42:21,025 --> 00:42:25,345
sound design below, they're getting rid
of that almost entirely and going for,

1101
00:42:25,405 --> 00:42:29,245
you have your script, and then if you
want a graphic to appear, yeah, you can

1102
00:42:29,245 --> 00:42:31,765
go place it where it needs to appear,
but you can highlight the words and

1103
00:42:31,765 --> 00:42:33,145
say, this is where the graphic comes in.

1104
00:42:33,145 --> 00:42:34,075
This is where it goes away.

1105
00:42:34,225 --> 00:42:34,975
Similarly.

1106
00:42:35,210 --> 00:42:38,060
You could take a script of Gavin
saying, welcome to AI for humans.

1107
00:42:38,060 --> 00:42:40,280
I could highlight, welcome,
and say I want a thumbs up.

1108
00:42:40,400 --> 00:42:42,260
And I could, I want the
voice to be excited.

1109
00:42:42,260 --> 00:42:42,470
That's cool.

1110
00:42:42,470 --> 00:42:42,770
Yeah.

1111
00:42:42,830 --> 00:42:46,490
And on the humans, I wanna point and so,
you know, it's still very early Yeah.

1112
00:42:46,490 --> 00:42:49,040
For this sort of stuff, but
you can imagine like it auto

1113
00:42:49,040 --> 00:42:51,620
generating a library of gestures.

1114
00:42:51,890 --> 00:42:51,950
Yeah.

1115
00:42:51,950 --> 00:42:54,530
Emotions, not just with the
voice, but in the, the facial

1116
00:42:54,530 --> 00:42:55,550
expressions of the character.

1117
00:42:55,730 --> 00:42:57,320
You sort of Q seeing them and going, yep.

1118
00:42:57,320 --> 00:42:58,010
Okay, that works.

1119
00:42:58,010 --> 00:42:58,520
That's good.

1120
00:42:58,640 --> 00:42:59,660
And now suddenly.

1121
00:43:00,029 --> 00:43:04,290
Your avatar can be contextually
aware of what it's saying because you

1122
00:43:04,290 --> 00:43:06,180
know, the, the, you should try this.

1123
00:43:06,180 --> 00:43:10,259
Licorice is delivered with the exact same
level and of excitement and a smile as,

1124
00:43:10,529 --> 00:43:12,810
I'm sad grandpa went to the farm upstate.

1125
00:43:12,839 --> 00:43:13,049
Yeah.

1126
00:43:13,109 --> 00:43:15,990
Like it's all the same performance,
so we gotta get more grand.

1127
00:43:15,990 --> 00:43:19,080
And one thing I'll say about that is we're
working on a presentation for next week.

1128
00:43:19,080 --> 00:43:19,410
By the way.

1129
00:43:19,410 --> 00:43:20,430
We are gonna be at the.

1130
00:43:20,490 --> 00:43:22,859
Banff, uh, film festival, uh, next week.

1131
00:43:22,859 --> 00:43:24,299
So if you're there, take a look at us.

1132
00:43:24,299 --> 00:43:24,870
Come find us.

1133
00:43:24,870 --> 00:43:26,310
We'll, we'll say hi to
us if you're a listener.

1134
00:43:26,399 --> 00:43:26,580
Yeah.

1135
00:43:26,580 --> 00:43:29,399
All of our listeners that are going
to the G seven summit right after

1136
00:43:29,399 --> 00:43:30,479
that, maybe come a little early.

1137
00:43:30,479 --> 00:43:30,540
Yeah.

1138
00:43:30,540 --> 00:43:31,379
Fly those PJs in.

1139
00:43:31,379 --> 00:43:31,770
Exactly.

1140
00:43:31,770 --> 00:43:32,189
Why not?

1141
00:43:32,220 --> 00:43:35,100
One of the things that's so interesting
to me about making content for that is

1142
00:43:35,100 --> 00:43:38,939
just how hard audio can still sometimes
be to get what you want out of it.

1143
00:43:38,939 --> 00:43:39,180
Right?

1144
00:43:39,180 --> 00:43:41,895
Like you really do have to think
about custom models and doing all

1145
00:43:41,895 --> 00:43:42,810
that sort of work to make it work.

1146
00:43:42,810 --> 00:43:43,350
Okay, Kevin?

1147
00:43:43,620 --> 00:43:44,520
We have to keep going.

1148
00:43:44,640 --> 00:43:48,570
Mark Andreessen has been starting to
talk about robotics in a big way, and

1149
00:43:48,600 --> 00:43:53,340
you have, you know, mark Andreessen is
the head of a 16 z. They are a, a one of

1150
00:43:53,340 --> 00:43:54,840
the biggest players in the VC business.

1151
00:43:54,840 --> 00:43:58,680
But in this clip specifically, he said
something that I think is important for

1152
00:43:58,680 --> 00:44:00,570
all of our listeners and viewers to hear.

1153
00:44:00,960 --> 00:44:03,540
You've all probably seen, you know,
Elon has this optimist robot that he is

1154
00:44:03,540 --> 00:44:07,740
building, um, these human aid robots like
the, the, the general purpose robotics.

1155
00:44:08,070 --> 00:44:10,890
Thing is going to happen and it's
gonna happen in the next decade and

1156
00:44:10,890 --> 00:44:12,150
it's gonna happen at giant scale.

1157
00:44:12,150 --> 00:44:15,240
And I, I, I think there's a plausible
argument which Elon also believes

1158
00:44:15,240 --> 00:44:17,790
that robotics is gonna be the biggest
industry in the history of the planet.

1159
00:44:18,450 --> 00:44:19,080
It should be gigantic.

1160
00:44:19,080 --> 00:44:20,335
A billion, big billion.

1161
00:44:20,365 --> 00:44:20,695
That is

1162
00:44:20,695 --> 00:44:21,060
a big, think about it.

1163
00:44:21,060 --> 00:44:24,270
Now, granted, I just to be clear, I'm
sure they have investments in a lot

1164
00:44:24,270 --> 00:44:28,110
of robotics work, but I do think we
talk about robots on the show a ton.

1165
00:44:28,430 --> 00:44:31,430
Humanoid robots are the things that
if you've been watching our show or

1166
00:44:31,430 --> 00:44:34,040
listening to our show for a couple years
and you were early on, all the stuff

1167
00:44:34,040 --> 00:44:37,160
we're talking about that's now come to
the mainstream, this is the next one.

1168
00:44:37,160 --> 00:44:37,430
Right?

1169
00:44:37,430 --> 00:44:42,260
And Unit Tree has just teased the
fact that they may be releasing a sub

1170
00:44:42,260 --> 00:44:44,630
$10,000 robot, which is a big deal.

1171
00:44:45,080 --> 00:44:47,990
We also know that we've seen the
unit tree battle bots that we

1172
00:44:47,990 --> 00:44:52,070
talked about last week, but having
a sub $10,000 humanoid robot is

1173
00:44:52,070 --> 00:44:53,810
something that's pretty significant.

1174
00:44:55,095 --> 00:44:55,245
Yeah.

1175
00:44:55,305 --> 00:44:57,944
Uh, listen, uh, we already know
that if it can fold laundry,

1176
00:44:57,944 --> 00:44:59,325
Gavin, you're gonna allow it Yep.

1177
00:44:59,325 --> 00:45:00,404
Into your household.

1178
00:45:00,734 --> 00:45:00,825
Yep.

1179
00:45:00,825 --> 00:45:04,125
I would, I would consider one
in mine at sub 10 K depending

1180
00:45:04,125 --> 00:45:05,654
upon what capabilities it had.

1181
00:45:05,654 --> 00:45:09,854
But, you know, again, this is, this
is such a race because as, as cool as

1182
00:45:09,854 --> 00:45:13,185
the simulations are that these things
can learn end to end through ai, by

1183
00:45:13,185 --> 00:45:16,185
simulating, walking around, picking
things up, the data that they're gonna

1184
00:45:16,185 --> 00:45:18,794
get from it, being in the real world.

1185
00:45:18,884 --> 00:45:18,944
Yeah.

1186
00:45:19,095 --> 00:45:21,854
Moving about interacting with
people is going to be massive.

1187
00:45:21,854 --> 00:45:23,654
And as long as it doesn't karate chop.

1188
00:45:23,690 --> 00:45:26,210
Flail its arms like the robot
we had a couple weeks ago.

1189
00:45:26,810 --> 00:45:28,850
What if, you know, I could see
these things popping up everywhere.

1190
00:45:28,850 --> 00:45:29,090
What if it

1191
00:45:29,090 --> 00:45:30,170
could play badminton with you?

1192
00:45:30,170 --> 00:45:31,490
Kevin, would you like that?

1193
00:45:33,180 --> 00:45:36,380
You can charge me 20 bills for that baby.

1194
00:45:36,380 --> 00:45:38,660
Are you talking about the
robot dog with a little.

1195
00:45:39,165 --> 00:45:40,755
A little racket on its back.

1196
00:45:40,755 --> 00:45:41,025
Gavin?

1197
00:45:41,025 --> 00:45:41,265
Yeah,

1198
00:45:41,265 --> 00:45:42,645
so this is from ETH Zurich.

1199
00:45:42,645 --> 00:45:44,055
They're a research, uh, lab.

1200
00:45:44,055 --> 00:45:46,935
There's a very fun little video of
it playing badminton with somebody.

1201
00:45:46,935 --> 00:45:49,305
What's interesting about that to me
is that you, you know, you often see

1202
00:45:49,305 --> 00:45:52,634
these robots that are, we have feature
one a long time ago in the show where

1203
00:45:52,634 --> 00:45:54,225
it's a tennis launching machine, right?

1204
00:45:54,225 --> 00:45:55,755
And it will launch a ball back and forth.

1205
00:45:55,755 --> 00:46:00,525
But an instance where now it's getting to
be pretty good at tracking where you are.

1206
00:46:00,525 --> 00:46:03,075
So you can imagine a role where
this is for practice, for sports

1207
00:46:03,075 --> 00:46:04,935
stuff, but not just for that.

1208
00:46:04,935 --> 00:46:08,475
Like, think of all the other use cases for
something like this, whether it's like.

1209
00:46:08,685 --> 00:46:12,585
You know, a training model or it's
something even if you, you know, wanted

1210
00:46:12,585 --> 00:46:15,555
to have, I'm trying to think of some,
some other use case besides training.

1211
00:46:15,555 --> 00:46:16,484
Like I literally

1212
00:46:16,484 --> 00:46:18,165
can't think of another use case.

1213
00:46:18,674 --> 00:46:20,535
Badminton or bust.

1214
00:46:20,565 --> 00:46:21,464
That's all this thing.

1215
00:46:21,464 --> 00:46:22,004
Now, how about pickleball?

1216
00:46:22,004 --> 00:46:22,484
What's crazy?

1217
00:46:22,484 --> 00:46:22,964
Is it, how about pickleball?

1218
00:46:24,095 --> 00:46:25,505
No, it could never be good.

1219
00:46:25,505 --> 00:46:28,715
The robots will never be good
enough to play pickle ball.

1220
00:46:28,715 --> 00:46:28,925
Fair enough.

1221
00:46:28,985 --> 00:46:30,035
You can clip that.

1222
00:46:30,185 --> 00:46:32,015
'cause it's never gonna
come back to haunt me.

1223
00:46:32,015 --> 00:46:33,155
Good outta my kitchen.

1224
00:46:33,215 --> 00:46:36,815
Ai it was trained in a simulated
environment, which is interesting.

1225
00:46:36,875 --> 00:46:36,995
Yep.

1226
00:46:37,025 --> 00:46:37,355
Right.

1227
00:46:37,535 --> 00:46:41,285
And when you watch the fluidity, the
speed, which with which this thing is

1228
00:46:41,285 --> 00:46:45,395
tracking the object, adjusting its little
robot dog position, whipping the thing,

1229
00:46:45,395 --> 00:46:48,605
it has the clearly the knowledge and
understanding of where the racket head is

1230
00:46:48,755 --> 00:46:50,435
and it's thwacking it back to the player.

1231
00:46:51,419 --> 00:46:53,040
That is impressive.

1232
00:46:53,040 --> 00:46:54,750
And this, we say it every week.

1233
00:46:54,750 --> 00:46:56,189
This is as bad as it's gonna get.

1234
00:46:56,189 --> 00:46:58,830
So yeah, there are, there
are implications here.

1235
00:46:58,950 --> 00:47:01,740
I don't know exactly what they are
yet, but there are implications.

1236
00:47:01,740 --> 00:47:01,980
Speaking

1237
00:47:01,980 --> 00:47:04,740
of implications, it's time to talk
about the things you talked about and

1238
00:47:04,740 --> 00:47:06,120
showed off on the internet this week.

1239
00:47:06,120 --> 00:47:06,720
It's ai.

1240
00:47:06,720 --> 00:47:07,770
See what you did there.

1241
00:47:07,980 --> 00:47:09,330
Metimes, yes.

1242
00:47:10,589 --> 00:47:16,315
Without a. Then suddenly
you stop and shout

1243
00:47:24,090 --> 00:47:24,940
what you did

1244
00:47:25,030 --> 00:47:25,380
there.

1245
00:47:25,675 --> 00:47:29,275
Okay, Kevin, this implications, we need
to talk about the implications of VO

1246
00:47:29,275 --> 00:47:32,035
O three because this may be, I know
we've been through a lot of things.

1247
00:47:32,035 --> 00:47:34,855
This may be my all time favorite video.

1248
00:47:35,125 --> 00:47:39,295
I discovered this video on the Bard
subreddit, which is a hilarious thing

1249
00:47:39,295 --> 00:47:40,555
to me, but it has since blown up.

1250
00:47:40,555 --> 00:47:41,875
Let's play a little bit of this video.

1251
00:47:42,990 --> 00:47:47,009
A revolutionary athletic program
is challenging everything we

1252
00:47:47,009 --> 00:47:48,839
thought we knew about cats.

1253
00:47:49,049 --> 00:47:50,009
Here's Brooke Landry.

1254
00:47:51,089 --> 00:47:55,170
You might think a synchronized
swimming team for cats sounds insane.

1255
00:47:55,500 --> 00:47:58,645
But a brave team of trainers is
trying to prove this guy to the song.

1256
00:47:58,645 --> 00:48:00,245
Okay, so you're not watching
what's on the screen.

1257
00:48:00,245 --> 00:48:00,884
Seattle Yeah.

1258
00:48:00,990 --> 00:48:01,980
Is you see very

1259
00:48:01,980 --> 00:48:02,819
believable.

1260
00:48:02,819 --> 00:48:04,950
Local news, uh, coverage.

1261
00:48:04,950 --> 00:48:06,240
It's like they did a great job.

1262
00:48:06,240 --> 00:48:10,890
This guy, his name is notice underscore
analytics, the original poster, and he

1263
00:48:11,009 --> 00:48:15,404
actually created a news story about Kaz
synchronized swimming and like again.

1264
00:48:16,009 --> 00:48:18,319
These are the moments of joy you live for.

1265
00:48:18,319 --> 00:48:19,645
If you're following this, it's so good.

1266
00:48:19,645 --> 00:48:19,924
Yeah.

1267
00:48:19,924 --> 00:48:22,100
If you're following this world,
there's a big, there's some

1268
00:48:22,100 --> 00:48:23,359
big surprises towards the end.

1269
00:48:23,359 --> 00:48:25,669
I don't wanna ruin it, but
it's a very long, fun video.

1270
00:48:25,669 --> 00:48:25,910
You

1271
00:48:25,940 --> 00:48:29,629
have to, you have to go grab the show
notes if you don't do the video of this.

1272
00:48:29,629 --> 00:48:33,169
If you do, then you already know the
sequence outfits that the cats are

1273
00:48:33,169 --> 00:48:36,919
wearing, the little swim goggles,
and then when it cuts to the trainer.

1274
00:48:37,655 --> 00:48:40,505
That's like it is a laugh out loud moment.

1275
00:48:40,505 --> 00:48:40,715
Yes.

1276
00:48:40,745 --> 00:48:44,045
In a, in, in a way that not
a lot of AI videos get me.

1277
00:48:44,285 --> 00:48:45,755
This was masterful.

1278
00:48:45,755 --> 00:48:46,805
I do wanna get the hint.

1279
00:48:46,805 --> 00:48:49,685
There's a little bit of a scandal
towards the end where the cats

1280
00:48:49,685 --> 00:48:53,585
might be, uh, ingesting substances
and it covered a whole thing.

1281
00:48:53,585 --> 00:48:56,915
Anyway, really big shout out to
notice analytics for doing this.

1282
00:48:56,915 --> 00:48:57,185
It's great.

1283
00:48:57,305 --> 00:49:00,845
He was basically like, kind of wanted
to try something and this is the kind

1284
00:49:00,845 --> 00:49:04,595
of creative, fascinating stuff that
can come out when somebody just is to

1285
00:49:04,655 --> 00:49:06,605
like dump their brain into AI video.

1286
00:49:07,245 --> 00:49:11,145
Yeah, I, I have seen, you know, this is
anecdotal, but on LinkedIn, Gavin, a lot

1287
00:49:11,145 --> 00:49:13,155
of e, even folks that were even never ai.

1288
00:49:13,335 --> 00:49:13,395
Yeah.

1289
00:49:13,785 --> 00:49:15,225
Coming around to the, alright.

1290
00:49:15,225 --> 00:49:17,925
So I decided to see what this
was all about and look at

1291
00:49:17,925 --> 00:49:19,125
this thing that I just made.

1292
00:49:19,125 --> 00:49:19,280
Yes, yes.

1293
00:49:19,285 --> 00:49:21,375
And it's usually a little
snippet of something or a spec

1294
00:49:21,375 --> 00:49:22,395
commercial or whatever else.

1295
00:49:22,515 --> 00:49:25,215
VO three is inspiring a lot of people.

1296
00:49:25,245 --> 00:49:25,455
Yeah.

1297
00:49:25,515 --> 00:49:29,265
Again, it's anecdotal, but in my little
bubble, I'm seeing a lot of people.

1298
00:49:29,390 --> 00:49:31,970
Dipping a toe in the VO three
waters and getting inspired.

1299
00:49:31,970 --> 00:49:32,299
Totally.

1300
00:49:32,299 --> 00:49:34,250
And speaking of that, Kev,
there's a really cool thing that

1301
00:49:34,250 --> 00:49:38,810
somebody made a Veel hub on their
YouTube channel made VX Flix.

1302
00:49:38,810 --> 00:49:43,640
And what this is is Sora and VO three,
but kind of integrated into a Netflix ui.

1303
00:49:44,330 --> 00:49:46,879
And so that it really does look
like there's all these kind of fake,

1304
00:49:46,910 --> 00:49:48,290
uh, movies that are playing and.

1305
00:49:48,475 --> 00:49:50,964
You know what's cool about it is
you see one of the, what is the

1306
00:49:50,964 --> 00:49:54,745
main video is like Heavy Lies, the
Cream, and it's like a story of like

1307
00:49:54,745 --> 00:49:56,634
putting cream into a cup of coffee.

1308
00:49:56,694 --> 00:50:01,435
There's a, uh, a fake reality show called
Gene Pool, and it's cool because each

1309
00:50:01,435 --> 00:50:04,735
of these is, is seen as like a popup
preview that you would see on Netflix.

1310
00:50:04,735 --> 00:50:04,855
The,

1311
00:50:04,860 --> 00:50:07,645
the Lex, the Lex Friedman standup special.

1312
00:50:07,645 --> 00:50:07,705
Yeah.

1313
00:50:08,174 --> 00:50:09,075
Yeah, exactly.

1314
00:50:09,134 --> 00:50:12,254
All this stuff is stuff that you can
do, but what's cool about seeing another

1315
00:50:12,254 --> 00:50:17,595
creative cool person putting this into a
format we know it elevates it to something

1316
00:50:17,595 --> 00:50:18,794
else, which I think is very cool.

1317
00:50:18,884 --> 00:50:21,974
Um, I really quickly wanna shout out,
it's not a visual something, but Jack

1318
00:50:21,974 --> 00:50:25,875
Dorsey the, uh, original, uh, Twitter
founder and if you will, I think he's.

1319
00:50:26,465 --> 00:50:29,705
Like the Rick, I know Rick Rubin
is now into vibe coding apparently,

1320
00:50:29,705 --> 00:50:32,015
but I feel like if there was ever
a Vibe coder, it would be Jack.

1321
00:50:32,645 --> 00:50:36,635
He said quote, I now spend two to three
hours per day reading research papers

1322
00:50:36,635 --> 00:50:40,385
and building something with Goose that
I didn't think it capable of doing.

1323
00:50:40,385 --> 00:50:44,945
I never see a line of code and never
uh, and never trapped in an IDE.

1324
00:50:44,975 --> 00:50:50,015
So Goose is like skunkworks project
this age agentic coating something.

1325
00:50:50,375 --> 00:50:50,765
Um.

1326
00:50:51,120 --> 00:50:52,740
What he's alluding to here.

1327
00:50:52,740 --> 00:50:55,620
What Jack is saying is, is the promise
of this future that you and I have

1328
00:50:55,620 --> 00:51:00,509
talked about so many times, which is
like, it's great that AI can code, but

1329
00:51:00,509 --> 00:51:05,250
there's a huge barrier to entry anytime
you have to get into an environment

1330
00:51:05,310 --> 00:51:06,930
where coding actually happens.

1331
00:51:06,930 --> 00:51:06,990
Yeah.

1332
00:51:07,020 --> 00:51:08,040
In a meaningful way.

1333
00:51:08,160 --> 00:51:11,670
Like not little web, web apps, not like
tiny little snippets, but like real.

1334
00:51:11,920 --> 00:51:13,990
Actual, uh, structural things.

1335
00:51:14,110 --> 00:51:17,140
And what Jack is saying is that
he's spending hours a day writing

1336
00:51:17,140 --> 00:51:18,460
code without writing code.

1337
00:51:18,460 --> 00:51:22,240
He's just talking to Goose and
it is building things and despite

1338
00:51:22,720 --> 00:51:27,700
requiring some nudging every now and
then, it works nearly every time.

1339
00:51:27,790 --> 00:51:30,310
So again, hyping his own bag, so to speak.

1340
00:51:30,310 --> 00:51:33,160
But there's a lot of people saying
that is the future of video editing.

1341
00:51:33,340 --> 00:51:34,180
Of coding, yes.

1342
00:51:34,300 --> 00:51:36,460
Of painting, of visual effects.

1343
00:51:36,460 --> 00:51:39,850
It's not having to directly
manipulate the tools, it's.

1344
00:51:39,850 --> 00:51:42,845
Interfacing with the machine so
that it can do the work for you.

1345
00:51:42,875 --> 00:51:43,084
That's

1346
00:51:43,084 --> 00:51:43,294
right.

1347
00:51:43,294 --> 00:51:46,145
And it's a great way to kind of end
this in the whole idea of that, like

1348
00:51:46,145 --> 00:51:49,205
for today's kind of thesis around the
show is really like these are gonna keep

1349
00:51:49,205 --> 00:51:50,794
getting better, as we've said all along.

1350
00:51:50,794 --> 00:51:55,174
And this is like one of the biggest like,
you know, on paper developers that has

1351
00:51:55,174 --> 00:51:58,325
existed in the last 10 to 15 years who
is diving into these tools full time.

1352
00:51:58,415 --> 00:51:59,734
So that is it today, everybody.

1353
00:51:59,734 --> 00:52:00,995
We will see you all next time.

1354
00:52:01,245 --> 00:52:03,765
Stick around, share us your
stuff, and, uh, hopefully

1355
00:52:03,765 --> 00:52:04,545
we'll see you on the internet.

1356
00:52:04,754 --> 00:52:06,254
I thought this would stick
around as you were teasing.

1357
00:52:06,254 --> 00:52:07,424
What's coming up next, Gavin?

1358
00:52:07,605 --> 00:52:08,714
What, what's coming up next?

1359
00:52:08,714 --> 00:52:09,435
Our next meeting.

1360
00:52:09,524 --> 00:52:11,504
I don't know who, what, what are we, the,

1361
00:52:12,015 --> 00:52:12,884
what are we lead for?

1362
00:52:12,884 --> 00:52:13,424
That's a good question.

1363
00:52:13,424 --> 00:52:16,319
You came for AI for humans, but stick
around because it's what is, I guess

1364
00:52:16,319 --> 00:52:19,095
maybe it's one of the other AI
video people that we love and know.

1365
00:52:19,125 --> 00:52:19,305
Uh,

1366
00:52:19,305 --> 00:52:20,265
I was gonna be like bowling.

1367
00:52:20,325 --> 00:52:22,424
Bowling with bowling
for robots or something.

1368
00:52:22,424 --> 00:52:25,214
It'd be something similar like
a vo badminton with the boys.

1369
00:52:25,214 --> 00:52:25,515
Yeah.

1370
00:52:25,515 --> 00:52:27,879
Like what is the dumb VO
three thing that's kind bad.

1371
00:52:27,879 --> 00:52:28,720
Oh, BADM, MIT with the boys is.

1372
00:52:28,720 --> 00:52:29,255
Pretty good.

1373
00:52:29,255 --> 00:52:32,915
Or maybe it's a bunch of robots that
drink in a Irish pub and they talk about

1374
00:52:32,915 --> 00:52:35,855
their old lives as playing badminton
players, like it's that sort of thing.

1375
00:52:35,855 --> 00:52:36,515
Stick around.

1376
00:52:36,995 --> 00:52:37,565
Bye.

1377
00:52:37,595 --> 00:52:38,255
Bye everybody.