1
00:00:00,160 –> 00:00:04,359
I first met our next guest Sam mman
2
00:00:02,159 –> 00:00:07,399
almost 20 years ago when he was working
3
00:00:04,359 –> 00:00:09,920
on a local mobile app called looped we
4
00:00:07,399 –> 00:00:11,960
were both backed by seoa capital and in
5
00:00:09,920 –> 00:00:14,160
fact we were both in the first class of
6
00:00:11,960 –> 00:00:16,080
seoa Scouts he did investment in a
7
00:00:14,160 –> 00:00:18,600
little unknown fintech company called
8
00:00:16,080 –> 00:00:20,039
stripe I did Uber and in that tiny
9
00:00:18,600 –> 00:00:22,320
experiment did Uber I’ve never heard
10
00:00:20,039 –> 00:00:24,960
that before yeah I think so it’s
11
00:00:22,320 –> 00:00:28,199
possible starting already you should
12
00:00:24,960 –> 00:00:30,100
write a book Jacob
13
00:00:28,199 –> 00:00:31,880
maybe let your winners
14
00:00:30,100 –> 00:00:34,480
[Music]
15
00:00:31,880 –> 00:00:34,480
Rainman
16
00:00:35,160 –> 00:00:40,400
David and in said we open source it to
17
00:00:37,760 –> 00:00:43,900
the fans and they’ve just gone crazy
18
00:00:40,400 –> 00:00:43,900
[Music]
19
00:00:43,920 –> 00:00:48,559
with that tiny experimental fund that
20
00:00:46,760 –> 00:00:50,559
Sam and I were part of a Scouts is
21
00:00:48,559 –> 00:00:52,440
sequoia’s highest multiple returning
22
00:00:50,559 –> 00:00:54,680
fund couple of low digigit Millions
23
00:00:52,440 –> 00:00:56,199
turned into over 200 million I’m told
24
00:00:54,680 –> 00:00:58,640
and then he did yeah that’s what I was
25
00:00:56,199 –> 00:01:00,440
told by ruoff yeah and he did a stint at
26
00:00:58,640 –> 00:01:03,480
Y combinator where he was president from
27
00:01:00,440 –> 00:01:06,159
2014 to 2019 in 2016 he co-founded open
28
00:01:03,480 –> 00:01:08,200
AI with the goal of ensuring that
29
00:01:06,159 –> 00:01:11,280
artificial general intelligence benefits
30
00:01:08,200 –> 00:01:14,000
all of humanity in 2019 he left YC to
31
00:01:11,280 –> 00:01:16,479
join openai full-time as CEO things got
32
00:01:14,000 –> 00:01:19,400
really interesting on November 30th of
33
00:01:16,479 –> 00:01:22,560
2022 that’s the day open AI launched
34
00:01:19,400 –> 00:01:25,799
chat GPT in January 2023 Microsoft
35
00:01:22,560 –> 00:01:28,799
invested 10 billion in November 2023
36
00:01:25,799 –> 00:01:30,880
over a crazy 5ay span Sam was fired from
37
00:01:28,799 –> 00:01:33,200
open AI everybody was going to go work
38
00:01:30,880 –> 00:01:36,079
at Microsoft a bunch of hard emojis went
39
00:01:33,200 –> 00:01:38,200
viral on x/ Twitter and people started
40
00:01:36,079 –> 00:01:40,320
speculating that the team had reached
41
00:01:38,200 –> 00:01:42,799
artificial general intelligence the
42
00:01:40,320 –> 00:01:44,640
world was going to end and suddenly a
43
00:01:42,799 –> 00:01:48,200
couple days later he was back to being
44
00:01:44,640 –> 00:01:49,960
the CEO of open AI in February Sam was
45
00:01:48,200 –> 00:01:53,000
reportedly looking to raise $7 trillion
46
00:01:49,960 –> 00:01:54,799
do for an AI chip project this after was
47
00:01:53,000 –> 00:01:57,159
reported that Sam was looking to raise a
48
00:01:54,799 –> 00:01:58,799
billion from masi yoshian to create an
49
00:01:57,159 –> 00:02:01,079
iPhone killer with Johnny I the
50
00:01:58,799 –> 00:02:03,320
co-creator of the iPhone all of this
51
00:02:01,079 –> 00:02:05,920
while chat GPT has become better and
52
00:02:03,320 –> 00:02:08,000
better and a household name is having a
53
00:02:05,920 –> 00:02:10,679
massive impact on how we work and how
54
00:02:08,000 –> 00:02:12,520
work is getting done and it’s reportedly
55
00:02:10,679 –> 00:02:15,920
the fastest product to hit 100 million
56
00:02:12,520 –> 00:02:18,000
users in history in just two months and
57
00:02:15,920 –> 00:02:19,800
check out opening eyes insane Revenue
58
00:02:18,000 –> 00:02:23,360
ramp up they reportedly hit two billion
59
00:02:19,800 –> 00:02:26,200
in ARR last year welcome to the all-in
60
00:02:23,360 –> 00:02:28,239
podcast Sam mman thank you thank you
61
00:02:26,200 –> 00:02:30,040
guys Sak you want to Le us off here okay
62
00:02:28,239 –> 00:02:32,040
sure I mean I I think the whole industry
63
00:02:30,040 –> 00:02:35,080
is waiting with baited breath for the
64
00:02:32,040 –> 00:02:36,599
release of GPT 5 I guess it’s been
65
00:02:35,080 –> 00:02:38,200
reported that it’s launching sometime
66
00:02:36,599 –> 00:02:41,000
this summer but that’s a pretty big
67
00:02:38,200 –> 00:02:42,640
window can you narrow that down I guess
68
00:02:41,000 –> 00:02:46,760
where where are you in the release of
69
00:02:42,640 –> 00:02:51,040
GPT 5 uh we we take our time on releases
70
00:02:46,760 –> 00:02:54,239
of major new models and I don’t think we
71
00:02:51,040 –> 00:02:56,400
uh I think it will be great uh when we
72
00:02:54,239 –> 00:02:58,080
do it and I think we’ll be thoughtful
73
00:02:56,400 –> 00:02:59,360
about how we do it uh like we may
74
00:02:58,080 –> 00:03:01,840
release it in a different way than we’ve
75
00:02:59,360 –> 00:03:03,920
released previous BS models um also I
76
00:03:01,840 –> 00:03:06,879
don’t even know if we’ll call it gbt 5
77
00:03:03,920 –> 00:03:08,000
um what I what I will say is you know a
78
00:03:06,879 –> 00:03:11,239
lot of people have noticed how much
79
00:03:08,000 –> 00:03:12,680
better gbd4 has gotten um since we’ve
80
00:03:11,239 –> 00:03:14,920
released it and particularly over the
81
00:03:12,680 –> 00:03:18,360
last few months I
82
00:03:14,920 –> 00:03:20,519
think I think that’s like a better hint
83
00:03:18,360 –> 00:03:24,200
of what the world looks like where it’s
84
00:03:20,519 –> 00:03:27,239
not the like 1 two 3 4 5 six seven but
85
00:03:24,200 –> 00:03:28,599
you you just you use an AI system and
86
00:03:27,239 –> 00:03:32,560
the whole system just gets better and
87
00:03:28,599 –> 00:03:34,799
better fairly continous ly um I think
88
00:03:32,560 –> 00:03:36,159
that’s like both a better technological
89
00:03:34,799 –> 00:03:39,200
Direction I think that’s like easier for
90
00:03:36,159 –> 00:03:41,519
society to adapt to um
91
00:03:39,200 –> 00:03:42,760
but but I assume that’s where we’ll head
92
00:03:41,519 –> 00:03:44,959
does that mean that there’s not going to
93
00:03:42,760 –> 00:03:47,920
be long training cycles and it’s
94
00:03:44,959 –> 00:03:49,680
continuously retraining or training
95
00:03:47,920 –> 00:03:51,000
submodels Sam and maybe you could just
96
00:03:49,680 –> 00:03:53,480
speak to us about what might change
97
00:03:51,000 –> 00:03:54,920
architecturally going forward with
98
00:03:53,480 –> 00:03:58,480
respect to large
99
00:03:54,920 –> 00:04:00,000
models well I mean one one one thing
100
00:03:58,480 –> 00:04:02,799
that you could imagine is this just that
101
00:04:00,000 –> 00:04:04,120
you keep training right a model uh that
102
00:04:02,799 –> 00:04:05,159
that would seem like a reasonable thing
103
00:04:04,120 –> 00:04:07,280
to
104
00:04:05,159 –> 00:04:09,519
me do you think you talked about
105
00:04:07,280 –> 00:04:11,280
releasing it differently this time are
106
00:04:09,519 –> 00:04:14,439
you thinking maybe releasing it to the
107
00:04:11,280 –> 00:04:17,280
paid users first or you know a slower
108
00:04:14,439 –> 00:04:19,320
roll out to get the red teams tight
109
00:04:17,280 –> 00:04:20,919
since now there’s so much at stake you
110
00:04:19,320 –> 00:04:22,840
have so many customers actually paying
111
00:04:20,919 –> 00:04:26,680
and you’ve got everybody watching
112
00:04:22,840 –> 00:04:29,160
everything you do you know is it is it
113
00:04:26,680 –> 00:04:30,520
moreful now yeah still only available to
114
00:04:29,160 –> 00:04:32,479
the paid user users but one of the
115
00:04:30,520 –> 00:04:34,919
things that we really want to do is
116
00:04:32,479 –> 00:04:36,960
figure out how to make more advanced
117
00:04:34,919 –> 00:04:39,400
technology available to free users too I
118
00:04:36,960 –> 00:04:42,199
think that’s a super important part of
119
00:04:39,400 –> 00:04:44,520
our mission uh and and this idea that we
120
00:04:42,199 –> 00:04:45,440
build AI tools and make them super
121
00:04:44,520 –> 00:04:47,680
widely
122
00:04:45,440 –> 00:04:49,720
available free or you know not that
123
00:04:47,680 –> 00:04:51,400
expensive whatever it is so that people
124
00:04:49,720 –> 00:04:53,800
can use them to go kind of invent the
125
00:04:51,400 –> 00:04:55,880
future rather than the magic AGI and the
126
00:04:53,800 –> 00:04:58,600
sky inventing the future and showing it
127
00:04:55,880 –> 00:05:00,039
down upon us uh that seems like a much
128
00:04:58,600 –> 00:05:01,160
better path it seems like more spiring
129
00:05:00,039 –> 00:05:04,560
path I also think it’s where things are
130
00:05:01,160 –> 00:05:06,840
actually heading so it makes me sad that
131
00:05:04,560 –> 00:05:08,720
we have not figured out how to make gp4
132
00:05:06,840 –> 00:05:10,600
level technology available to free users
133
00:05:08,720 –> 00:05:12,240
it’s something we really want to do it’s
134
00:05:10,600 –> 00:05:15,479
just very expensive I take it’s very
135
00:05:12,240 –> 00:05:17,960
expensive yeah chamal your thoughts I
136
00:05:15,479 –> 00:05:20,759
think maybe the the two big vectors Sam
137
00:05:17,960 –> 00:05:23,600
that people always talk about is that
138
00:05:20,759 –> 00:05:26,400
underlying cost and sort of the latency
139
00:05:23,600 –> 00:05:27,440
that’s kind of rate limited a killer app
140
00:05:26,400 –> 00:05:30,280
and
141
00:05:27,440 –> 00:05:33,120
then I think the second is sort of the
142
00:05:30,280 –> 00:05:34,919
long-term ability for people to build in
143
00:05:33,120 –> 00:05:37,240
an open source World versus a closed
144
00:05:34,919 –> 00:05:40,919
Source world and I think the crazy thing
145
00:05:37,240 –> 00:05:43,080
about this space is that the open source
146
00:05:40,919 –> 00:05:44,840
Community is rabid so one example that I
147
00:05:43,080 –> 00:05:47,880
think is incredible is you know we had
148
00:05:44,840 –> 00:05:50,160
these guys do a pretty crazy demo for
149
00:05:47,880 –> 00:05:52,720
Devon remember like even like five or
150
00:05:50,160 –> 00:05:54,840
six weeks ago that looked incredible and
151
00:05:52,720 –> 00:05:58,120
then some kid just published it under an
152
00:05:54,840 –> 00:06:00,520
open MIT license like open Devon and
153
00:05:58,120 –> 00:06:02,720
it’s incredibly
154
00:06:00,520 –> 00:06:05,120
good and almost as good as that other
155
00:06:02,720 –> 00:06:07,639
thing that was closed source so maybe we
156
00:06:05,120 –> 00:06:09,759
can just start with that which is tell
157
00:06:07,639 –> 00:06:12,000
me about the business decision to keep
158
00:06:09,759 –> 00:06:14,680
these models close source and where do
159
00:06:12,000 –> 00:06:16,960
you see things going in the next couple
160
00:06:14,680 –> 00:06:19,840
years so on the first part of your
161
00:06:16,960 –> 00:06:23,319
question um speed and cost those are
162
00:06:19,840 –> 00:06:26,120
hugely important to us and I don’t want
163
00:06:23,319 –> 00:06:27,840
to like give a timeline on when we can
164
00:06:26,120 –> 00:06:30,639
bring them down a lot cuz research is
165
00:06:27,840 –> 00:06:32,720
hard but I am confident we’ll be able to
166
00:06:30,639 –> 00:06:34,639
um we want to like cut the latency super
167
00:06:32,720 –> 00:06:37,319
dramatically we want to cut the cost
168
00:06:34,639 –> 00:06:39,199
really really dramatically um and I
169
00:06:37,319 –> 00:06:42,039
believe that will happen we’re still so
170
00:06:39,199 –> 00:06:43,960
early in the development of the science
171
00:06:42,039 –> 00:06:47,800
and understanding how this works plus we
172
00:06:43,960 –> 00:06:50,280
have all the engineering Tailwinds so
173
00:06:47,800 –> 00:06:52,120
I I don’t know like when we get to
174
00:06:50,280 –> 00:06:53,800
intelligence too cheap to meter and so
175
00:06:52,120 –> 00:06:56,560
fast that it feels instantaneous to us
176
00:06:53,800 –> 00:07:00,240
and everything else but I do believe we
177
00:06:56,560 –> 00:07:04,120
can get there for you know a pretty high
178
00:07:00,240 –> 00:07:06,039
level of intelligence and um I it’s
179
00:07:04,120 –> 00:07:08,479
important to us it’s clearly important
180
00:07:06,039 –> 00:07:10,440
to users and it’ll unlock a lot of stuff
181
00:07:08,479 –> 00:07:12,360
on the sort of Open Source close Source
182
00:07:10,440 –> 00:07:14,599
thing I think there’s great roles for
183
00:07:12,360 –> 00:07:16,560
both I think
184
00:07:14,599 –> 00:07:17,919
um you know we’ve open sourced some
185
00:07:16,560 –> 00:07:20,360
stuff we’ll open source more stuff in
186
00:07:17,919 –> 00:07:22,919
the future but really like our mission
187
00:07:20,360 –> 00:07:24,400
is to build towards AGI and to figure
188
00:07:22,919 –> 00:07:26,919
out how to broadly distribute its
189
00:07:24,400 –> 00:07:28,520
benefits we have a strategy for that
190
00:07:26,919 –> 00:07:30,319
seems to be resonating with a lot of
191
00:07:28,520 –> 00:07:32,479
people it obviously isn’t for everyone
192
00:07:30,319 –> 00:07:34,039
and there’s like a big ecosystem and
193
00:07:32,479 –> 00:07:36,560
there will also be open source models
194
00:07:34,039 –> 00:07:38,319
and people who build that way um one
195
00:07:36,560 –> 00:07:41,080
area that I’m particularly interested
196
00:07:38,319 –> 00:07:43,440
personally in open source for is I want
197
00:07:41,080 –> 00:07:47,440
an open source model that is as good as
198
00:07:43,440 –> 00:07:49,440
it can be that runs on my phone and that
199
00:07:47,440 –> 00:07:51,000
I think is going to you know the world
200
00:07:49,440 –> 00:07:52,960
doesn’t quite have the technology for
201
00:07:51,000 –> 00:07:54,759
for a good version of that yet but that
202
00:07:52,960 –> 00:07:56,400
seems like a really important thing to
203
00:07:54,759 –> 00:07:57,599
go do at some point will you do will you
204
00:07:56,400 –> 00:07:59,680
do that will you release I don’t know if
205
00:07:57,599 –> 00:08:03,199
we will or someone will but someone
206
00:07:59,680 –> 00:08:05,159
llama 3 llama 3 running on a phone well
207
00:08:03,199 –> 00:08:08,199
I guess maybe there’s like a seven
208
00:08:05,159 –> 00:08:09,840
billion version yeah yeah uh I don’t
209
00:08:08,199 –> 00:08:13,039
know if that’s if that will fit on a
210
00:08:09,840 –> 00:08:14,720
phone or not but that should be fitable
211
00:08:13,039 –> 00:08:16,319
on a phone but I don’t I I’m not I’m not
212
00:08:14,720 –> 00:08:17,479
sure if that one is like I haven’t
213
00:08:16,319 –> 00:08:18,960
played with it I don’t know if it’s like
214
00:08:17,479 –> 00:08:21,199
good enough to kind of do the thing I’m
215
00:08:18,960 –> 00:08:23,240
thinking about here so when when llama 3
216
00:08:21,199 –> 00:08:24,879
got released I think the big takeaway
217
00:08:23,240 –> 00:08:26,840
for a lot of people was oh wow they’ve
218
00:08:24,879 –> 00:08:29,560
like caught up to GPT 4 I don’t think
219
00:08:26,840 –> 00:08:31,319
it’s equal in all Dimensions but it’s
220
00:08:29,560 –> 00:08:33,880
like pretty pretty close or pretty in
221
00:08:31,319 –> 00:08:36,440
the ballpark I guess the question is you
222
00:08:33,880 –> 00:08:38,919
know you guys released four a while ago
223
00:08:36,440 –> 00:08:41,159
you’re working on five or you know more
224
00:08:38,919 –> 00:08:43,519
upgrades to four I mean I think to
225
00:08:41,159 –> 00:08:45,640
Chamas point about Devin how do you stay
226
00:08:43,519 –> 00:08:47,480
ahead of Open Source I mean it’s just
227
00:08:45,640 –> 00:08:49,320
that’s like a very hard thing to do in
228
00:08:47,480 –> 00:08:50,440
general right I mean how do you think
229
00:08:49,320 –> 00:08:54,399
about
230
00:08:50,440 –> 00:08:57,160
that what we’re trying to do is not
231
00:08:54,399 –> 00:08:59,760
make the sort of
232
00:08:57,160 –> 00:09:01,600
smartest set of Weights that we can can
233
00:08:59,760 –> 00:09:04,079
but what we’re trying to make is like
234
00:09:01,600 –> 00:09:06,839
this useful intelligence layer for
235
00:09:04,079 –> 00:09:09,399
people to use and a model is part of
236
00:09:06,839 –> 00:09:11,519
that I think we will stay pretty far
237
00:09:09,399 –> 00:09:13,680
ahead of I hope we’ll stay pretty far
238
00:09:11,519 –> 00:09:16,800
ahead of the rest of the world on that
239
00:09:13,680 –> 00:09:19,920
um but there’s a lot of other work
240
00:09:16,800 –> 00:09:22,880
around the whole system that’s not just
241
00:09:19,920 –> 00:09:24,720
that you know the the model Waits and
242
00:09:22,880 –> 00:09:26,120
we’ll have to build up enduring value
243
00:09:24,720 –> 00:09:28,120
the oldfashioned way like any other
244
00:09:26,120 –> 00:09:29,680
business does will have to figure out a
245
00:09:28,120 –> 00:09:31,720
great product and reasons to stick with
246
00:09:29,680 –> 00:09:34,959
it and uh you know delivered at a great
247
00:09:31,720 –> 00:09:37,920
price when you founded the organization
248
00:09:34,959 –> 00:09:39,440
you the stated goal or part of what you
249
00:09:37,920 –> 00:09:41,560
discussed was hey this is too important
250
00:09:39,440 –> 00:09:43,240
for any one company to own it so
251
00:09:41,560 –> 00:09:45,680
therefore it needs to be open then there
252
00:09:43,240 –> 00:09:48,200
was the switch hey it’s too dangerous
253
00:09:45,680 –> 00:09:49,720
for anybody to be able to see it and we
254
00:09:48,200 –> 00:09:52,720
need to lock this down because you you
255
00:09:49,720 –> 00:09:54,399
had some fear about that I think is that
256
00:09:52,720 –> 00:09:56,640
accurate because the cynical side is
257
00:09:54,399 –> 00:09:59,880
like well this is a capitalistic move
258
00:09:56,640 –> 00:10:01,920
and then the I think
259
00:09:59,880 –> 00:10:03,920
you know I’m I’m curious what the
260
00:10:01,920 –> 00:10:06,480
decision was here in terms of going from
261
00:10:03,920 –> 00:10:08,360
open we the world needs to see this it’s
262
00:10:06,480 –> 00:10:11,120
really important to closed only we can
263
00:10:08,360 –> 00:10:12,480
see it well how did you come to that
264
00:10:11,120 –> 00:10:14,320
that conclusion what were the disc part
265
00:10:12,480 –> 00:10:15,800
of the reason that we released chat GPT
266
00:10:14,320 –> 00:10:17,480
was we want the world to see this and
267
00:10:15,800 –> 00:10:19,240
we’ve been trying to tell people that AI
268
00:10:17,480 –> 00:10:22,399
is really important and if you go back
269
00:10:19,240 –> 00:10:23,560
to like uh October of 2022 not that many
270
00:10:22,399 –> 00:10:24,600
people thought AI was going to be that
271
00:10:23,560 –> 00:10:28,279
important or that it was really
272
00:10:24,600 –> 00:10:30,120
happening um and a huge part of what we
273
00:10:28,279 –> 00:10:33,040
we try to do
274
00:10:30,120 –> 00:10:34,600
is put the technology in the hands of
275
00:10:33,040 –> 00:10:35,920
people uh now again there’s different
276
00:10:34,600 –> 00:10:37,880
ways to do that and I think there really
277
00:10:35,920 –> 00:10:40,240
is an important role to just say like
278
00:10:37,880 –> 00:10:42,600
here’s the weights have at it but the
279
00:10:40,240 –> 00:10:44,760
fact that we have so many people using a
280
00:10:42,600 –> 00:10:46,200
free version of chat GPT that we don’t
281
00:10:44,760 –> 00:10:47,560
you know we don’t run ads on we don’t
282
00:10:46,200 –> 00:10:48,920
try to like make money on we just put
283
00:10:47,560 –> 00:10:51,680
out there because we want people to have
284
00:10:48,920 –> 00:10:54,360
these tools um I think it’s done a lot
285
00:10:51,680 –> 00:10:56,440
to provide a lot of value and you know
286
00:10:54,360 –> 00:10:59,399
teach people how to fish but also to get
287
00:10:56,440 –> 00:11:01,200
the world um really thoughtful about
288
00:10:59,399 –> 00:11:03,040
what’s happening here now we still don’t
289
00:11:01,200 –> 00:11:04,560
have all the answers and uh we’re
290
00:11:03,040 –> 00:11:06,000
fumbling our way through this like
291
00:11:04,560 –> 00:11:08,560
everybody else and I assume we’ll change
292
00:11:06,000 –> 00:11:11,079
strategy many more times as we learn new
293
00:11:08,560 –> 00:11:13,399
things you know when we started open AI
294
00:11:11,079 –> 00:11:15,920
we had really no idea about how things
295
00:11:13,399 –> 00:11:17,160
were going to go um that we’d make a
296
00:11:15,920 –> 00:11:20,160
language model that we’d ever make a
297
00:11:17,160 –> 00:11:21,360
product we started off just I remember
298
00:11:20,160 –> 00:11:23,880
very clearly that first day where we’re
299
00:11:21,360 –> 00:11:24,959
like well now we’re all here that was
300
00:11:23,880 –> 00:11:26,839
you know it was difficult to get this
301
00:11:24,959 –> 00:11:28,040
set up but what happens now maybe we
302
00:11:26,839 –> 00:11:30,360
should write some papers maybe we should
303
00:11:28,040 –> 00:11:32,279
stand around a whiteboard and we’ve just
304
00:11:30,360 –> 00:11:33,680
been trying to like put one foot in
305
00:11:32,279 –> 00:11:36,519
front of the other and figure out what’s
306
00:11:33,680 –> 00:11:39,079
next and what’s next and what’s next
307
00:11:36,519 –> 00:11:40,480
and I think we’ll keep doing that can I
308
00:11:39,079 –> 00:11:42,360
just replay something and just make sure
309
00:11:40,480 –> 00:11:45,000
I heard it right I think what you were
310
00:11:42,360 –> 00:11:48,519
saying on the open source close Source
311
00:11:45,000 –> 00:11:50,440
thing is if I heard it right all these
312
00:11:48,519 –> 00:11:52,399
models independent of the business
313
00:11:50,440 –> 00:11:55,720
decision you make are going to become
314
00:11:52,399 –> 00:11:57,639
asymptotically accurate towards some
315
00:11:55,720 –> 00:11:58,920
amount of accuracy like not all but like
316
00:11:57,639 –> 00:12:00,760
let’s just say there’s four or five that
317
00:11:58,920 –> 00:12:03,200
are
318
00:12:00,760 –> 00:12:05,680
well capitalized enough you guys meta
319
00:12:03,200 –> 00:12:08,440
Google Microsoft whomever right so let’s
320
00:12:05,680 –> 00:12:12,320
just say four or five maybe one
321
00:12:08,440 –> 00:12:15,000
startup and on the open web and then
322
00:12:12,320 –> 00:12:16,880
quickly the accuracy or the value of
323
00:12:15,000 –> 00:12:18,560
these models will probably shift to
324
00:12:16,880 –> 00:12:20,639
these proprietary sources of training
325
00:12:18,560 –> 00:12:22,480
data that you could get that others
326
00:12:20,639 –> 00:12:25,320
can’t or others can get that you can’t
327
00:12:22,480 –> 00:12:27,720
is that how you see this thing evolving
328
00:12:25,320 –> 00:12:29,720
where the open web gets everybody to a
329
00:12:27,720 –> 00:12:33,240
certain threshold and then it’s just an
330
00:12:29,720 –> 00:12:34,560
arms race for data beyond that doesn’t
331
00:12:33,240 –> 00:12:36,160
so I definitely don’t think it’ll be an
332
00:12:34,560 –> 00:12:37,880
arms race for data because when the
333
00:12:36,160 –> 00:12:39,880
models get smart enough at some point it
334
00:12:37,880 –> 00:12:41,880
shouldn’t be about more data at least
335
00:12:39,880 –> 00:12:45,440
not for training it may matter data to
336
00:12:41,880 –> 00:12:48,279
make it useful um look the the one thing
337
00:12:45,440 –> 00:12:50,320
that I have learned most throughout all
338
00:12:48,279 –> 00:12:51,760
this is that uh it’s hard to make
339
00:12:50,320 –> 00:12:53,000
confidence statements a couple of years
340
00:12:51,760 –> 00:12:54,880
in the future about where this is all
341
00:12:53,000 –> 00:12:58,399
going to go and so I don’t want to try
342
00:12:54,880 –> 00:13:01,760
now I I will say that I I expect lots of
343
00:12:58,399 –> 00:13:04,760
very capable models in the world and you
344
00:13:01,760 –> 00:13:07,279
know like it feels to me like we just
345
00:13:04,760 –> 00:13:08,519
like stumbled on a new fact of nature or
346
00:13:07,279 –> 00:13:13,920
science or whatever you want to call it
347
00:13:08,519 –> 00:13:15,560
which is like we can create you can like
348
00:13:13,920 –> 00:13:18,000
I mean I don’t believe this literally
349
00:13:15,560 –> 00:13:19,519
but it’s like a spiritual point you know
350
00:13:18,000 –> 00:13:21,440
intelligence is just this emergent
351
00:13:19,519 –> 00:13:22,760
property of matter and that’s like a
352
00:13:21,440 –> 00:13:24,959
that’s like a rule of physics or
353
00:13:22,760 –> 00:13:26,360
something um so people are going to
354
00:13:24,959 –> 00:13:27,560
figure that out but there will be all
355
00:13:26,360 –> 00:13:29,120
these different ways to design the
356
00:13:27,560 –> 00:13:32,839
systems people will make different Cho
357
00:13:29,120 –> 00:13:36,199
choices figure out new ideas and I’m
358
00:13:32,839 –> 00:13:39,079
sure like you
359
00:13:36,199 –> 00:13:41,399
know like any other industry I would
360
00:13:39,079 –> 00:13:43,079
expect there to be multiple approaches
361
00:13:41,399 –> 00:13:44,279
and different people like different ones
362
00:13:43,079 –> 00:13:46,480
you know some people like iPhones some
363
00:13:44,279 –> 00:13:48,160
people like an Android phone I think
364
00:13:46,480 –> 00:13:50,680
there’ll be some effect like that let’s
365
00:13:48,160 –> 00:13:53,240
go back to that first section of just
366
00:13:50,680 –> 00:13:55,519
the the cost and the
367
00:13:53,240 –> 00:13:58,120
speed all of you guys are sort of a
368
00:13:55,519 –> 00:14:00,120
little bit rate limited on literally
369
00:13:58,120 –> 00:14:02,480
nvidia’s throughput right and I think
370
00:14:00,120 –> 00:14:04,480
that you and most everybody else have
371
00:14:02,480 –> 00:14:06,000
sort of effectively announced how much
372
00:14:04,480 –> 00:14:08,639
capacity you can get just because it’s
373
00:14:06,000 –> 00:14:10,480
as much as they can spin out what needs
374
00:14:08,639 –> 00:14:14,279
to happen at the substrate so that you
375
00:14:10,480 –> 00:14:16,720
can actually compute cheaper compute
376
00:14:14,279 –> 00:14:19,920
faster get access to more energy how are
377
00:14:16,720 –> 00:14:22,040
you helping to frame out the industry
378
00:14:19,920 –> 00:14:24,040
solving those problems well we we’ll
379
00:14:22,040 –> 00:14:26,000
make huge algorithmic gains for sure and
380
00:14:24,040 –> 00:14:28,079
I don’t want to Discount that I you know
381
00:14:26,000 –> 00:14:30,680
I very interested in chips and energy
382
00:14:28,079 –> 00:14:32,639
but if we can make our if we can make a
383
00:14:30,680 –> 00:14:34,880
same quality model twice as efficient
384
00:14:32,639 –> 00:14:37,800
that’s like we had twice as much compute
385
00:14:34,880 –> 00:14:42,480
right and I think there’s a gigantic
386
00:14:37,800 –> 00:14:44,040
amount of work to be done there uh and I
387
00:14:42,480 –> 00:14:47,040
hope we’ll start really seeing those
388
00:14:44,040 –> 00:14:48,639
results um other than that the whole
389
00:14:47,040 –> 00:14:51,600
supply chain is like very complicated
390
00:14:48,639 –> 00:14:53,240
you know there’s there’s logic Fab
391
00:14:51,600 –> 00:14:55,120
capacity there’s how much hbm the world
392
00:14:53,240 –> 00:14:56,639
can make there’s how quickly you can
393
00:14:55,120 –> 00:14:57,839
like get permits and pour the concrete
394
00:14:56,639 –> 00:14:59,399
make the data centers and then have
395
00:14:57,839 –> 00:15:00,839
people in there wiring them all up
396
00:14:59,399 –> 00:15:03,399
there’s finding the energy which is a
397
00:15:00,839 –> 00:15:06,079
huge bottleneck but
398
00:15:03,399 –> 00:15:08,800
uh I think when there’s this much value
399
00:15:06,079 –> 00:15:11,920
to people uh the world will do its thing
400
00:15:08,800 –> 00:15:14,320
we’ll try to help it happen faster um
401
00:15:11,920 –> 00:15:15,600
and there’s probably like I don’t know
402
00:15:14,320 –> 00:15:18,800
how to give it a number but there’s like
403
00:15:15,600 –> 00:15:20,240
some percentage chance where there is as
404
00:15:18,800 –> 00:15:22,000
you were saying like a huge substrate
405
00:15:20,240 –> 00:15:23,519
breakthrough and we have like a
406
00:15:22,000 –> 00:15:26,040
massively more efficient way to do
407
00:15:23,519 –> 00:15:27,440
Computing but I don’t I don’t like Bank
408
00:15:26,040 –> 00:15:30,920
on that or spend too much time thinking
409
00:15:27,440 –> 00:15:33,920
about it what about the device side and
410
00:15:30,920 –> 00:15:35,759
sort of you mentioned sort of the models
411
00:15:33,920 –> 00:15:37,959
that can fit on a phone so obviously
412
00:15:35,759 –> 00:15:39,279
whether that’s an llm or some slm or
413
00:15:37,959 –> 00:15:41,040
something I’m sure you’re thinking about
414
00:15:39,279 –> 00:15:42,839
that but then does the device itself
415
00:15:41,040 –> 00:15:46,759
change I mean is it does it need to be
416
00:15:42,839 –> 00:15:46,759
as expensive as an iPhone
417
00:15:47,120 –> 00:15:52,959
uh I’m super interested in this uh I I
418
00:15:50,440 –> 00:15:55,319
love like great new form factors of
419
00:15:52,959 –> 00:15:57,399
computing and it feels like with every
420
00:15:55,319 –> 00:16:00,199
major technological Advance a new thing
421
00:15:57,399 –> 00:16:03,079
becomes possible
422
00:16:00,199 –> 00:16:04,839
uh phones are unbelievably good so I
423
00:16:03,079 –> 00:16:07,680
think the threshold is like very high
424
00:16:04,839 –> 00:16:10,560
here like what like I think I think like
425
00:16:07,680 –> 00:16:13,040
I personally think an iPhone is like the
426
00:16:10,560 –> 00:16:14,720
greatest piece of technology Humanity
427
00:16:13,040 –> 00:16:17,120
has ever made it’s really a wonderful
428
00:16:14,720 –> 00:16:18,600
product what comes after it like I don’t
429
00:16:17,120 –> 00:16:20,079
know I mean I was gonna that was what I
430
00:16:18,600 –> 00:16:23,199
was saying it’s so good that to get
431
00:16:20,079 –> 00:16:24,759
Beyond it I think the bar is like quite
432
00:16:23,199 –> 00:16:26,480
High well you’ve been you’ve been
433
00:16:24,759 –> 00:16:29,279
working with Johnny IV on on something
434
00:16:26,480 –> 00:16:32,000
right we’ve been discussing ideas but uh
435
00:16:29,279 –> 00:16:33,639
I don’t like if I knew is it that that
436
00:16:32,000 –> 00:16:35,519
it has to be more complicated or
437
00:16:33,639 –> 00:16:37,759
actually just much much cheaper and
438
00:16:35,519 –> 00:16:39,880
simpler well every Mo almost everyone’s
439
00:16:37,759 –> 00:16:41,959
willing to pay for a phone anyway so if
440
00:16:39,880 –> 00:16:43,959
you could like make a way cheaper device
441
00:16:41,959 –> 00:16:46,560
I think the barrier to carry a second
442
00:16:43,959 –> 00:16:48,680
thing or use a second thing is pretty
443
00:16:46,560 –> 00:16:50,480
high so I don’t think C given that we’re
444
00:16:48,680 –> 00:16:53,399
all willing to pay for phones or most of
445
00:16:50,480 –> 00:16:56,639
us are I don’t think cheaper is the
446
00:16:53,399 –> 00:16:58,279
answer different is the answer then
447
00:16:56,639 –> 00:17:00,040
would there be like a specialized chip
448
00:16:58,279 –> 00:17:02,839
that would run the phone that was really
449
00:17:00,040 –> 00:17:04,640
good at powering a you know a phone size
450
00:17:02,839 –> 00:17:06,079
AI model probably but the phone
451
00:17:04,640 –> 00:17:07,360
manufacturers are going to do that for
452
00:17:06,079 –> 00:17:09,160
sure that doesn’t that doesn’t
453
00:17:07,360 –> 00:17:11,480
necessitate a new device I think you’d
454
00:17:09,160 –> 00:17:14,160
have to like find some really different
455
00:17:11,480 –> 00:17:17,760
interaction Paradigm that the technology
456
00:17:14,160 –> 00:17:19,799
enables uh and if I knew what it was I
457
00:17:17,760 –> 00:17:21,360
would be excited to be working on it
458
00:17:19,799 –> 00:17:23,039
right now but well you have you have
459
00:17:21,360 –> 00:17:24,760
voice working right now in the app in
460
00:17:23,039 –> 00:17:27,919
fact I set my action button on my phone
461
00:17:24,760 –> 00:17:29,559
to go directly to chat gpt’s voice app
462
00:17:27,919 –> 00:17:31,400
and I use it with my kids and they love
463
00:17:29,559 –> 00:17:32,919
it talking to it’s got latency issues
464
00:17:31,400 –> 00:17:34,320
but it’s really we’ll get we’ll we’ll
465
00:17:32,919 –> 00:17:37,400
get that we’ll get that better and I
466
00:17:34,320 –> 00:17:40,039
think voice is a hint to whatever the
467
00:17:37,400 –> 00:17:42,840
next thing is like if you can get voice
468
00:17:40,039 –> 00:17:44,760
interaction to be really good it
469
00:17:42,840 –> 00:17:46,400
feels I think that feels like a
470
00:17:44,760 –> 00:17:49,280
different way to use a computer but
471
00:17:46,400 –> 00:17:52,080
again with that by the way like what why
472
00:17:49,280 –> 00:17:55,480
is it not responsive and you know it
473
00:17:52,080 –> 00:17:58,240
feels like a CB you know like over over
474
00:17:55,480 –> 00:18:00,039
it’s really annoying to use you know uh
475
00:17:58,240 –> 00:18:01,679
in that way but it’s also brilliant when
476
00:18:00,039 –> 00:18:04,640
it gives you the right answer we are
477
00:18:01,679 –> 00:18:07,039
working on that uh it’s it’s so clunky
478
00:18:04,640 –> 00:18:09,960
right now it’s slow it’s like kind of
479
00:18:07,039 –> 00:18:12,159
doesn’t feel very smooth or authentic or
480
00:18:09,960 –> 00:18:13,000
organic like we’ll get all that to be
481
00:18:12,159 –> 00:18:16,440
much
482
00:18:13,000 –> 00:18:18,159
better what about computer vision I mean
483
00:18:16,440 –> 00:18:19,880
they have glasses or maybe you could
484
00:18:18,159 –> 00:18:21,440
wear a pendant I mean you take the
485
00:18:19,880 –> 00:18:24,960
combination
486
00:18:21,440 –> 00:18:27,600
of visual or video data combine it with
487
00:18:24,960 –> 00:18:29,400
voice and now ai knows everything that’s
488
00:18:27,600 –> 00:18:32,280
happening around you super powerful to
489
00:18:29,400 –> 00:18:35,000
be able to like the multimodality of
490
00:18:32,280 –> 00:18:37,000
saying like hey cha gbt what am I
491
00:18:35,000 –> 00:18:39,400
looking at or like what kind of plant is
492
00:18:37,000 –> 00:18:41,799
this I can’t quite tell
493
00:18:39,400 –> 00:18:44,039
um that’s obvious that that’s like a
494
00:18:41,799 –> 00:18:45,840
that’s another I think like hint but
495
00:18:44,039 –> 00:18:47,360
whether people want to wear glasses or
496
00:18:45,840 –> 00:18:50,679
like hold up something when they want
497
00:18:47,360 –> 00:18:53,720
that like I there’s a bunch of just like
498
00:18:50,679 –> 00:18:56,159
like the the sort of like societal
499
00:18:53,720 –> 00:18:57,720
interpersonal issues here are all very
500
00:18:56,159 –> 00:19:00,360
complicated about wearing a computer on
501
00:18:57,720 –> 00:19:02,159
your face um we we saw that with Google
502
00:19:00,360 –> 00:19:04,080
class people got punched in the face in
503
00:19:02,159 –> 00:19:06,000
the mission started a lot of I forgot
504
00:19:04,080 –> 00:19:07,799
about that I forgot about that so so I I
505
00:19:06,000 –> 00:19:10,320
think it’s
506
00:19:07,799 –> 00:19:13,360
like what are the apps that could be
507
00:19:10,320 –> 00:19:14,760
unlocked if AI was sort of ubiquitous on
508
00:19:13,360 –> 00:19:17,760
people’s
509
00:19:14,760 –> 00:19:21,600
phones do you have a sense of that or
510
00:19:17,760 –> 00:19:24,480
what would you want to see built
511
00:19:21,600 –> 00:19:29,240
uh I I think what I want is just this
512
00:19:24,480 –> 00:19:31,480
always on like super low friction
513
00:19:29,240 –> 00:19:33,919
thing where I
514
00:19:31,480 –> 00:19:36,039
can either by voice or by text or
515
00:19:33,919 –> 00:19:38,080
ideally like some other it just kind of
516
00:19:36,039 –> 00:19:39,640
knows what I want have this like
517
00:19:38,080 –> 00:19:41,559
constant thing helping me throughout my
518
00:19:39,640 –> 00:19:43,440
day that’s got like as much context on
519
00:19:41,559 –> 00:19:45,840
as possible it’s like the world’s
520
00:19:43,440 –> 00:19:48,360
greatest assistant um and it’s just this
521
00:19:45,840 –> 00:19:51,280
like thing working to make me better and
522
00:19:48,360 –> 00:19:52,400
better uh there’s there there’s like a I
523
00:19:51,280 –> 00:19:54,039
know when you hear people like talk
524
00:19:52,400 –> 00:19:56,720
about the AI future they’re imag they
525
00:19:54,039 –> 00:19:59,200
imagine there’s sort of
526
00:19:56,720 –> 00:20:00,320
two different approaches and don’t sound
527
00:19:59,200 –> 00:20:01,840
that different but I think they’re like
528
00:20:00,320 –> 00:20:04,840
very different for how we’ll design the
529
00:20:01,840 –> 00:20:09,159
system in practice there’s the I want an
530
00:20:04,840 –> 00:20:11,480
extension of myself um I want like a
531
00:20:09,159 –> 00:20:14,919
ghost or an alter ego or this thing that
532
00:20:11,480 –> 00:20:17,640
really like is me is acting on my behalf
533
00:20:14,919 –> 00:20:20,120
is um responding to emails not even
534
00:20:17,640 –> 00:20:24,480
telling me about it is is is sort of
535
00:20:20,120 –> 00:20:25,760
like it it becomes more me and is me and
536
00:20:24,480 –> 00:20:29,200
then there’s this other thing which is
537
00:20:25,760 –> 00:20:31,280
like I want a great senior employee
538
00:20:29,200 –> 00:20:32,919
it may get to know me very well I may
539
00:20:31,280 –> 00:20:34,559
delegate it you know you can like have
540
00:20:32,919 –> 00:20:36,480
access to my email and I’ll tell you the
541
00:20:34,559 –> 00:20:40,720
constraints but but I think of it as
542
00:20:36,480 –> 00:20:42,880
this like separate entity and I
543
00:20:40,720 –> 00:20:44,320
personally like the separate entity
544
00:20:42,880 –> 00:20:48,440
approach better and think that’s where
545
00:20:44,320 –> 00:20:51,280
we’re going to head um and so in that
546
00:20:48,440 –> 00:20:54,440
sense the thing is not you but it’s it’s
547
00:20:51,280 –> 00:20:57,799
like a always available always great
548
00:20:54,440 –> 00:20:59,760
super capable assistant executive agent
549
00:20:57,799 –> 00:21:01,960
in a way like gets out there working on
550
00:20:59,760 –> 00:21:04,120
your behalf and understands what you
551
00:21:01,960 –> 00:21:05,320
want and anticipates what you want is
552
00:21:04,120 –> 00:21:08,120
what I’m reading into what you’re saying
553
00:21:05,320 –> 00:21:10,559
I think there’d be agent like Behavior
554
00:21:08,120 –> 00:21:13,279
but there’s like a difference
555
00:21:10,559 –> 00:21:16,159
between a senior employee and an agent
556
00:21:13,279 –> 00:21:19,840
yeah and like I want it you know I think
557
00:21:16,159 –> 00:21:22,799
of like my I think like of it like one
558
00:21:19,840 –> 00:21:25,799
of the things that I like about a senior
559
00:21:22,799 –> 00:21:25,799
employee is
560
00:21:26,159 –> 00:21:31,279
they’ll they’ll push back on me they
561
00:21:28,960 –> 00:21:32,919
will sometimes not do something I ask or
562
00:21:31,279 –> 00:21:34,720
they sometimes will say like I can do
563
00:21:32,919 –> 00:21:35,880
that thing if you want but if I do it
564
00:21:34,720 –> 00:21:37,159
here’s what I think would happen and
565
00:21:35,880 –> 00:21:38,880
then this and then that and are you
566
00:21:37,159 –> 00:21:41,279
really
567
00:21:38,880 –> 00:21:42,960
sure I definitely want that kind of vibe
568
00:21:41,279 –> 00:21:46,080
which not not just like this thing that
569
00:21:42,960 –> 00:21:47,640
I asking it blindly does it can reason
570
00:21:46,080 –> 00:21:49,279
yeah yeah and push back it can reason it
571
00:21:47,640 –> 00:21:51,880
has like the kind of relationship with
572
00:21:49,279 –> 00:21:54,080
me that I would expect out of a really
573
00:21:51,880 –> 00:21:56,200
competent person that I worked with
574
00:21:54,080 –> 00:21:58,400
which is different from like a sycophant
575
00:21:56,200 –> 00:22:01,799
yeah the thing in that world where if
576
00:21:58,400 –> 00:22:06,080
you had this like Jarvis like thing that
577
00:22:01,799 –> 00:22:08,240
can reason what do you think it does to
578
00:22:06,080 –> 00:22:10,520
products that you use today where the
579
00:22:08,240 –> 00:22:12,880
interface is very valuable so for
580
00:22:10,520 –> 00:22:14,600
example if you look at an instacart or
581
00:22:12,880 –> 00:22:16,840
if you look at an Uber or if you look at
582
00:22:14,600 –> 00:22:19,240
a door Dash these are not services that
583
00:22:16,840 –> 00:22:22,360
are meant to be pipes that are just
584
00:22:19,240 –> 00:22:24,240
providing a set of apis to a Smart Set
585
00:22:22,360 –> 00:22:25,880
of agents that ubiquitously work on
586
00:22:24,240 –> 00:22:28,080
behalf of 8 billion
587
00:22:25,880 –> 00:22:29,840
people what do you think has to change
588
00:22:28,080 –> 00:22:31,919
in how we think about how apps need to
589
00:22:29,840 –> 00:22:33,640
work of how this entire infrastructure
590
00:22:31,919 –> 00:22:35,880
of experiences need to work in a world
591
00:22:33,640 –> 00:22:37,279
where you’re agentically interfacing to
592
00:22:35,880 –> 00:22:40,760
the world you I’m actually very
593
00:22:37,279 –> 00:22:44,320
interested in designing a world that is
594
00:22:40,760 –> 00:22:46,559
equally usable by humans and by AIS so I
595
00:22:44,320 –> 00:22:49,480
I
596
00:22:46,559 –> 00:22:51,279
I I I like the interpretability of that
597
00:22:49,480 –> 00:22:52,600
I like the smoothness of the handoffs I
598
00:22:51,279 –> 00:22:55,400
like the ability that we can provide
599
00:22:52,600 –> 00:22:59,480
feedback or whatever so you know door
600
00:22:55,400 –> 00:23:02,159
Dash could just expose some a pi to my
601
00:22:59,480 –> 00:23:04,440
future AI assistant and they could go
602
00:23:02,159 –> 00:23:06,120
put the order and whatever or I could
603
00:23:04,440 –> 00:23:08,840
say like I could be holding my phone and
604
00:23:06,120 –> 00:23:11,000
I could say okay AI assistant like you
605
00:23:08,840 –> 00:23:12,799
put in this order on door Dash please
606
00:23:11,000 –> 00:23:14,240
and I could like watch the app open and
607
00:23:12,799 –> 00:23:17,720
see the thing clicking around and I
608
00:23:14,240 –> 00:23:19,679
could say hey no not this or like um
609
00:23:17,720 –> 00:23:23,039
there there’s something about designing
610
00:23:19,679 –> 00:23:25,679
a world that is
611
00:23:23,039 –> 00:23:28,960
usable equally well by humans and AIS
612
00:23:25,679 –> 00:23:31,919
that I think is a interesting concept
613
00:23:28,960 –> 00:23:33,279
excited humanoid robots than sort of
614
00:23:31,919 –> 00:23:34,880
robots of like very other shapes the
615
00:23:33,279 –> 00:23:36,120
world is very much designed for humans
616
00:23:34,880 –> 00:23:39,279
and I think we should absolutely keep it
617
00:23:36,120 –> 00:23:42,679
that way and a shared interface is nice
618
00:23:39,279 –> 00:23:44,480
so you see voice chat that modality kind
619
00:23:42,679 –> 00:23:46,200
of gets rid of apps you just ask it for
620
00:23:44,480 –> 00:23:48,039
sushi it knows Sushi you like before it
621
00:23:46,200 –> 00:23:50,840
knows what you don’t like and does its
622
00:23:48,039 –> 00:23:53,200
best shot at doing it I it’s hard for me
623
00:23:50,840 –> 00:23:55,799
to imagine that we just go to a world
624
00:23:53,200 –> 00:23:57,880
totally where you say like hey Chachi BT
625
00:23:55,799 –> 00:23:59,320
order me sushi and it says okay do you
626
00:23:57,880 –> 00:24:02,880
want it from this restaurant what kind
627
00:23:59,320 –> 00:24:05,080
what time whatever I think user I think
628
00:24:02,880 –> 00:24:07,960
visual user interfaces are super good
629
00:24:05,080 –> 00:24:10,000
for a lot of things um and it’s hard of
630
00:24:07,960 –> 00:24:15,400
me to imagine like a world
631
00:24:10,000 –> 00:24:18,080
where youd never look at a screen and
632
00:24:15,400 –> 00:24:20,279
just use voice mode only but I I can’t
633
00:24:18,080 –> 00:24:22,200
imagine that for a lot of things yeah I
634
00:24:20,279 –> 00:24:23,840
mean Apple tried with Siri like you
635
00:24:22,200 –> 00:24:25,240
supposedly you can order an Uber
636
00:24:23,840 –> 00:24:27,919
automatically with Siri I don’t think
637
00:24:25,240 –> 00:24:29,559
anybody’s ever done it because it’s why
638
00:24:27,919 –> 00:24:31,320
would you take the risk of not the
639
00:24:29,559 –> 00:24:33,840
quality to your point the quality is not
640
00:24:31,320 –> 00:24:35,559
good but when the quality is good enough
641
00:24:33,840 –> 00:24:36,799
you’re you’ll actually prefer it just
642
00:24:35,559 –> 00:24:37,960
because it’s just lighter weight you
643
00:24:36,799 –> 00:24:40,039
don’t have to take your phone out you
644
00:24:37,960 –> 00:24:42,159
don’t have to search for your app and
645
00:24:40,039 –> 00:24:44,880
press it and oh it automatically logged
646
00:24:42,159 –> 00:24:46,120
you out oh hold on log back in oh TFA
647
00:24:44,880 –> 00:24:48,440
it’s a whole pain in the ass you know
648
00:24:46,120 –> 00:24:50,760
it’s like setting a timer with Siri I do
649
00:24:48,440 –> 00:24:52,279
every time because it works really well
650
00:24:50,760 –> 00:24:55,399
and it’s great and I don’t need more
651
00:24:52,279 –> 00:24:57,200
information but ordering an Uber like I
652
00:24:55,399 –> 00:24:58,760
want to see the prices for a few
653
00:24:57,200 –> 00:25:01,240
different options I want to see how far
654
00:24:58,760 –> 00:25:02,399
away it is I want to see like maybe even
655
00:25:01,240 –> 00:25:04,520
where they are on the map because I
656
00:25:02,399 –> 00:25:07,399
might walk somewhere I get a lot more
657
00:25:04,520 –> 00:25:09,360
information by I think in less time by
658
00:25:07,399 –> 00:25:10,559
looking at that order the Uber screen
659
00:25:09,360 –> 00:25:12,679
than I would if I had to do that all
660
00:25:10,559 –> 00:25:14,240
through the audio Channel I like your
661
00:25:12,679 –> 00:25:16,840
idea of watching it happen that’s kind
662
00:25:14,240 –> 00:25:18,080
of cool I think there will just be like
663
00:25:16,840 –> 00:25:20,120
yeah
664
00:25:18,080 –> 00:25:21,720
different there are different interfaces
665
00:25:20,120 –> 00:25:23,559
we use for different tasks and I think
666
00:25:21,720 –> 00:25:26,760
that’ll keep going of all the developers
667
00:25:23,559 –> 00:25:29,080
that are building apps and experiences
668
00:25:26,760 –> 00:25:30,480
on open AI are there few that stand out
669
00:25:29,080 –> 00:25:32,840
for you where you’re like okay this is
670
00:25:30,480 –> 00:25:34,640
directionally going in a super
671
00:25:32,840 –> 00:25:36,480
interesting area even if it’s like a toy
672
00:25:34,640 –> 00:25:39,720
app but are there things that you guys
673
00:25:36,480 –> 00:25:44,000
point to and say this is really
674
00:25:39,720 –> 00:25:45,640
important um I met with a new company
675
00:25:44,000 –> 00:25:46,880
this morning or and bar even a company
676
00:25:45,640 –> 00:25:48,880
it’s like two people that are going to
677
00:25:46,880 –> 00:25:52,000
work on a summer project trying to
678
00:25:48,880 –> 00:25:53,399
actually finally make the AI tutor like
679
00:25:52,000 –> 00:25:54,960
and I’ve always been interested in this
680
00:25:53,399 –> 00:25:57,240
space a lot of people have done great
681
00:25:54,960 –> 00:25:58,760
stuff on our platform but if if someone
682
00:25:57,240 –> 00:26:03,480
can deliver
683
00:25:58,760 –> 00:26:05,679
like the way that you actually
684
00:26:03,480 –> 00:26:07,159
like H they used a phrase I love which
685
00:26:05,679 –> 00:26:08,880
is this is going to be like a monor
686
00:26:07,159 –> 00:26:11,320
level reinvention for how people how
687
00:26:08,880 –> 00:26:12,640
people learn things yeah um but if you
688
00:26:11,320 –> 00:26:14,559
can like find this new way to like let
689
00:26:12,640 –> 00:26:17,360
people explore and learn and new ways on
690
00:26:14,559 –> 00:26:20,440
their own I’m personally super excited
691
00:26:17,360 –> 00:26:21,640
about that um a lot of the coding
692
00:26:20,440 –> 00:26:23,720
related stuff you mentioned Devon
693
00:26:21,640 –> 00:26:26,159
earlier I think that’s like a super cool
694
00:26:23,720 –> 00:26:28,960
vision of the future the thing that I am
695
00:26:26,159 –> 00:26:32,440
Health Healthcare I I believe
696
00:26:28,960 –> 00:26:34,000
should be pretty transformed by this but
697
00:26:32,440 –> 00:26:38,000
the thing I’m personally most excited
698
00:26:34,000 –> 00:26:40,919
about is the sort of doing faster and
699
00:26:38,000 –> 00:26:42,640
better scientific discovery gp4 clearly
700
00:26:40,919 –> 00:26:44,000
not there in a big way although maybe it
701
00:26:42,640 –> 00:26:45,720
accelerates things a little bit by
702
00:26:44,000 –> 00:26:50,600
making scientists more
703
00:26:45,720 –> 00:26:54,480
productive but Alpha 43 yeah that’s like
704
00:26:50,600 –> 00:26:58,520
but Sam that will be a Triumph those are
705
00:26:54,480 –> 00:27:01,640
not like these these models are train
706
00:26:58,520 –> 00:27:04,320
Tred and built differently than the
707
00:27:01,640 –> 00:27:06,320
language models I mean to some obviously
708
00:27:04,320 –> 00:27:08,360
there’s a lot that’s similar but there’s
709
00:27:06,320 –> 00:27:09,840
a lot um there’s kind of a groundup
710
00:27:08,360 –> 00:27:12,039
architecture to a lot of these models
711
00:27:09,840 –> 00:27:14,840
that are being applied to these specific
712
00:27:12,039 –> 00:27:17,960
problem sets these specific applications
713
00:27:14,840 –> 00:27:20,320
like chemistry interaction modeling for
714
00:27:17,960 –> 00:27:21,960
example does you you’ll need some of
715
00:27:20,320 –> 00:27:24,440
that for sure but the the thing that I
716
00:27:21,960 –> 00:27:25,640
think we’re missing across the board for
717
00:27:24,440 –> 00:27:28,960
many of these things we’ve been talking
718
00:27:25,640 –> 00:27:30,240
about is models that can do reason
719
00:27:28,960 –> 00:27:32,880
and once you have reasoning you can
720
00:27:30,240 –> 00:27:34,520
connect it to chemistry simulators or
721
00:27:32,880 –> 00:27:37,279
guess yeah that’s the important question
722
00:27:34,520 –> 00:27:38,880
I wanted to kind of talk about today was
723
00:27:37,279 –> 00:27:41,640
this idea
724
00:27:38,880 –> 00:27:44,320
of networks of models people talk a lot
725
00:27:41,640 –> 00:27:47,519
about agents as if there’s kind of this
726
00:27:44,320 –> 00:27:49,960
linear set of call functions that happen
727
00:27:47,519 –> 00:27:53,880
but one of the things that
728
00:27:49,960 –> 00:27:56,320
arises in biology is networks of systems
729
00:27:53,880 –> 00:27:58,640
that have cross interactions that the
730
00:27:56,320 –> 00:28:00,279
aggregation of the system the
731
00:27:58,640 –> 00:28:02,480
aggregation of the network produces an
732
00:28:00,279 –> 00:28:04,640
output rather than one thing calling
733
00:28:02,480 –> 00:28:06,200
another that thing calling another do we
734
00:28:04,640 –> 00:28:07,720
see like an emergence in this
735
00:28:06,200 –> 00:28:10,000
architecture of either specialized
736
00:28:07,720 –> 00:28:12,559
models or network models that work
737
00:28:10,000 –> 00:28:14,440
together to address bigger problem sets
738
00:28:12,559 –> 00:28:16,279
use reasoning there’s computational
739
00:28:14,440 –> 00:28:17,880
models that do things like chemistry or
740
00:28:16,279 –> 00:28:20,320
arithmetic and there’s other models that
741
00:28:17,880 –> 00:28:23,200
do rather than one model to rule them
742
00:28:20,320 –> 00:28:23,200
all that’s purely
743
00:28:23,919 –> 00:28:29,080
generalized I don’t know um
744
00:28:29,519 –> 00:28:33,919
I don’t know how much
745
00:28:31,799 –> 00:28:36,080
reasoning is going to turn out to be a
746
00:28:33,919 –> 00:28:38,360
super generalizable thing I suspect it
747
00:28:36,080 –> 00:28:40,159
will but that’s more just like an
748
00:28:38,360 –> 00:28:42,360
intuition and a hope and it would be
749
00:28:40,159 –> 00:28:44,880
nice if it worked out that way I I don’t
750
00:28:42,360 –> 00:28:44,880
know if that’s
751
00:28:45,360 –> 00:28:49,799
like but let’s walk through the the
752
00:28:47,960 –> 00:28:52,720
protein modeling
753
00:28:49,799 –> 00:28:55,080
example there’s a bunch of training data
754
00:28:52,720 –> 00:28:57,120
images of proteins and then sequence
755
00:28:55,080 –> 00:28:59,039
data and they build a model predictive
756
00:28:57,120 –> 00:29:01,279
model and they have a set of processes
757
00:28:59,039 –> 00:29:04,080
and steps for doing that do you envision
758
00:29:01,279 –> 00:29:05,559
that there’s this artificial general
759
00:29:04,080 –> 00:29:07,399
intelligence or this great reasoning
760
00:29:05,559 –> 00:29:08,840
model that then figures out how to build
761
00:29:07,399 –> 00:29:10,919
that submodel that figures out how to
762
00:29:08,840 –> 00:29:13,840
solve that problem by acquiring the
763
00:29:10,919 –> 00:29:15,679
necessary data and then resolving there
764
00:29:13,840 –> 00:29:17,720
so many ways where that could go like
765
00:29:15,679 –> 00:29:20,519
maybe it is it trains a literal model
766
00:29:17,720 –> 00:29:22,880
for it or maybe it just like knows the
767
00:29:20,519 –> 00:29:24,360
one big model what it can like go pick
768
00:29:22,880 –> 00:29:27,039
what other training data it needs and
769
00:29:24,360 –> 00:29:29,360
ask a question and then update on that
770
00:29:27,039 –> 00:29:31,080
um I guess the real question is are all
771
00:29:29,360 –> 00:29:32,480
these startups going to die because so
772
00:29:31,080 –> 00:29:34,720
many startups are working in that
773
00:29:32,480 –> 00:29:36,279
modality which is go get special data
774
00:29:34,720 –> 00:29:38,279
and then train a new model on that
775
00:29:36,279 –> 00:29:40,600
special data from the ground up and then
776
00:29:38,279 –> 00:29:42,080
it only does that one sort of thing and
777
00:29:40,600 –> 00:29:43,240
it works really well at that one thing
778
00:29:42,080 –> 00:29:44,320
and it works better than anything else
779
00:29:43,240 –> 00:29:49,000
at that one thing you know there there’s
780
00:29:44,320 –> 00:29:51,840
like a version of this I think you can
781
00:29:49,000 –> 00:29:53,600
like already see when you were when you
782
00:29:51,840 –> 00:29:54,840
were talking about like biology and
783
00:29:53,600 –> 00:29:56,720
these complicated networks of systems
784
00:29:54,840 –> 00:30:00,039
the reason I was smiling I I got super
785
00:29:56,720 –> 00:30:02,039
sick recently and I’m mostly better now
786
00:30:00,039 –> 00:30:04,000
but it was just like body like got beat
787
00:30:02,039 –> 00:30:05,880
up like one system at a time F like you
788
00:30:04,000 –> 00:30:10,039
can really tell like okay it’s this
789
00:30:05,880 –> 00:30:11,080
cating thing and uh and that reminded me
790
00:30:10,039 –> 00:30:12,960
of you like talking about the like
791
00:30:11,080 –> 00:30:14,399
biology is just these like you have no
792
00:30:12,960 –> 00:30:15,760
idea how much these systems interact
793
00:30:14,399 –> 00:30:17,360
with each other until things start going
794
00:30:15,760 –> 00:30:21,159
wrong and that was sort of like
795
00:30:17,360 –> 00:30:24,200
interesting to see but I was using
796
00:30:21,159 –> 00:30:26,399
um I was like using chat GPT uh to try
797
00:30:24,200 –> 00:30:28,440
to like figure out like what was
798
00:30:26,399 –> 00:30:29,960
happening whatever and and and would say
799
00:30:28,440 –> 00:30:31,880
well I’m you know unsure of this one
800
00:30:29,960 –> 00:30:33,960
thing and then I just like posted a
801
00:30:31,880 –> 00:30:36,240
paper on it without even reading the
802
00:30:33,960 –> 00:30:37,679
paper um like in the context and it says
803
00:30:36,240 –> 00:30:40,320
oh that was the thing I was unsure of
804
00:30:37,679 –> 00:30:41,840
like now I think this instead so there’s
805
00:30:40,320 –> 00:30:44,559
like a that was like a small version of
806
00:30:41,840 –> 00:30:46,760
what you’re talking about where you can
807
00:30:44,559 –> 00:30:47,919
like can say this I don’t I don’t know
808
00:30:46,760 –> 00:30:48,919
this thing you can put more information
809
00:30:47,919 –> 00:30:51,240
you don’t retrain the model you’re just
810
00:30:48,919 –> 00:30:53,080
adding it to the context here and now
811
00:30:51,240 –> 00:30:55,720
you’re getting a so these models that
812
00:30:53,080 –> 00:30:57,559
are predicting protein structure like
813
00:30:55,720 –> 00:31:01,360
let’s say right this is the whole basis
814
00:30:57,559 –> 00:31:05,080
and now now other molecules at Alpha
815
00:31:01,360 –> 00:31:07,000
3 can they can yeah I mean is it
816
00:31:05,080 –> 00:31:09,399
basically a world where the best
817
00:31:07,000 –> 00:31:11,240
generalized model goes in and gets that
818
00:31:09,399 –> 00:31:13,799
training data and then figures out on
819
00:31:11,240 –> 00:31:15,360
its own and maybe you could maybe you
820
00:31:13,799 –> 00:31:17,440
could use an example for us can you tell
821
00:31:15,360 –> 00:31:20,480
us about Sora your video model that
822
00:31:17,440 –> 00:31:22,919
generates amazing moving images moving
823
00:31:20,480 –> 00:31:24,480
video and and what’s different about the
824
00:31:22,919 –> 00:31:27,120
architecture there whatever you’re
825
00:31:24,480 –> 00:31:28,480
willing to share on how how that is
826
00:31:27,120 –> 00:31:33,760
different
827
00:31:28,480 –> 00:31:33,760
yeah so my on the general thing first
828
00:31:34,279 –> 00:31:42,240
my you clearly will need
829
00:31:38,840 –> 00:31:45,720
specialized simulators connectors pieces
830
00:31:42,240 –> 00:31:47,559
of data whatever but my
831
00:31:45,720 –> 00:31:49,000
intuition and again I don’t have this
832
00:31:47,559 –> 00:31:51,120
like backed up with science my intuition
833
00:31:49,000 –> 00:31:53,760
would be if we can figure out the core
834
00:31:51,120 –> 00:31:56,480
of generalized reasoning connecting that
835
00:31:53,760 –> 00:31:59,000
to new problem domains in the same way
836
00:31:56,480 –> 00:32:02,360
that humans are generalized reason
837
00:31:59,000 –> 00:32:05,360
ERS would I think be be doable it’s like
838
00:32:02,360 –> 00:32:09,440
a fast unlock faster unlock than I think
839
00:32:05,360 –> 00:32:09,440
I I think so
840
00:32:10,000 –> 00:32:15,559
um but yeah you Sora like does not start
841
00:32:13,200 –> 00:32:17,360
with a language model um it’s that
842
00:32:15,559 –> 00:32:21,880
that’s a model that is like customized
843
00:32:17,360 –> 00:32:24,600
to do video uh and and so like we’re
844
00:32:21,880 –> 00:32:26,559
clearly not at that world yet right so
845
00:32:24,600 –> 00:32:28,399
you guys so just as an example for you
846
00:32:26,559 –> 00:32:30,480
guys to build a good
847
00:32:28,399 –> 00:32:33,159
video model you built it from scratch
848
00:32:30,480 –> 00:32:36,120
using I’m assuming some different
849
00:32:33,159 –> 00:32:39,200
architecture and different data but in
850
00:32:36,120 –> 00:32:42,279
the future the generalized reasoning
851
00:32:39,200 –> 00:32:44,159
system the the AGI whatever system
852
00:32:42,279 –> 00:32:46,760
theoretically could render that by
853
00:32:44,159 –> 00:32:48,880
figuring out how to do it yeah I mean
854
00:32:46,760 –> 00:32:51,080
one example of this is like okay you
855
00:32:48,880 –> 00:32:52,360
know as far as I know all the best text
856
00:32:51,080 –> 00:32:54,760
models in the world are still a lot of
857
00:32:52,360 –> 00:32:56,440
regressive models and the best image and
858
00:32:54,760 –> 00:32:59,919
video models are diffusion models and
859
00:32:56,440 –> 00:33:04,000
that’s like sort strange in some sense
860
00:32:59,919 –> 00:33:06,240
yeah yeah so there’s a big debate about
861
00:33:04,000 –> 00:33:08,320
uh training data you guys have been I
862
00:33:06,240 –> 00:33:11,919
think the most thoughtful of any company
863
00:33:08,320 –> 00:33:14,519
you’ve got licensing deals now ft Etc
864
00:33:11,919 –> 00:33:16,039
and we got to gu be gentle here because
865
00:33:14,519 –> 00:33:17,720
you’re involved in a New York Times
866
00:33:16,039 –> 00:33:19,639
lawsuit you weren’t able to settle I
867
00:33:17,720 –> 00:33:23,000
guess an arrangement with them for
868
00:33:19,639 –> 00:33:24,919
training data how do you think about
869
00:33:23,000 –> 00:33:27,399
fairness in fair
870
00:33:24,919 –> 00:33:28,559
use we’ve had big debates here on the
871
00:33:27,399 –> 00:33:30,880
pod
872
00:33:28,559 –> 00:33:32,639
obviously your actions are you know
873
00:33:30,880 –> 00:33:35,039
speak volumes that you’re trying to be
874
00:33:32,639 –> 00:33:37,919
fair by doing licensing deals so what
875
00:33:35,039 –> 00:33:40,880
what’s your personal position on the
876
00:33:37,919 –> 00:33:45,039
rights of artists who create beautiful
877
00:33:40,880 –> 00:33:46,799
music lyrics books and you taking that
878
00:33:45,039 –> 00:33:49,600
and then making a derivative product out
879
00:33:46,799 –> 00:33:52,000
of it and and then monetizing it and and
880
00:33:49,600 –> 00:33:55,200
what’s fair here and how do we get to a
881
00:33:52,000 –> 00:33:57,240
world where you know artists can make
882
00:33:55,200 –> 00:33:59,000
content in the world and then decide
883
00:33:57,240 –> 00:34:01,480
what they want other people to do with
884
00:33:59,000 –> 00:34:02,600
it yeah and and I’m just curious your
885
00:34:01,480 –> 00:34:04,399
personal belief because I know you to be
886
00:34:02,600 –> 00:34:07,080
a thoughtful person on this and I know a
887
00:34:04,399 –> 00:34:08,879
lot of other people in our industry are
888
00:34:07,080 –> 00:34:10,079
not very thoughtful about how they think
889
00:34:08,879 –> 00:34:12,639
about content
890
00:34:10,079 –> 00:34:14,399
creators so I think it’s very different
891
00:34:12,639 –> 00:34:16,359
for different kinds of I mean look on
892
00:34:14,399 –> 00:34:18,760
unfair use I think we have a a very
893
00:34:16,359 –> 00:34:22,040
reasonable position under the current
894
00:34:18,760 –> 00:34:23,520
law but I think AI is so different that
895
00:34:22,040 –> 00:34:25,760
for things like art we’ll need to think
896
00:34:23,520 –> 00:34:29,800
about them in different ways but I would
897
00:34:25,760 –> 00:34:34,040
say if you go read a bunch of math on
898
00:34:29,800 –> 00:34:37,399
the internet and learn how to do math
899
00:34:34,040 –> 00:34:38,960
that I think seems unobjectionable to
900
00:34:37,399 –> 00:34:40,240
most people and then there’s like you
901
00:34:38,960 –> 00:34:41,639
know another set of people who might
902
00:34:40,240 –> 00:34:43,839
have a different opinion well what if
903
00:34:41,639 –> 00:34:43,839
you
904
00:34:45,679 –> 00:34:48,760
like actually let me not get into that
905
00:34:47,639 –> 00:34:50,040
just in the interest of not making this
906
00:34:48,760 –> 00:34:51,560
answer too long so I think there’s like
907
00:34:50,040 –> 00:34:54,359
one category people are like okay
908
00:34:51,560 –> 00:34:56,679
there’s like generalized human knowledge
909
00:34:54,359 –> 00:34:58,160
you can kind of like go if you learn
910
00:34:56,679 –> 00:34:59,760
that like that’s
911
00:34:58,160 –> 00:35:01,240
that that’s like open domain or
912
00:34:59,760 –> 00:35:04,720
something if you kind of go learn about
913
00:35:01,240 –> 00:35:06,000
the Pythagorean theorem um that’s one
914
00:35:04,720 –> 00:35:10,160
end of the spectrum and then I think the
915
00:35:06,000 –> 00:35:14,200
other extreme end of the spectrum is um
916
00:35:10,160 –> 00:35:16,359
is Art and maybe even like more than
917
00:35:14,200 –> 00:35:19,280
more specifically I would say it’s like
918
00:35:16,359 –> 00:35:21,720
doing it’s a system generating art in
919
00:35:19,280 –> 00:35:24,480
the style or the likeness of another
920
00:35:21,720 –> 00:35:25,520
artist um would be kind of the furthest
921
00:35:24,480 –> 00:35:28,280
end of
922
00:35:25,520 –> 00:35:31,000
that and then there’s many many cases on
923
00:35:28,280 –> 00:35:33,760
the Spectrum in between
924
00:35:31,000 –> 00:35:36,320
uh I think the conversation has been
925
00:35:33,760 –> 00:35:38,520
historically very caught up on training
926
00:35:36,320 –> 00:35:40,079
data but it will increasingly become
927
00:35:38,520 –> 00:35:43,640
more about what happens at inference
928
00:35:40,079 –> 00:35:48,800
time as training data
929
00:35:43,640 –> 00:35:52,240
becomes less valuable and the what the
930
00:35:48,800 –> 00:35:55,880
system does accessing you know
931
00:35:52,240 –> 00:35:58,960
information in in context in real time
932
00:35:55,880 –> 00:36:00,400
or uh you know know taking like like
933
00:35:58,960 –> 00:36:02,560
something like that what happens at
934
00:36:00,400 –> 00:36:04,599
inference time will become more debated
935
00:36:02,560 –> 00:36:08,200
and and how the what the new economic
936
00:36:04,599 –> 00:36:11,119
model is there so if you say like
937
00:36:08,200 –> 00:36:13,920
uh if you say like create me a song in
938
00:36:11,119 –> 00:36:16,319
this in the style of Taylor
939
00:36:13,920 –> 00:36:19,119
Swift even if the model were never
940
00:36:16,319 –> 00:36:21,119
trained on any Taylor Swift songs at all
941
00:36:19,119 –> 00:36:22,520
you can still have a problem which is it
942
00:36:21,119 –> 00:36:24,319
may have read about Taylor Swift it may
943
00:36:22,520 –> 00:36:26,800
know about her themes Taylor Swift means
944
00:36:24,319 –> 00:36:28,960
something and then and then the question
945
00:36:26,800 –> 00:36:30,440
is like that model even if it were never
946
00:36:28,960 –> 00:36:33,520
trained on any Taylor Swift song
947
00:36:30,440 –> 00:36:38,119
whatsoever be allowed to do that and if
948
00:36:33,520 –> 00:36:40,040
so um how should Taylor get paid right
949
00:36:38,119 –> 00:36:41,440
so I think there’s an optin opt out in
950
00:36:40,040 –> 00:36:43,800
that case first of all and then there’s
951
00:36:41,440 –> 00:36:45,800
an economic model um staying on the
952
00:36:43,800 –> 00:36:48,520
music example there is something
953
00:36:45,800 –> 00:36:50,640
interesting to look at from the
954
00:36:48,520 –> 00:36:52,119
historical perspective here which is uh
955
00:36:50,640 –> 00:36:54,240
sampling and how the economics around
956
00:36:52,119 –> 00:36:55,640
that work this is not quite the same
957
00:36:54,240 –> 00:36:56,800
thing but it’s like an interesting place
958
00:36:55,640 –> 00:36:59,240
to start looking Sam let me just
959
00:36:56,800 –> 00:37:01,240
challenge that what’s the difference in
960
00:36:59,240 –> 00:37:03,839
the example you’re giving of the model
961
00:37:01,240 –> 00:37:06,839
learning about things like song
962
00:37:03,839 –> 00:37:09,400
structure Tempo Melody Harmony
963
00:37:06,839 –> 00:37:10,880
relationships all the discovering all
964
00:37:09,400 –> 00:37:13,319
the underlying structure that makes
965
00:37:10,880 –> 00:37:16,440
music successful and then building new
966
00:37:13,319 –> 00:37:19,359
music using training data and what a
967
00:37:16,440 –> 00:37:21,400
human does that listens to lots of music
968
00:37:19,359 –> 00:37:23,160
learns about and and their brain is
969
00:37:21,400 –> 00:37:25,440
processing and building all those same
970
00:37:23,160 –> 00:37:28,560
sort of predictive models or those same
971
00:37:25,440 –> 00:37:30,839
sort of uh discoveries understandings
972
00:37:28,560 –> 00:37:33,200
what’s the difference here and why why
973
00:37:30,839 –> 00:37:36,280
are you making the case that perhaps
974
00:37:33,200 –> 00:37:37,960
artists should be uniquely paid this is
975
00:37:36,280 –> 00:37:39,640
not a sampling situation you’re not the
976
00:37:37,960 –> 00:37:41,760
AI is not outputting and it’s not
977
00:37:39,640 –> 00:37:44,520
storing in the model the actual original
978
00:37:41,760 –> 00:37:46,400
song it’s learning structure right I
979
00:37:44,520 –> 00:37:47,760
wasn’t trying to make that that point
980
00:37:46,400 –> 00:37:50,079
because I agree like in the same way
981
00:37:47,760 –> 00:37:51,960
that humans are inspired by other humans
982
00:37:50,079 –> 00:37:54,280
I was saying if you if you say generate
983
00:37:51,960 –> 00:37:56,760
me a song in the style of Taylor Swift I
984
00:37:54,280 –> 00:37:59,359
see right okay where the prompt
985
00:37:56,760 –> 00:38:01,119
leverages some artist I I think
986
00:37:59,359 –> 00:38:03,680
personally that’s a different case would
987
00:38:01,119 –> 00:38:05,960
you be comfortable asking or would you
988
00:38:03,680 –> 00:38:08,119
be comfortable letting the model train
989
00:38:05,960 –> 00:38:09,839
itself well a music model being trained
990
00:38:08,119 –> 00:38:12,960
on the whole Corpus of music that humans
991
00:38:09,839 –> 00:38:15,599
have created without royalties being
992
00:38:12,960 –> 00:38:17,359
paid to the artists that um that music
993
00:38:15,599 –> 00:38:19,280
is being fed in and then you’re not
994
00:38:17,359 –> 00:38:20,760
allowed to ask you know artists specific
995
00:38:19,280 –> 00:38:23,040
prompts you could just say hey pay me a
996
00:38:20,760 –> 00:38:25,359
play me a a really cool pop song that’s
997
00:38:23,040 –> 00:38:27,760
fairly modern about heartbreak you know
998
00:38:25,359 –> 00:38:30,240
with a female voice you know we have
999
00:38:27,760 –> 00:38:32,760
currently made the decision not to do
1000
00:38:30,240 –> 00:38:34,200
music and partly because exactly these
1001
00:38:32,760 –> 00:38:37,079
questions of where you draw the lines
1002
00:38:34,200 –> 00:38:39,720
and you know what like
1003
00:38:37,079 –> 00:38:41,200
even I was meeting with several
1004
00:38:39,720 –> 00:38:42,319
musicians I really admire recently I was
1005
00:38:41,200 –> 00:38:46,960
just trying to like talk about some of
1006
00:38:42,319 –> 00:38:48,760
these edge cases but even the world in
1007
00:38:46,960 –> 00:38:53,480
which if
1008
00:38:48,760 –> 00:38:55,240
we went and let’s say we paid 10,000
1009
00:38:53,480 –> 00:38:57,040
musicians to create a bunch of music
1010
00:38:55,240 –> 00:38:58,240
just to make a great training set where
1011
00:38:57,040 –> 00:39:03,079
the music music model could learn
1012
00:38:58,240 –> 00:39:06,640
everything about strong s structure um
1013
00:39:03,079 –> 00:39:09,040
and what makes a good catchy beat and
1014
00:39:06,640 –> 00:39:10,040
everything else um and only trained on
1015
00:39:09,040 –> 00:39:11,720
that let’s say we could still make a
1016
00:39:10,040 –> 00:39:14,160
great music model which maybe maybe we
1017
00:39:11,720 –> 00:39:15,520
could um you know I was kind of like
1018
00:39:14,160 –> 00:39:17,160
posing that as a thought experiment to
1019
00:39:15,520 –> 00:39:18,920
musicians and they’re like well I can’t
1020
00:39:17,160 –> 00:39:21,160
object to that on any principal basis at
1021
00:39:18,920 –> 00:39:22,400
that point um and yet there’s still
1022
00:39:21,160 –> 00:39:25,240
something I don’t like about it now
1023
00:39:22,400 –> 00:39:28,800
that’s not a reason not to do it um
1024
00:39:25,240 –> 00:39:30,920
necessarily but it is
1025
00:39:28,800 –> 00:39:32,240
did you see that ad that Apple put out
1026
00:39:30,920 –> 00:39:34,280
maybe it was yesterday or something of
1027
00:39:32,240 –> 00:39:35,960
like squishing all of human creativity
1028
00:39:34,280 –> 00:39:37,319
down into one really thin iPad what was
1029
00:39:35,960 –> 00:39:40,480
your take on
1030
00:39:37,319 –> 00:39:42,880
it uh people got really emotional about
1031
00:39:40,480 –> 00:39:46,800
it yeah it’s stronger reaction than you
1032
00:39:42,880 –> 00:39:46,800
would think yeah there’s something
1033
00:39:47,800 –> 00:39:53,079
about I’m obviously hugely positive on
1034
00:39:50,359 –> 00:39:55,319
AI but there is something that I think
1035
00:39:53,079 –> 00:39:57,760
is beautiful about human creativity and
1036
00:39:55,319 –> 00:39:59,240
human artistic expression and and you
1037
00:39:57,760 –> 00:40:01,400
know for an AI that just does better
1038
00:39:59,240 –> 00:40:03,319
science like great bring that on but an
1039
00:40:01,400 –> 00:40:05,920
AI that is going to do this like deeply
1040
00:40:03,319 –> 00:40:08,839
beautiful human creative expression I
1041
00:40:05,920 –> 00:40:10,760
think we should like figure out it’s
1042
00:40:08,839 –> 00:40:12,200
going to happen it’s going to be a tool
1043
00:40:10,760 –> 00:40:13,560
that will lead us to Greater creative
1044
00:40:12,200 –> 00:40:15,319
Heights but I think we should figure out
1045
00:40:13,560 –> 00:40:17,240
how to do it in a way that like
1046
00:40:15,319 –> 00:40:20,760
preserves the spirit of what we all care
1047
00:40:17,240 –> 00:40:25,079
about here and I I think your actions
1048
00:40:20,760 –> 00:40:27,839
speak loudly we were trying to do Star
1049
00:40:25,079 –> 00:40:30,040
Wars characters in dolly
1050
00:40:27,839 –> 00:40:31,920
and if you ask for Darth Vader it says
1051
00:40:30,040 –> 00:40:34,200
hey we can’t do that so you’ve I guess
1052
00:40:31,920 –> 00:40:36,200
red teed or whatever you call it
1053
00:40:34,200 –> 00:40:38,560
internally yeah you’re not allowing
1054
00:40:36,200 –> 00:40:40,000
people to use other people’s IP so
1055
00:40:38,560 –> 00:40:42,480
you’ve taken that decision now if you
1056
00:40:40,000 –> 00:40:45,040
asked it to make a Jedi Bulldog or a
1057
00:40:42,480 –> 00:40:47,520
Sith Lord Bulldog which I did it made my
1058
00:40:45,040 –> 00:40:49,359
Bulldogs a Sith bulldogs so there’s an
1059
00:40:47,520 –> 00:40:50,760
interesting question about your spectrum
1060
00:40:49,359 –> 00:40:53,079
right yeah you know we put out this
1061
00:40:50,760 –> 00:40:56,000
thing yesterday called the spec um where
1062
00:40:53,079 –> 00:40:58,240
we’re trying to say here are here’s
1063
00:40:56,000 –> 00:41:01,400
here’s how our model is supposed to
1064
00:40:58,240 –> 00:41:03,200
behave and it’s very hard it’s a long
1065
00:41:01,400 –> 00:41:05,079
document it’s very hard to like specify
1066
00:41:03,200 –> 00:41:07,280
exactly in each case where the limit
1067
00:41:05,079 –> 00:41:08,839
should be and I view this as like a
1068
00:41:07,280 –> 00:41:12,720
discussion that’s going to need a lot
1069
00:41:08,839 –> 00:41:14,280
more input um but but these sorts of
1070
00:41:12,720 –> 00:41:16,599
questions
1071
00:41:14,280 –> 00:41:18,680
about okay maybe it shouldn’t generate
1072
00:41:16,599 –> 00:41:20,880
Darth Vader but the idea of a Sith Lord
1073
00:41:18,680 –> 00:41:22,920
or a Sith style thing or Jedi at this
1074
00:41:20,880 –> 00:41:25,040
point is like part of the culture like
1075
00:41:22,920 –> 00:41:27,079
like these are these are all hard
1076
00:41:25,040 –> 00:41:29,880
decisions yeah and and I think you’re
1077
00:41:27,079 –> 00:41:31,920
right the music industry is going to
1078
00:41:29,880 –> 00:41:33,680
consider this opportunity to make tlor
1079
00:41:31,920 –> 00:41:36,200
Swift songs their opportunity it’s part
1080
00:41:33,680 –> 00:41:39,240
of the four-part fair use test is you
1081
00:41:36,200 –> 00:41:41,119
know these who gets to capitalize on new
1082
00:41:39,240 –> 00:41:44,240
Innovations for existing art and and
1083
00:41:41,119 –> 00:41:47,040
Disney has an argument that hey you know
1084
00:41:44,240 –> 00:41:49,680
if if you’re GNA make Sora versions of a
1085
00:41:47,040 –> 00:41:51,599
aoka or whatever Obi-Wan Kenobi that’s
1086
00:41:49,680 –> 00:41:53,319
Disney’s opportunity and that’s a great
1087
00:41:51,599 –> 00:41:56,599
partnership for
1088
00:41:53,319 –> 00:41:58,599
you you know to pursue so we’re I think
1089
00:41:56,599 –> 00:42:01,880
this section I would label as AI in the
1090
00:41:58,599 –> 00:42:04,880
law so let me ask maybe a higher level
1091
00:42:01,880 –> 00:42:07,920
question what does it mean when people
1092
00:42:04,880 –> 00:42:09,920
say regulate AI totally Sam what does it
1093
00:42:07,920 –> 00:42:12,920
what does that even mean and comment on
1094
00:42:09,920 –> 00:42:15,720
California’s new proposed regulations as
1095
00:42:12,920 –> 00:42:17,160
well a few if you’re up for it uh I’m
1096
00:42:15,720 –> 00:42:18,640
concerned I mean there’s so many
1097
00:42:17,160 –> 00:42:20,200
proposed regulations but most of the
1098
00:42:18,640 –> 00:42:22,400
ones I’ve seen on the California state
1099
00:42:20,200 –> 00:42:24,440
things I’m concerned about I also have a
1100
00:42:22,400 –> 00:42:27,240
general fear of the states all doing
1101
00:42:24,440 –> 00:42:30,000
this them themselves um when people say
1102
00:42:27,240 –> 00:42:31,400
regulate AI I don’t think they mean one
1103
00:42:30,000 –> 00:42:33,640
thing I think there’s like some people
1104
00:42:31,400 –> 00:42:35,040
are like B the whole thing some people
1105
00:42:33,640 –> 00:42:37,559
like don’t allow it to be open source
1106
00:42:35,040 –> 00:42:40,000
required to be open source um the thing
1107
00:42:37,559 –> 00:42:43,040
that I am personally most interested in
1108
00:42:40,000 –> 00:42:45,359
is I think there will
1109
00:42:43,040 –> 00:42:46,720
come look I may be wrong about this I
1110
00:42:45,359 –> 00:42:48,040
will acknowledge that this is a
1111
00:42:46,720 –> 00:42:49,880
forward-looking statement and those are
1112
00:42:48,040 –> 00:42:52,040
always dangerous to make but I think
1113
00:42:49,880 –> 00:42:53,680
there will come a time in the not super
1114
00:42:52,040 –> 00:42:55,040
distant future like you know we’re not
1115
00:42:53,680 –> 00:42:58,880
talking like decades and decades from
1116
00:42:55,040 –> 00:43:02,040
now where AI say the frontier AI systems
1117
00:42:58,880 –> 00:43:06,440
are capable of causing
1118
00:43:02,040 –> 00:43:08,480
significant Global harm and for those
1119
00:43:06,440 –> 00:43:11,760
kinds of systems in the same way we have
1120
00:43:08,480 –> 00:43:13,480
like Global oversight of nuclear weapons
1121
00:43:11,760 –> 00:43:15,880
or synthetic bio or things that can
1122
00:43:13,480 –> 00:43:18,400
really like have a very negative impact
1123
00:43:15,880 –> 00:43:20,280
Way Beyond the realm of one country uh I
1124
00:43:18,400 –> 00:43:22,359
would like to see some sort of
1125
00:43:20,280 –> 00:43:24,760
international agency that is looking at
1126
00:43:22,359 –> 00:43:26,839
the most powerful systems and ensuring
1127
00:43:24,760 –> 00:43:28,680
like reasonable safety testing you know
1128
00:43:26,839 –> 00:43:30,400
these things things are not going to
1129
00:43:28,680 –> 00:43:32,119
escape and recursively self-improve or
1130
00:43:30,400 –> 00:43:35,160
whatever the
1131
00:43:32,119 –> 00:43:39,359
criticism of this is that you’re you
1132
00:43:35,160 –> 00:43:40,640
have the resources to Cozy up to Lobby
1133
00:43:39,359 –> 00:43:42,200
to be involved and you’ve been very
1134
00:43:40,640 –> 00:43:44,000
involved with politicians and then
1135
00:43:42,200 –> 00:43:46,119
startups which are also passionate about
1136
00:43:44,000 –> 00:43:49,400
and invest in um are not going to have
1137
00:43:46,119 –> 00:43:51,599
the ability to Resource uh and deal with
1138
00:43:49,400 –> 00:43:53,480
this and that this regulatory capture as
1139
00:43:51,599 –> 00:43:55,280
per our friend you know Bill Gurley did
1140
00:43:53,480 –> 00:43:57,480
a great talk last year about it so maybe
1141
00:43:55,280 –> 00:43:59,240
you could address that headon do do you
1142
00:43:57,480 –> 00:44:01,000
if the line were we’re only going to
1143
00:43:59,240 –> 00:44:03,200
look at models that are trained on
1144
00:44:01,000 –> 00:44:04,960
computers that cost more than 10 billion
1145
00:44:03,200 –> 00:44:07,319
or more than 100 billion or whatever
1146
00:44:04,960 –> 00:44:10,119
dollars I’d be fine with that there’d be
1147
00:44:07,319 –> 00:44:11,839
some line that’d be fine and uh I don’t
1148
00:44:10,119 –> 00:44:14,480
think that puts any regulatory burden on
1149
00:44:11,839 –> 00:44:16,319
startups so if you have like the the
1150
00:44:14,480 –> 00:44:18,319
nuclear raw material to make a nuclear
1151
00:44:16,319 –> 00:44:19,800
bomb like there’s a small subset set of
1152
00:44:18,319 –> 00:44:21,599
people who have that therefore you use
1153
00:44:19,800 –> 00:44:23,599
the analogy of like a a nuclear
1154
00:44:21,599 –> 00:44:26,079
inspectors kind of situation yeah I
1155
00:44:23,599 –> 00:44:28,720
think that that’s interesting sax you
1156
00:44:26,079 –> 00:44:30,760
have a question well go ahead you had to
1157
00:44:28,720 –> 00:44:33,079
follow can I say one more thing about
1158
00:44:30,760 –> 00:44:34,920
that of course I’d be super nervous
1159
00:44:33,079 –> 00:44:36,920
about regulatory overreach here I think
1160
00:44:34,920 –> 00:44:38,559
we can get this wrong by doing way too
1161
00:44:36,920 –> 00:44:39,760
much I or even a little too much I think
1162
00:44:38,559 –> 00:44:41,280
we can get this wrong by doing not
1163
00:44:39,760 –> 00:44:45,520
enough
1164
00:44:41,280 –> 00:44:49,599
but but I do think part of and
1165
00:44:45,520 –> 00:44:52,119
I and now I mean you know we have seen
1166
00:44:49,599 –> 00:44:56,440
regulatory overstepping or capture just
1167
00:44:52,119 –> 00:44:58,839
get super bad in other areas um and you
1168
00:44:56,440 –> 00:45:00,960
know like also maybe nothing will happen
1169
00:44:58,839 –> 00:45:03,559
but but I think it is part of our duty
1170
00:45:00,960 –> 00:45:06,520
and our mission to like talk about what
1171
00:45:03,559 –> 00:45:08,960
we believe is likely to happen and what
1172
00:45:06,520 –> 00:45:11,760
it takes to get that right the challenge
1173
00:45:08,960 –> 00:45:14,119
Sam is that we have statute that is
1174
00:45:11,760 –> 00:45:17,240
meant to protect people protects Society
1175
00:45:14,119 –> 00:45:20,839
at large what we’re creating however a
1176
00:45:17,240 –> 00:45:24,440
statute that gives the government rights
1177
00:45:20,839 –> 00:45:29,119
to go in and audit code to audit
1178
00:45:24,440 –> 00:45:31,400
business um trade Secrets uh we’ve never
1179
00:45:29,119 –> 00:45:33,079
seen that to this degree before
1180
00:45:31,400 –> 00:45:34,440
basically the California legislation
1181
00:45:33,079 –> 00:45:36,680
that’s proposed and some of the federal
1182
00:45:34,440 –> 00:45:39,079
legislation that’s been proposed
1183
00:45:36,680 –> 00:45:41,720
basically requires the fed the
1184
00:45:39,079 –> 00:45:43,359
government to audit a model to audit
1185
00:45:41,720 –> 00:45:44,880
software to audit and review the
1186
00:45:43,359 –> 00:45:47,640
parameters and the weightings of the
1187
00:45:44,880 –> 00:45:51,440
model and then you need their check mark
1188
00:45:47,640 –> 00:45:55,400
in order to deploy it for commercial or
1189
00:45:51,440 –> 00:45:57,480
public use and for me it just feels like
1190
00:45:55,400 –> 00:46:00,200
we’re trying to
1191
00:45:57,480 –> 00:46:03,160
reigning the the the government agencies
1192
00:46:00,200 –> 00:46:04,559
for fear and and because folks have a
1193
00:46:03,160 –> 00:46:06,440
hard time understanding this and are
1194
00:46:04,559 –> 00:46:08,800
scared about the implications of it they
1195
00:46:06,440 –> 00:46:10,119
want to control it and because they want
1196
00:46:08,800 –> 00:46:11,319
and the only way to control it is to say
1197
00:46:10,119 –> 00:46:14,240
give me a right to audit before you can
1198
00:46:11,319 –> 00:46:15,319
release it asess these people cluel I
1199
00:46:14,240 –> 00:46:16,680
mean the way that the the stuff is
1200
00:46:15,319 –> 00:46:18,200
written you read it you’re like G to
1201
00:46:16,680 –> 00:46:20,160
pull your hair out because as you know
1202
00:46:18,200 –> 00:46:21,520
better than anyone in 12 months none of
1203
00:46:20,160 –> 00:46:23,559
this stuff’s going to make sense anyway
1204
00:46:21,520 –> 00:46:26,760
totally right look the reason I have
1205
00:46:23,559 –> 00:46:28,359
pushed for an agency based approach for
1206
00:46:26,760 –> 00:46:30,760
for for kind of like the big picture
1207
00:46:28,359 –> 00:46:32,800
stuff and not a like write it in laws I
1208
00:46:30,760 –> 00:46:35,520
don’t in 12 months it will all be
1209
00:46:32,800 –> 00:46:38,119
written wrong and I don’t think even if
1210
00:46:35,520 –> 00:46:39,440
these people were like True World
1211
00:46:38,119 –> 00:46:43,160
experts I don’t think they could get it
1212
00:46:39,440 –> 00:46:45,960
right looking at 12 or 24 months um and
1213
00:46:43,160 –> 00:46:47,240
I don’t these policies which is like
1214
00:46:45,960 –> 00:46:48,640
we’re going to look at you know we’re
1215
00:46:47,240 –> 00:46:50,680
going to audit all of your source code
1216
00:46:48,640 –> 00:46:53,119
and like look at all of your weights one
1217
00:46:50,680 –> 00:46:55,640
by one like yeah I think there’s a lot
1218
00:46:53,119 –> 00:46:57,000
of crazy proposals out there um by the
1219
00:46:55,640 –> 00:46:58,480
way especially if the models are always
1220
00:46:57,000 –> 00:47:00,200
retrained all the time if they become
1221
00:46:58,480 –> 00:47:03,599
more Dynamic again this is why I think
1222
00:47:00,200 –> 00:47:05,319
it’s yeah but but like when before an
1223
00:47:03,599 –> 00:47:07,200
airplane gets certified there’s like a
1224
00:47:05,319 –> 00:47:10,760
set of safety tests we put the airplane
1225
00:47:07,200 –> 00:47:12,240
through it um and totally it’s different
1226
00:47:10,760 –> 00:47:14,000
than reading all of your code that’s
1227
00:47:12,240 –> 00:47:17,240
reviewing the output of the model not
1228
00:47:14,000 –> 00:47:18,640
reviewing the insides of the model and
1229
00:47:17,240 –> 00:47:21,880
and so what I was goingon to say is I
1230
00:47:18,640 –> 00:47:24,839
that is the kind of thing that I think
1231
00:47:21,880 –> 00:47:26,680
as safety testing makes sense how are we
1232
00:47:24,839 –> 00:47:28,119
going to get that to happen Sam and I’m
1233
00:47:26,680 –> 00:47:29,960
not just speaking for open AI I speak
1234
00:47:28,119 –> 00:47:32,480
for the industry for for Humanity
1235
00:47:29,960 –> 00:47:34,800
because I am concerned that we draw
1236
00:47:32,480 –> 00:47:37,880
ourselves into almost like a dark ages
1237
00:47:34,800 –> 00:47:39,720
type of era by restricting the growth of
1238
00:47:37,880 –> 00:47:41,280
these incredible technologies that can
1239
00:47:39,720 –> 00:47:43,640
prosper human that that Humanity can
1240
00:47:41,280 –> 00:47:45,440
prosper from so significantly how do we
1241
00:47:43,640 –> 00:47:46,720
change the the sentiment and get that to
1242
00:47:45,440 –> 00:47:48,640
happen because this is all moving so
1243
00:47:46,720 –> 00:47:50,280
quickly at the government levels and
1244
00:47:48,640 –> 00:47:53,400
folks seem to be getting it wrong and
1245
00:47:50,280 –> 00:47:55,280
I’m I’m just to build on that Sam the
1246
00:47:53,400 –> 00:47:58,400
architectural decision for example that
1247
00:47:55,280 –> 00:48:01,160
llama took is pretty interesting in that
1248
00:47:58,400 –> 00:48:02,960
it’s like we’re going to let llama grow
1249
00:48:01,160 –> 00:48:04,920
and be as unfettered as possible and we
1250
00:48:02,960 –> 00:48:06,520
have this other kind of thing that we
1251
00:48:04,920 –> 00:48:08,839
call Llama guard that’s meant to be
1252
00:48:06,520 –> 00:48:10,359
these protective guard rails is that how
1253
00:48:08,839 –> 00:48:12,680
you see the problem being solved
1254
00:48:10,359 –> 00:48:15,520
correctly or do you see at the current
1255
00:48:12,680 –> 00:48:16,880
at the current strength of models
1256
00:48:15,520 –> 00:48:18,559
definitely some things are going to go
1257
00:48:16,880 –> 00:48:20,440
wrong and I don’t want to like make
1258
00:48:18,559 –> 00:48:22,800
light of those or not take those
1259
00:48:20,440 –> 00:48:25,319
seriously but I’m not like I don’t have
1260
00:48:22,800 –> 00:48:29,599
any like catastrophic risk worries with
1261
00:48:25,319 –> 00:48:34,480
a gp4 level model um and I think there’s
1262
00:48:29,599 –> 00:48:34,480
many safe ways to choose to deploy this
1263
00:48:34,640 –> 00:48:40,720
uh may maybe we’d find more common
1264
00:48:37,079 –> 00:48:44,000
ground if we said that uh and I like you
1265
00:48:40,720 –> 00:48:46,839
know the specific example of models that
1266
00:48:44,000 –> 00:48:48,079
are capable that are technically capable
1267
00:48:46,839 –> 00:48:51,200
not even if they’re not going to be used
1268
00:48:48,079 –> 00:48:54,119
this way of recursive
1269
00:48:51,200 –> 00:48:57,720
self-improvement um or
1270
00:48:54,119 –> 00:49:00,680
of you know autonomously
1271
00:48:57,720 –> 00:49:03,079
designing and deploying a bioweapon or
1272
00:49:00,680 –> 00:49:05,119
something like that or a new model that
1273
00:49:03,079 –> 00:49:08,440
was the recursive self-improvement Point
1274
00:49:05,119 –> 00:49:10,119
um you know we should have safety
1275
00:49:08,440 –> 00:49:12,359
testing on the outputs at an
1276
00:49:10,119 –> 00:49:14,960
international level for models that you
1277
00:49:12,359 –> 00:49:17,960
know have a reasonable chance of of
1278
00:49:14,960 –> 00:49:22,440
posing a threat there uh I don’t think
1279
00:49:17,960 –> 00:49:26,240
like GPT 4 of course does
1280
00:49:22,440 –> 00:49:29,440
not POS any sort of well I want to say
1281
00:49:26,240 –> 00:49:31,680
any sort cuz we don’t yeah I don’t think
1282
00:49:29,440 –> 00:49:33,400
the gp4 poses a material threat on those
1283
00:49:31,680 –> 00:49:35,640
kinds of things and I think there’s many
1284
00:49:33,400 –> 00:49:39,359
safe ways to release a model like this
1285
00:49:35,640 –> 00:49:43,079
um but you know when like significant
1286
00:49:39,359 –> 00:49:45,760
loss of human life is a serious
1287
00:49:43,079 –> 00:49:48,319
possibility like airplanes
1288
00:49:45,760 –> 00:49:49,760
or any number of other examples where I
1289
00:49:48,319 –> 00:49:51,079
think we’re happy to have some sort of
1290
00:49:49,760 –> 00:49:52,440
testing framework like I don’t think
1291
00:49:51,079 –> 00:49:55,160
about an airplane when I get on it I
1292
00:49:52,440 –> 00:49:56,280
just assume it’s going to be safe right
1293
00:49:55,160 –> 00:49:58,119
right there’s a lot of hand ringing
1294
00:49:56,280 –> 00:50:00,799
right now Sam about
1295
00:49:58,119 –> 00:50:02,119
jobs and you had a lot of I think you
1296
00:50:00,799 –> 00:50:04,760
did like some sort of a test when you
1297
00:50:02,119 –> 00:50:06,799
were at YC about Ubi and you’ve been
1298
00:50:04,760 –> 00:50:09,000
result in that come out very soon I just
1299
00:50:06,799 –> 00:50:12,240
it was a fiveyear study that wrapped up
1300
00:50:09,000 –> 00:50:13,559
um or started five years ago well there
1301
00:50:12,240 –> 00:50:16,359
was like a beta study first and then it
1302
00:50:13,559 –> 00:50:17,839
was like a long one that ran but uhk can
1303
00:50:16,359 –> 00:50:19,440
you explain yeah why did you start why’
1304
00:50:17,839 –> 00:50:22,440
you start it maybe just explain Ubi and
1305
00:50:19,440 –> 00:50:25,480
why you started it um so we started
1306
00:50:22,440 –> 00:50:27,640
thinking about this in 2016 uh kind of
1307
00:50:25,480 –> 00:50:31,359
about the same time started taking AI
1308
00:50:27,640 –> 00:50:34,040
really seriously and the theory was that
1309
00:50:31,359 –> 00:50:37,000
the magnitude of the change that may
1310
00:50:34,040 –> 00:50:40,200
come to society
1311
00:50:37,000 –> 00:50:41,799
and jobs in the economy and and sort of
1312
00:50:40,200 –> 00:50:45,319
in some deeper sense than that like what
1313
00:50:41,799 –> 00:50:47,799
the social contract looks like
1314
00:50:45,319 –> 00:50:50,720
um meant that we should have many
1315
00:50:47,799 –> 00:50:52,079
studies to study many ideas about new
1316
00:50:50,720 –> 00:50:56,079
new ways to
1317
00:50:52,079 –> 00:50:58,200
arrange that um I also think that you
1318
00:50:56,079 –> 00:51:01,200
know I’m not like a super fan of how the
1319
00:50:58,200 –> 00:51:03,880
government has handled most policies
1320
00:51:01,200 –> 00:51:06,000
designed to help poor people and I kind
1321
00:51:03,880 –> 00:51:08,079
of believe that if you could just give
1322
00:51:06,000 –> 00:51:09,559
people money they would make good
1323
00:51:08,079 –> 00:51:12,079
decisions and the market would do its
1324
00:51:09,559 –> 00:51:15,160
thing and you know I’m very much in
1325
00:51:12,079 –> 00:51:18,160
favor of lifting up the floor and
1326
00:51:15,160 –> 00:51:19,760
reducing eliminating poverty um but I’m
1327
00:51:18,160 –> 00:51:22,599
interested in better ways to do that
1328
00:51:19,760 –> 00:51:24,400
than what we have tried for the existing
1329
00:51:22,599 –> 00:51:26,760
social safety net and and kind of the
1330
00:51:24,400 –> 00:51:29,200
way things have been handled and I think
1331
00:51:26,760 –> 00:51:30,400
people money is not going to go solve
1332
00:51:29,200 –> 00:51:33,119
all problems it’s certainly not going to
1333
00:51:30,400 –> 00:51:36,119
make people happy but it
1334
00:51:33,119 –> 00:51:40,319
might it might solve some problems and
1335
00:51:36,119 –> 00:51:42,200
it might give people a better Horizon
1336
00:51:40,319 –> 00:51:44,799
with which to help themselves and I’m
1337
00:51:42,200 –> 00:51:48,200
interested in that I I think
1338
00:51:44,799 –> 00:51:50,920
that now that we see some of the ways so
1339
00:51:48,200 –> 00:51:52,319
2016 was a very long time ago uh you
1340
00:51:50,920 –> 00:51:54,559
know now that we see some of the ways
1341
00:51:52,319 –> 00:51:57,050
that AI is developing I wonder if
1342
00:51:54,559 –> 00:51:58,240
there’s better things to do than the
1343
00:51:57,050 –> 00:52:02,240
[Music]
1344
00:51:58,240 –> 00:52:03,319
traditional um conceptualization of Ubi
1345
00:52:02,240 –> 00:52:05,520
uh like I
1346
00:52:03,319 –> 00:52:07,559
wonder I wonder if the future looks
1347
00:52:05,520 –> 00:52:09,920
something like more like Universal basic
1348
00:52:07,559 –> 00:52:12,280
compute than Universal basic income and
1349
00:52:09,920 –> 00:52:14,280
everybody gets like a slice of gpt7
1350
00:52:12,280 –> 00:52:16,480
compute and they can use it they can
1351
00:52:14,280 –> 00:52:18,520
resell it they can donate it to somebody
1352
00:52:16,480 –> 00:52:21,000
to use for cancer research but but what
1353
00:52:18,520 –> 00:52:22,920
you get is not dollars but this like
1354
00:52:21,000 –> 00:52:24,960
productivity slice yeah you own like
1355
00:52:22,920 –> 00:52:27,960
part of the productivity right I would
1356
00:52:24,960 –> 00:52:30,760
like to shift to the gossip part of this
1357
00:52:27,960 –> 00:52:35,799
gossip what gossip let’s go back let’s
1358
00:52:30,760 –> 00:52:39,880
go back to November what the flying
1359
00:52:35,799 –> 00:52:41,920
happened um you know I I if you have
1360
00:52:39,880 –> 00:52:43,040
specific questions I’m happy to maybe
1361
00:52:41,920 –> 00:52:44,839
I’ll answer maybe you said you were
1362
00:52:43,040 –> 00:52:47,280
going to talk about it at some point so
1363
00:52:44,839 –> 00:52:49,559
here’s the point what the hell happened
1364
00:52:47,280 –> 00:52:52,160
you were fired you came back it was a
1365
00:52:49,559 –> 00:52:54,280
palace Intrigue did somebody stab you in
1366
00:52:52,160 –> 00:52:57,400
the back did you find AGI what’s going
1367
00:52:54,280 –> 00:53:02,720
on tell us this is a safe face
1368
00:52:57,400 –> 00:53:05,480
Sam um I was fired I
1369
00:53:02,720 –> 00:53:07,440
was I talked about coming back I kind of
1370
00:53:05,480 –> 00:53:08,559
was a little bit unsure at the moment
1371
00:53:07,440 –> 00:53:14,319
about what I wanted to do because I was
1372
00:53:08,559 –> 00:53:17,680
very upset um and I realized that I
1373
00:53:14,319 –> 00:53:20,200
really loved open Ai and the people and
1374
00:53:17,680 –> 00:53:22,480
that I would come back and I kind of I
1375
00:53:20,200 –> 00:53:25,079
knew it was going to be hard it was even
1376
00:53:22,480 –> 00:53:27,559
harder than I thought but I I kind of
1377
00:53:25,079 –> 00:53:30,720
was like all right fine um I agreed to
1378
00:53:27,559 –> 00:53:33,119
come back um the board like took a while
1379
00:53:30,720 –> 00:53:35,240
to figure things out and then uh you
1380
00:53:33,119 –> 00:53:38,079
know we were kind of like trying to keep
1381
00:53:35,240 –> 00:53:40,000
the team together and keep doing things
1382
00:53:38,079 –> 00:53:41,319
for our customers and uh you know sort
1383
00:53:40,000 –> 00:53:43,040
of started making other plans then the
1384
00:53:41,319 –> 00:53:46,640
board decided to hire a different
1385
00:53:43,040 –> 00:53:49,359
interim CEO um and then
1386
00:53:46,640 –> 00:53:50,680
everybody there many people oh my gosh
1387
00:53:49,359 –> 00:53:53,559
what was what was that guy’s name he was
1388
00:53:50,680 –> 00:53:56,280
there for like a scaramucci right like
1389
00:53:53,559 –> 00:53:58,440
uh em great and I I have nothing but
1390
00:53:56,280 –> 00:53:58,440
good
1391
00:53:58,839 –> 00:54:04,559
scari um and then where were you when
1392
00:54:02,599 –> 00:54:07,079
they um when you found the news that
1393
00:54:04,559 –> 00:54:10,040
you’d been fired like taking I was in a
1394
00:54:07,079 –> 00:54:13,559
hotel room in Vegas for F1 weekend I
1395
00:54:10,040 –> 00:54:14,480
think text and they’re like fire pick up
1396
00:54:13,559 –> 00:54:17,040
said I think that’s happened to you
1397
00:54:14,480 –> 00:54:18,480
before J I’m trying to think if I ever
1398
00:54:17,040 –> 00:54:21,119
got fired I don’t think I’ve gotten
1399
00:54:18,480 –> 00:54:23,000
fired um yeah I got no it’s just a weird
1400
00:54:21,119 –> 00:54:24,960
thing like it’s a text from who actually
1401
00:54:23,000 –> 00:54:27,079
no I got a text the night before and
1402
00:54:24,960 –> 00:54:28,720
then I got in a phone call with the
1403
00:54:27,079 –> 00:54:31,319
uh and then that was that and then I
1404
00:54:28,720 –> 00:54:35,200
kind of like I mean then everything went
1405
00:54:31,319 –> 00:54:37,760
crazy I was like uh it was
1406
00:54:35,200 –> 00:54:39,359
like I mean I have my phone was like
1407
00:54:37,760 –> 00:54:41,079
unusable it was just a Non-Stop
1408
00:54:39,359 –> 00:54:44,119
vibrating thing of like text messages
1409
00:54:41,079 –> 00:54:45,520
call basically you got fired by tweet
1410
00:54:44,119 –> 00:54:49,920
that happened a few times during the
1411
00:54:45,520 –> 00:54:52,319
Trump Administration a few uh C TW
1412
00:54:49,920 –> 00:54:54,040
before tweeting was nice of them um and
1413
00:54:52,319 –> 00:54:56,720
then like you know I kind of did like a
1414
00:54:54,040 –> 00:55:01,160
few hours of just this like absolute
1415
00:54:56,720 –> 00:55:03,119
State um in the hotel room trying to
1416
00:55:01,160 –> 00:55:05,559
like I was just confused beyond belief
1417
00:55:03,119 –> 00:55:07,760
trying to figure out what to do and uh
1418
00:55:05,559 –> 00:55:12,119
so weird and then
1419
00:55:07,760 –> 00:55:13,559
like flew home it maybe like got on a
1420
00:55:12,119 –> 00:55:16,240
plane like I don’t know 3 p.m. or
1421
00:55:13,559 –> 00:55:19,040
something like that um still just like
1422
00:55:16,240 –> 00:55:21,520
you know crazy non-stop phone blowing up
1423
00:55:19,040 –> 00:55:24,799
uh met up with some people in person by
1424
00:55:21,520 –> 00:55:27,480
that evening I was like okay you know
1425
00:55:24,799 –> 00:55:29,760
I’ll just like go do AGI research and
1426
00:55:27,480 –> 00:55:32,599
was feeling pretty happy about the
1427
00:55:29,760 –> 00:55:35,039
future and yeah you have options and
1428
00:55:32,599 –> 00:55:36,799
then and then the next morning uh had
1429
00:55:35,039 –> 00:55:40,319
this call with a couple of board members
1430
00:55:36,799 –> 00:55:44,520
about coming back and that led to a few
1431
00:55:40,319 –> 00:55:48,039
more days of craziness and then
1432
00:55:44,520 –> 00:55:50,640
uh and then it kind of I think it got
1433
00:55:48,039 –> 00:55:53,160
resolved well it was like a lot of
1434
00:55:50,640 –> 00:55:54,400
insanity in between what percent what
1435
00:55:53,160 –> 00:55:57,440
percent of it was because of these
1436
00:55:54,400 –> 00:55:59,960
nonprofit board members
1437
00:55:57,440 –> 00:56:01,559
um well we only have a nonprofit board
1438
00:55:59,960 –> 00:56:03,400
so it was all the nonprofit board
1439
00:56:01,559 –> 00:56:07,200
members uh there the board had gotten
1440
00:56:03,400 –> 00:56:10,400
down to six people um
1441
00:56:07,200 –> 00:56:14,280
they and then they removed Greg from the
1442
00:56:10,400 –> 00:56:16,039
board and then fired me um so but it was
1443
00:56:14,280 –> 00:56:17,520
like you know but I mean like was there
1444
00:56:16,039 –> 00:56:19,880
a culture clash between the people on
1445
00:56:17,520 –> 00:56:21,599
the board who had only nonprofit
1446
00:56:19,880 –> 00:56:23,359
experience versus the people who had
1447
00:56:21,599 –> 00:56:24,839
startup experience and maybe you can
1448
00:56:23,359 –> 00:56:26,799
share a little bit about if you’re
1449
00:56:24,839 –> 00:56:28,440
willing to the motivation behind the
1450
00:56:26,799 –> 00:56:31,400
action anything you
1451
00:56:28,440 –> 00:56:34,039
can I think there’s always been culture
1452
00:56:31,400 –> 00:56:36,720
clashes
1453
00:56:34,039 –> 00:56:39,319
at look
1454
00:56:36,720 –> 00:56:41,480
obviously not all of those board members
1455
00:56:39,319 –> 00:56:43,720
are my favorite people in the world but
1456
00:56:41,480 –> 00:56:43,720
I
1457
00:56:43,920 –> 00:56:50,079
have serious respect
1458
00:56:46,839 –> 00:56:53,200
for the gravity with which they treat
1459
00:56:50,079 –> 00:56:56,880
AGI and the importance of getting AI
1460
00:56:53,200 –> 00:56:59,960
safety right and even if I
1461
00:56:56,880 –> 00:57:02,839
stringently disagree with their
1462
00:56:59,960 –> 00:57:05,640
decision- making and actions which I do
1463
00:57:02,839 –> 00:57:11,119
um I have never once doubted
1464
00:57:05,640 –> 00:57:12,920
their integrity or commitment to um the
1465
00:57:11,119 –> 00:57:13,920
sort of shared mission of safe and
1466
00:57:12,920 –> 00:57:17,280
beneficial
1467
00:57:13,920 –> 00:57:19,599
AGI um you know do I think they like
1468
00:57:17,280 –> 00:57:21,599
made good decisions in the process of
1469
00:57:19,599 –> 00:57:24,039
that or kind of know how to balance all
1470
00:57:21,599 –> 00:57:27,559
of things opening I has to get right no
1471
00:57:24,039 –> 00:57:32,880
but but I think that like the intent the
1472
00:57:27,559 –> 00:57:36,760
intent of the magnitude of yeah
1473
00:57:32,880 –> 00:57:38,039
AGI and getting that right I actually
1474
00:57:36,760 –> 00:57:41,839
let me ask you about that so the mission
1475
00:57:38,039 –> 00:57:43,200
of open AI is explicitly to create AGI
1476
00:57:41,839 –> 00:57:46,119
which I think is really
1477
00:57:43,200 –> 00:57:49,720
interesting a lot of people would say
1478
00:57:46,119 –> 00:57:51,839
that if we create AGI that would be like
1479
00:57:49,720 –> 00:57:53,920
an unintended consequence of something
1480
00:57:51,839 –> 00:57:56,720
gone horribly wrong and they’re very
1481
00:57:53,920 –> 00:58:00,039
afraid of that outcome but open a makes
1482
00:57:56,720 –> 00:58:02,599
that the actual Mission yeah does that
1483
00:58:00,039 –> 00:58:03,960
create like more fear about what you’re
1484
00:58:02,599 –> 00:58:06,559
doing I mean I understand it can create
1485
00:58:03,960 –> 00:58:08,160
motivation too but how do you reconcile
1486
00:58:06,559 –> 00:58:10,119
that I guess why is I think a lot of I
1487
00:58:08,160 –> 00:58:12,839
think a lot of the
1488
00:58:10,119 –> 00:58:14,039
well I mean first I’ll say I’ll answer
1489
00:58:12,839 –> 00:58:15,559
the first question and the second one I
1490
00:58:14,039 –> 00:58:19,000
think it does create a great deal of
1491
00:58:15,559 –> 00:58:21,319
fear uh I think a lot of the world is
1492
00:58:19,000 –> 00:58:23,599
understandably very afraid of AGI or
1493
00:58:21,319 –> 00:58:26,599
very afraid of even current Ai and and
1494
00:58:23,599 –> 00:58:28,200
very excited about it and even more
1495
00:58:26,599 –> 00:58:31,760
afraid and even more excited about where
1496
00:58:28,200 –> 00:58:35,960
it’s going um and
1497
00:58:31,760 –> 00:58:37,880
we we wrestle with that but like I think
1498
00:58:35,960 –> 00:58:39,039
it is unavoidable that this is going to
1499
00:58:37,880 –> 00:58:41,240
happen I also think it’s going to be
1500
00:58:39,039 –> 00:58:42,880
tremendously beneficial but we do have
1501
00:58:41,240 –> 00:58:45,440
to navigate how to get there in a
1502
00:58:42,880 –> 00:58:47,359
reasonable way and like a lot of stuff
1503
00:58:45,440 –> 00:58:49,760
is going to change and change is you
1504
00:58:47,359 –> 00:58:52,520
know pretty pretty uncomfortable for
1505
00:58:49,760 –> 00:58:56,839
people so there’s a lot of
1506
00:58:52,520 –> 00:58:58,960
pieces that we got to get right and ask
1507
00:58:56,839 –> 00:59:00,200
can I ask a different question you you
1508
00:58:58,960 –> 00:59:01,960
have
1509
00:59:00,200 –> 00:59:04,880
created I
1510
00:59:01,960 –> 00:59:06,440
mean it’s the hottest company and you
1511
00:59:04,880 –> 00:59:07,640
are literally at the center of the
1512
00:59:06,440 –> 00:59:12,520
center of the
1513
00:59:07,640 –> 00:59:15,559
center but then it’s so unique in the
1514
00:59:12,520 –> 00:59:17,680
sense that all of this value you issued
1515
00:59:15,559 –> 00:59:19,480
economically can you just like walk us
1516
00:59:17,680 –> 00:59:21,000
through like yeah I wish I had taken I
1517
00:59:19,480 –> 00:59:23,200
wish I had taken Equity so I never had
1518
00:59:21,000 –> 00:59:24,760
to answer this question if I could go
1519
00:59:23,200 –> 00:59:27,920
back in why don’t they give you a grand
1520
00:59:24,760 –> 00:59:29,280
now or just give you a big option Grant
1521
00:59:27,920 –> 00:59:31,240
like you deserve yeah give you five
1522
00:59:29,280 –> 00:59:33,000
points what was the decision back then
1523
00:59:31,240 –> 00:59:34,680
like why was that so important the
1524
00:59:33,000 –> 00:59:36,640
decision back then the re the original
1525
00:59:34,680 –> 00:59:39,920
reason was just like the structure of
1526
00:59:36,640 –> 00:59:43,000
our nonprofit it was uh like there was
1527
00:59:39,920 –> 00:59:45,599
something about yeah okay this is like
1528
00:59:43,000 –> 00:59:47,480
nice from a motivations perspective but
1529
00:59:45,599 –> 00:59:49,440
mostly it was that our board needed to
1530
00:59:47,480 –> 00:59:51,440
be a majority of disinterested
1531
00:59:49,440 –> 00:59:54,839
directors and I was like that’s fine I
1532
00:59:51,440 –> 00:59:57,960
don’t need Equity right now I kind
1533
00:59:54,839 –> 00:59:59,559
of but like but in this weird way now
1534
00:59:57,960 –> 01:00:01,079
that you’re running a company yeah it it
1535
00:59:59,559 –> 01:00:03,200
creates these weird questions of like
1536
01:00:01,079 –> 01:00:07,599
well what’s your real motivation versus
1537
01:00:03,200 –> 01:00:10,200
to that’s that it is so deeply un I one
1538
01:00:07,599 –> 01:00:12,720
thing I have noticed it is is so deeply
1539
01:00:10,200 –> 01:00:13,880
unimaginable to people to say I don’t
1540
01:00:12,720 –> 01:00:16,720
really need more
1541
01:00:13,880 –> 01:00:18,200
money like and I well people think I
1542
01:00:16,720 –> 01:00:20,319
think I think people think it’s a little
1543
01:00:18,200 –> 01:00:22,640
bit of an ulterior motive I think yeah
1544
01:00:20,319 –> 01:00:24,200
yeah yeah no so it assumes what else is
1545
01:00:22,640 –> 01:00:26,160
he doing on the side to make money
1546
01:00:24,200 –> 01:00:27,400
something if I were just trying to say
1547
01:00:26,160 –> 01:00:29,119
like I’m going to try to make a trillion
1548
01:00:27,400 –> 01:00:30,960
dollarss with open AI I think everybody
1549
01:00:29,119 –> 01:00:32,280
would have an easier time and it would
1550
01:00:30,960 –> 01:00:34,640
save me it would save a lot of
1551
01:00:32,280 –> 01:00:37,720
conspiracy theories totally this is
1552
01:00:34,640 –> 01:00:39,799
totally the back Channel you are a great
1553
01:00:37,720 –> 01:00:42,000
dealmaker I I’ve watched your whole
1554
01:00:39,799 –> 01:00:43,680
career I mean you’re just great at it
1555
01:00:42,000 –> 01:00:47,160
you got all these connections you’re
1556
01:00:43,680 –> 01:00:49,720
really good at raising money uh you’re
1557
01:00:47,160 –> 01:00:51,920
fantastic at it and you got this Johnny
1558
01:00:49,720 –> 01:00:54,160
I thing going you’re inhumane you’re
1559
01:00:51,920 –> 01:00:56,920
investing in company she got the orb
1560
01:00:54,160 –> 01:01:00,000
raising $7 trillion to build
1561
01:00:56,920 –> 01:01:03,039
Fabs all this stuff all of that put
1562
01:01:00,000 –> 01:01:04,280
together J loves fake news I’m kind of
1563
01:01:03,039 –> 01:01:05,520
being a little fous here you know
1564
01:01:04,280 –> 01:01:06,920
obviously it’s not you’re not raising 7
1565
01:01:05,520 –> 01:01:08,799
trillion doll but maybe that’s the
1566
01:01:06,920 –> 01:01:12,119
market cap of something putting all that
1567
01:01:08,799 –> 01:01:14,160
aside the T was you’re doing all these
1568
01:01:12,119 –> 01:01:16,640
deals they don’t trust you because
1569
01:01:14,160 –> 01:01:18,760
what’s your motivation you you your end
1570
01:01:16,640 –> 01:01:21,440
running and and what opportunities
1571
01:01:18,760 –> 01:01:23,359
belong inside of open AI what opportuni
1572
01:01:21,440 –> 01:01:25,160
should be Sam’s and this group of
1573
01:01:23,359 –> 01:01:27,720
nonprofit people didn’t trust you is
1574
01:01:25,160 –> 01:01:29,640
that what happened so the things like
1575
01:01:27,720 –> 01:01:31,280
you know device companies or if we were
1576
01:01:29,640 –> 01:01:33,280
doing some chip Fab Company it’s like
1577
01:01:31,280 –> 01:01:36,280
those are not Sam project those would be
1578
01:01:33,280 –> 01:01:38,119
like opening ey would get that Equity um
1579
01:01:36,280 –> 01:01:40,720
they would okay that’s not the Public’s
1580
01:01:38,119 –> 01:01:42,200
perception well that’s not like kind of
1581
01:01:40,720 –> 01:01:43,440
the people like you who have to like
1582
01:01:42,200 –> 01:01:44,680
commentate on the stuff all day
1583
01:01:43,440 –> 01:01:45,880
perception which is fair because we
1584
01:01:44,680 –> 01:01:47,680
haven’t announced the stuff because it’s
1585
01:01:45,880 –> 01:01:49,880
not done I don’t think most people in
1586
01:01:47,680 –> 01:01:53,640
the world like are thinking about this
1587
01:01:49,880 –> 01:01:56,319
but I I I agree it spins up a lot of
1588
01:01:53,640 –> 01:01:59,799
conspiracies conspiracy theories in like
1589
01:01:56,319 –> 01:02:01,480
Tech commentators yeah and if I could go
1590
01:01:59,799 –> 01:02:03,520
back yeah I would just say like let me
1591
01:02:01,480 –> 01:02:05,480
take equity and make that super clear
1592
01:02:03,520 –> 01:02:06,839
and then every be like all right like
1593
01:02:05,480 –> 01:02:08,359
I’d still be doing it because I really
1594
01:02:06,839 –> 01:02:10,079
care about AGI and think this is like
1595
01:02:08,359 –> 01:02:11,359
the most interesting work in the world
1596
01:02:10,079 –> 01:02:14,480
but it would at least type check to
1597
01:02:11,359 –> 01:02:16,119
everybody what’s the chip project that
1598
01:02:14,480 –> 01:02:17,480
the S trillion do and where’ the Seven
1599
01:02:16,119 –> 01:02:18,640
trillion number come from makes no sense
1600
01:02:17,480 –> 01:02:21,960
I don’t know where that came from
1601
01:02:18,640 –> 01:02:23,960
actually I genuinely don’t uh I think I
1602
01:02:21,960 –> 01:02:26,240
think the world needs a lot more AI
1603
01:02:23,960 –> 01:02:28,279
infrastructure a lot more than it’s
1604
01:02:26,240 –> 01:02:31,640
currently planning to build and with a
1605
01:02:28,279 –> 01:02:35,039
different cost structure um the exact
1606
01:02:31,640 –> 01:02:36,760
way for us to play there is we’re still
1607
01:02:35,039 –> 01:02:38,880
trying to figure that out got it what’s
1608
01:02:36,760 –> 01:02:40,119
your preferred model of organizing open
1609
01:02:38,880 –> 01:02:43,799
AI is
1610
01:02:40,119 –> 01:02:46,520
it sort of like the move fast break
1611
01:02:43,799 –> 01:02:48,279
things highly distributed small teams or
1612
01:02:46,520 –> 01:02:49,760
is it more of this organized effort
1613
01:02:48,279 –> 01:02:53,200
where you need to plan because you want
1614
01:02:49,760 –> 01:02:55,160
to prevent some of these edge cases um
1615
01:02:53,200 –> 01:02:57,839
oh I have to go in a minute uh it’s not
1616
01:02:55,160 –> 01:02:57,839
because
1617
01:02:58,680 –> 01:03:01,839
it’s not to prevent the edge cases that
1618
01:03:00,160 –> 01:03:04,359
we need to be more organized but it is
1619
01:03:01,839 –> 01:03:07,400
that these systems are so complicated
1620
01:03:04,359 –> 01:03:09,000
and concentrating bets are so important
1621
01:03:07,400 –> 01:03:11,599
like
1622
01:03:09,000 –> 01:03:13,760
one you know at the time before it was
1623
01:03:11,599 –> 01:03:15,480
like obvious to do this you have like
1624
01:03:13,760 –> 01:03:16,839
deep mind or whatever has all these
1625
01:03:15,480 –> 01:03:17,880
different teams doing all these
1626
01:03:16,839 –> 01:03:19,680
different things and they’re spreading
1627
01:03:17,880 –> 01:03:20,880
their bets out and you had Open the Eyes
1628
01:03:19,680 –> 01:03:22,119
say we’re going to like basically put
1629
01:03:20,880 –> 01:03:25,440
the whole company and work together to
1630
01:03:22,119 –> 01:03:28,559
make gp4 and that was like unimaginable
1631
01:03:25,440 –> 01:03:30,920
for how to run an AI research lab but it
1632
01:03:28,559 –> 01:03:32,960
is I think what works at the minimum
1633
01:03:30,920 –> 01:03:34,400
it’s what works for us so not because
1634
01:03:32,960 –> 01:03:35,839
we’re trying to prevent edge cases but
1635
01:03:34,400 –> 01:03:38,319
because we want to concentrate resources
1636
01:03:35,839 –> 01:03:40,400
and do these like big hard complicated
1637
01:03:38,319 –> 01:03:42,680
things we do have a lot of coordination
1638
01:03:40,400 –> 01:03:44,520
on what we work on all right Sam I know
1639
01:03:42,680 –> 01:03:46,960
you got to go you’ve been great on the
1640
01:03:44,520 –> 01:03:49,440
hour come back any time great talking to
1641
01:03:46,960 –> 01:03:50,680
you guys yeah fun thanks for for being
1642
01:03:49,440 –> 01:03:53,000
so open about it we’ve been talking
1643
01:03:50,680 –> 01:03:54,240
about it for like a year plus I’m really
1644
01:03:53,000 –> 01:03:56,400
happy it finally happened yeah it’s
1645
01:03:54,240 –> 01:03:57,960
awesome I really app come back on after
1646
01:03:56,400 –> 01:04:00,440
our next like major launch and I’ll be
1647
01:03:57,960 –> 01:04:01,960
able to talk more directly about some
1648
01:04:00,440 –> 01:04:03,880
you got the zoom link same Zoom link
1649
01:04:01,960 –> 01:04:06,799
every week just same time same Zoom link
1650
01:04:03,880 –> 01:04:08,559
drop time just drop in just put on your
1651
01:04:06,799 –> 01:04:11,960
calendar come back to the game come back
1652
01:04:08,559 –> 01:04:13,480
to the game in a while I you know I
1653
01:04:11,960 –> 01:04:16,720
would love to play poker it has been
1654
01:04:13,480 –> 01:04:18,760
forever that would be a lot of fun send
1655
01:04:16,720 –> 01:04:21,920
where Cham when you and I were heads up
1656
01:04:18,760 –> 01:04:25,480
and you you had remind me you and I were
1657
01:04:21,920 –> 01:04:27,039
heads up and you went all in I had a set
1658
01:04:25,480 –> 01:04:29,279
but there was a straight and a flush on
1659
01:04:27,039 –> 01:04:31,240
the board and I’m in the tank trying to
1660
01:04:29,279 –> 01:04:32,359
figure out if I want to lose this back
1661
01:04:31,240 –> 01:04:35,319
when we playing small Stakes it might
1662
01:04:32,359 –> 01:04:37,599
have been like 5K pot or something and
1663
01:04:35,319 –> 01:04:39,559
then chath can’t stay out of the pot and
1664
01:04:37,599 –> 01:04:41,039
he starts taunting the two of us you
1665
01:04:39,559 –> 01:04:44,079
should call you shouldn’t call he’s
1666
01:04:41,039 –> 01:04:45,359
bluffing and I’m like I’m going I’m
1667
01:04:44,079 –> 01:04:48,640
trying to figure out if I make the call
1668
01:04:45,359 –> 01:04:50,520
here I make the call and uh it was like
1669
01:04:48,640 –> 01:04:51,520
uh you had a really good hand and but I
1670
01:04:50,520 –> 01:04:52,960
just happened to have a s I think you
1671
01:04:51,520 –> 01:04:54,880
had like top pair top kicker or
1672
01:04:52,960 –> 01:04:56,559
something but you you made a great move
1673
01:04:54,880 –> 01:04:58,400
because the boy was so almost like a
1674
01:04:56,559 –> 01:05:00,400
bottom set Sam has a great style of
1675
01:04:58,400 –> 01:05:02,480
playing which I would call RAM and jam
1676
01:05:00,400 –> 01:05:03,680
totally you got to get I don’t really
1677
01:05:02,480 –> 01:05:05,079
know if you I don’t I don’t know if you
1678
01:05:03,680 –> 01:05:07,559
could say about anybody
1679
01:05:05,079 –> 01:05:09,079
else I don’t I don’t I’m not gonna you
1680
01:05:07,559 –> 01:05:10,880
haven’t seen Jam play in the last 18
1681
01:05:09,079 –> 01:05:14,799
months it’s a lot
1682
01:05:10,880 –> 01:05:18,160
different much more so much fun now find
1683
01:05:14,799 –> 01:05:19,599
find hard to have you played bomb pots I
1684
01:05:18,160 –> 01:05:22,720
don’t know what that is okay you’ll love
1685
01:05:19,599 –> 01:05:25,839
it we’ll see you is nuts
1686
01:05:22,720 –> 01:05:27,359
it’s do two boards and congrats
1687
01:05:25,839 –> 01:05:29,319
everything honestly than thanks for
1688
01:05:27,359 –> 01:05:31,000
coming on and love to have you back when
1689
01:05:29,319 –> 01:05:34,559
the next after the big launch sounds
1690
01:05:31,000 –> 01:05:37,559
please do cool bye gentlemen some
1691
01:05:34,559 –> 01:05:39,160
breaking news here all those projects He
1692
01:05:37,559 –> 01:05:40,839
said are part of open AI That’s
1693
01:05:39,160 –> 01:05:42,359
something people didn’t know before this
1694
01:05:40,839 –> 01:05:44,440
and a lot of confusion
1695
01:05:42,359 –> 01:05:47,200
there chamat what was your major
1696
01:05:44,440 –> 01:05:50,039
takeaway from our hour with Sam I think
1697
01:05:47,200 –> 01:05:52,880
that these guys are going to be one of
1698
01:05:50,039 –> 01:05:54,680
the four major companies okay that
1699
01:05:52,880 –> 01:05:56,520
matter in this whole space I think that
1700
01:05:54,680 –> 01:05:58,599
that’s clear
1701
01:05:56,520 –> 01:06:00,279
I think what’s still unclear is where is
1702
01:05:58,599 –> 01:06:01,799
the economics going to be he said
1703
01:06:00,279 –> 01:06:04,000
something very discreet but I thought
1704
01:06:01,799 –> 01:06:06,279
was important which is I think he
1705
01:06:04,000 –> 01:06:08,960
basically my interpretation is these
1706
01:06:06,279 –> 01:06:11,799
models will roughly all be the same but
1707
01:06:08,960 –> 01:06:14,279
there’s going to be a lot of scaffolding
1708
01:06:11,799 –> 01:06:15,960
around these models that actually allow
1709
01:06:14,279 –> 01:06:17,960
you to build these apps so in many ways
1710
01:06:15,960 –> 01:06:20,640
that is like the open source movement so
1711
01:06:17,960 –> 01:06:23,240
even if the model itself is never open
1712
01:06:20,640 –> 01:06:24,760
source it doesn’t much matter because
1713
01:06:23,240 –> 01:06:25,880
you have to pay for the infrastructure
1714
01:06:24,760 –> 01:06:27,799
right there’s a lot of open Source
1715
01:06:25,880 –> 01:06:30,440
software that runs on Amazon you still
1716
01:06:27,799 –> 01:06:34,240
pay AWS something so I think the right
1717
01:06:30,440 –> 01:06:36,760
way to think about this now
1718
01:06:34,240 –> 01:06:38,960
is the models will basically be all
1719
01:06:36,760 –> 01:06:40,440
really good and then it’s all this other
1720
01:06:38,960 –> 01:06:43,920
stuff that you’ll have to pay for
1721
01:06:40,440 –> 01:06:46,480
interface whoever builds all this other
1722
01:06:43,920 –> 01:06:48,559
stuff is going to be in a position to
1723
01:06:46,480 –> 01:06:50,279
build a really good business freeberg he
1724
01:06:48,559 –> 01:06:52,039
talked a lot about reasoning it seemed
1725
01:06:50,279 –> 01:06:53,640
like that he kept going to reasoning and
1726
01:06:52,039 –> 01:06:55,240
away from the language model did you not
1727
01:06:53,640 –> 01:06:57,440
did you note that and anything else that
1728
01:06:55,240 –> 01:06:58,599
you noted in our hour with yeah I mean
1729
01:06:57,440 –> 01:07:00,680
that’s a longer conversation because
1730
01:06:58,599 –> 01:07:03,359
there is a lot of talk about language
1731
01:07:00,680 –> 01:07:07,000
models eventually evolving to be so
1732
01:07:03,359 –> 01:07:09,520
generalizable that they can
1733
01:07:07,000 –> 01:07:11,680
resolve pretty much like all intelligent
1734
01:07:09,520 –> 01:07:13,839
function and so the language model is
1735
01:07:11,680 –> 01:07:16,960
the foundational model that that yields
1736
01:07:13,839 –> 01:07:18,079
AGI but that’s a I think there’s a lot
1737
01:07:16,960 –> 01:07:20,839
of people that at different schools of
1738
01:07:18,079 –> 01:07:24,000
thought on this and how much my my other
1739
01:07:20,839 –> 01:07:27,760
takeaway I think is that the I think
1740
01:07:24,000 –> 01:07:30,799
what he also seem to indicate is there’s
1741
01:07:27,760 –> 01:07:33,599
like so many like we’re also enraptured
1742
01:07:30,799 –> 01:07:35,520
by llms but there’s so many things other
1743
01:07:33,599 –> 01:07:37,880
than llms that are being baked and
1744
01:07:35,520 –> 01:07:39,559
rolled by him and by other groups and I
1745
01:07:37,880 –> 01:07:41,720
think we have to pay some amount of
1746
01:07:39,559 –> 01:07:43,359
attention to all those because that’s
1747
01:07:41,720 –> 01:07:44,440
probably where and I think freedberg you
1748
01:07:43,359 –> 01:07:45,720
tried to go there in your question
1749
01:07:44,440 –> 01:07:48,559
that’s where reasoning will really come
1750
01:07:45,720 –> 01:07:49,880
from is this mixture of experts approach
1751
01:07:48,559 –> 01:07:52,039
and so you’re going to have to think
1752
01:07:49,880 –> 01:07:54,039
multi-dimensionally to reason right we
1753
01:07:52,039 –> 01:07:56,559
do that right do I cross the street or
1754
01:07:54,039 –> 01:07:58,640
not in this point in time you reason
1755
01:07:56,559 –> 01:08:00,319
based on all these multi-inputs and so
1756
01:07:58,640 –> 01:08:01,640
there’s there’s all these little systems
1757
01:08:00,319 –> 01:08:03,520
that go into making that decision in
1758
01:08:01,640 –> 01:08:05,799
your brain and if you if you use that as
1759
01:08:03,520 –> 01:08:07,640
a a simple example there’s all this
1760
01:08:05,799 –> 01:08:10,079
stuff that has to go into
1761
01:08:07,640 –> 01:08:12,839
making some experience being able to
1762
01:08:10,079 –> 01:08:15,920
reason intelligently sax you went right
1763
01:08:12,839 –> 01:08:19,279
there with the corporate structure the
1764
01:08:15,920 –> 01:08:21,359
board and uh he he he gave us a lot more
1765
01:08:19,279 –> 01:08:24,679
information here what are your thoughts
1766
01:08:21,359 –> 01:08:26,120
on the hey you know the chip stuff and
1767
01:08:24,679 –> 01:08:27,839
the other stuff I’m working that’s all
1768
01:08:26,120 –> 01:08:30,239
part of open AI people just don’t
1769
01:08:27,839 –> 01:08:32,480
realize it and that moment and then you
1770
01:08:30,239 –> 01:08:35,080
know your questions to him about Equity
1771
01:08:32,480 –> 01:08:36,600
your thoughts on um I’m not sure I was
1772
01:08:35,080 –> 01:08:39,480
like the main guy who asked that
1773
01:08:36,600 –> 01:08:41,279
question jakal but um well no you did
1774
01:08:39,480 –> 01:08:44,640
talk about the the nonprofit that the
1775
01:08:41,279 –> 01:08:46,520
difference between question about the
1776
01:08:44,640 –> 01:08:49,199
clearly was some sort of culture Clash
1777
01:08:46,520 –> 01:08:50,920
on the board between the the people who
1778
01:08:49,199 –> 01:08:52,600
originated from the nonprofit world and
1779
01:08:50,920 –> 01:08:54,000
people who came from the startup world
1780
01:08:52,600 –> 01:08:55,199
we don’t really know more than that but
1781
01:08:54,000 –> 01:08:56,279
there clearly was some sort of culture
1782
01:08:55,199 –> 01:08:58,120
class
1783
01:08:56,279 –> 01:08:59,799
I thought one of the a couple of the
1784
01:08:58,120 –> 01:09:00,839
other areas that he drew attention to
1785
01:08:59,799 –> 01:09:02,679
that were kind of interesting is he
1786
01:09:00,839 –> 01:09:05,880
clearly thinks there’s a big opportunity
1787
01:09:02,679 –> 01:09:08,080
on mobile that goes beyond just like
1788
01:09:05,880 –> 01:09:10,719
having you know a chat gbt app on your
1789
01:09:08,080 –> 01:09:12,799
phone or maybe even having like a Siri
1790
01:09:10,719 –> 01:09:14,080
on your phone there’s clearly something
1791
01:09:12,799 –> 01:09:16,400
bigger there he doesn’t know exactly
1792
01:09:14,080 –> 01:09:18,799
what it is but it’s going to require
1793
01:09:16,400 –> 01:09:20,880
more inputs it’s that you know personal
1794
01:09:18,799 –> 01:09:22,719
assistant that’s seeing everything
1795
01:09:20,880 –> 01:09:24,480
around you help really I think that’s a
1796
01:09:22,719 –> 01:09:27,080
great Insight David because he was
1797
01:09:24,480 –> 01:09:29,199
talking about hey I’m looking for a
1798
01:09:27,080 –> 01:09:30,839
senior team member who can push back on
1799
01:09:29,199 –> 01:09:32,839
me and understands all context I thought
1800
01:09:30,839 –> 01:09:35,560
that was like a very interesting to
1801
01:09:32,839 –> 01:09:38,480
think about an executive assistant or an
1802
01:09:35,560 –> 01:09:40,679
assistant that’s has executive function
1803
01:09:38,480 –> 01:09:42,679
as opposed to being like just an alter
1804
01:09:40,679 –> 01:09:44,520
ego for you or what he called a
1805
01:09:42,679 –> 01:09:45,960
sycophant that’s kind of interesting I
1806
01:09:44,520 –> 01:09:47,120
thought that was interesting yeah yeah
1807
01:09:45,960 –> 01:09:49,640
and clearly he thinks there’s a big
1808
01:09:47,120 –> 01:09:51,040
opportunity in biology and scientific
1809
01:09:49,640 –> 01:09:52,719
discovery after the break I think we
1810
01:09:51,040 –> 01:09:54,000
should talk about Alpha fold 3 it was
1811
01:09:52,719 –> 01:09:56,199
just announ let’s do that and we can
1812
01:09:54,000 –> 01:09:57,880
talk about the the Apple ad and
1813
01:09:56,199 –> 01:09:59,199
I just want to also make sure people
1814
01:09:57,880 –> 01:10:00,600
understand when people come on the Pod
1815
01:09:59,199 –> 01:10:03,560
we don’t show them questions they don’t
1816
01:10:00,600 –> 01:10:05,360
edit the transcript nothing is out of
1817
01:10:03,560 –> 01:10:06,880
bounds if you were wondering why I
1818
01:10:05,360 –> 01:10:08,640
didn’t ask or we didn’t ask about the
1819
01:10:06,880 –> 01:10:10,120
Elon lawsuit he’s just not going to be
1820
01:10:08,640 –> 01:10:12,800
able to comment on that so it’ be no
1821
01:10:10,120 –> 01:10:14,199
comment so you know and we’re not like
1822
01:10:12,800 –> 01:10:15,760
our time was limited and there’s a lot
1823
01:10:14,199 –> 01:10:16,880
of questions that we could have asked
1824
01:10:15,760 –> 01:10:19,679
him that would have just been a waste of
1825
01:10:16,880 –> 01:10:20,719
time and com so I just want to make sure
1826
01:10:19,679 –> 01:10:22,960
people understand of course he’s going
1827
01:10:20,719 –> 01:10:25,159
to no comment in any lawsuit and he’s
1828
01:10:22,960 –> 01:10:26,440
already been asked about that 500 times
1829
01:10:25,159 –> 01:10:27,719
yes should we take a quick break before
1830
01:10:26,440 –> 01:10:28,920
the next before we come back yeah I’ll
1831
01:10:27,719 –> 01:10:30,320
take a bio break and then we’ll come
1832
01:10:28,920 –> 01:10:33,400
back with some news for you and some
1833
01:10:30,320 –> 01:10:36,400
more banter with your
1834
01:10:33,400 –> 01:10:38,440
favorite besties on the number one
1835
01:10:36,400 –> 01:10:40,199
podcast in the world the only podcast
1836
01:10:38,440 –> 01:10:41,920
all right welcome back everybody second
1837
01:10:40,199 –> 01:10:44,000
half of the show great guest Sam mman
1838
01:10:41,920 –> 01:10:46,679
thanks for coming on the Pod we’ve got a
1839
01:10:44,000 –> 01:10:49,400
bunch of news on the docket so let’s get
1840
01:10:46,679 –> 01:10:51,840
started freeberg you told me I could
1841
01:10:49,400 –> 01:10:54,000
give some names of uh the guests that
1842
01:10:51,840 –> 01:10:56,800
we’ve booked for the all in Summit I did
1843
01:10:54,000 –> 01:10:59,719
not you did you’ve said each week every
1844
01:10:56,800 –> 01:11:02,000
week that I get to say I did not I
1845
01:10:59,719 –> 01:11:05,000
appreciate your interest in the all in
1846
01:11:02,000 –> 01:11:08,440
Summits lineup but we do not yet
1847
01:11:05,000 –> 01:11:11,280
have uh enough critical math uh to feel
1848
01:11:08,440 –> 01:11:13,360
like we should go out there well uh I am
1849
01:11:11,280 –> 01:11:16,600
a loose canidate so I will announce my
1850
01:11:13,360 –> 01:11:18,840
two guests and I created The Summit and
1851
01:11:16,600 –> 01:11:20,440
you took it from me so and done a great
1852
01:11:18,840 –> 01:11:22,920
job I will announce my guests I don’t
1853
01:11:20,440 –> 01:11:24,880
care what your opinion is I have booked
1854
01:11:22,920 –> 01:11:27,120
two guests for the summit and it’s going
1855
01:11:24,880 –> 01:11:29,480
to be out look at these two guests I
1856
01:11:27,120 –> 01:11:31,560
booked for the third time coming back to
1857
01:11:29,480 –> 01:11:33,640
the summit our guy Elon Musk will be
1858
01:11:31,560 –> 01:11:35,360
there hopefully in person if not you
1859
01:11:33,640 –> 01:11:36,920
know from 40,000 feet on a starling
1860
01:11:35,360 –> 01:11:40,800
connection wherever he is in the world
1861
01:11:36,920 –> 01:11:43,440
and for the first time our friend Mark
1862
01:11:40,800 –> 01:11:45,080
Cuban will be coming and so two great
1863
01:11:43,440 –> 01:11:46,760
guests for you to look forward to but
1864
01:11:45,080 –> 01:11:49,719
free BG’s got like a thousand guests
1865
01:11:46,760 –> 01:11:51,199
coming he’ll tell you when it’s like 48
1866
01:11:49,719 –> 01:11:52,719
hours before the conference but yeah two
1867
01:11:51,199 –> 01:11:55,639
great speaking of billionaires who are
1868
01:11:52,719 –> 01:11:57,199
coming isn’t coming too yes coming yes
1869
01:11:55,639 –> 01:12:00,000
he’s booked so we have three
1870
01:11:57,199 –> 01:12:01,520
billionaires three billionaires yes okay
1871
01:12:00,000 –> 01:12:03,400
hasn’t fully confirmed so don’t okay
1872
01:12:01,520 –> 01:12:06,639
well we’re going to say it anyway has
1873
01:12:03,400 –> 01:12:08,120
penciled in back we say penciled yeah
1874
01:12:06,639 –> 01:12:10,320
don’t back out this is going to be
1875
01:12:08,120 –> 01:12:11,719
catnip for all these protest organizers
1876
01:12:10,320 –> 01:12:14,120
like if
1877
01:12:11,719 –> 01:12:17,040
youke the bear well by the way speaking
1878
01:12:14,120 –> 01:12:19,719
of updates what do you guys think of the
1879
01:12:17,040 –> 01:12:21,840
bottle for the all-in tequila oh
1880
01:12:19,719 –> 01:12:24,320
beautiful honestly honestly I will just
1881
01:12:21,840 –> 01:12:25,840
say I think you are doing a marvelous
1882
01:12:24,320 –> 01:12:27,000
job that
1883
01:12:25,840 –> 01:12:30,040
I was
1884
01:12:27,000 –> 01:12:33,840
shocked at the design shocked meaning it
1885
01:12:30,040 –> 01:12:37,120
is so unique and high quality I think
1886
01:12:33,840 –> 01:12:39,120
it’s amazing it would make me drink
1887
01:12:37,120 –> 01:12:41,920
tequila you’re going to You’re Gonna
1888
01:12:39,120 –> 01:12:44,800
Want to going to it it is uh stunning
1889
01:12:41,920 –> 01:12:47,199
just congratulations and um yeah it was
1890
01:12:44,800 –> 01:12:49,480
just when we went through the deck at
1891
01:12:47,199 –> 01:12:51,159
the uh at the monthly meeting it was
1892
01:12:49,480 –> 01:12:52,760
like oh that’s nice oh that’s nice we’re
1893
01:12:51,159 –> 01:12:55,000
going to do the concept bottles and then
1894
01:12:52,760 –> 01:12:56,920
that bottle came up and everybody went
1895
01:12:55,000 –> 01:12:58,440
like crazy it was like somebody hitting
1896
01:12:56,920 –> 01:13:01,040
like a Steph Curry hitting a half court
1897
01:12:58,440 –> 01:13:03,679
shot it was like oh my God it was just
1898
01:13:01,040 –> 01:13:07,520
so clear that you’ve made an iconic
1899
01:13:03,679 –> 01:13:11,719
bottle that if we can produce it oh Lord
1900
01:13:07,520 –> 01:13:13,360
it is going to be looks like we can the
1901
01:13:11,719 –> 01:13:14,920
make it yeah it’s gonna be amazing I’m
1902
01:13:13,360 –> 01:13:16,679
excited I’m excited for it you know it’s
1903
01:13:14,920 –> 01:13:18,639
like the Des so complicated that we have
1904
01:13:16,679 –> 01:13:21,320
to do a feasibility analysis on whether
1905
01:13:18,639 –> 01:13:23,080
it was actually manufacturable but it is
1906
01:13:21,320 –> 01:13:25,360
so or at least the early reports are
1907
01:13:23,080 –> 01:13:27,239
good so we’re going to H hopefully we’ll
1908
01:13:25,360 –> 01:13:30,520
have some a for the in time for the all
1909
01:13:27,239 –> 01:13:32,159
Summit I mean why not sounds I mean it’s
1910
01:13:30,520 –> 01:13:34,400
great when we get barricaded in by all
1911
01:13:32,159 –> 01:13:36,760
these protesters we can drink the
1912
01:13:34,400 –> 01:13:39,040
tequila did you guys see did you see
1913
01:13:36,760 –> 01:13:41,440
Peter te Peter teal got barricaded by
1914
01:13:39,040 –> 01:13:42,880
these ding-dongs at Cambridge my god
1915
01:13:41,440 –> 01:13:44,320
listen people have the right to protest
1916
01:13:42,880 –> 01:13:46,440
I think it’s great people are protesting
1917
01:13:44,320 –> 01:13:49,120
but surrounding people and threatening
1918
01:13:46,440 –> 01:13:51,199
them is a little bit over the top and D
1919
01:13:49,120 –> 01:13:52,400
think you’re exaggerating what happened
1920
01:13:51,199 –> 01:13:54,280
well I don’t know exactly what happened
1921
01:13:52,400 –> 01:13:56,199
because all we see is these videos look
1922
01:13:54,280 –> 01:13:57,639
they’re not threatening anybody and I
1923
01:13:56,199 –> 01:13:59,719
don’t even think they tried to barricade
1924
01:13:57,639 –> 01:14:01,679
him in they were just outside the
1925
01:13:59,719 –> 01:14:04,880
building and because they were blocking
1926
01:14:01,679 –> 01:14:05,760
the driveway his car couldn’t leave but
1927
01:14:04,880 –> 01:14:08,199
he
1928
01:14:05,760 –> 01:14:10,639
wasn’t physically like locked in the
1929
01:14:08,199 –> 01:14:12,199
building or something yeah that’s that’s
1930
01:14:10,639 –> 01:14:14,120
what the headlines say but that could be
1931
01:14:12,199 –> 01:14:17,920
fake news fake Social yeah this was not
1932
01:14:14,120 –> 01:14:19,800
on my bingo card this Pro protester
1933
01:14:17,920 –> 01:14:22,920
support by saaks was not on the bingo
1934
01:14:19,800 –> 01:14:25,239
card I got to say I contitution the
1935
01:14:22,920 –> 01:14:27,239
Constitution of the United States in the
1936
01:14:25,239 –> 01:14:29,320
Amendment provides for the right of
1937
01:14:27,239 –> 01:14:31,760
assembly which includes protest and sit
1938
01:14:29,320 –> 01:14:34,440
in as long as they’re as long as they’re
1939
01:14:31,760 –> 01:14:36,639
Peaceable now obviously if they go too
1940
01:14:34,440 –> 01:14:38,400
far and they vandalize or break into
1941
01:14:36,639 –> 01:14:40,760
buildings or use violence then that’s
1942
01:14:38,400 –> 01:14:43,320
not Peaceable however expressing
1943
01:14:40,760 –> 01:14:46,440
sentiments with which you disagree does
1944
01:14:43,320 –> 01:14:47,760
not make it violent and there’s all
1945
01:14:46,440 –> 01:14:50,360
these people out there now making the
1946
01:14:47,760 –> 01:14:54,239
argument that if you hear something from
1947
01:14:50,360 –> 01:14:56,800
a protester that you don’t like and you
1948
01:14:54,239 –> 01:14:59,320
subjectively experience that as a as a
1949
01:14:56,800 –> 01:15:02,120
threat to your safety then that somehow
1950
01:14:59,320 –> 01:15:04,040
should be you know treated as valid like
1951
01:15:02,120 –> 01:15:07,080
that’s basically violent well that’s
1952
01:15:04,040 –> 01:15:09,920
that’s not what the Constitution says
1953
01:15:07,080 –> 01:15:11,440
and these people understood well just a
1954
01:15:09,920 –> 01:15:14,159
few months ago that that was basically
1955
01:15:11,440 –> 01:15:16,600
snowf fakery that you know just because
1956
01:15:14,159 –> 01:15:18,760
somebody you know what I’m
1957
01:15:16,600 –> 01:15:21,320
saying we have the rise of the woke
1958
01:15:18,760 –> 01:15:22,800
right now where they’re buying yeah the
1959
01:15:21,320 –> 01:15:25,080
woke right they’re buying into this idea
1960
01:15:22,800 –> 01:15:26,639
of safetyism which is being exposed
1961
01:15:25,080 –> 01:15:28,400
ideas you don’t like to protest you
1962
01:15:26,639 –> 01:15:30,480
don’t like is a threat to your safety no
1963
01:15:28,400 –> 01:15:33,400
it’s
1964
01:15:30,480 –> 01:15:35,760
not every we absolutely have snow fakery
1965
01:15:33,400 –> 01:15:38,520
on both sides now it’s ridiculous the
1966
01:15:35,760 –> 01:15:41,800
only thing I will say that I’ve seen and
1967
01:15:38,520 –> 01:15:44,280
is this this this uh surrounding
1968
01:15:41,800 –> 01:15:46,080
individuals who you don’t want there and
1969
01:15:44,280 –> 01:15:48,320
locking them in a circle and then moving
1970
01:15:46,080 –> 01:15:49,920
them out of like protesta that’s not
1971
01:15:48,320 –> 01:15:52,280
cool yeah obviously you can’t do that
1972
01:15:49,920 –> 01:15:53,840
but look I think that most of the
1973
01:15:52,280 –> 01:15:55,080
protests on most of the campuses have
1974
01:15:53,840 –> 01:15:57,480
not crossed the line they’ve just
1975
01:15:55,080 –> 01:16:01,280
occupied The Lawns of these campuses and
1976
01:15:57,480 –> 01:16:04,520
look I’ve seen some troublemakers try to
1977
01:16:01,280 –> 01:16:06,440
barge through the the encampments and
1978
01:16:04,520 –> 01:16:08,159
claim that because they can’t go through
1979
01:16:06,440 –> 01:16:10,199
there that somehow they’re being
1980
01:16:08,159 –> 01:16:12,480
prevented from going to class look you
1981
01:16:10,199 –> 01:16:15,239
just walk around the lawn and you can
1982
01:16:12,480 –> 01:16:18,480
get to class okay and you know some of
1983
01:16:15,239 –> 01:16:20,520
these videos are showing that these are
1984
01:16:18,480 –> 01:16:24,679
effectively right-wing provocators who
1985
01:16:20,520 –> 01:16:26,960
are engaging in leftwing tactics and I
1986
01:16:24,679 –> 01:16:28,159
don’t support it either way by the way
1987
01:16:26,960 –> 01:16:29,639
some of these camps are some of the
1988
01:16:28,159 –> 01:16:32,600
funniest things you’ve ever seen it’s
1989
01:16:29,639 –> 01:16:34,920
like there are like a one tent that’s
1990
01:16:32,600 –> 01:16:37,159
dedicated to like a reading room and you
1991
01:16:34,920 –> 01:16:39,639
go in there and there’s like these like
1992
01:16:37,159 –> 01:16:41,320
Center oh my God it’s unbelievably
1993
01:16:39,639 –> 01:16:42,639
hilarious look there there’s no question
1994
01:16:41,320 –> 01:16:44,639
that because the protests are
1995
01:16:42,639 –> 01:16:46,679
originating on the left that there’s
1996
01:16:44,639 –> 01:16:48,880
some goofy views like you know you’re
1997
01:16:46,679 –> 01:16:52,000
dealing with like a leftwing idea
1998
01:16:48,880 –> 01:16:53,639
complex right but and you know it’s easy
1999
01:16:52,000 –> 01:16:56,159
to make fun of them doing different
2000
01:16:53,639 –> 01:16:57,520
things but the fact of the matter is
2001
01:16:56,159 –> 01:16:59,880
that most of the protests on most of
2002
01:16:57,520 –> 01:17:01,480
these campuses are even though they can
2003
01:16:59,880 –> 01:17:04,040
be annoying because they’re occupying
2004
01:17:01,480 –> 01:17:05,840
part of the lawn they’re not violent
2005
01:17:04,040 –> 01:17:06,880
yeah and you know the way they’re being
2006
01:17:05,840 –> 01:17:09,560
cracked down on they’re sending the
2007
01:17:06,880 –> 01:17:12,320
police in at 5:00 a. to crack down on
2008
01:17:09,560 –> 01:17:14,840
these encampments with batons and riot
2009
01:17:12,320 –> 01:17:17,239
gear and I find that part to be
2010
01:17:14,840 –> 01:17:19,239
completely excessive well it’s also
2011
01:17:17,239 –> 01:17:21,040
dangerous because you know things can
2012
01:17:19,239 –> 01:17:22,360
escalate when you have mobs of people
2013
01:17:21,040 –> 01:17:24,239
and large groups of people so I just
2014
01:17:22,360 –> 01:17:26,120
want to make sure people understand that
2015
01:17:24,239 –> 01:17:28,280
large group of people large you have a
2016
01:17:26,120 –> 01:17:29,719
diffusion of responsibility that occurs
2017
01:17:28,280 –> 01:17:31,159
when there’s large groups of people who
2018
01:17:29,719 –> 01:17:33,080
are passionate about things and and
2019
01:17:31,159 –> 01:17:35,480
people could get hurt people have gotten
2020
01:17:33,080 –> 01:17:37,120
killed at these things so just you know
2021
01:17:35,480 –> 01:17:39,080
keep it calm everybody I agree with you
2022
01:17:37,120 –> 01:17:40,960
like what’s the harm of these folks
2023
01:17:39,080 –> 01:17:42,440
protesting on a lawn it’s not a big deal
2024
01:17:40,960 –> 01:17:44,000
when they break into buildings of course
2025
01:17:42,440 –> 01:17:45,360
yeah that crosses the line obviously
2026
01:17:44,000 –> 01:17:48,120
yeah but I mean let them sit out there
2027
01:17:45,360 –> 01:17:50,239
and then they’ll run out their food cars
2028
01:17:48,120 –> 01:17:51,960
their food card and they run out of
2029
01:17:50,239 –> 01:17:53,280
waffles did you guys see the clip I
2030
01:17:51,960 –> 01:17:56,159
think it was on the University of
2031
01:17:53,280 –> 01:17:58,719
Washington campus where one kid
2032
01:17:56,159 –> 01:18:00,880
challenged this antifa guy to a push-up
2033
01:17:58,719 –> 01:18:02,960
contest oh
2034
01:18:00,880 –> 01:18:05,000
fantastic I mean it’s it is some of the
2035
01:18:02,960 –> 01:18:07,600
funniest stuff some of some content is
2036
01:18:05,000 –> 01:18:09,400
coming out that’s just my favorite was
2037
01:18:07,600 –> 01:18:11,280
the woman who came out and said that the
2038
01:18:09,400 –> 01:18:14,320
Columbia students needed humanitarian
2039
01:18:11,280 –> 01:18:16,159
Aid and oh my God the overdubs on her
2040
01:18:14,320 –> 01:18:19,159
were hilarious I was like uh
2041
01:18:16,159 –> 01:18:21,120
humanitarian Aid the he was like we need
2042
01:18:19,159 –> 01:18:23,040
our door Dash right now we need we
2043
01:18:21,120 –> 01:18:26,040
Double Dash some boba and we can’t get
2044
01:18:23,040 –> 01:18:28,159
it through the the police need our Boba
2045
01:18:26,040 –> 01:18:31,120
low sugar Boba with the with the popping
2046
01:18:28,159 –> 01:18:32,400
Boba bubbles wasn’t getting in but you
2047
01:18:31,120 –> 01:18:34,719
know people have the right to protest
2048
01:18:32,400 –> 01:18:36,880
and uh Peaceable by the way there’s a
2049
01:18:34,719 –> 01:18:39,360
word I’ve never heard very good sax
2050
01:18:36,880 –> 01:18:41,560
Peaceable inclined to avoid argument or
2051
01:18:39,360 –> 01:18:42,760
violent conflict very nice well it’s in
2052
01:18:41,560 –> 01:18:44,679
the Constitution it’s in the first
2053
01:18:42,760 –> 01:18:46,080
amendment is it really I’ve never I
2054
01:18:44,679 –> 01:18:47,760
haven’t heard the word Peaceable before
2055
01:18:46,080 –> 01:18:51,280
I mean you and I are sympatico on this
2056
01:18:47,760 –> 01:18:55,040
like I I don’t we used to have the
2057
01:18:51,280 –> 01:18:57,920
ACLU like backing up the KKK going down
2058
01:18:55,040 –> 01:19:00,360
Main Street and really fighting decision
2059
01:18:57,920 –> 01:19:02,199
yeah they were really fighting for I I’m
2060
01:19:00,360 –> 01:19:04,199
and I have to say the Overton window is
2061
01:19:02,199 –> 01:19:05,520
opened back up and I think it’s great
2062
01:19:04,199 –> 01:19:07,159
all right we got some things on the
2063
01:19:05,520 –> 01:19:09,440
docket here I don’t know if you guys saw
2064
01:19:07,159 –> 01:19:12,360
the Apple new iPad ad it’s getting a
2065
01:19:09,440 –> 01:19:15,480
bunch of criticism they use like some
2066
01:19:12,360 –> 01:19:19,000
giant hydraulic press to crush a bunch
2067
01:19:15,480 –> 01:19:21,920
of creative tools EJ turntable trumpet
2068
01:19:19,000 –> 01:19:24,360
piano people really care about Apple’s
2069
01:19:21,920 –> 01:19:26,880
ads and what they represent we talked
2070
01:19:24,360 –> 01:19:29,040
about that that uh Mother Earth little
2071
01:19:26,880 –> 01:19:30,239
vignette they created here what do you
2072
01:19:29,040 –> 01:19:32,679
think free BR did you see the ad what
2073
01:19:30,239 –> 01:19:34,360
was your reaction to it made me sad it
2074
01:19:32,679 –> 01:19:38,120
it did not make me want to buy an iPad
2075
01:19:34,360 –> 01:19:40,280
so huh did not seem like it made you sad
2076
01:19:38,120 –> 01:19:42,400
it actually elicited an emotion meaning
2077
01:19:40,280 –> 01:19:43,600
like commercials it’s very rare that
2078
01:19:42,400 –> 01:19:45,280
commercials can actually do that most
2079
01:19:43,600 –> 01:19:47,120
people just Zone up yeah they took all
2080
01:19:45,280 –> 01:19:49,080
this beautiful stuff and heard it didn’t
2081
01:19:47,120 –> 01:19:50,320
it didn’t feel good I don’t know it just
2082
01:19:49,080 –> 01:19:52,639
didn’t seem like a good ad I don’t know
2083
01:19:50,320 –> 01:19:54,080
why they did that I don’t get it I I’ve
2084
01:19:52,639 –> 01:19:55,800
I don’t I don’t know I think I think
2085
01:19:54,080 –> 01:19:57,440
maybe what they’re trying to do is the
2086
01:19:55,800 –> 01:19:59,600
the selling point of this new iPad is
2087
01:19:57,440 –> 01:20:00,920
that it’s the thinnest one I mean
2088
01:19:59,600 –> 01:20:02,760
there’s no innovation left so they’re
2089
01:20:00,920 –> 01:20:06,239
just making the devices yeah you know
2090
01:20:02,760 –> 01:20:07,040
thinner yeah so I think the idea was
2091
01:20:06,239 –> 01:20:09,080
that they were going to take this
2092
01:20:07,040 –> 01:20:12,440
hydraulic press to represent how
2093
01:20:09,080 –> 01:20:14,719
ridiculously thin the new iPad is now I
2094
01:20:12,440 –> 01:20:17,120
don’t know if the point there was to
2095
01:20:14,719 –> 01:20:18,199
smush all of that good stuff into the
2096
01:20:17,120 –> 01:20:21,560
iPad I don’t know if that’s what they
2097
01:20:18,199 –> 01:20:24,840
were trying to convey but yeah I I think
2098
01:20:21,560 –> 01:20:26,840
that by destroying all those creative
2099
01:20:24,840 –> 01:20:28,760
tools that apple is supposed to
2100
01:20:26,840 –> 01:20:30,520
represent it definitely seemed very
2101
01:20:28,760 –> 01:20:34,080
offbrand for them and I think people
2102
01:20:30,520 –> 01:20:35,320
were reacting to the fact that it was so
2103
01:20:34,080 –> 01:20:37,199
um different than what they would have
2104
01:20:35,320 –> 01:20:38,480
done in the past and of course everyone
2105
01:20:37,199 –> 01:20:41,560
was saying well Steve would never have
2106
01:20:38,480 –> 01:20:45,239
done this I do think it did land wrong I
2107
01:20:41,560 –> 01:20:46,639
mean I I didn’t care that much but but I
2108
01:20:45,239 –> 01:20:48,679
I was kind of asking the question like
2109
01:20:46,639 –> 01:20:52,360
why are they destroying all these
2110
01:20:48,679 –> 01:20:55,360
Creator tools that they’re renowned for
2111
01:20:52,360 –> 01:20:58,600
creating or for turning into the digital
2112
01:20:55,360 –> 01:21:01,360
version yeah it just didn’t land I mean
2113
01:20:58,600 –> 01:21:06,120
shath how are you doing emotionally
2114
01:21:01,360 –> 01:21:09,120
after seeing me are you okay buddy yeah
2115
01:21:06,120 –> 01:21:10,159
I think this is uh you guys see that in
2116
01:21:09,120 –> 01:21:12,800
the
2117
01:21:10,159 –> 01:21:17,080
birkshire annual meeting last
2118
01:21:12,800 –> 01:21:19,719
weekend Tim Cook was in the audience and
2119
01:21:17,080 –> 01:21:22,480
Buffett was very laudatory this is an
2120
01:21:19,719 –> 01:21:24,560
incredible company but he’s so clever
2121
01:21:22,480 –> 01:21:28,159
with words he’s like you know this is an
2122
01:21:24,560 –> 01:21:31,400
incredible business that we will hold
2123
01:21:28,159 –> 01:21:33,400
forever most likely then it turns out
2124
01:21:31,400 –> 01:21:34,679
that he sold $20 billion worth of Apple
2125
01:21:33,400 –> 01:21:37,199
shares in the
2126
01:21:34,679 –> 01:21:39,080
cat we’re gonna hold it forever which
2127
01:21:37,199 –> 01:21:40,960
which by the way sell if you guys
2128
01:21:39,080 –> 01:21:42,520
remember we we put that little chart up
2129
01:21:40,960 –> 01:21:44,920
which shows when he doesn’t mention it
2130
01:21:42,520 –> 01:21:46,840
in the in the annual letter it’s
2131
01:21:44,920 –> 01:21:49,239
basically like it’s foreshadowing the
2132
01:21:46,840 –> 01:21:52,120
fact that he is just pounding the cell
2133
01:21:49,239 –> 01:21:54,880
and he sold $20 billion well also
2134
01:21:52,120 –> 01:21:56,400
holding it forever could mean one share
2135
01:21:54,880 –> 01:21:59,199
yeah exactly we kind of need to know
2136
01:21:56,400 –> 01:22:01,000
like how much are we talking about I
2137
01:21:59,199 –> 01:22:03,080
mean it’s an incredible business that
2138
01:22:01,000 –> 01:22:04,320
has so much money with nothing to do
2139
01:22:03,080 –> 01:22:06,679
they’re probably just going to buy back
2140
01:22:04,320 –> 01:22:08,400
the stock just a total waste they were
2141
01:22:06,679 –> 01:22:10,520
floating this rumor of buying rivan you
2142
01:22:08,400 –> 01:22:11,960
know after they shut down Titan project
2143
01:22:10,520 –> 01:22:13,560
the your internal project to make a car
2144
01:22:11,960 –> 01:22:16,000
it seems like a car is the only thing
2145
01:22:13,560 –> 01:22:17,760
people can think of that would move the
2146
01:22:16,000 –> 01:22:19,719
needle in terms of earnings I think the
2147
01:22:17,760 –> 01:22:21,679
problem is J like you kind of become
2148
01:22:19,719 –> 01:22:23,520
afraid of your own shadow meaning the
2149
01:22:21,679 –> 01:22:25,120
the folks that are really good at m&a
2150
01:22:23,520 –> 01:22:27,880
like you look at Benny off
2151
01:22:25,120 –> 01:22:31,560
the thing with benof m& strategy is that
2152
01:22:27,880 –> 01:22:34,440
he’s been doing it for 20 years and so
2153
01:22:31,560 –> 01:22:37,280
he’s cut his teeth on small Acquisitions
2154
01:22:34,440 –> 01:22:39,199
and the market learns to give him trust
2155
01:22:37,280 –> 01:22:41,840
so that when he proposes like the $27
2156
01:22:39,199 –> 01:22:44,800
billion slack acquisition he’s allowed
2157
01:22:41,840 –> 01:22:46,960
to do that another guy you know narora
2158
01:22:44,800 –> 01:22:48,480
at pandw these last five years people
2159
01:22:46,960 –> 01:22:49,960
were very skeptical that he could
2160
01:22:48,480 –> 01:22:52,360
actually roll up security because it was
2161
01:22:49,960 –> 01:22:54,080
a super fragmented Market he’s gotten
2162
01:22:52,360 –> 01:22:56,480
permission then there are companies like
2163
01:22:54,080 –> 01:22:58,159
Dan that buy hundreds of companies so
2164
01:22:56,480 –> 01:23:00,280
all these folks are examples of you
2165
01:22:58,159 –> 01:23:02,840
start small and you you earn the right
2166
01:23:00,280 –> 01:23:04,760
to do more Apple hasn’t bought anything
2167
01:23:02,840 –> 01:23:06,760
more than 50 or hundred million do and
2168
01:23:04,760 –> 01:23:08,880
so the idea that all of a sudden they
2169
01:23:06,760 –> 01:23:10,800
come out of the blue and buy a 102
2170
01:23:08,880 –> 01:23:13,639
billion dollar company I think is just
2171
01:23:10,800 –> 01:23:15,040
totally doesn’t stand logic it’s just
2172
01:23:13,639 –> 01:23:16,800
not possible for them because they’ll be
2173
01:23:15,040 –> 01:23:19,760
so afraid of their own shadow that’s the
2174
01:23:16,800 –> 01:23:21,040
big problem it’s themselves well if
2175
01:23:19,760 –> 01:23:23,920
you’re running out of In-House
2176
01:23:21,040 –> 01:23:25,679
Innovation and you can’t do m&a then
2177
01:23:23,920 –> 01:23:27,840
your options are kind of limited I mean
2178
01:23:25,679 –> 01:23:30,159
I do think that the fact that the big
2179
01:23:27,840 –> 01:23:32,520
news out of apple is the iPad’s Getting
2180
01:23:30,159 –> 01:23:34,360
Thinner does represent kind of the end
2181
01:23:32,520 –> 01:23:36,760
of the road in terms of innovation it’s
2182
01:23:34,360 –> 01:23:39,440
kind of like when they added the third
2183
01:23:36,760 –> 01:23:41,159
camera to the iPhone yeah it reminds me
2184
01:23:39,440 –> 01:23:43,480
of those um remember like when the
2185
01:23:41,159 –> 01:23:44,719
Gillette yeah they came out and then
2186
01:23:43,480 –> 01:23:47,120
they did the five was the best onion
2187
01:23:44,719 –> 01:23:49,040
thing was like we’re doing five f
2188
01:23:47,120 –> 01:23:51,400
it but then Gillette actually came out
2189
01:23:49,040 –> 01:23:52,360
with the Mach 5 so yeah like the parody
2190
01:23:51,400 –> 01:23:54,480
became the reality what are they going
2191
01:23:52,360 –> 01:23:57,000
to do add two more cameras to the iPhone
2192
01:23:54,480 –> 01:23:58,639
you have five cameras on it no makes no
2193
01:23:57,000 –> 01:24:01,000
sense and then I don’t know anybody
2194
01:23:58,639 –> 01:24:02,960
wants to remember the Apple Vision was
2195
01:24:01,000 –> 01:24:05,120
like gonna plus why are they body
2196
01:24:02,960 –> 01:24:07,800
shaming the the fat
2197
01:24:05,120 –> 01:24:10,199
iPads that’s a fair point Fair Point
2198
01:24:07,800 –> 01:24:11,880
actually you know what it’s actually
2199
01:24:10,199 –> 01:24:14,520
this didn’t come out yet but it turns
2200
01:24:11,880 –> 01:24:15,880
out the iPad is on OIC it’s actually
2201
01:24:14,520 –> 01:24:20,719
dropped a lot that would have been a
2202
01:24:15,880 –> 01:24:22,880
funnier ad yeah yeah exactly oh oh
2203
01:24:20,719 –> 01:24:24,159
ohic we can just Workshop that right
2204
01:24:22,880 –> 01:24:26,000
here but there was another funny one
2205
01:24:24,159 –> 01:24:27,600
which was making the iPhone smaller and
2206
01:24:26,000 –> 01:24:28,719
smaller and smaller and the iPod smaller
2207
01:24:27,600 –> 01:24:30,679
and smaller and smaller to the point it
2208
01:24:28,719 –> 01:24:33,040
was like you know like a thumb siize
2209
01:24:30,679 –> 01:24:36,639
iPhone like the Ben Stiller phone in
2210
01:24:33,040 –> 01:24:39,719
Zoolander or
2211
01:24:36,639 –> 01:24:42,440
correct yeah that was a great scene is
2212
01:24:39,719 –> 01:24:45,159
there a category that you can think of
2213
01:24:42,440 –> 01:24:47,920
that you would love an Apple product for
2214
01:24:45,159 –> 01:24:50,760
there’s a product in your life that you
2215
01:24:47,920 –> 01:24:53,199
would love to have Apple’s version of it
2216
01:24:50,760 –> 01:24:55,040
they they killed it I think a lot of
2217
01:24:53,199 –> 01:24:57,960
people would be very open minded to an
2218
01:24:55,040 –> 01:25:00,159
Apple car okay they they just would it’s
2219
01:24:57,960 –> 01:25:02,600
it’s a connected internet device
2220
01:25:00,159 –> 01:25:03,880
increasingly so yeah and they they
2221
01:25:02,600 –> 01:25:06,760
managed to flub
2222
01:25:03,880 –> 01:25:09,360
it they had a chance to buy Tesla they
2223
01:25:06,760 –> 01:25:11,000
managed to flub it yeah right there are
2224
01:25:09,360 –> 01:25:12,960
just too many examples here where these
2225
01:25:11,000 –> 01:25:14,400
guys have so much money and not enough
2226
01:25:12,960 –> 01:25:17,119
ideas that’s a
2227
01:25:14,400 –> 01:25:19,880
shame it’s a bummer yeah the one I
2228
01:25:17,119 –> 01:25:21,000
always wanted to see them do saxs was TV
2229
01:25:19,880 –> 01:25:22,400
the one I always wanted to see them do
2230
01:25:21,000 –> 01:25:24,880
was the TV and they were supposedly
2231
01:25:22,400 –> 01:25:26,840
working on it like the actual TV not the
2232
01:25:24,880 –> 01:25:28,400
little Apple TV box in the back and like
2233
01:25:26,840 –> 01:25:31,600
that would have been extraordinary to
2234
01:25:28,400 –> 01:25:33,280
actually have a gorgeous you know big
2235
01:25:31,600 –> 01:25:34,760
television what about a gaming console
2236
01:25:33,280 –> 01:25:36,800
they could have done that you know
2237
01:25:34,760 –> 01:25:39,440
there’s just all these things that they
2238
01:25:36,800 –> 01:25:42,119
could have done it’s not a lack of
2239
01:25:39,440 –> 01:25:44,560
imagination because these aren’t exactly
2240
01:25:42,119 –> 01:25:46,159
incredibly World beating ideas they’re
2241
01:25:44,560 –> 01:25:50,119
sitting right in front of your face it’s
2242
01:25:46,159 –> 01:25:52,280
just the will to do it yeah all in one
2243
01:25:50,119 –> 01:25:55,360
TV would have been good if you think
2244
01:25:52,280 –> 01:25:57,760
back on Apple product lineup over the
2245
01:25:55,360 –> 01:26:00,320
years where they’ve really created value
2246
01:25:57,760 –> 01:26:02,639
is on how unique the products are they
2247
01:26:00,320 –> 01:26:04,560
almost create new categories sure there
2248
01:26:02,639 –> 01:26:06,239
may have been a quote tablet computer
2249
01:26:04,560 –> 01:26:08,679
prior to the iPad but the iPad really
2250
01:26:06,239 –> 01:26:10,400
defined the tablet computer era sure
2251
01:26:08,679 –> 01:26:12,000
there was a smartphone or two before the
2252
01:26:10,400 –> 01:26:14,119
iPhone came along but it really defined
2253
01:26:12,000 –> 01:26:15,800
the smartphone and sure there was a
2254
01:26:14,119 –> 01:26:17,119
computer before the Apple 2 and then it
2255
01:26:15,800 –> 01:26:19,960
came along and it defined the personal
2256
01:26:17,119 –> 01:26:22,520
computer in all these cases I think
2257
01:26:19,960 –> 01:26:24,480
Apple strives to define the category so
2258
01:26:22,520 –> 01:26:25,880
it’s very hard to define a television if
2259
01:26:24,480 –> 01:26:28,239
you think about it or gaming console in
2260
01:26:25,880 –> 01:26:29,800
a way that you take a step up and you
2261
01:26:28,239 –> 01:26:32,159
say this is the new thing this is the
2262
01:26:29,800 –> 01:26:33,719
new platform so I don’t know that’s the
2263
01:26:32,159 –> 01:26:36,480
lens I would look at if I’m Apple in
2264
01:26:33,719 –> 01:26:38,080
terms of like can I redefine a car can I
2265
01:26:36,480 –> 01:26:40,000
make you know we’re all trying to fit
2266
01:26:38,080 –> 01:26:41,199
them into an existing product bucket but
2267
01:26:40,000 –> 01:26:43,600
I think what they’ve always been so good
2268
01:26:41,199 –> 01:26:45,119
at is identifying consumer needs and
2269
01:26:43,600 –> 01:26:46,600
then creating an entirely new way of
2270
01:26:45,119 –> 01:26:49,360
addressing that need in a real step
2271
01:26:46,600 –> 01:26:51,520
change function from the like the the
2272
01:26:49,360 –> 01:26:53,800
the um iPod it was so different from any
2273
01:26:51,520 –> 01:26:55,239
MP3 player ever I think the reason why
2274
01:26:53,800 –> 01:26:57,280
the car could have been completely
2275
01:26:55,239 –> 01:26:59,920
reimagined by Apple is that they have a
2276
01:26:57,280 –> 01:27:02,679
level of credibility and trust that I
2277
01:26:59,920 –> 01:27:05,639
think probably no other company has and
2278
01:27:02,679 –> 01:27:08,199
absolutely no other tech company has and
2279
01:27:05,639 –> 01:27:11,080
we talked about this but I think this
2280
01:27:08,199 –> 01:27:13,760
was the third Steve Job story that that
2281
01:27:11,080 –> 01:27:15,790
I left out but in
2282
01:27:13,760 –> 01:27:17,480
200 I don’t know was it
2283
01:27:15,790 –> 01:27:21,280
[Music]
2284
01:27:17,480 –> 01:27:22,840
one I launched a 99 cent download store
2285
01:27:21,280 –> 01:27:26,119
right I think I’ve told you the story in
2286
01:27:22,840 –> 01:27:26,119
winam and
2287
01:27:26,159 –> 01:27:30,040
Steve Jobs just ran total circles around
2288
01:27:28,239 –> 01:27:31,960
us but the reason he was able to is he
2289
01:27:30,040 –> 01:27:34,360
had all the credibility to go to the
2290
01:27:31,960 –> 01:27:36,760
labels and get deals done for licensing
2291
01:27:34,360 –> 01:27:38,000
music that nobody could get done before
2292
01:27:36,760 –> 01:27:39,639
I think that’s an example of what
2293
01:27:38,000 –> 01:27:42,920
Apple’s able to do which is to use their
2294
01:27:39,639 –> 01:27:44,960
political Capital to change the rules so
2295
01:27:42,920 –> 01:27:47,639
if the thing that we would all want is
2296
01:27:44,960 –> 01:27:49,840
safer roads and autonomous vehicles
2297
01:27:47,639 –> 01:27:52,360
there are regions in every town and city
2298
01:27:49,840 –> 01:27:55,760
that could be completely converted to
2299
01:27:52,360 –> 01:27:57,000
level five autonomous zones if I had to
2300
01:27:55,760 –> 01:27:58,760
pick one company that had the
2301
01:27:57,000 –> 01:28:01,440
credibility to go and change those rules
2302
01:27:58,760 –> 01:28:03,840
it’s them because they could demonstrate
2303
01:28:01,440 –> 01:28:05,920
that there was a methodical safe
2304
01:28:03,840 –> 01:28:08,239
approach to doing something and so the
2305
01:28:05,920 –> 01:28:09,880
point is that even in these categories
2306
01:28:08,239 –> 01:28:11,560
that could be totally reimagined it’s
2307
01:28:09,880 –> 01:28:13,040
not for a lack of imagination again it
2308
01:28:11,560 –> 01:28:15,400
just goes back to a complete lack of
2309
01:28:13,040 –> 01:28:18,800
Will and I understand because if they
2310
01:28:15,400 –> 01:28:21,199
had if you if you had $200 billion do of
2311
01:28:18,800 –> 01:28:23,000
capital on your balance sheet I think
2312
01:28:21,199 –> 01:28:25,880
it’s probably pretty easy to get fat and
2313
01:28:23,000 –> 01:28:27,520
lazy yeah it is and and they want to
2314
01:28:25,880 –> 01:28:29,000
have everything built there people don’t
2315
01:28:27,520 –> 01:28:30,679
remember but they actually built one of
2316
01:28:29,000 –> 01:28:32,480
the first digital cameras you must have
2317
01:28:30,679 –> 01:28:34,480
owned this right freeberg you’re I
2318
01:28:32,480 –> 01:28:36,199
remember this yeah totally it beautiful
2319
01:28:34,480 –> 01:28:39,040
what did they call it was it the ey
2320
01:28:36,199 –> 01:28:41,280
camera or something quick take quick
2321
01:28:39,040 –> 01:28:42,800
take yeah um the thing I would like to
2322
01:28:41,280 –> 01:28:46,080
see apple build and I’m surprised they
2323
01:28:42,800 –> 01:28:50,920
didn’t was a smart home system the way
2324
01:28:46,080 –> 01:28:54,840
Apple has Nest a drop cam a door lock
2325
01:28:50,920 –> 01:28:56,320
you know a AV system go equestron or
2326
01:28:54,840 –> 01:28:58,760
whatever and just have your whole home
2327
01:28:56,320 –> 01:29:00,400
automated thermostat Nest all of that
2328
01:28:58,760 –> 01:29:04,800
would be brilliant by Apple and right
2329
01:29:00,400 –> 01:29:06,520
now I’m an apple family that has our all
2330
01:29:04,800 –> 01:29:08,639
of our home automation through Google so
2331
01:29:06,520 –> 01:29:10,000
it’s just kind of sucks I would I would
2332
01:29:08,639 –> 01:29:11,840
like that all to that would be pretty
2333
01:29:10,000 –> 01:29:13,880
amazing like if they did a crant or
2334
01:29:11,840 –> 01:29:15,800
Savant CU then when you just go to your
2335
01:29:13,880 –> 01:29:17,239
Apple TV all your cameras just work you
2336
01:29:15,800 –> 01:29:19,719
don’t need to
2337
01:29:17,239 –> 01:29:21,920
yes that’s the that I mean and everybody
2338
01:29:19,719 –> 01:29:23,840
has a home and everybody automates their
2339
01:29:21,920 –> 01:29:26,239
home so well everyone has Apple TV at
2340
01:29:23,840 –> 01:29:29,400
this point so you just make Apple TV the
2341
01:29:26,239 –> 01:29:31,080
brain for the home system right that
2342
01:29:29,400 –> 01:29:33,480
would be your Hub and you can connect
2343
01:29:31,080 –> 01:29:36,600
your phone to it then yes that would be
2344
01:29:33,480 –> 01:29:38,480
very nice yeah like can you imagine like
2345
01:29:36,600 –> 01:29:39,600
the ring cameras all that stuff being
2346
01:29:38,480 –> 01:29:41,040
integrated I don’t know why they didn’t
2347
01:29:39,600 –> 01:29:43,199
go after that that seems like the easy
2348
01:29:41,040 –> 01:29:47,119
layup hey you know everybody’s been
2349
01:29:43,199 –> 01:29:49,960
talking freedberg about this uh Alpha
2350
01:29:47,119 –> 01:29:52,159
fold this folding
2351
01:29:49,960 –> 01:29:55,320
proteins and there’s some new version
2352
01:29:52,159 –> 01:29:56,880
out from Google and uh also Google
2353
01:29:55,320 –> 01:29:58,600
reportedly we talked about this before
2354
01:29:56,880 –> 01:30:01,119
is also advancing talks to acquire
2355
01:29:58,600 –> 01:30:03,159
HubSpot so that rumor for the $30
2356
01:30:01,119 –> 01:30:06,639
billion market cap HubSpot is out there
2357
01:30:03,159 –> 01:30:09,600
as well free break your as our resident
2358
01:30:06,639 –> 01:30:12,119
science Sultan uh our resident Sultan of
2359
01:30:09,600 –> 01:30:14,280
Science and as an Google
2360
01:30:12,119 –> 01:30:15,639
alumni pick either story and let’s go
2361
01:30:14,280 –> 01:30:16,880
for it yeah I mean I’m not sure there’s
2362
01:30:15,639 –> 01:30:18,400
much more to add on the HubSpot
2363
01:30:16,880 –> 01:30:20,000
acquisition rumors they are still just
2364
01:30:18,400 –> 01:30:22,119
rumors and I think we covered the topic
2365
01:30:20,000 –> 01:30:24,000
a couple weeks ago but I will say that
2366
01:30:22,119 –> 01:30:27,520
Alpha fold 3 that was just announced
2367
01:30:24,000 –> 01:30:31,119
today and demonstrated by Google um is a
2368
01:30:27,520 –> 01:30:34,080
real uh I would say breathtaking moment
2369
01:30:31,119 –> 01:30:36,920
um for biology for bioengineering for
2370
01:30:34,080 –> 01:30:38,840
human health for medicine and maybe I’ll
2371
01:30:36,920 –> 01:30:40,719
just take 30 seconds to kind of explain
2372
01:30:38,840 –> 01:30:43,639
it um you remember when they introduced
2373
01:30:40,719 –> 01:30:46,199
Alpha fold at Alpha fold 2 we talked
2374
01:30:43,639 –> 01:30:49,000
about DNA codes for proteins so every
2375
01:30:46,199 –> 01:30:52,639
three letters of DNA codes for an amino
2376
01:30:49,000 –> 01:30:54,199
acid so a string of DNA codes for a
2377
01:30:52,639 –> 01:30:55,800
string of amino acids and that’s called
2378
01:30:54,199 –> 01:30:59,080
a gene that produces a
2379
01:30:55,800 –> 01:31:00,480
protein and that protein is basically a
2380
01:30:59,080 –> 01:31:01,679
like think about beads there’s 20
2381
01:31:00,480 –> 01:31:04,679
different types of beads 20 different
2382
01:31:01,679 –> 01:31:06,480
amino acids that can be strung together
2383
01:31:04,679 –> 01:31:08,239
and what happens is that necklace that
2384
01:31:06,480 –> 01:31:10,119
bead necklace basically collapses on
2385
01:31:08,239 –> 01:31:11,119
itself and all those little beads stick
2386
01:31:10,119 –> 01:31:12,840
together with each other in some
2387
01:31:11,119 –> 01:31:15,080
complicated way that we can’t
2388
01:31:12,840 –> 01:31:16,719
deterministically model and that creates
2389
01:31:15,080 –> 01:31:19,400
a three-dimensional structure which is
2390
01:31:16,719 –> 01:31:20,960
called A protein that molecule and that
2391
01:31:19,400 –> 01:31:23,199
molecule does something interesting it
2392
01:31:20,960 –> 01:31:24,800
can break apart other molecules it can
2393
01:31:23,199 –> 01:31:26,760
bind molecules it can move molecules
2394
01:31:24,800 –> 01:31:29,440
around so it’s basically the Machinery
2395
01:31:26,760 –> 01:31:32,600
of chemistry of biochemistry and so
2396
01:31:29,440 –> 01:31:34,239
proteins are what is encoded in our DNA
2397
01:31:32,600 –> 01:31:36,880
and then the proteins do all the work of
2398
01:31:34,239 –> 01:31:38,400
making living organisms so Google’s
2399
01:31:36,880 –> 01:31:41,520
Alpha fold project took threedimensional
2400
01:31:38,400 –> 01:31:43,080
images of proteins and the DNA sequence
2401
01:31:41,520 –> 01:31:45,040
that codes for those proteins and then
2402
01:31:43,080 –> 01:31:46,280
they built a predictive model that
2403
01:31:45,040 –> 01:31:48,520
predicted the three-dimensional
2404
01:31:46,280 –> 01:31:50,320
structure of a protein from the DNA that
2405
01:31:48,520 –> 01:31:52,239
codes for it and that was a huge
2406
01:31:50,320 –> 01:31:54,639
breakthrough years ago what they just
2407
01:31:52,239 –> 01:31:56,920
announced with Alpha fold 3 today
2408
01:31:54,639 –> 01:31:58,480
is that they’re now including all small
2409
01:31:56,920 –> 01:32:00,840
molecules so all the other little
2410
01:31:58,480 –> 01:32:02,920
molecules that go into chemistry and
2411
01:32:00,840 –> 01:32:05,280
biology that drive the function of
2412
01:32:02,920 –> 01:32:07,320
everything we see around us and the way
2413
01:32:05,280 –> 01:32:09,239
that all those molecules actually bind
2414
01:32:07,320 –> 01:32:11,719
and fit together is part of the
2415
01:32:09,239 –> 01:32:12,920
predictive model why is that important
2416
01:32:11,719 –> 01:32:15,000
well let’s say that you’re designing a
2417
01:32:12,920 –> 01:32:16,400
new drug and it’s a protein based drug
2418
01:32:15,000 –> 01:32:18,360
which biologic drugs which most drugs
2419
01:32:16,400 –> 01:32:20,600
are today you could find a biologic drug
2420
01:32:18,360 –> 01:32:22,199
that binds to a cancer cell and then
2421
01:32:20,600 –> 01:32:24,920
you’ll spend 10 years going to clinical
2422
01:32:22,199 –> 01:32:27,440
trials and billions of later you find
2423
01:32:24,920 –> 01:32:28,960
out that that protein accidentally binds
2424
01:32:27,440 –> 01:32:30,560
to other stuff and hurts other stuff in
2425
01:32:28,960 –> 01:32:32,560
the body and that’s an off Target effect
2426
01:32:30,560 –> 01:32:33,920
or a side effect and that drug is pulled
2427
01:32:32,560 –> 01:32:36,199
from the clinical trials and it never
2428
01:32:33,920 –> 01:32:39,080
goes to Market most drugs go through
2429
01:32:36,199 –> 01:32:41,360
that process they are actually tested in
2430
01:32:39,080 –> 01:32:42,880
in animals and then in humans and we
2431
01:32:41,360 –> 01:32:45,000
find all these side effects that arise
2432
01:32:42,880 –> 01:32:47,040
from those drugs because we don’t know
2433
01:32:45,000 –> 01:32:48,840
how those drugs are going to bind or
2434
01:32:47,040 –> 01:32:50,800
interact with other things in our
2435
01:32:48,840 –> 01:32:52,800
biochemistry and we only discovered
2436
01:32:50,800 –> 01:32:54,960
after we put it in but now we can
2437
01:32:52,800 –> 01:32:56,199
actually model that with software we can
2438
01:32:54,960 –> 01:32:57,760
take that drug we can create a
2439
01:32:56,199 –> 01:33:00,000
three-dimensional representation of it
2440
01:32:57,760 –> 01:33:01,520
using the software and we can model how
2441
01:33:00,000 –> 01:33:03,719
that drug might interact with all the
2442
01:33:01,520 –> 01:33:06,400
other cells all the other proteins all
2443
01:33:03,719 –> 01:33:08,040
the other small molecules in the body to
2444
01:33:06,400 –> 01:33:10,199
find all the of Target effects that may
2445
01:33:08,040 –> 01:33:12,639
arise and decide whether or not that
2446
01:33:10,199 –> 01:33:15,440
presents a good drug candidate that is
2447
01:33:12,639 –> 01:33:17,560
one example of how this capability can
2448
01:33:15,440 –> 01:33:20,119
be used and there are many many others
2449
01:33:17,560 –> 01:33:22,159
including creating new proteins that
2450
01:33:20,119 –> 01:33:24,159
could be used to bind molecules or stick
2451
01:33:22,159 –> 01:33:26,480
molecules together or new prot that
2452
01:33:24,159 –> 01:33:29,159
could be designed to rip molecules apart
2453
01:33:26,480 –> 01:33:31,000
we can now predict the function of
2454
01:33:29,159 –> 01:33:33,320
threedimensional molecules using this
2455
01:33:31,000 –> 01:33:35,760
this capability which opens up all of
2456
01:33:33,320 –> 01:33:38,560
the software based design of chemistry
2457
01:33:35,760 –> 01:33:41,199
of biology of drugs and it really is an
2458
01:33:38,560 –> 01:33:42,480
incredible breakthrough moment the
2459
01:33:41,199 –> 01:33:45,119
interesting thing that happened though
2460
01:33:42,480 –> 01:33:47,679
is Google alphabet has a subsidiary
2461
01:33:45,119 –> 01:33:49,920
called isomorphic Labs it is a drug
2462
01:33:47,679 –> 01:33:52,159
development subsidiary of alphabet and
2463
01:33:49,920 –> 01:33:54,480
they’ve basically kept all the IP for
2464
01:33:52,159 –> 01:33:56,880
Alpha fold 3 in is
2465
01:33:54,480 –> 01:33:58,639
so Google is going to monetize the heck
2466
01:33:56,880 –> 01:34:00,719
out of this capability and what they
2467
01:33:58,639 –> 01:34:03,080
made available was not open source code
2468
01:34:00,719 –> 01:34:04,920
but a webbased viewer that scientists
2469
01:34:03,080 –> 01:34:06,840
for quote non-commercial purposes can
2470
01:34:04,920 –> 01:34:08,360
use to do some fundamental research in a
2471
01:34:06,840 –> 01:34:09,760
web-based viewer and make some
2472
01:34:08,360 –> 01:34:12,040
experiments and try stuff out and how
2473
01:34:09,760 –> 01:34:14,080
interactions might occur but no one can
2474
01:34:12,040 –> 01:34:16,520
use it for commercial use only Google’s
2475
01:34:14,080 –> 01:34:18,840
isomorphic Labs can so number one it’s
2476
01:34:16,520 –> 01:34:21,239
an incredible demonstration of what AI
2477
01:34:18,840 –> 01:34:23,760
outside of llms which we just talked
2478
01:34:21,239 –> 01:34:25,360
about with Sam today and obviously
2479
01:34:23,760 –> 01:34:27,840
talked about other models but llms being
2480
01:34:25,360 –> 01:34:29,800
kind of this consumer text predictive
2481
01:34:27,840 –> 01:34:32,080
model capability but outside of that
2482
01:34:29,800 –> 01:34:34,520
there’s this capability in things like
2483
01:34:32,080 –> 01:34:36,920
chemistry with these new AI models that
2484
01:34:34,520 –> 01:34:39,080
can be trained and built to predict
2485
01:34:36,920 –> 01:34:41,080
things like three-dimensional chemical
2486
01:34:39,080 –> 01:34:43,520
interactions that is going to open up an
2487
01:34:41,080 –> 01:34:45,199
entirely New Era for you know human
2488
01:34:43,520 –> 01:34:46,440
progress and I think that’s what’s so
2489
01:34:45,199 –> 01:34:48,440
exciting I think the other side of this
2490
01:34:46,440 –> 01:34:50,159
is Google is hugely advantaged and they
2491
01:34:48,440 –> 01:34:51,280
just showed the world a little bit about
2492
01:34:50,159 –> 01:34:52,400
some of these jewels that they have in
2493
01:34:51,280 –> 01:34:54,080
the treasure chest and they’re like look
2494
01:34:52,400 –> 01:34:55,480
at what we got we’re going to all these
2495
01:34:54,080 –> 01:34:57,679
drugs and they’ve got Partnerships with
2496
01:34:55,480 –> 01:34:59,719
all thesea companies at isomorphic Labs
2497
01:34:57,679 –> 01:35:01,320
that they’ve talked about and it’s going
2498
01:34:59,719 –> 01:35:03,719
to usher in a new era of drug
2499
01:35:01,320 –> 01:35:05,159
development design for human health so
2500
01:35:03,719 –> 01:35:06,639
all in all I’d say it’s a pretty like
2501
01:35:05,159 –> 01:35:08,199
astounding day a lot of people are going
2502
01:35:06,639 –> 01:35:10,119
crazy over the capability that they just
2503
01:35:08,199 –> 01:35:12,000
demonstrated and then it begs all this
2504
01:35:10,119 –> 01:35:13,320
really interesting question around like
2505
01:35:12,000 –> 01:35:14,199
you know what’s Google G to do with it
2506
01:35:13,320 –> 01:35:15,920
and how much value is going to be
2507
01:35:14,199 –> 01:35:17,679
created here so anyway I thought it was
2508
01:35:15,920 –> 01:35:19,159
a great story and I just rambled on for
2509
01:35:17,679 –> 01:35:20,960
a couple minutes but I don’t know it’s
2510
01:35:19,159 –> 01:35:22,920
pretty cool super interesting is is this
2511
01:35:20,960 –> 01:35:25,320
AI capable of making a science Corner
2512
01:35:22,920 –> 01:35:28,800
that David saak pay attention
2513
01:35:25,320 –> 01:35:30,600
to well it will it will predict the Cure
2514
01:35:28,800 –> 01:35:33,199
I think for the common cold and for
2515
01:35:30,600 –> 01:35:36,320
herpes so he should pay attention
2516
01:35:33,199 –> 01:35:38,679
absolutely folding cells is the app that
2517
01:35:36,320 –> 01:35:40,360
Sachs casual game Sachs just downloaders
2518
01:35:38,679 –> 01:35:42,119
playing how many uh how many chest moves
2519
01:35:40,360 –> 01:35:43,159
did you make during that segment sex
2520
01:35:42,119 –> 01:35:44,119
sorry let me just say one more thing do
2521
01:35:43,159 –> 01:35:46,440
you guys remember we talked about
2522
01:35:44,119 –> 01:35:48,360
yamanaka factors and how challenging it
2523
01:35:46,440 –> 01:35:51,320
is to basically we can reverse aging if
2524
01:35:48,360 –> 01:35:53,960
we can get the right proteins into cells
2525
01:35:51,320 –> 01:35:56,760
to tune the expression of certain gen to
2526
01:35:53,960 –> 01:35:58,800
make those cells youthful right now it’s
2527
01:35:56,760 –> 01:36:00,679
a shotgun approach to trying millions of
2528
01:35:58,800 –> 01:36:01,800
compounds and combinations of compounds
2529
01:36:00,679 –> 01:36:03,560
to do them there’s a lot of companies
2530
01:36:01,800 –> 01:36:05,480
actually trying to do this right now to
2531
01:36:03,560 –> 01:36:08,480
come up with a fountain of youth type
2532
01:36:05,480 –> 01:36:10,119
product we can now simulate that so with
2533
01:36:08,480 –> 01:36:13,360
this system one of the things that this
2534
01:36:10,119 –> 01:36:15,560
Alpha 3 can do is predict what molecules
2535
01:36:13,360 –> 01:36:17,639
will bind and promote certain sequences
2536
01:36:15,560 –> 01:36:19,000
of DNA which is exactly what we try and
2537
01:36:17,639 –> 01:36:21,080
do with the yanaka factor-based
2538
01:36:19,000 –> 01:36:23,600
expression systems and find ones that
2539
01:36:21,080 –> 01:36:25,320
won’t trigger off Target expression so
2540
01:36:23,600 –> 01:36:27,679
meaning we can now go through the search
2541
01:36:25,320 –> 01:36:29,480
space and software of creating a
2542
01:36:27,679 –> 01:36:31,159
combination of molecules that
2543
01:36:29,480 –> 01:36:33,400
theoretically could unlock this Fountain
2544
01:36:31,159 –> 01:36:35,719
of Youth to deage all the cells in the
2545
01:36:33,400 –> 01:36:37,480
body and introduce an extraordinary kind
2546
01:36:35,719 –> 01:36:39,080
of health benefit and that’s just again
2547
01:36:37,480 –> 01:36:40,280
one example of the many things that are
2548
01:36:39,080 –> 01:36:42,000
possible inredible with this sort of
2549
01:36:40,280 –> 01:36:43,280
platform I I and I’m really I gota be
2550
01:36:42,000 –> 01:36:46,639
honest I’m really just sort of skimming
2551
01:36:43,280 –> 01:36:48,600
the surface here of what this can do the
2552
01:36:46,639 –> 01:36:50,520
capabilities and the impact are going to
2553
01:36:48,600 –> 01:36:51,480
be like I don’t know I know I say this
2554
01:36:50,520 –> 01:36:53,800
sort of stuff a lot but it’s going to be
2555
01:36:51,480 –> 01:36:55,639
pretty for half there’s um on the Block
2556
01:36:53,800 –> 01:36:57,400
post they have this incredible video
2557
01:36:55,639 –> 01:37:00,040
that they show of
2558
01:36:57,400 –> 01:37:03,480
the Corona virus that creates a common
2559
01:37:00,040 –> 01:37:06,920
cold I think the S P&M protein and not
2560
01:37:03,480 –> 01:37:09,199
only did they literally like predict it
2561
01:37:06,920 –> 01:37:12,960
accurately they also predicted how it
2562
01:37:09,199 –> 01:37:15,440
interacts with an antibody with a sugar
2563
01:37:12,960 –> 01:37:17,600
it’s nuts so you could see a world where
2564
01:37:15,440 –> 01:37:19,080
like I don’t know you just get a vaccine
2565
01:37:17,600 –> 01:37:21,679
for the cold and it’s kind of like you
2566
01:37:19,080 –> 01:37:23,760
never have colds again amazing I mean
2567
01:37:21,679 –> 01:37:25,239
simple stuff but so powerful you can
2568
01:37:23,760 –> 01:37:27,639
filter out stuff that has offt Target
2569
01:37:25,239 –> 01:37:29,080
effect so so much of drug Discovery and
2570
01:37:27,639 –> 01:37:31,199
all the side effect stuff can start to
2571
01:37:29,080 –> 01:37:33,360
be solved for in silicone and you could
2572
01:37:31,199 –> 01:37:34,960
think about running extraordinar large
2573
01:37:33,360 –> 01:37:37,360
use a model like this run
2574
01:37:34,960 –> 01:37:39,520
extraordinarily large simulations in a
2575
01:37:37,360 –> 01:37:42,199
search space of chemistry to find stuff
2576
01:37:39,520 –> 01:37:44,000
that does things in the body that can
2577
01:37:42,199 –> 01:37:45,560
unlock you know all these benefits can
2578
01:37:44,000 –> 01:37:48,719
do all sorts of amazing things to
2579
01:37:45,560 –> 01:37:51,480
destroy cancer to destroy viruses to
2580
01:37:48,719 –> 01:37:53,080
repair cells to deage cells and this is
2581
01:37:51,480 –> 01:37:57,280
a hundred billion dollar business they
2582
01:37:53,080 –> 01:37:58,480
say I alone I feel like this is where I
2583
01:37:57,280 –> 01:38:01,280
I’ve said this before I think Google’s
2584
01:37:58,480 –> 01:38:03,280
got this like portfolio of like
2585
01:38:01,280 –> 01:38:05,480
quiet
2586
01:38:03,280 –> 01:38:06,880
extraord what if they hit and the fact
2587
01:38:05,480 –> 01:38:08,880
and I think the fact that they didn’t
2588
01:38:06,880 –> 01:38:10,480
open source everything in this says a
2589
01:38:08,880 –> 01:38:12,320
lot about their intentions yeah yeah
2590
01:38:10,480 –> 01:38:14,960
open source when you’re behind closed
2591
01:38:12,320 –> 01:38:16,920
Source lock it up when you’re ahead but
2592
01:38:14,960 –> 01:38:19,119
show yamanaka actually interestingly
2593
01:38:16,920 –> 01:38:21,000
yamanaka is the Japanese whiskey that
2594
01:38:19,119 –> 01:38:24,119
saak serves on his plane as well it’s
2595
01:38:21,000 –> 01:38:26,119
delicious I love that Hokkaido yak
2596
01:38:24,119 –> 01:38:27,719
Jason I feel like if you didn’t find
2597
01:38:26,119 –> 01:38:29,920
your way to Silicon Valley you could be
2598
01:38:27,719 –> 01:38:32,119
like a Vegas Lounge comedy guy
2599
01:38:29,920 –> 01:38:33,719
absolutely for sure yeah I was actually
2600
01:38:32,119 –> 01:38:35,920
yeah somebody said I should do like
2601
01:38:33,719 –> 01:38:37,679
those 1950s those 1950s talk shows where
2602
01:38:35,920 –> 01:38:40,239
the guys would do like the the stage
2603
01:38:37,679 –> 01:38:42,719
show somebody told me I should do um
2604
01:38:40,239 –> 01:38:44,040
like spald and gray Eric bosan style
2605
01:38:42,719 –> 01:38:45,719
stuff I don’t know if you guys remember
2606
01:38:44,040 –> 01:38:47,480
like the uh the monologue is from the
2607
01:38:45,719 –> 01:38:48,840
80s in New York I was like oh that’s
2608
01:38:47,480 –> 01:38:50,679
interesting maybe all right everybody
2609
01:38:48,840 –> 01:38:53,679
thanks for tuning in to the world’s
2610
01:38:50,679 –> 01:38:56,040
number one podcast can you believe we
2611
01:38:53,679 –> 01:38:59,360
did it jamath uh the number one podcast
2612
01:38:56,040 –> 01:39:02,520
in the world and the all-in summit the
2613
01:38:59,360 –> 01:39:04,679
Ted killer if you are going to Ted
2614
01:39:02,520 –> 01:39:06,199
congratulations for genu flecting if you
2615
01:39:04,679 –> 01:39:08,679
want to talk about real issues come to
2616
01:39:06,199 –> 01:39:11,239
the all-in summit and if you are
2617
01:39:08,679 –> 01:39:13,800
protesting at the all-in summit let us
2618
01:39:11,239 –> 01:39:15,880
know uh what mock meet you would like to
2619
01:39:13,800 –> 01:39:18,320
have freeberg is setting up mock meat
2620
01:39:15,880 –> 01:39:20,239
stations for all of our protesters and
2621
01:39:18,320 –> 01:39:23,639
what milk you would like yeah all vegan
2622
01:39:20,239 –> 01:39:26,119
if you if your o milk nut milk
2623
01:39:23,639 –> 01:39:28,400
come five different kinds of zanth gum
2624
01:39:26,119 –> 01:39:30,440
you can from all of the nut milks you
2625
01:39:28,400 –> 01:39:32,760
could want and then they’ll be
2626
01:39:30,440 –> 01:39:34,719
mindful can we have some soil likein
2627
01:39:32,760 –> 01:39:37,199
please yes on the south lawn we’ll have
2628
01:39:34,719 –> 01:39:38,960
the goat yoga going on so just please
2629
01:39:37,199 –> 01:39:41,560
note that the goat yoga will be going on
2630
01:39:38,960 –> 01:39:43,880
for all of you it’s very thoughtful for
2631
01:39:41,560 –> 01:39:46,199
you to make sure that our protesters are
2632
01:39:43,880 –> 01:39:48,560
going to be well well fed well taken
2633
01:39:46,199 –> 01:39:50,760
care of yes we’re actually freeberg is
2634
01:39:48,560 –> 01:39:54,000
working on the protester gift bags the
2635
01:39:50,760 –> 01:39:57,800
protester gift bags they’re made
2636
01:39:54,000 –> 01:39:59,440
Yak folding proteins so you’re good
2637
01:39:57,800 –> 01:40:02,119
folding proteins I think I saw them open
2638
01:39:59,440 –> 01:40:05,599
for the Smashing Pumpkins in
2639
01:40:02,119 –> 01:40:07,639
2003 on fire on fire enough I’ll be here
2640
01:40:05,599 –> 01:40:09,639
for three more nights love you boys
2641
01:40:07,639 –> 01:40:11,280
byebye love you besties is this the all
2642
01:40:09,639 –> 01:40:15,040
Potter open mic night what’s going on
2643
01:40:11,280 –> 01:40:15,040
it’s basically I’m just
2644
01:40:15,119 –> 01:40:22,440
bored let your winners
2645
01:40:17,800 –> 01:40:22,440
ride Rainman David
2646
01:40:23,440 –> 01:40:28,000
said we open source it to the fans and
2647
01:40:25,480 –> 01:40:30,360
they’ve just gone crazy with it love
2648
01:40:28,000 –> 01:40:30,360
queen
2649
01:40:32,300 –> 01:40:38,040
[Music]
2650
01:40:35,040 –> 01:40:42,320
of Besties
2651
01:40:38,040 –> 01:40:42,320
are my dog taking
2652
01:40:42,800 –> 01:40:48,080
driveway oh man myit will meet me we
2653
01:40:46,520 –> 01:40:49,719
should all just get a room and just have
2654
01:40:48,080 –> 01:40:51,400
one big huge orgy cuz they’re all this
2655
01:40:49,719 –> 01:40:52,760
useless it’s like this like sexual
2656
01:40:51,400 –> 01:40:53,320
tension that they just need to release
2657
01:40:52,760 –> 01:40:56,430
somehow
2658
01:40:53,320 –> 01:40:56,430
[Music]
2659
01:40:58,800 –> 01:41:02,360
your we need to get
2660
01:41:04,350 –> 01:41:08,419
[Music]
2661
01:41:09,000 –> 01:41:15,159
merch all right that’s episode 178 and
2662
01:41:12,520 –> 01:41:17,119
now the plugs the all-in Summit is
2663
01:41:15,159 –> 01:41:19,520
taking place in Los Angeles on September
2664
01:41:17,119 –> 01:41:23,400
8th through the 10th you can apply for a
2665
01:41:19,520 –> 01:41:25,760
ticket at summit. Allin podcast.co
2666
01:41:23,400 –> 01:41:27,920
scholarships will be coming soon if you
2667
01:41:25,760 –> 01:41:30,199
want to see the four of us interview Sam
2668
01:41:27,920 –> 01:41:33,080
Alman you can actually see the video of
2669
01:41:30,199 –> 01:41:35,560
this podcast on YouTube
2670
01:41:33,080 –> 01:41:38,080
youtube.com Allin or just search all-in
2671
01:41:35,560 –> 01:41:39,960
podcast and hit the alert Bell and
2672
01:41:38,080 –> 01:41:43,080
you’ll get updates when we post we’re
2673
01:41:39,960 –> 01:41:45,719
doing a Q&A episode live when the
2674
01:41:43,080 –> 01:41:48,480
YouTube channel hits 500,000 and we’re
2675
01:41:45,719 –> 01:41:49,920
going to do a party in Vegas my
2676
01:41:48,480 –> 01:41:51,840
understanding when we hit a million
2677
01:41:49,920 –> 01:41:53,840
subscribers so look for that as well you
2678
01:41:51,840 –> 01:41:58,800
can follow us on x x
2679
01:41:53,840 –> 01:42:01,599
/the all inpod Tik Tok is allore inor
2680
01:41:58,800 –> 01:42:03,480
talk Instagram the all inpod and on
2681
01:42:01,599 –> 01:42:06,920
LinkedIn just search for the all-in
2682
01:42:03,480 –> 01:42:08,840
podcast you can follow chth x.com chth
2683
01:42:06,920 –> 01:42:11,560
and you can sign up for a substack at
2684
01:42:08,840 –> 01:42:14,320
cho. substack docomo freeberg can be
2685
01:42:11,560 –> 01:42:16,880
followed at x.com freeberg and ohal is
2686
01:42:14,320 –> 01:42:19,719
hiring click on the careers page at ohal
2687
01:42:16,880 –> 01:42:22,639
genetics.com and you can follow saxs at
2688
01:42:19,719 –> 01:42:24,440
x.com davit saxs saxs recently spoke at
2689
01:42:22,639 –> 01:42:26,080
the American moment conference and
2690
01:42:24,440 –> 01:42:28,360
people are going crazy for it it’s
2691
01:42:26,080 –> 01:42:32,080
pinned to his tweet on his X profile I’m
2692
01:42:28,360 –> 01:42:33,800
Jason kakanis I amx.com Jason and if you
2693
01:42:32,080 –> 01:42:36,560
want to see pictures of my Bulldogs and
2694
01:42:33,800 –> 01:42:39,000
the food I’m eating go to instagram.com
2695
01:42:36,560 –> 01:42:40,960
Jason inth firstname Club you can listen
2696
01:42:39,000 –> 01:42:42,480
to my other podcast this week in
2697
01:42:40,960 –> 01:42:44,239
startups just search for it on YouTube
2698
01:42:42,480 –> 01:42:47,040
or your favorite podcast player we are
2699
01:42:44,239 –> 01:42:48,639
hiring a researcher apply to be a
2700
01:42:47,040 –> 01:42:50,400
research you’re doing primary research
2701
01:42:48,639 –> 01:42:52,400
and working with me and producer Nick
2702
01:42:50,400 –> 01:42:54,360
working in data and Science and being
2703
01:42:52,400 –> 01:42:57,760
able to do great great research Finance
2704
01:42:54,360 –> 01:42:59,119
Etc Allin podcast.co resesarch it’s a
2705
01:42:57,760 –> 01:43:01,280
full-time job working with us the
2706
01:42:59,119 –> 01:43:04,320
besties we’ll see you all next time when
2707
01:43:01,280 –> 01:43:04,320
the Allin podcast