6 big ethical questions about the future of AI | Genevieve Bell

99,047 views ใƒป 2021-01-14

TED


์•„๋ž˜ ์˜๋ฌธ์ž๋ง‰์„ ๋”๋ธ”ํด๋ฆญํ•˜์‹œ๋ฉด ์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค.

๋ฒˆ์—ญ: erin choi ๊ฒ€ํ† : Jihyeon J. Kim
์ธ๊ณต ์ง€๋Šฅ์— ๊ด€๋ จ๋œ ์ด์•ผ๊ธฐ๋ฅผ ํ•ด๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
์‹œ๋“œ๋‹ˆ์— โ€˜์› ๋ธ”๋ผ์ด ์ŠคํŠธ๋ฆฌํŠธโ€™๋ผ๋Š” ๊ฑด๋ฌผ์ด ํ•˜๋‚˜ ์žˆ๋Š”๋ฐ์š”.
์—ฌ๋Ÿฌ ์ •๋ถ€ ๋ถ€์ฒ˜๋“ค๋„ ๋งŽ์ด ์ž…์ฃผํ•ด ์žˆ๊ณ 
00:13
Let me tell you a story about artificial intelligence.
0
13286
2999
๋ฐ”์œ ์‚ฌ๋žŒ๋“ค๋กœ ๋ถ๋น„๋Š” ๊ณณ์ด์ฃ .
๊ฒ‰๋ชจ์Šต์€ ๊ณต์ƒ๊ณผํ•™ ์†Œ์„ค์—๋‚˜ ๋‚˜์˜ฌ ๋ฒ• ํ•˜๊ฒŒ ์ƒ๊ฒผ์–ด์š”.
00:16
There's a building in Sydney at 1 Bligh Street.
1
16309
3207
๋น›๋‚˜๋Š” ์œ ๋ฆฌ์™€ ๊ฑด๋ฌผ์˜ ๊ณก์„ ๋“ค
00:19
It houses lots of government apartments
2
19540
2230
00:21
and busy people.
3
21794
1449
๊ทธ๋ฆฌ๊ณ  ์˜ค๋ Œ์ง€์ƒ‰ ์กฐํ˜•๋ฌผ๋“ค์„ ๋ณผ ์ˆ˜ ์žˆ์ฃ .
00:23
From the outside, it looks like something out of American science fiction:
4
23267
3524
์•ˆ์—๋Š” ํ›Œ๋ฅญํ•œ ์ปคํ”ผ์ˆ์ด 1์ธต์— ์žˆ๊ณ 
00:26
all gleaming glass and curved lines,
5
26815
2714
์ œ๊ฐ€ ์ข‹์•„ํ•˜๋Š” ์—˜๋ฆฌ๋ฒ ์ดํ„ฐ๊ฐ€ ์žˆ์–ด์š”.
์ •๋ง ์•„๋ฆ„๋‹ต์ฃ .
00:29
and a piece of orange sculpture.
6
29553
2167
๋งˆ์น˜ ์‚ด์•„ ์žˆ๋Š” ๊ฒƒ ๊ฐ™๊ธฐ๋„ ํ•˜๊ณ ์š”.
00:31
On the inside, it has excellent coffee on the ground floor
7
31744
2928
์ €๋Š” ์—˜๋ฆฌ๋ฒ ์ดํ„ฐ๋ฅผ ์ •๋ง ์ข‹์•„ํ•ฉ๋‹ˆ๋‹ค.
์—ฌ๋Ÿฌ ์ด์œ ๊ฐ€ ์žˆ์ฃ .
00:34
and my favorite lifts in Sydney.
8
34696
1651
์—˜๋ฆฌ๋ฒ ์ดํ„ฐ๋Š” ์—ฌ๋Ÿฌ๋ถ„์˜ ๋ฏธ๋ž˜๋ฅผ ๋ณผ ์ˆ˜ ์žˆ๋Š” ์žฅ์†Œ ์ค‘ ํ•˜๋‚˜์˜ˆ์š”.
00:36
They're beautiful;
9
36371
1301
00:37
they look almost alive.
10
37696
2120
21์„ธ๊ธฐ ์—˜๋ฆฌ๋ฒ ์ดํ„ฐ๋“ค์€ ์•„์ฃผ ํฅ๋ฏธ๋กญ์Šต๋‹ˆ๋‹ค.
00:40
And it turns out I'm fascinated with lifts.
11
40220
2088
์ธ๊ณต์ง€๋Šฅ์ด ์—ฌ๋Ÿฌ๋ถ„ ์‚ถ์— ์˜ํ–ฅ์„ ๋ฏธ์น˜๋Š” ์ฃผ์š” ์žฅ์†Œ ์ค‘ ํ•œ ๊ณณ์ด๊ธฐ ๋•Œ๋ฌธ์ด์ฃ .
00:42
For lots of reasons.
12
42332
1364
00:43
But because lifts are one of the places you can see the future.
13
43720
3143
๊ทธ๊ฒƒ๋„ ์—ฌ๋Ÿฌ๋ถ„๋„ ๋ชจ๋ฅด๋Š” ์‚ฌ์ด์—์š”.
00:46
In the 21st century, lifts are interesting
14
46887
2492
์ „ ์„ธ๊ณ„ ๋งŽ์€ ๋นŒ๋”ฉ๋“ค ์•ˆ์—์„œ
00:49
because they're one of the first places that AI will touch you
15
49403
2912
์—˜๋ฆฌ๋ฒ ์ดํ„ฐ๋Š” ๊ณ„์†ํ•ด์„œ ๋‹ค์–‘ํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‹คํ–‰ํ•˜๊ณ  ์žˆ์–ด์š”.
์ธ๊ณต์ง€๋Šฅ์˜ ์›์กฐ์ธ ์…ˆ์ด์ฃ .
00:52
without you even knowing it happened.
16
52339
2039
00:54
In many buildings all around the world,
17
54919
2565
๊ทธ ๋ง์€, ์—ฌ๋Ÿฌ๋ถ„์ด ์—˜๋ฆฌ๋ฒ ์ดํ„ฐ ๋ฒ„ํŠผ์„ ๋ˆ„๋ฅด๊ธฐ๋„ ์ „์—
00:57
the lifts are running a set of algorithms.
18
57508
2491
์ด๋ฏธ ์—ฌ๋Ÿฌ๋ถ„์ด ์˜ฌ ๊ฑธ ์•Œ๊ณ  ๊ธฐ๋‹ค๋ฆฐ๋‹ค๋Š” ๋œป์ด์—์š”.
01:00
A form of protoartificial intelligence.
19
60023
2667
๋ชจ๋“  ์—˜๋ฆฌ๋ฒ ์ดํ„ฐ ์นธ๋“ค์˜ ์œ„์น˜๋ฅผ ์กฐ์ •ํ•˜์ฃ .
01:03
That means before you even walk up to the lift to press the button,
20
63071
4284
์—๋„ˆ์ง€๋ฅผ ์•„๋ผ๋ ค ๋Š˜ ์•„๋ž˜ ์ชฝ์œผ๋กœ ๋ณด๋‚ด๊ณ 
์‚ฌ๋žŒ๋“ค์ด ์–ด๋””๋กœ ํ–ฅํ•˜๋Š”์ง€ ๋ฏธ๋ฆฌ ์•Œ๊ณ  ์žˆ์ฃ .
01:07
it's anticipated you being there.
21
67379
1775
๋ฒ„ํŠผ์„ ๋ˆ„๋ฅด๋Š” ๊ทธ ์ˆœ๊ฐ„์—
01:09
It's already rearranging all the carriages.
22
69490
2936
์—ฌ๋Ÿฌ๋ถ„์€ ๊ทธ ์‹œ์Šคํ…œ์˜ ์ผ๋ถ€๊ฐ€ ๋ฉ๋‹ˆ๋‹ค.
์‚ฌ๋žŒ, ํ™˜๊ฒฝ, ๊ฑด๋ฌผ ๊ทธ๋ฆฌ๊ณ  ์„ธ์ƒ์„ ์ดํ•ดํ•˜๋Š” ์‹œ์Šคํ…œ ๋ง์ด์ฃ .
01:12
Always going down, to save energy,
23
72450
1627
01:14
and to know where the traffic is going to be.
24
74101
2143
01:16
By the time you've actually pressed the button,
25
76268
2262
์šฐ๋ฆฌ๋Š” ์ธ๊ณต์ง€๋Šฅ์„ ์ข…์ข… ๋กœ๋ด‡ํ•˜๊ณ ๋งŒ ์—ฐ๊ด€์‹œํ‚ต๋‹ˆ๋‹ค.
01:18
you're already part of an entire system
26
78554
1872
01:20
that's making sense of people and the environment
27
80450
2310
๊ณต์ƒ๊ณผํ•™๋งŒ ๋– ์˜ฌ๋ฆฌ๊ธฐ ์‹ญ์ƒ์ด์ฃ .
01:22
and the building and the built world.
28
82784
1991
100๋…„ ๋™์•ˆ ๊ทธ๋žฌ๋“ฏ์ด ๋ง์ด์—์š”.
01:25
I know when we talk about AI, we often talk about a world of robots.
29
85481
3810
์ธ๊ณต์ง€๋Šฅ์ด๋ผ ํ•˜๋ฉด ์—ฌ๋Ÿฌ๋ถ„์€ ํ„ฐ๋ฏธ๋„ค์ดํ„ฐ๋ฅผ ๋– ์˜ฌ๋ฆฌ์‹ค ํ…๋ฐ์š”.
01:29
It's easy for our imaginations to be occupied with science fiction,
30
89720
3468
์ธ๊ณต์ง€๋Šฅ๊ณผ ๋ฐ”๊นฅ ์„ธ์ƒ ์‚ฌ์ด์˜ ์—ฐ๊ด€์ ์„ ์ฐพ๋Š” ๊ฒƒ์€
01:33
well, over the last 100 years.
31
93212
1833
ํ›จ์”ฌ ์–ด๋ ค์šด ์ผ์ด์—์š”.
01:35
I say AI and you think "The Terminator."
32
95069
2534
ํ•˜์ง€๋งŒ ํ˜„์‹ค์€, ์ธ๊ณต์ง€๋Šฅ์€ ์šฐ๋ฆฌ ์ฃผ๋ณ€์— ์ด๋ฏธ ์กด์žฌํ•œ๋‹ค๋Š” ๊ฑฐ์˜ˆ์š”.
01:38
Somewhere, for us, making the connection between AI and the built world,
33
98728
4532
๊ทธ๊ฒƒ๋„ ์•„์ฃผ ๋งŽ์€ ๊ณณ์—์š”.
๊ฑด๋ฌผ๊ณผ ์—ฌ๋Ÿฌ ์‹œ์Šคํ…œ์— ์กด์žฌํ•˜์ฃ .
01:43
that's a harder story to tell.
34
103284
1690
200๋…„์ด ๋„˜๋Š” ์‚ฐ์—…ํ™” ๊ธฐ๊ฐ„ ๋™์•ˆ
01:45
But the reality is AI is already everywhere around us.
35
105395
3727
์‚ฌ๋žŒ๋“ค์€ ์—ฌ๋Ÿฌ ์‹œ์Šคํ…œ์— ์ธ๊ณต์ง€๋Šฅ์ด ์‰ฝ๊ฒŒ ์ ์šฉ๋  ๊ฑฐ๋ผ ์ƒ๊ฐํ–ˆ์–ด์š”.
01:49
And in many places.
36
109434
1215
01:50
It's in buildings and in systems.
37
110673
2587
๊ฒฐ๊ตญ ์—ญ์‚ฌ์ ์œผ๋กœ ๋น„์ถ”์–ด๋ณผ ๋•Œ
01:53
More than 200 years of industrialization
38
113284
2063
์šฐ๋ฆฌ๊ฐ€ ๊ธฐ์ˆ ๋งŒ ๊ฐœ๋ฐœํ•œ๋‹ค๋ฉด
01:55
suggest that AI will find its way to systems-level scale relatively easily.
39
115371
4303
๊ทœ๋ชจ์™€ ๋ณ€ํ™”๋Š” ์ž์—ฐ์Šค๋ ˆ ๋’ค๋”ฐ๋ฅธ๋‹ค๊ณ  ํ•  ์ˆ˜ ์žˆ์ฃ .
๊ธฐ๊ณ„ํ™”, ์ž๋™ํ™” ๊ทธ๋ฆฌ๊ณ  ๋””์ง€ํ„ธํ™”๋Š”
02:00
After all, one telling of that history
40
120173
2054
02:02
suggests that all you have to do is find a technology,
41
122251
2540
๋ชจ๋‘๊ฐ€ ๊ธฐ์ˆ ์˜ ์—ญํ• ๊ณผ ์ค‘์š”์„ฑ์— ์ดˆ์ ์„ ๋‘๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
02:04
achieve scale and revolution will follow.
42
124815
2426
02:07
The story of mechanization, automation and digitization
43
127919
4277
๊ทธ๋Ÿฐ ์‹์˜ ๊ธฐ์ˆ ์  ๋ณ€ํ™”๋Š”
๊ทœ๋ชจ๊ฐ€ ๋Œ€๊ฐœ ํ‰๋ฒ”ํ•˜์ฃ .
02:12
all point to the role of technology and its importance.
44
132220
3182
์˜ˆ์ƒ์„ ๋ฒ—์–ด๋‚˜์ง€๋„ ์•Š๊ณ 
์•ˆ์ •๋œ ์ˆ˜์ค€์ด์ฃ .
์˜ˆ์ธก ๊ฐ€๋Šฅํ•  ์ •๋„๋กœ์š”.
02:16
Those stories of technological transformation
45
136274
2461
02:18
make scale seem, well, normal.
46
138759
2714
ํ•˜์ง€๋งŒ ์ด๋Š” ์ดˆ์ ์„ ๊ธฐ์ˆ ๊ณผ ๊ธฐ์ˆ ์˜ ๋ณ€ํ™”์—๋งŒ ๋‘์ฃ .
02:21
Or expected.
47
141497
1150
02:23
And stable.
48
143052
1341
๊ธฐ์ˆ ์„ ์Šค์ผ€์ผํ™” ์‹œํ‚ค๊ณ  ์‹œ์Šคํ…œ์„ ๊ตฌ์ถ•ํ•˜๋Š” ๊ฒƒ์€
02:24
And sometimes even predictable.
49
144417
2066
๋” ๋งŽ์€ ๊ฒƒ์„ ํ•„์š”๋กœ ํ•ด์š”.
02:27
But it also puts the focus squarely on technology and technology change.
50
147012
4023
ํ˜ธ์ฃผ ๊ตญ๋ฆฝ๋Œ€์—์„œ ํ•œ ์ธ๊ณต์ง€๋Šฅ ๊ธฐ๊ด€์—์„œ
02:31
But I believe that scaling a technology and building a system
51
151964
3483
2017๋…„ 9์›”์—
02:35
requires something more.
52
155471
1680
์ด์ƒํ•˜๋ฆฌ๋งŒ์น˜ ์‰ฌ์šด ๋ฏธ์…˜์„ ์‹คํ–‰ํ–ˆ์–ด์š”.
02:38
We founded the 3Ai Institute at the Australian National University
53
158260
3762
๊ณตํ•™๊ธฐ์ˆ ์˜ ์ƒˆ๋กœ์šด ๋ฌธ์„ ๋งŒ๋“ค๊ณ 
์ธ๊ณต์ง€๋Šฅ์„ ์Šค์ผ€์ผ๋กœ ์•ˆ์ „ํ•˜๊ฒŒ ๊ฐ€์ ธ์˜ค๋Š” ์ž‘์—…์ด์—ˆ์–ด์š”.
02:42
in September 2017.
54
162046
1729
02:44
It has one deceptively simple mission:
55
164101
2794
ํ•˜์ง€๋งŒ 21์„ธ๊ธฐ์— ์–ด๋–ป๊ฒŒ ๊ฐ€๋Šฅ์ผ€ ํ•  ์ˆ˜ ์žˆ์„๊นŒ์š”?
02:46
to establish a new branch of engineering
56
166919
1911
์ €ํฌ๋Š” ์‹คํ—˜์„ ๋™๋ฐ˜ํ•œ ๊ต์œก ๊ณผ์ •์„ ํ†ตํ•ด
02:48
to take AI safely, sustainably and responsibly to scale.
57
168854
4198
์ด์— ๋Œ€ํ•ด ๊ฐ€๋ฅด์น˜๊ณ  ์žˆ์–ด์š”.
02:53
But how do you build a new branch of engineering in the 21st century?
58
173076
3253
์„ธ์ต์Šคํ”ผ์–ด์˜ ์ถœ์ƒ์ง€๋‚˜
๋Œ€๋ณด์ดˆ ๊ฐ™์ด ๋‹ค์–‘ํ•œ ๊ณณ์—์„œ
02:56
Well, we're teaching it into existence
59
176353
2389
02:58
through an experimental education program.
60
178766
2658
์—ฐ๊ตฌ๋ฅผ ํ•˜๊ธฐ๋„ ํ•˜๊ณ ์š”.
ํ˜ธ์ฃผ์˜ ์ปค๋‹ค๋ž€ ๊ด‘์‚ฐ์—์„œ๋„ ํ•˜๊ณ  ์žˆ๊ณ ์š”.
03:01
We're researching it into existence
61
181876
2019
03:03
with locations as diverse as Shakespeare's birthplace,
62
183919
2769
๊ทธ๋ฆฌ๊ณ  ์ด๋ก ์„ ์„ธ์›Œ์š”.
์‚ฌ์ด๋ฒ„ ์‹œ์Šคํ…œ์˜ ๋ณตํ•ฉ์„ฑ์— ์ดˆ์ ์„ ๋‘๊ณ ์š”.
03:06
the Great Barrier Reef,
63
186712
1802
03:08
not to mention one of Australia's largest autonomous mines.
64
188538
2888
์ €ํฌ๋Š” ์ƒˆ๋กญ๊ณ  ์œ ์ตํ•œ ๊ฒƒ๋“ค์„ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ๋…ธ๋ ฅํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
03:11
And we're theorizing it into existence,
65
191892
2509
๋‹ค์Œ ์„ธ๋Œ€์˜ ์ฐฝ์˜์ ์ธ ์ธ์žฌ๋ฅผ ๋งŒ๋“ค ๋งŒํ•œ ๊ฒƒ๋“ค์ด์š”.
03:14
paying attention to the complexities of cybernetic systems.
66
194425
3023
๊ทธ๋ฆฌ๊ณ  ์ €ํฌ๋Š”
03:18
We're working to build something new and something useful.
67
198250
2954
์ธ๊ณต์ง€๋Šฅ์˜ ๊ณผ๊ฑฐ ์ด์•ผ๊ธฐ๋“ค์„ ๊นŠ๊ฒŒ ์ดํ•ดํ•˜๋ฉด์„œ ๊ทธ ์ž‘์—…์„ ํ•ด์š”.
03:21
Something to create the next generation of critical thinkers and critical doers.
68
201228
3776
๊ทธ๋ฆฌ๊ณ  ํ˜‘๋ ฅ๊ณผ ํ˜‘๋™์„ ํ†ตํ•ด,
03:25
And we're doing all of that
69
205736
1461
๊ฐ€๋ฅด์นจ๊ณผ ์—ฐ๊ตฌ์™€ ์ฐธ์—ฌ๋ฅผ ํ†ตํ•ด
03:27
through a richer understanding of AI's many pasts and many stories.
70
207221
3973
๊ทธ๋ฆฌ๊ณ  ์งˆ๋ฌธ์„ ๊ตฌ์ƒํ•˜๋ฉฐ
03:31
And by working collaboratively and collectively
71
211964
3024
๋‹ค์–‘ํ•œ ๋ฌธ์ œ๋ฅผ ํ’€์–ด๊ฐ€์š”.
03:35
through teaching and research and engagement,
72
215012
3540
์ €ํฌ๋Š” ํ•˜๋‚˜์˜ ์ธ๊ณต์ง€๋Šฅ์„ ๋งŒ๋“œ๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๊ณ 
๊ฐ€๋Šฅ์„ฑ์„ ๋งŒ๋“ค์–ด์š”.
03:38
and by focusing as much on the framing of the questions
73
218576
2684
์ƒ์ƒ๋ ฅ์„ ๋ชจ๋‘ ๋™์›ํ•ด
03:41
as the solving of the problems.
74
221284
1761
๋‹ค์–‘ํ•œ ๊ฐ€๋Šฅ์„ฑ๊ณผ ๋Œ€ํ™”๊ฐ€ ๊ฐ€๋Šฅํ•œ
03:43
We're not making a single AI,
75
223505
1929
03:45
we're making the possibilities for many.
76
225458
2278
์ปค๋ฆฌํ˜๋Ÿผ์„ ๊ตฌ์ƒ์ค‘์ด์—์š”.
03:48
And we're actively working to decolonize our imaginations
77
228280
3079
๋งŒ๋“ค๊ณ  ๋˜ ์ƒˆ๋กœ ๋งŒ๋“ค์ฃ .
03:51
and to build a curriculum and a pedagogy
78
231383
2373
๊ทธ๋ฆฌ๊ณ  ๊ณ„์† ์ง„ํ–‰ ์ค‘์ด์—์š”.
03:53
that leaves room for a range of different conversations and possibilities.
79
233780
3745
์ €ํฌ๊ฐ€ ์–ด๋–ป๊ฒŒ
์ด ๋ฌธ์ œ์— ์ ‘๊ทผํ•˜๋Š”์ง€ ์•„์ฃผ ๊ฐ„๋‹จํ•˜๊ฒŒ ๋ง์”€๋“œ๋ฆด๊ฒŒ์š”.
03:58
We are making and remaking.
80
238514
2277
04:00
And I know we're always a work in progress.
81
240815
2838
๊ผญ ์—ญ์‚ฌ๋ฅผ ํ™•์ธ ํ•œ ํ›„ ์‹œ์ž‘ํ•ด์š”.
04:04
But here's a little glimpse
82
244593
1302
04:05
into how we're approaching that problem of scaling a future.
83
245919
3282
2018๋…„ 12์›”,
๋‰ด์‚ฌ์šฐ์Šค์›จ์ผ์Šค ์ฃผ ๊ตญ๊ฒฝ์— ์œ„์น˜ํ•œ
๋ธŒ๋ฆฌ์™€๋ฆฌ๋‚˜๋ผ๋Š” ๋งˆ์„์— ๊ฐ”์–ด์š”.
04:09
We start by making sure we're grounded in our own history.
84
249982
2834
์›์ฃผ๋ฏผ์„ ์œ„ํ•œ ๋งŒ๋‚จ์˜ ์žฅ์†Œ์˜€์ฃ .
04:13
In December of 2018,
85
253537
1365
๋‹ค์–‘ํ•œ ๊ทธ๋ฃน์ด ๋ชจ์—ฌ
04:14
I took myself up to the town of Brewarrina
86
254926
2056
ํ•จ๊ป˜ ์ถ•์ œ๋ฅผ ํ•˜๋ฉฐ ์ฆ๊ฒจ์š”.
04:17
on the New South Wales-Queensland border.
87
257006
2587
๊ทธ๊ณณ์˜ ๋ฐ”๋ฅด์˜จ ๊ฐ•์—๋Š”
04:19
This place was a meeting place for Aboriginal people,
88
259617
2515
ํ˜ธ์ฃผ์˜ ๊ฐ€์žฅ ํฌ๊ณ  ์˜ค๋ž˜๋œ ๋ฌผ๊ณ ๊ธฐ ์žก์ด ๊ทธ๋ฌผ์ธ
04:22
for different groups,
89
262156
1159
04:23
to gather, have ceremonies, meet, to be together.
90
263339
3256
์–ด์‚ด์ด ์žˆ์–ด์š”.
1.8km์˜ ๋Œ๋ฒฝ์œผ๋กœ ์ด๋ฃจ์–ด์ ธ ์žˆ๊ณ 
04:26
There, on the Barwon River, there's a set of fish weirs
91
266969
2600
04:29
that are one of the oldest and largest systems
92
269593
2158
๊ทธ๋ฌผ๊ฐ™์ด ์—‰์ผœ์žˆ๊ณ 
์•„๋ž˜์ชฝ์œผ๋กœ ํ–ฅํ•ด์žˆ์–ด
04:31
of Aboriginal fish traps in Australia.
93
271775
2098
๋‹ค์–‘ํ•œ ์ˆ˜์‹ฌ์— ์‚ฌ๋Š” ๋ฌผ๊ณ ๊ธฐ๋“ค์„ ์žก๋Š” ๋ฐ ์“ฐ์—ฌ์š”.
04:34
This system is comprised of 1.8 kilometers of stone walls
94
274214
3283
๋ฌผ๊ณ ๊ธฐ ์žก๋Š” ๋ฐ ์“ฐ๋Š” ํŽœ๋„ ์ฐฝ๊ณ ์— ์œ„์น˜ํ•ด ์žˆ๊ณ 
04:37
shaped like a series of fishnets
95
277521
1691
04:39
with the "Us" pointing down the river,
96
279236
2341
๋ฌผ์‚ด์„ ๋ฐ”๊ฟ€ ๋„๊ตฌ์™€
04:41
allowing fish to be trapped at different heights of the water.
97
281601
2937
ํฌ๊ณ  ์ž‘์€ ๋ฌผ๊ณ ๊ธฐ๋“ค์„ ์ €์žฅํ•  ์žฅ์†Œ์™€
๊นจ๋—ํ•œ ๋ฌผ์ด ๋งˆ๋ จ๋ผ ์žˆ์ฃ .
04:44
They're also fish holding pens with different-height walls for storage,
98
284562
3349
04:47
designed to change the way the water moves
99
287935
2150
์ด ์‹œ์Šคํ…œ์€ ๊ทธ ๊ณณ์— ์‚ฌ๋žŒ๋“ค์ด ๋ชจ์ผ ๋•Œ
04:50
and to be able to store big fish and little fish
100
290109
2253
๋‹ค ํ•จ๊ป˜ ๋จน์„ ์‹๋Ÿ‰์„ ํ™•๋ณดํ•˜๊ธฐ ์œ„ํ•œ
04:52
and to keep those fish in cool, clear running water.
101
292386
2673
์ˆ˜๋‹จ์ด์—์š”.
๋Œ๊ณผ ๋ซ๊ณผ๋Š” ์ƒ๊ด€์ด ์—†๋Š” ์ด์œ ์ฃ .
04:56
This fish-trap system was a way to ensure that you could feed people
102
296060
3263
04:59
as they gathered there in a place that was both a meeting of rivers
103
299347
3237
์˜คํžˆ๋ ค ๋ซ์ด ๋งŒ๋“  ์‹œ์Šคํ…œ์ด์ฃ .
05:02
and a meeting of cultures.
104
302608
1309
๊ธฐ์ˆ ์  ์ง€์‹๊ณผ
05:04
It isn't about the rocks or even the traps per se.
105
304441
3562
๋ฌธํ™”์  ์ง€์‹๊ณผ
์ƒํ•™์  ์ง€์‹์„ ํ•„์š”๋กœ ํ•˜๋Š” ์‹œ์Šคํ…œ์ด์š”.
05:08
It is about the system that those traps created.
106
308027
3063
์ด ์‹œ์Šคํ…œ์€ ์˜ค๋ž˜๋์–ด์š”,
4๋งŒ๋…„ ๋„˜๊ฒŒ ์ง€์†๋œ๋‹ค ๋ฏฟ๋Š” ๊ณ ๊ณ ํ•™์ž๋“ค๋„ ์žˆ์–ด์š”.
05:11
One that involves technical knowledge,
107
311114
2023
05:13
cultural knowledge
108
313161
1485
๋งˆ์ง€๋ง‰ ๊ธฐ๋ก์ด ์ฐํ˜”์„ ๋•Œ๋Š” 1910๋…„ ๋Œ€์˜ˆ์š”.
05:14
and ecological knowledge.
109
314670
1680
05:16
This system is old.
110
316792
1730
๋†€๋ž„๋งŒํ•œ ์žฅ์ˆ˜์™€ ํฌ๊ธฐ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์ฃ .
05:18
Some archaeologists think it's as old as 40,000 years.
111
318546
2880
05:21
The last time we have its recorded uses is in the nineteen teens.
112
321760
3870
์ €์—๊ฒ ํฐ ์˜๊ฐ์ด ๋์–ด์š”.
์ด ์–ด์‚ด์˜ ์‚ฌ์ง„์€ ์ œ ํšŒ์‚ฌ์˜ ๋ฒฝ์— ์œ„์น˜ํ•ด ์žˆ์–ด์š”.
05:26
It's had remarkable longevity and incredible scale.
113
326164
3245
๊ทธ ์•ฝ์†๊ณผ ์ƒˆ๋กœ์šด ๊ฒƒ์„ ๋งŒ๋“ค๊ฒ ๋‹ค๋Š” ๋„์ „์„
05:29
And it's an inspiration to me.
114
329871
2000
ํ•ญ์ƒ ์ƒ๊ธฐ์‹œํ‚ค๊ณ  ์‹ถ์–ด์„œ์š”.
๊ทธ๋ฆฌ๊ณ  ๋˜ ์ €ํฌ๊ฐ€
05:32
And a photo of the weir is on our walls here at the Institute,
115
332263
3149
์‚ฌ๋žŒ๋“ค์ด ๊ณ„์†ํ•ด์„œ ์œ ์ง€ํ•œ ๊ทธ ์‹œ์Šคํ…œ์„
05:35
to remind us of the promise and the challenge
116
335436
2110
๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ๋…ธ๋ ฅํ•˜๊ณ  ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์„์š”.
05:37
of building something meaningful.
117
337570
1873
์ด๋Š” ์ €ํฌ์˜ ์—ญ์‚ฌ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ
05:39
And to remind us that we're building systems
118
339467
2055
๊ณตํ•™์˜ ์ƒˆ๋กœ์šด ํ•™๋ฌธ์˜ ์„ ์กฐ๊ฐ€ ๋  ๊ฑฐ์˜ˆ์š”.
05:41
in a place where people have built systems
119
341546
2015
05:43
and sustained those same systems for generations.
120
343585
2476
๊ทธ ์œ ์‚ฐ์— ๋”ํ•ด
05:46
It isn't just our history,
121
346514
1666
๋ฏธ๋ž˜๋ฅผ ์œ„ํ•ด ๋ช…ํ™•ํ•œ ์งˆ๋ฌธ์„ ํ•ด์•ผ๋˜๊ฒ ๋‹ค๋Š” ์ƒ๊ฐ์„ ํ–ˆ์–ด์š”.
05:48
it's our legacy as we seek to establish a new branch of engineering.
122
348204
3611
์‰ฌ์šด ๋‹ต์ด ์—†๋Š” ์งˆ๋ฌธ์„์š”.
05:52
To build on that legacy and our sense of purpose,
123
352284
2872
์š”์ ์€ ์งˆ๋ฌธ์„ ํ•˜๋Š” ๊ฑฐ์˜ˆ์š”.
05:55
I think we need a clear framework for asking questions about the future.
124
355180
3929
์šฐ๋ฆฌ๋Š” ๊ฐ€๋” ์ „ํ†ต์ ์ธ ๋ฐฉ์‹์„ ๋„˜์–ด
05:59
Questions for which there aren't ready or easy answers.
125
359744
3065
๋” ๋ณต์žกํ•œ
๊ทธ๋Ÿฐ ์งˆ๋ฌธ์„ ํ•ด์•ผ ํ•œ๋‹ค๊ณ 
06:03
Here, the point is the asking of the questions.
126
363262
2944
์ƒ๊ฐํ•ด์š”.
06:06
We believe you need to go beyond the traditional approach
127
366659
2690
์™œ๋ƒํ•˜๋ฉด ๊ทธ๋Ÿผ์œผ๋กœ์จ ์ƒˆ ๊ฐ€๋Šฅ์„ฑ์„ ๋„“ํžˆ๊ณ 
06:09
of problem-solving,
128
369373
1405
06:10
to the more complicated one of question asking
129
370802
3085
์ƒˆ ๋„์ „์„ ์ฐพ๊ฒŒ ๋˜๋‹ˆ๊นŒ์š”.
์ง€๊ธˆ ์ €์—๊ฒŒ
06:13
and question framing.
130
373911
1680
์—ฌ์„ฏ ๊ฐœ์˜ ๊ธฐ๋ณธ ์งˆ๋ฌธ๋“ค์ด ์žˆ์–ด์š”.
06:15
Because in so doing, you open up all kinds of new possibilities
131
375950
2969
์ธ๊ณต์ง€๋Šฅ์„ ์•ˆ์ „ํ•˜๊ฒŒ ๊ทธ๋ฆฌ๊ณ  ์˜ค๋ž˜ ์Šค์ผ€์ผ๋กœ ์˜ฎ๊ธธ ๋ฐฉ๋ฒ•์ด์š”.
06:18
and new challenges.
132
378943
1475
06:20
For me, right now,
133
380442
1978
์ž์ฃผ์„ฑ,
๋Œ€ํ–‰์‚ฌ, ๋ณดํ—˜,
06:22
there are six big questions that frame our approach
134
382444
2794
๋ชฉ์ ์„ฑ, ์ง€ํ‘œ, ๊ทธ๋ฆฌ๊ณ  ์ ‘์†๊ธฐ์— ๊ด€๋ จ๋œ ์งˆ๋ฌธ์ด์š”.
06:25
for taking AI safely, sustainably and responsibly to scale.
135
385262
3056
์ฒซ ๋ฒˆ์งธ ์งˆ๋ฌธ์€ ๊ฐ„๋‹จํ•ด์š”.
06:28
Questions about autonomy,
136
388730
1486
06:30
agency, assurance,
137
390240
1837
์ด ์‹œ์Šคํ…œ์€ ์ž๋™์ธ๊ฐ€์š”?
06:32
indicators, interfaces and intentionality.
138
392101
2777
๋ธ”๋ผ์ด๊ฐ€์— ์Šน๊ฐ•๊ธฐ๋ฅผ ๋– ์˜ฌ๋ ค ๋ณด์„ธ์š”.
ํ˜„์‹ค์€, ์Šน๊ฐ•๊ธฐ๋Š” ์ž๋™์ด ๋  ๊ฑฐ์˜ˆ์š”.
06:36
The first question we ask is a simple one.
139
396284
2406
์•„๋ฌด ํ–‰๋™์„ ์ทจํ•˜์ง€ ์•Š์•„๋„ ์ €์ ˆ๋กœ ์›ํ•˜๋Š” ์ธต์œผ๋กœ ์˜ฎ๊ฒจ ๊ฐ€๊ฒ ์ฃ .
06:39
Is the system autonomous?
140
399038
1680
06:41
Think back to that lift on Bligh Street.
141
401284
2118
ํ•˜์ง€๋งŒ ์™„์ „ ์ž๋™์€ ์•„๋‹ˆ์ž–์•„์š”?
06:43
The reality is, one day, that lift may be autonomous.
142
403879
2627
๊ทธ ์Šน๊ฐ•๊ธฐ๋ฅผ ๋– ๋‚˜ ๋ธ”๋ผ์ด๊ฐ€์—
06:46
Which is to say it will be able to act without being told to act.
143
406530
3134
๋งฅ์ฃผ๋ฅผ ๋งˆ์‹œ๋Ÿฌ ๊ฐˆ ์ˆ˜๋Š” ์—†์ž–์•„์š”.
๊ทธ์ € ์˜ค๋ฅด๋‚ด๋ฆฌ๊ธฐ๋งŒ ํ•˜๋‹ˆ๊นŒ์š”.
06:50
But it isn't fully autonomous, right?
144
410219
1937
ํ•˜์ง€๋งŒ ํ˜ผ์ž ์ž๋™์œผ๋กœ ํ•˜์ฃ .
06:52
It can't leave that Bligh Street building
145
412180
2580
06:54
and wonder down to Circular Quay for a beer.
146
414784
2800
๊ทธ๋Ÿฐ ๋ฉด์—์„œ๋Š” ์ž๋™์ธ ์…ˆ์ด์—์š”.
๋‘ ๋ฒˆ์งธ ์งˆ๋ฌธ์€
06:58
It goes up and down, that's all.
147
418093
2230
์ด ์‹œ์Šคํ…œ์€ ๋Œ€ํ–‰์‚ฌ๊ฐ€ ์žˆ๋‚˜์š”?
07:00
But it does it by itself.
148
420347
2341
07:02
It's autonomous in that sense.
149
422712
2000
์ด ์‹œ์Šคํ…œ์ด ์ œํ•œ์ด ์žˆ์–ด
07:05
The second question we ask:
150
425744
2381
ํŠน์ • ์ƒํ™ฉ์— ํŠน์ • ํ–‰๋™์„ ํ”ผํ•˜๊ฒŒ ํ•  ๋งŒํ•œ ๊ทธ๋Ÿฐ ์žฅ์น˜์š”.
07:08
does this system have agency?
151
428149
1866
07:10
Does this system have controls and limits that live somewhere
152
430910
3778
์Šน๊ฐ•๊ธฐ๋Š” ์žˆ์ฃ .
์•„๋ฌด ์Šน๊ฐ•๊ธฐ๋‚˜ ๋ง์ด์ฃ .
07:14
that prevent it from doing certain kinds of things under certain conditions.
153
434712
3793
๋นจ๊ฐ„ ๋ฒ„ํŠผ์„ ๋ˆ„๋ฅด๋ฉด
์ € ์ˆ˜ํ™”๊ธฐ ๋„ˆ๋จธ ์‚ฌ๋žŒ์ด
07:19
The reality with lifts, that's absolutely the case.
154
439061
3270
๋ชจ๋“  ๊ฑธ ๋ฌด๋ ฅํ™” ์‹œํ‚ฌ ์ˆ˜ ์žˆ์ฃ .
07:22
Think of any lift you've been in.
155
442355
1904
ํ•˜์ง€๋งŒ ๋งŒ์•ฝ ์ด ๋ชจ๋“  ๊ฒŒ ์ธ๊ณต์ง€๋Šฅ์œผ๋กœ๋งŒ ์ด๋ฃจ์–ด์ง„๋‹ค๋ฉด์š”?
07:24
There's a red keyslot in the elevator carriage
156
444283
2620
๋ˆ„๊ฐ€ ์ด ์—ด์‡ ๋ฅผ ์ฅ๊ณ ,
07:26
that an emergency services person can stick a key into
157
446927
2587
๊ทธ ์—ด์‡ ๋Š” ์‹ค์ œ ์—ด์‡ ์ธ์ง€, ๋””์ง€ํ„ธ ์ƒ์ธ์ง€
07:29
and override the whole system.
158
449538
1719
๊ทธ๋ฆฌ๊ณ  ๋ˆ„๊ฐ€ ๊ฐ–๋Š”์ง€ ๋“ฑ์€์š”?
๊ทธ๊ฑธ ๊ฐ–๋Š” ๊ฑด ์‚ฌ๋žŒ์ธ๊ฐ€์š”?
07:31
But what happens when that system is AI-driven?
159
451641
2500
๊ทธ๋ ‡๋‹ค๋ฉด ์–ด๋–ป๊ฒŒ ๊ทธ ์‚ฌ๋žŒ์ด ์•Œ ์ˆ˜ ๊ฐ€ ์žˆ๋Š” ๊ฑฐ์ฃ ?
07:34
Where does the key live?
160
454165
1206
์–ด๋–ป๊ฒŒ ๊ทธ ๋ชจ๋“  ๊ฒƒ๋“ค์ด ์Šน๊ฐ•๊ธฐ ์•ˆ์— ์žˆ๋Š” ๋‹น์‹ ์—๊ฒŒ ๊ฐ€๋Šฅํ•˜์ฃ ?
07:35
Is it a physical key, is it a digital key?
161
455395
2230
07:37
Who gets to use it?
162
457649
1349
07:39
Is that the emergency services people?
163
459022
1928
์„ธ ๋ฒˆ์งธ ์งˆ๋ฌธ์€ ๋ณดํ—˜ ๊ด€๋ จ์ž…๋‹ˆ๋‹ค.
07:40
And how would you know if that was happening?
164
460974
2189
07:43
How would all of that be manifested to you in the lift?
165
463187
3287
์ €ํฌ๊ฐ€ ์•ˆ์ „, ๋ณด์•ˆ, ์‹ ๋ขฐ, ์œ„ํ—˜, ์•ฝ์ , ๊ฐ€๋™์„ฑ,
์œค๋ฆฌ, ๋ฒ•, ๊ทธ๋ฆฌ๊ณ , ๊ทœ์ •
07:47
The third question we ask is how do we think about assurance.
166
467752
3817
์ด ๋ชจ๋“  ๊ฒƒ๋“ค์— ๋Œ€ํ•ด ์–ด๋–ป๊ฒŒ ์ƒ๊ฐํ•˜๋Š”์ง€ ๋ง์ด์—์š”.
07:51
How do we think about all of its pieces:
167
471593
1912
07:53
safety, security, trust, risk, liability, manageability,
168
473529
3557
์–ด๋–ป๊ฒŒ ํ•ด์•ผ ์ด ์‹œ์Šคํ…œ์ด ์•ˆ์ „ํ•˜๊ณ  ์ž˜ ์ž‘๋™๋˜๋Š”์ง€ ์•Œ ์ˆ˜ ์žˆ๋‚˜์š”?
07:57
explicability, ethics, public policy, law, regulation?
169
477110
3657
๋„ค ๋ฒˆ์งธ ์งˆ๋ฌธ์€
๋ฌด์—‡์ด ์šฐ๋ฆฌ์˜ ์ธํ„ฐํŽ˜์ด์Šค๊ฐ€ ๋˜๋‚˜์š”?
08:01
And how would we tell you that the system was safe and functioning?
170
481545
3778
๊ทธ๊ฒƒ์„ ์šฐ๋ฆฌ๊ฐ€ ๊ฐ–๋‚˜์š”?
์šฐ๋ฆฌ๊ฐ€ ๋ง์„ ํ•˜๋‚˜์š”? ์•„๋‹˜ ๊ทธ๋“ค๋ผ๋ฆฌ ๋ง์„ ํ•˜๋‚˜์š”?
08:06
The fourth question we ask
171
486514
1699
๊ทธ๋ฆฌ๊ณ  ์ €ํฌ๊ฐ€ ์•Œ๋˜ ๊ธฐ์ˆ ์ด
08:08
is what would be our interfaces with these AI-driven systems.
172
488237
2874
๊ฐ‘์ž๊ธฐ
๋งค์šฐ ๋‹ค๋ฅธ ๋ฐฉํ–ฅ์œผ๋กœ ํ–‰๋™ํ•œ๋‹ค๋ฉด์š”?
08:11
Will we talk to them?
173
491135
1174
08:12
Will they talk to us, will they talk to each other?
174
492333
2421
์Šน๊ฐ•๊ธฐ, ์ฐจ, ์ „๋ ฅ๋ง, ์‹ ํ˜ธ๋“ฑ, ์ง‘์— ์žˆ๋Š” ๊ฒƒ๋“ค ๋ง์ด์—์š”.
08:14
And what will it mean to have a series of technologies we've known,
175
494778
3174
08:17
for some of us, all our lives,
176
497976
1475
๋‹ค์„ฏ ๋ฒˆ์งธ ์งˆ๋ฌธ์€
08:19
now suddenly behave in entirely different ways?
177
499475
2237
08:21
Lifts, cars, the electrical grid, traffic lights, things in your home.
178
501736
3938
๋ฌด์—‡์ด ์ž˜ ์ž‘๋™ํ•˜๋Š” ์ง€๋ฅผ ๋ณด์—ฌ์ฃผ๋Š” ์ง€ํ‘œ๊ฐ€ ๋˜์–ด ์ฃผ๋‚˜์š”?
200๋…„ ๊ฐ€๋Ÿ‰์˜ ์‚ฐ์—…ํ™”๋Š”
08:27
The fifth question for these AI-driven systems:
179
507038
3301
์ข‹์€ ์‹œ์Šคํ…œ์ด๋ผ๊ณ  ์ •์˜ ๋‚ด๋ฆด ์ˆ˜ ์žˆ๋Š” ๊ฑด
08:30
What will the indicators be to show that they're working well?
180
510363
3055
์ƒ์‚ฐ์„ฑ๊ณผ ํšจ์œจ์„ฑ์„ ๊ฐ€์ง„๋‹ค ๋งํ•ฉ๋‹ˆ๋‹ค.
08:33
Two hundred years of the industrial revolution
181
513442
2159
21์„ธ๊ธฐ์—๋Š”
์กฐ๊ธˆ๋งŒ ๋” ๋„“๊ฒŒ ์ƒ๊ฐํ•ด ๋ณด๋ฉด ๋ผ์š”.
08:35
tells us that the two most important ways to think about a good system
182
515625
3286
์ด ์‹œ์Šคํ…œ์ด ์˜ค๋ž˜ ๊ฐ€๋Š”์ง€,
์•ˆ์ „ํ•œ์ง€, ์ฑ…์ž„์งˆ๋งŒ ํ•œ์ง€ ๋ง์ด์ฃ .
08:38
are productivity and efficiency.
183
518935
2133
๊ทธ๋Ÿผ ๋ˆ„๊ฐ€ ๊ทธ๋Ÿฐ ๊ฒƒ๋“ค์„ ํŒ๋‹จํ•˜์ฃ ?
08:41
In the 21st century,
184
521712
1175
08:42
you might want to expand that just a little bit.
185
522911
2262
์‚ฌ์šฉ์ž๋“ค์€ ์ด ์‹œ์Šคํ…œ๋“ค์ด ์–ด๋–ป๊ฒŒ
08:45
Is the system sustainable,
186
525197
1245
๊ทœ์ œ๋˜๊ณ  ๋งŒ๋“ค์–ด์ง€๋Š”์ง€ ์•Œ๊ณ  ์‹ถ์–ด ํ•˜๋‹ˆ๊นŒ์š”.
08:46
is it safe, is it responsible?
187
526466
2000
08:48
Who gets to judge those things for us?
188
528920
2428
๋งˆ์ง€๋ง‰, ์•„๋งˆ๋„ ๊ฐ€์žฅ ์ค‘์š”ํ•œ ์งˆ๋ฌธ์€
08:51
Users of the systems would want to understand
189
531372
2120
์ด๊ฒƒ์ž…๋‹ˆ๋‹ค.
08:53
how these things are regulated, managed and built.
190
533516
2581
์˜๋„๋Š” ๋ฌด์—‡์ธ๊ฐ€์š”?
08:57
And then there's the final, perhaps most critical question
191
537737
2769
์ด ์‹œ์Šคํ…œ์€ ๋ฌด์—‡์„ ํ•˜๊ธฐ ์œ„ํ•ด ์„ค๊ณ„๋˜์—ˆ์œผ๋ฉฐ
์ด๊ฒƒ์ด ์ข‹์€ ์•„์ด๋””์–ด๋ผ ๋ˆ„๊ฐ€ ๋งํ•˜๋˜๊ฐ€์š”?
09:00
that you need to ask of these new AI systems.
192
540530
2817
๋‹ค๋ฅธ ๋ง๋กœ ํ•˜๋ฉด
์ด ์‹œ์Šคํ…œ์ด ๊ตฌ์ถ•ํ•˜๋Š” ์„ธ์ƒ์€ ์–ด๋–ค ์„ธ์ƒ์ด๋ฉฐ
09:03
What's its intent?
193
543989
1440
09:05
What's the system designed to do
194
545822
1533
์–ด๋–ป๊ฒŒ ๊ตฌ์ƒ์ด ๋์œผ๋ฉฐ
์ง€๊ธˆ ์šฐ๋ฆฌ๊ฐ€ ์‚ด๊ณ  ์žˆ๋Š” ์„ธ์ƒ๊ณผ์˜ ๊ด€๊ณ„๋Š” ์–ด๋–ป๊ฒŒ ๋˜๋‚˜์š”?
09:07
and who said that was a good idea?
195
547379
1792
09:09
Or put another way,
196
549195
1667
09:10
what is the world that this system is building,
197
550886
2675
๊ทธ๋ฆฌ๊ณ  ์ด ๋Œ€ํ™”๋Š” ๋ˆ„๊ฐ€ ํ•˜๊ฒŒ ๋˜๋‚˜์š”?
๋ˆ„๊ฐ€ ๋Œ€๋‹ตํ•˜๊ฒŒ ๋˜๋‚˜์š”?
09:13
how is that world imagined,
198
553585
1643
์–ด๋–ป๊ฒŒ ๊ตฌ์„ฑ๋˜๋‚˜์š”?
09:15
and what is its relationship to the world we live in today?
199
555252
2864
์–ด๋А ์งˆ๋ฌธ๋„ ๊ฐ„๋‹จํ•œ ๋‹ต์ด ์—†์Šต๋‹ˆ๋‹ค.
09:18
Who gets to be part of that conversation?
200
558546
2166
09:20
Who gets to articulate it?
201
560736
1730
๋Œ€์‹ ์—, ๊ทธ๋“ค์€ ๊ฐ€๋Šฅ์„ฑ์— ๋Œ€ํ•ด
09:22
How does it get framed and imagined?
202
562490
2400
์ƒ์ƒํ•˜๊ณ 
๋””์ž์ธํ•˜๊ณ , ๋งŒ๋“ค๊ณ , ๊ทœ์ œํ•˜๊ณ  ํ•ด์ฒด๋„ ํ•ฉ๋‹ˆ๋‹ค.
09:26
There are no simple answers to these questions.
203
566149
2916
์˜ฌ๋ฐ”๋ฅธ ๋ฐฉํ–ฅ์œผ๋กœ ์ €ํฌ๋ฅผ ์ด๋Œ๊ณ 
09:29
Instead, they frame what's possible
204
569089
2071
๊ณตํ•™์˜ ์ƒˆ๋กœ์šด ํ•™๋ฌธ์œผ๋กœ ํ†ตํ•˜๋Š” ๊ธธ์„ ๋งŒ๋“ค์–ด ์ค๋‹ˆ๋‹ค.
09:31
and what we need to imagine,
205
571184
2055
09:33
design, build, regulate and even decommission.
206
573263
3317
ํ•˜์ง€๋งŒ ์ด ์งˆ๋ฌธ๋“ค๋งŒ์œผ๋กœ๋Š” ์ถฉ๋ถ„ํ•˜์ง€ ์•Š์ฃ .
09:37
They point us in the right directions
207
577278
1855
์ด ์งˆ๋ฌธ๋“ค์„ ๋ชจ๋‘ ์—ฎ์„ ์ˆ˜ ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
09:39
and help us on a path to establish a new branch of engineering.
208
579157
3317
09:42
But critical questions aren't enough.
209
582895
3016
์ €ํฌ ํšŒ์‚ฌ์—์„œ๋Š”
์ธ๊ณต์ง€๋Šฅ์„ ์–ด๋–ป๊ฒŒ ํ•˜๋‚˜์˜ ์‹œ์Šคํ…œ์œผ๋กœ ๋ณผ์ง€์— ๊ด€์‹ฌ์ด ์žˆ์Šต๋‹ˆ๋‹ค.
09:46
You also need a way of holding all those questions together.
210
586880
3293
๊ทธ๋ฆฌ๊ณ  ์–ด๋–ป๊ฒŒ ๊ทธ ๊ฒฝ๊ณ„๋ฅผ ๋งŒ๋“ค์ง€์— ๋Œ€ํ•ด์„œ๋„์š”.
09:50
For us at the Institute,
211
590809
1166
09:51
we're also really interested in how to think about AI as a system,
212
591999
4578
๊ทธ๊ฒƒ์ด ๋งค์šฐ ์ค‘์š”ํ•˜๋‹จ ์ƒ๊ฐ์ด ๋“ญ๋‹ˆ๋‹ค.
์ €ํฌ๋Š” 1940๋…„๋„์— ์‹œ์ž‘๋œ ์ผ์— ์˜ํ–ฅ์„ ๋ฐ›์•˜์Šต๋‹ˆ๋‹ค.
09:56
and where and how to draw the boundaries of that system.
213
596601
2992
09:59
And those feel like especially important things right now.
214
599617
2857
1944๋…„์— ์ธ๋ฅ˜ํ•™์ž์ธ ๊ทธ๋ ˆ๊ณ ๋ฆฌ ๋ฐ”ํ…Œ์†ก๊ณผ ๋งˆ๊ฐ€๋ › ๋ฉ”๋“œ์™€ ํ•จ๊ป˜
10:03
Here, we're influenced by the work that was started way back in the 1940s.
215
603061
3968
์ˆ˜ํ•™์ž ๋…ธ๋ฒ ๋ฅดํŠธ ์œ„์—๋„ˆ๋Š” ์ˆ˜์ฐจ๋ก€ ๋ชจ์ž„์„ ๊ฐ€์กŒ์Šต๋‹ˆ๋‹ค.
๋งค์šฐ ์ž˜ ์•Œ๋ ค์ง„ ๋ฉ”์ด์”จ ํšŒ์˜๊ฐ€ ์ด ๋ชจ์ž„์ธ๋ฐ์š”.
10:07
In 1944, along with anthropologists Gregory Bateson and Margaret Mead,
216
607529
4142
๊ฒฐ๊ตญ, 1946๋…„๊ณผ 1953๋…„ ์‚ฌ์ด์—
10:11
mathematician Norbert Wiener convened a series of conversations
217
611695
3485
์ธ๊ณต๋‘๋‡Œํ•™ ์•„๋ž˜์„œ ์—ด ๋ฒˆ์˜ ํšŒ์˜๊ฐ€ ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค.
10:15
that would become known as the Macy Conferences on Cybernetics.
218
615204
3365
๋…ธ๋ฒ ๋ฅดํŠธ ์œ„์—๋„ˆ์— ๋”ฐ๋ฅด๋ฉด
10:18
Ultimately, between 1946 and 1953,
219
618593
3373
์ธ๊ณต๋‘๋‡Œํ•™์€ ์–ธ์–ด์™€ ๊ธฐ์ˆ ์„ ํ–ฅ์ƒ์‹œํ‚ค๊ณ 
10:21
ten conferences were held under the banner of cybernetics.
220
621990
2809
๋‹ค๊ฐ€์˜ฌ ์ปดํ“จํ„ฐ ๊ธฐ์ˆ ์˜ ๋ฌธ์ œ์ธ ์กฐ์ข…๊ณผ ๋Œ€ํ™”๋ฅผ
10:25
As defined by Norbert Wiener,
221
625553
1438
10:27
cybernetics sought to "develop a language and techniques
222
627015
3803
ํ•ด๊ฒฐํ•˜๋Š” ํ•™๋ฌธ์ž…๋‹ˆ๋‹ค.
10:30
that will enable us to indeed attack the problem of control and communication
223
630842
5124
์ธ๊ณต๋‘๋‡Œํ•™์— ๋”ฐ๋ฅด๋ฉด
์šฐ๋ฆฌ๋Š” ์‚ฌ๋žŒ๊ณผ ์ปดํ“จํ„ฐ์™€
๋” ๋‚˜์•„๊ฐ€ ์ƒํƒœ๊ณ„์˜
10:35
in advanced computing technologies."
224
635990
2400
๊ด€๊ณ„์— ๋Œ€ํ•ด ์ƒ๊ฐํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
10:38
Cybernetics argued persuasively
225
638847
1825
ํ•˜๋‚˜์˜ ์‹œ์Šคํ…œ์œผ๋กœ ๋ด์•ผ ํ•˜์ฃ .
10:40
that one had to think about the relationship
226
640696
2064
๋ฉ”์ด์‹œ ํšŒ์˜ ์ฐธ์—ฌ์ž๋“ค์€ ์–ด๋–ป๊ฒŒ ์šฐ๋ฆฌ์˜ ๋งˆ์Œ์ด
10:42
between humans, computers
227
642784
1962
10:44
and the broader ecological world.
228
644770
1778
์ง€์‹๊ณผ ๋ฐฐ์›€์— ๊ด€๋ จ๋œ ์ƒ๊ฐ๊ณผ
10:46
You had to think about them as a holistic system.
229
646572
2332
๋ฏธ๋ž˜์˜ ๊ธฐ์ˆ ์˜ ์—ญํ• ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•˜๋Š” ๊ฒƒ์„ ๊ฑฑ์ •ํ–ˆ์Šต๋‹ˆ๋‹ค.
10:49
Participants in the Macy Conferences were concerned with how the mind worked,
230
649293
3715
์Šฌํ”„๊ฒŒ๋„, ๋ฉ”์ด์‹œ ํšŒ์˜์—์„œ ์‹œ์ž‘๋œ ๋Œ€ํ™”๋Š”
๊ฐ€๋” ์žŠํ˜€์ง€๊ณค ํ•ฉ๋‹ˆ๋‹ค.
10:53
with ideas about intelligence and learning,
231
653032
2029
10:55
and about the role of technology in our future.
232
655085
2317
ํ•˜์ง€๋งŒ ์ €์—๊ฒŒ๋Š” ๋ฌธํ™”, ๊ธฐ์ˆ , ๊ทธ๋ฆฌ๊ณ  ํ™˜๊ฒฝ์„
10:57
Sadly, the conversations that started with the Macy Conference
233
657847
3040
์ˆ˜์šฉํ•  ์‹œ์Šคํ…œ์„ ๋งŒ๋“œ๋Š” ์ƒ๊ฐ๋“ค์ด
11:00
are often forgotten when the talk is about AI.
234
660911
2483
๋งค์šฐ ์†Œ์ค‘ํ•˜๊ฒŒ ์ž‘์šฉํ•ฉ๋‹ˆ๋‹ค.
11:03
But for me, there's something really important to reclaim here
235
663815
3349
์ œ ํšŒ์‚ฌ์—์„œ๋Š” ๊ทธ๋Ÿฐ ์‹œ์Šคํ…œ์ด ์ €ํฌ ์ผ์˜ ์ค‘์‹ฌ์ด์ฃ .
11:07
about the idea of a system that has to accommodate culture,
236
667188
3326
3๋…„ ๋™์•ˆ
11:10
technology and the environment.
237
670538
2261
๋งค์šฐ ํ›Œ๋ฅญํ•˜์‹  ๋ถ„๋“ค์ด ์ €์™€ ํ•จ๊ป˜
11:13
At the Institute, that sort of systems thinking is core to our work.
238
673649
3523
์ด ์ผ์— ๋ชฐ๋‘ํ•˜์—ฌ ์ฃผ์…จ์Šต๋‹ˆ๋‹ค.
๊ทธ ์ค‘์—๋Š” ์ธ๋ฅ˜ํ•™์ž๋“ค,
11:17
Over the last three years,
239
677950
1302
ํ™˜๊ฒฝ ์—”์ง€๋‹ˆ์–ด๋“ค, ์ปดํ“จํ„ฐ ๊ณผํ•™์ž๋“ค,
11:19
a whole collection of amazing people have joined me here
240
679276
2642
11:21
on this crazy journey to do this work.
241
681942
2373
๊ทธ๋ฆฌ๊ณ  ํ•ต ๋ฌผ๋ฆฌํ•™์ž๋“ค,
11:24
Our staff includes anthropologists,
242
684871
2357
์ž…์ƒํ•œ ์‚ฌ์ง„ ์ž‘๊ฐ€,
๊ทธ๋ฆฌ๊ณ  ์ ์–ด๋„ ํ•œ๋ช…์˜ ์ •์ฑ… ์ „๋ฌธ๊ฐ€๊ฐ€ ์žˆ์–ด์š”.
11:27
systems and environmental engineers, and computer scientists
243
687252
3365
ํ›Œ๋ฅญํ•œ ์กฐํ•ฉ์ด์ฃ .
11:30
as well as a nuclear physicist,
244
690641
2166
๊ฒฝํ—˜๊ณผ ์ „๋ฌธ์„ฑ์˜ ๊ฐ•๋ ฅํ•œ ํž˜์€
11:32
an award-winning photo journalist,
245
692831
1643
๋„์ „๊ณผ ๊ฐˆ๋“ฑ๋งŒํผ์ด๋‚˜ ํž˜์ด ์…‰๋‹ˆ๋‹ค.
11:34
and at least one policy and standards expert.
246
694498
2507
๋‹ค์–‘ํ•˜๋‹ค๋Š” ๊ฒƒ์€
11:37
It's a heady mix.
247
697506
1484
๋Œ€ํ™”ํ•  ๋ฐฉ๋ฒ•์„ ์ฐพ์•„์•ผ ํ•œ๋‹ค๋Š” ๊ฒƒ์ด๊ณ 
11:39
And the range of experience and expertise is powerful,
248
699014
3387
๊ฐˆ๋“ฑ์„ ์กฐ๊ธˆ ๋” ์ž˜ ์ฐธ์•„์•ผ ํ•œ๋‹ค๋Š” ๋œป์ด์ง€์š”.
11:42
as are the conflicts and the challenges.
249
702425
2303
11:45
Being diverse requires a constant willingness
250
705085
2262
์ €ํฌ๋Š” ๋˜ํ•œ
11:47
to find ways to hold people in conversation.
251
707371
2627
์ผ์„ ํ•  ์ƒˆ๋กœ์šด ๋ฐฉ์‹์„ ์ฐพ๋Š” ๊ฒƒ์€
๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค๋„ ํ•จ๊ป˜ ๊ทธ ์—ฌ์ •์— ํ•จ๊ป˜ํ•  ์ฑ…์ž„๊ฐ์ด ํ•„์š”ํ•˜๋‹ค๋Š” ๊ฒƒ์„ ๋А๊ผˆ์Šต๋‹ˆ๋‹ค.
11:50
And to dwell just a little bit with the conflict.
252
710323
2865
11:53
We also worked out early
253
713911
1880
๊ทธ๋ž˜์„œ ๊ต์œก ๊ณผ์ •์— ๋ฌธ์„ ์—ด์—ˆ๊ณ 
11:55
that the way to build a new way of doing things
254
715815
2198
2018๋…„์— ํ•™์ƒ๋“ค๋„ ์ฐธ์—ฌํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋์ฃ .
11:58
would require a commitment to bringing others along on that same journey with us.
255
718037
4143
๊ทธ ๋’ค๋กœ๋Š” ํ•™๋ถ€์ƒ๋“ค๊ณผ ํ•จ๊ป˜
12:02
So we opened our doors to an education program very quickly,
256
722204
3135
๊ทธ๋ฆฌ๊ณ  ๋ฐ•์‚ฌ๋“ค๊ณผ ํ•จ๊ป˜ ์ผํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:05
and we launched our first master's program in 2018.
257
725363
2944
์ €ํฌ ํ•™์ƒ๋“ค์€ ์ „์„ธ๊ณ„์—์„œ ์˜ค๋ฉฐ
๋‹ค๋ฅธ ์‚ถ์„ ์‚ด์•˜์ฃ .
12:08
Since then, we've had two cohorts of master's students
258
728704
2593
ํ˜ธ์ฃผ, ๋‰ด์งˆ๋žœ๋“œ, ๋‚˜์ด์ง€๋ฆฌ์•„, ๋„คํŒ”,
12:11
and one cohort of PhD students.
259
731321
1944
๋ฉ•์‹œ์ฝ”, ์ธ๋„, ๋ฏธ๊ตญ ๋“ฑ์ด์š”.
12:13
Our students come from all over the world
260
733683
1960
๋‚˜์ด๋„ 23์‚ด๋ถ€ํ„ฐ 60์‚ด๊นŒ์ง€ ์žˆ์–ด์š”.
12:15
and all over life.
261
735667
1222
12:16
Australia, New Zealand, Nigeria, Nepal,
262
736913
2595
๋‹ค์–‘ํ•œ ๊ด€์‹ฌ์‚ฌ๋„ ์žˆ์ฃ .
12:19
Mexico, India, the United States.
263
739532
1959
์Œ์•…, ์ˆ˜ํ•™, ์ •์ฑ…, ์ถค,
์‹œ์Šคํ…œ, ๊ธฐ์ค€,
12:22
And they range in age from 23 to 60.
264
742286
2400
๊ฑด์ถ•, ๊ทธ๋ฆฌ๊ณ  ๋ฏธ์ˆ .
12:24
They variously had backgrounds in maths and music,
265
744997
2984
๊ทธ๋“ค์ด ์ €ํฌ์™€ ํ•จ๊ป˜ ์ผํ•˜๊ธฐ ์ „์—
ํšŒ์‚ฌ๋ฅผ ์ฐจ๋ฆฐ ์‚ฌ๋žŒ๋„ ์žˆ๊ณ  ์ •๋ถ€๋ฅผ ์œ„ํ•ด ์ผํ•œ ์‚ฌ๋žŒ๋„ ์žˆ๊ณ 
12:28
policy and performance,
266
748005
1745
12:29
systems and standards,
267
749774
1334
๊ตฐ๋Œ€์—์„œ ์ผํ•˜๊ฑฐ๋‚˜ ๊ณ ๋“ฑํ•™๊ต ์„ ์ƒ๋‹˜์ด๊ฑฐ๋‚˜
12:31
architecture and arts.
268
751132
1459
๋ฏธ์ˆ  ๊ธฐ์—…์„ ์šด์˜ํ•˜๊ธฐ๋„ ํ–ˆ์ฃ .
12:33
Before they joined us at the Institute,
269
753385
1881
๊ทธ๋“ค์€ ๋ชจํ—˜๊ฐ€๊ณ 
12:35
they ran companies, they worked for government,
270
755290
2238
๋ชจ๋‘ ์„œ๋กœ์—๊ฒŒ, ๊ทธ๋ฆฌ๊ณ  ์ƒˆ๋กœ์šด ๊ฒƒ์„ ๋งŒ๋“œ๋Š” ๊ฒƒ์—
์ „๋…ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
12:37
served in the army, taught high school,
271
757552
2197
12:39
and managed arts organizations.
272
759773
2067
๋ฌด์—‡์„ ๋” ๋ฐ”๋ž„๊นŒ์š”?
12:42
They were adventurers
273
762329
1564
์ œ๊ฐ€ ์‹ค๋ฆฌ์ฝ˜ ๋ฐธ๋ฆฌ์—์„œ 20๋…„ ์žˆ์—ˆ์ง€๋งŒ
12:43
and committed to each other,
274
763917
1414
์ €๋Š” ์™ธ๋กœ์šด ๋ฐœ๋ช…๊ฐ€์˜ ์ด์•ผ๊ธฐ์™€
12:45
and to building something new.
275
765355
1475
12:47
And really, what more could you ask for?
276
767630
2513
์˜์›…์˜ ์ด์•ผ๊ธฐ๋„ ์•Œ๊ณ 
๊ทธ๋ฆฌ๊ณ  ํ˜„์‹ค๋„ ์••๋‹ˆ๋‹ค.
12:50
Because although I've spent 20 years in Silicon Valley
277
770922
2532
์˜์›…์˜ ์—ฌ์ •์ด๋ž€ ๊ฒƒ์€ ์—†์ง€์š”.
ํ•ญ์ƒ ์‚ฌ๋žŒ๋“ค์ด ํ•จ๊ป˜ ๋ชจ์—ฌ ๊ณตํ†ต๋œ ๋ชฉํ‘œ๋ฅผ ๊ฐ€์ง€๊ณ 
12:53
and I know the stories about the lone inventor
278
773478
2175
12:55
and the hero's journey,
279
775677
1436
์„ธ์ƒ์„ ๋ฐ”๊พธ๋Š” ๊ฒƒ์ด์ง€์š”.
12:57
I also know the reality.
280
777137
1603
12:58
That it's never just a hero's journey.
281
778764
2186
๊ทธ๋ž˜์„œ ์–ด๋””์„œ ์‹œ์ž‘ํ•ด์•ผ ํ• ๊นŒ์š”?
13:00
It's always a collection of people who have a shared sense of purpose
282
780974
3643
๊ธ€์Ž„์š”, ์ €๋Š” ์ง€๊ธˆ ์„œ๊ณ„์‹  ๊ณณ์—์„œ ์‹œ์ž‘ํ•ด์•ผ ํ•œ๋‹ค๊ณ  ๋ด…๋‹ˆ๋‹ค.
๋ณธ์ธ์ด ๋ฐŸ๊ณ  ์žˆ๋Š” ๊ทธ ๊ณณ์˜ ์ง„์ •ํ•œ ์ฃผ์ธ์„
13:04
who can change the world.
283
784641
1680
๋ฐ›์•„๋“ค์ผ ๋งˆ์Œ์„ ๊ฐ€์ ธ์•ผ ํ•œ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค.
13:06
So where do you start?
284
786807
1680
์€๊ตฐ๋‚˜์™ˆ๊ณผ ์€๊ฐ๋ธŒ๋ฆฌ ์‚ฌ๋žŒ๋“ค์€
13:09
Well, I think you start where you stand.
285
789299
2412
๋ณธ์ธ์˜ ๋•…์— ์„œ
ํ•ญ์ƒ ๋„๋ง๊ฐ€์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.
13:12
And for me, that means I want to acknowledge
286
792077
2063
์ €๋Š” ์ด ๊ณณ์˜ ๊ณผ๊ฑฐ์™€ ํ˜„์žฌ์˜ ๋…ธ์ธ ๋ถ„๋“ค๊ป˜ ๊ณต๊ฒฝ์„ ํ‘œํ•ฉ๋‹ˆ๋‹ค.
13:14
the traditional owners of the land upon which I'm standing.
287
794164
2777
13:16
The Ngunnawal and Ngambri people,
288
796965
1587
์˜ค๋Š˜ ๋ชจ์ด์‹  ๋ถ„๋“ค๋„ ๋‹ค์–‘ํ•œ ๊ณณ์—์„œ ์˜ค์‹  ๊ฒƒ์„
13:18
this is their land,
289
798576
1167
์ž˜ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
13:19
never ceded, always sacred.
290
799767
1365
๊ทธ ๊ณณ์˜ ์ฃผ์ธ๊ณผ ๋…ธ์ธ ๋ถ„๋“ค๊ป˜๋„
13:21
And I pay my respects to the elders, past and present, of this place.
291
801156
3373
๊ณต๊ฒฝ์„ ํ‘œํ•ฉ๋‹ˆ๋‹ค.
13:25
I also acknowledge that we're gathering today
292
805315
2111
์ด๋Ÿฐ ๋ง๋“ค์„ ํ•˜๊ณ 
13:27
in many other places,
293
807450
1445
์˜๋ฏธ๋ฅผ ๊ณฑ์”น๋Š” ๊ฒƒ์€ ํฐ ์˜๋ฏธ์ž…๋‹ˆ๋‹ค.
13:28
and I pay my respects to the traditional owners and elders
294
808919
2718
์ €ํฌ๊ฐ€ 6๋งŒ๋…„ ๋™์•ˆ ์ ๋ น๋๋˜ ๋•…์—
13:31
of all those places too.
295
811661
1510
์‚ฐ๋‹ค๋Š” ๊ฒƒ๋„์š”.
13:33
It means a lot to me to get to say those words
296
813617
2207
13:35
and to dwell on what they mean and signal.
297
815848
2531
์›์ฃผ๋ฏผ๋“ค์€ ์ด๊ณณ์—์„œ ์„ธ์ƒ์„ ๊ตฌ์ถ•ํ•˜๊ณ 
13:38
And to remember that we live in a country
298
818855
1961
์‚ฌํšŒ์™€ ์†Œํ†ตํ•˜๊ณ  ๊ธฐ์ˆ ์„ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค.
13:40
that has been continuously occupied for at least 60,000 years.
299
820840
3261
์ด ๊ณต๊ฐ„์„ ์ž˜ ๊ด€๋ฆฌํ•  ๋ฐฉ๋ฒ•์„ ๋งŒ๋“ค๊ณ 
๊ณ„์•ฝ๋œ ๊ธฐ๊ฐ„ ๋™์•ˆ ์ž˜ ๊ด€๋ฆฌํ–ˆ์ฃ .
13:44
Aboriginal people built worlds here,
300
824979
1905
13:46
they built social systems, they built technologies.
301
826908
2499
๋ˆ„๊ตฐ๊ฐ€ ์ด๊ณณ์—์„œ ํ˜ธ์ฃผ์ธ์œผ๋กœ์„œ
13:49
They built ways to manage this place
302
829837
1729
์ด ์ž๋ฆฌ์— ์„ ๋‹ค๋ฉด,
13:51
and to manage it remarkably over a protracted period of time.
303
831590
3254
๊ทธ ์—ญ์‚ฌ์— ํŠนํ˜œ์™€ ์ฑ…์ž„๊ฐ์„
๊ฐ€์ง€์‹œ๊ธธ ๋ฐ”๋ž๋‹ˆ๋‹ค.
13:55
And every moment any one of us stands on a stage as Australians,
304
835935
3000
ํ•˜์ง€๋งŒ ์ด๊ฒƒ์€ ๊ทธ์ € ๋‹จ์ˆœํ•œ ์—ญ์‚ฌ๊ฐ€ ์•„๋‹ˆ์ง€์š”.
์ด๋Š” ๋งค์šฐ ์ค‘์š”ํ•œ ์ž๋ฃŒ๋“ค์ด๊ณ 
13:58
here or abroad,
305
838959
1284
14:00
we carry with us a privilege and a responsibility
306
840267
2287
๊ด€์ ์ด๊ณ  ์ง€์‹์ด์ฃ .
์ด๋Š” ์šฐ๋ฆฌ์˜ ๋ผˆ ์†์— ์žˆ๊ณ 
14:02
because of that history.
307
842578
1361
14:04
And it's not just a history.
308
844241
1706
์ €ํฌ๊ฐ€ ํ•ญ์ƒ ํ•˜๋Š” ์ด์•ผ๊ธฐ์˜ ์ผ๋ถ€๋ถ„์ด์ฃ .
14:05
It's also an incredibly rich set of resources,
309
845971
2517
์ด๋Š” ๋‹ค๋ฅด๊ฒŒ ์ƒ๊ฐํ•˜๊ณ 
14:08
worldviews and knowledge.
310
848512
1944
๋‹ค๋ฅธ ์งˆ๋ฌธ์„ ๋˜์ง€๊ณ 
14:10
And it should run through all of our bones
311
850480
2444
14:12
and it should be the story we always tell.
312
852948
2415
์„ธ์ƒ๊ณผ ์‹œ์Šคํ…œ์„ ํ•˜๋‚˜๋กœ ๋ณด๊ณ 
์—ฌ๋Ÿฌ๋ถ„์˜ ์—ฌ์ •์„ ํ•จ๊ป˜ํ•  ์‚ฌ๋žŒ์„ ์ฐพ๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.
14:15
Ultimately, it's about thinking differently,
313
855778
2182
14:17
asking different kinds of questions,
314
857984
2556
์™œ๋ƒํ•˜๋ฉด ์ €์—๊ฒŒ๋Š”
๋ฏธ๋ž˜์™€ ์Šค์ผ€์ผ์— ๋Œ€ํ•ด ์ƒ๊ฐํ•  ์œ ์ผํ•œ ๋ฐฉ๋ฒ•์€
14:20
looking holistically at the world and the systems,
315
860564
2737
ํ•จ๊ป˜ํ•˜๋Š” ๊ฒƒ์ด์˜€์Šต๋‹ˆ๋‹ค.
14:23
and finding other people who want to be on that journey with you.
316
863325
3249
์™œ๋ƒํ•˜๋ฉด ์ €์—๊ฒŒ๋Š”
14:26
Because for me,
317
866598
1201
์šฐ๋ฆฌ ๋ชจ๋‘ ํ•จ๊ป˜ ์žˆ๋‹ค๋Š” ๊ฒƒ์ด
14:27
the only way to actually think about the future and scale
318
867823
3306
์ฑ…์ž„๊ฐ ์žˆ๊ณ , ์•ˆ์ „ํ•˜๊ณ ,
14:31
is to always be doing it collectively.
319
871153
2437
๊ทธ๋ฆฌ๊ณ  ์˜ค๋ž˜ ์œ ์ง€ ๊ฐ€๋Šฅํ•œ ๊ฒƒ๋“ค์— ๋Œ€ํ•ด
14:33
And because for me,
320
873614
1349
์ƒ๊ฐํ•˜๋Š” ๋ฐฉ๋ฒ•์ด๋‹ˆ๊นŒ์š”.
14:34
the notion of humans in it together
321
874987
2537
๊ฐ์‚ฌ ํ•ฉ๋‹ˆ๋‹ค.
14:37
is one of the ways we get to think about things
322
877548
2214
14:39
that are responsible, safe
323
879786
2357
14:42
and ultimately, sustainable.
324
882167
1872
14:45
Thank you.
325
885143
1150
์ด ์›น์‚ฌ์ดํŠธ ์ •๋ณด

์ด ์‚ฌ์ดํŠธ๋Š” ์˜์–ด ํ•™์Šต์— ์œ ์šฉํ•œ YouTube ๋™์˜์ƒ์„ ์†Œ๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์ „ ์„ธ๊ณ„ ์ตœ๊ณ ์˜ ์„ ์ƒ๋‹˜๋“ค์ด ๊ฐ€๋ฅด์น˜๋Š” ์˜์–ด ์ˆ˜์—…์„ ๋ณด๊ฒŒ ๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ฐ ๋™์˜์ƒ ํŽ˜์ด์ง€์— ํ‘œ์‹œ๋˜๋Š” ์˜์–ด ์ž๋ง‰์„ ๋”๋ธ” ํด๋ฆญํ•˜๋ฉด ๊ทธ๊ณณ์—์„œ ๋™์˜์ƒ์ด ์žฌ์ƒ๋ฉ๋‹ˆ๋‹ค. ๋น„๋””์˜ค ์žฌ์ƒ์— ๋งž์ถฐ ์ž๋ง‰์ด ์Šคํฌ๋กค๋ฉ๋‹ˆ๋‹ค. ์˜๊ฒฌ์ด๋‚˜ ์š”์ฒญ์ด ์žˆ๋Š” ๊ฒฝ์šฐ ์ด ๋ฌธ์˜ ์–‘์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๋ฌธ์˜ํ•˜์‹ญ์‹œ์˜ค.

https://forms.gle/WvT1wiN1qDtmnspy7