How CRISPR lets us edit our DNA | Jennifer Doudna

1,798,020 views ・ 2015-11-12

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: DONG IN SHIN κ²€ν† : JY Kang
00:13
A few years ago,
0
13303
1160
λͺ‡ λ…„ μ „,
00:14
with my colleague, Emmanuelle Charpentier,
1
14487
2770
μ €λŠ”, 제 λ™λ£ŒμΈ μž„λ§ˆλ‰΄μ—˜ μƒ€νŽœν‹°μ—μ™€ ν•¨κ»˜
00:17
I invented a new technology for editing genomes.
2
17281
3258
μœ μ „μ²΄λ₯Ό μ‘°μž‘ν•˜λŠ” μƒˆλ‘œμš΄ κΈ°μˆ μ„ 발λͺ…ν–ˆμŠ΅λ‹ˆλ‹€.
00:21
It's called CRISPR-Cas9.
3
21093
2114
μ΄λ¦„ν•˜μ—¬ 크리슀퍼-캐슀9 (CRISPR-Cas9) 이라고 ν•˜μ£ .
00:23
The CRISPR technology allows scientists to make changes
4
23870
3576
이 크리슀퍼 κΈ°μˆ μ„ 톡해 κ³Όν•™μžλ“€μ€ μ„Έν¬μ˜ DNAλ₯Ό
00:27
to the DNA in cells
5
27470
1855
μ‘°μž‘ν•¨μœΌλ‘œμ¨
00:29
that could allow us to cure genetic disease.
6
29349
2906
μœ μ „λ³‘μ„ μΉ˜λ£Œν•  수 μžˆμŠ΅λ‹ˆλ‹€.
00:33
You might be interested to know
7
33196
1492
λ‹€λ“€ κΆκΈˆν•˜μ‹€ 텐데,
00:34
that the CRISPR technology came about through a basic research project
8
34712
3784
μ›λž˜ 이 크리슀퍼 κΈ°μˆ μ€ 세균이 λ°”μ΄λŸ¬μŠ€ 감염에 μ–΄λ–»κ²Œ λŒ€ν•­ν•˜λŠ”μ§€
00:38
that was aimed at discovering how bacteria fight viral infections.
9
38520
3943
λ°ν˜€λ‚΄κΈ° μœ„ν•œ 기초 연ꡬλ₯Ό ν•˜λŠ” κ³Όμ •μ—μ„œ κ°œλ°œλ˜μ—ˆμŠ΅λ‹ˆλ‹€.
00:43
Bacteria have to deal with viruses in their environment,
10
43340
2872
세균은 κ·Έ μ£Όμœ„μ˜ λ°”μ΄λŸ¬μŠ€μ— λ§žμ„œμ•Όλ§Œ ν•©λ‹ˆλ‹€.
00:46
and we can think about a viral infection like a ticking time bomb --
11
46236
3664
λ°”μ΄λŸ¬μŠ€μ„± 감염은 마치 μ‹œν•œν­νƒ„κ³Ό κ°™λ‹€κ³  μƒκ°ν•˜λ©΄ λ©λ‹ˆλ‹€.
00:49
a bacterium has only a few minutes to defuse the bomb
12
49924
3358
폭탄이 ν­λ°œν•˜κΈ° 전에 μ œκ±°ν•˜κΈ° μœ„ν•΄μ„œ
00:53
before it gets destroyed.
13
53306
1610
μ„Έκ· μ—κ²Œ 주어진 μ‹œκ°„μ€ 단지 λͺ‡ 뢄에 λΆˆκ³Όν•˜μ£ .
00:55
So, many bacteria have in their cells an adaptive immune system called CRISPR,
14
55284
5390
κ·Έλž˜μ„œ λ§Žμ€ 세균듀은 세포 μ•ˆμ— ν¬λ¦¬μŠ€νΌλΌλŠ” 적응면역체계λ₯Ό 가지며,
01:00
that allows them to detect viral DNA and destroy it.
15
60698
3646
이것은 λ°”μ΄λŸ¬μŠ€μ˜ DNAλ₯Ό μΆ”μ ν•˜μ—¬ νŒŒκ΄΄μ‹œν‚΅λ‹ˆλ‹€.
01:05
Part of the CRISPR system is a protein called Cas9,
16
65148
3968
이 크리슀퍼 μ²΄κ³„μ˜ 일뢀뢄이 λ°”λ‘œ 캐슀9 이라 λΆˆλ¦¬λŠ” λ‹¨λ°±μ§ˆμž…λ‹ˆλ‹€.
01:09
that's able to seek out, cut and eventually degrade viral DNA
17
69140
5949
이것은 νŠΉλ³„ν•œ λ°©λ²•μœΌλ‘œ λ°”μ΄λŸ¬μŠ€μ˜ DNAλ₯Ό μ°Ύμ•„ λ‚΄μ„œ μž˜λΌλ‚Ό 수 있고,
01:15
in a specific way.
18
75113
1262
λ§ˆμΉ¨λ‚΄ νŒŒκ΄΄μ‹œν‚¬ 수 μžˆμ§€μš”.
01:17
And it was through our research
19
77054
1502
캐슀9 μ΄λΌλŠ” λ‹¨λ°±μ§ˆμ˜ ν™œλ™μ„
01:18
to understand the activity of this protein, Cas9,
20
78580
3657
μ΄ν•΄ν•˜κΈ° μœ„ν•œ 연ꡬλ₯Ό ν•˜λ˜ 쀑에,
01:22
that we realized that we could harness its function
21
82261
2831
κ·Έ κΈ°λŠ₯을 μœ μ „κ³΅ν•™κΈ°μˆ μ—
01:25
as a genetic engineering technology --
22
85116
3406
ν™œμš©ν•  수 μžˆλ‹€λŠ” 것을 κΉ¨λ‹¬μ•˜μŠ΅λ‹ˆλ‹€.
01:28
a way for scientists to delete or insert specific bits of DNA into cells
23
88546
6951
κ³Όν•™μžλ“€μ΄ 세포 λ‚΄ DNA의 νŠΉμ • 뢀뢄을 μ‚­μ œν•˜κ±°λ‚˜ μ‚½μž…ν•˜λŠ” 방법인데
01:35
with incredible precision --
24
95521
2031
μ—„μ²­λ‚œ 정밀함이 ν•„μš”ν•˜μ£ .
01:37
that would offer opportunities
25
97576
1667
μ΄κ²ƒμ˜ ν™œμš©μ€
01:39
to do things that really haven't been possible in the past.
26
99267
3072
κ³Όκ±°μ—λŠ” λΆˆκ°€λŠ₯ν–ˆλ˜ 것듀을 ν•  수 μžˆλŠ” 기회λ₯Ό μ œκ³΅ν•©λ‹ˆλ‹€.
01:43
The CRISPR technology has already been used
27
103184
2062
이 크리슀퍼 κΈ°μˆ μ€ 이미
01:45
to change the DNA in the cells of mice and monkeys,
28
105270
5084
μ₯λ‚˜ μ›μˆ­μ΄, ν˜Ήμ€ λ‹€λ₯Έ 유기체의 세포 DNAλ₯Ό λ³€ν˜•ν•˜λŠ” 데에
01:50
other organisms as well.
29
110378
1625
μ‚¬μš©λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
01:52
Chinese scientists showed recently
30
112527
1755
졜근, μ€‘κ΅­μ˜ κ³Όν•™μžλ“€μ€
01:54
that they could even use the CRISPR technology
31
114306
2159
심지어 이 크리슀퍼 κΈ°μˆ μ„
01:56
to change genes in human embryos.
32
116489
2629
인간 λ°°μ•„μ˜ μœ μ „μž λ³€ν˜•μ—λ„ μ‚¬μš©ν–ˆλ‹€κ³  λ°œν‘œν–ˆμ£ .
01:59
And scientists in Philadelphia showed they could use CRISPR
33
119142
3143
그리고 ν•„λΌλΈν”Όμ•„μ˜ κ³Όν•™μžλ“€μ€ 크리슀퍼 κΈ°μˆ μ„ μ΄μš©ν•΄μ„œ
02:02
to remove the DNA of an integrated HIV virus
34
122309
4843
μ—μ΄μ¦ˆμ— κ°μ—Όλœ 인간 μ„Έν¬λ‘œλΆ€ν„° μΌμ²΄ν™”λœ μ—μ΄μ¦ˆ λ°”μ΄λŸ¬μŠ€μ˜ DNAλ₯Ό
02:07
from infected human cells.
35
127176
2195
μ œκ±°ν•  수 μžˆλ‹€κ³  λ°œν‘œν–ˆμŠ΅λ‹ˆλ‹€.
02:10
The opportunity to do this kind of genome editing
36
130217
3131
μ΄λŸ¬ν•œ μœ μ „μž λ³€ν˜•μ˜ κΈ°νšŒλŠ”
02:13
also raises various ethical issues that we have to consider,
37
133372
3283
μš°λ¦¬κ°€ κ³ λ €ν•΄μ•Όλ§Œ ν•˜λŠ” λ‹€μ–‘ν•œ 윀리적인 λ¬Έμ œλ“€μ„ μ•ΌκΈ°ν•©λ‹ˆλ‹€.
02:16
because this technology can be employed not only in adult cells,
38
136679
3968
μ™œλƒν•˜λ©΄ 이 κΈ°μˆ μ€ 성체 세포듀 뿐만 μ•„λ‹ˆλΌ
02:20
but also in the embryos of organisms,
39
140671
2603
유기체의 λ°°μ•„λ“€μ—κ²Œλ„ 적용될 수 있기 λ•Œλ¬Έμ΄μ§€μš”.
02:23
including our own species.
40
143298
3000
λ¬Όλ‘  인간을 ν¬ν•¨ν•΄μ„œ λ§μž…λ‹ˆλ‹€.
02:26
And so, together with my colleagues,
41
146767
1998
κ·Έλž˜μ„œ μ €λŠ” λ™λ£Œλ“€κ³Ό ν•¨κ»˜
02:28
I've called for a global conversation about the technology that I co-invented,
42
148789
4488
μ œκ°€ 곡동 κ°œλ°œν•œ 이 κΈ°μˆ μ— κ΄€ν•œ ꡭ제적 λ…Όμ˜λ₯Ό μš”κ΅¬ν–ˆμŠ΅λ‹ˆλ‹€.
02:33
so that we can consider all of the ethical and societal implications
43
153301
4540
이 같은 기술이 κ°€μ Έμ˜¬ 수 μžˆλŠ” λͺ¨λ“  윀리적이고 μ‚¬νšŒμ μΈ 영ν–₯을
02:37
of a technology like this.
44
157865
1724
κ³ λ €ν•˜κΈ° μœ„ν•΄μ„œ 말이죠.
02:40
What I want to do now is tell you what the CRISPR technology is,
45
160056
4939
μ œκ°€ μ§€κΈˆ μ—¬λŸ¬λΆ„κ»˜ λ§ν•˜κ³  싢은 것은 크리슀퍼 기술이 무엇인지,
02:45
what it can do,
46
165019
1454
무엇을 ν•  수 μžˆλŠ”μ§€,
02:46
where we are today
47
166497
1245
기술 μˆ˜μ€€μ΄ μ–΄λ””κΉŒμ§€ μ™”λŠ”μ§€,
02:47
and why I think we need to take a prudent path forward
48
167766
3022
그리고 이 κΈ°μˆ μ„ μ μš©ν•¨μ— μžˆμ–΄μ„œ
02:50
in the way that we employ this technology.
49
170812
2353
μ™œ 그토둝 신쀑해야 ν•˜λŠ”μ§€ μž…λ‹ˆλ‹€.
02:55
When viruses infect a cell, they inject their DNA.
50
175085
3746
λ°”μ΄λŸ¬μŠ€κ°€ ν•œ 세포λ₯Ό κ°μ—Όμ‹œν‚¬ λ•ŒλŠ” μžμ‹ μ˜ DNAλ₯Ό 세포에 μ£Όμž…ν•©λ‹ˆλ‹€.
02:59
And in a bacterium,
51
179363
1476
그리고 μ„Έκ·  μ•ˆμ—μ„œλŠ”
03:00
the CRISPR system allows that DNA to be plucked out of the virus,
52
180863
4865
크리슀퍼 μ‹œμŠ€ν…œμ΄ κ·Έ DNAλ₯Ό λ°”μ΄λŸ¬μŠ€λ‘œλΆ€ν„° λ½‘μ•„λ‚΄μ„œ
03:05
and inserted in little bits into the chromosome --
53
185752
3845
염색체 일뢀뢄에 λ„£μ–΄ λ‘‘λ‹ˆλ‹€.
03:09
the DNA of the bacterium.
54
189621
1950
μ„Έκ·  μžμ‹ μ˜ DNA에 λ„£μ–΄ 두죠.
03:11
And these integrated bits of viral DNA get inserted at a site called CRISPR.
55
191937
5350
이 λ°”μ΄λŸ¬μŠ€ DNA의 λͺ¨λ“  정보가 μ €μž₯λ˜λŠ” μ˜μ—­μ΄ '크리슀퍼'μž…λ‹ˆλ‹€.
03:18
CRISPR stands for clustered regularly interspaced short palindromic repeats.
56
198032
5995
ν¬λ¦¬μŠ€νΌλŠ” '주기적 κ°„κ²©μœΌλ‘œ λΆ„ν¬ν•˜λŠ” 짧은 회문ꡬ쑰 λ°˜λ³΅μ„œμ—΄'의 μ•½μžμž…λ‹ˆλ‹€.
03:24
(Laughter)
57
204051
1001
(μ›ƒμŒ)
03:25
A big mouthful -- you can see why we use the acronym CRISPR.
58
205076
3373
λ§ν•˜κΈ° μ—„μ²­ κΈΈμ£ . μ™œ ν¬λ¦¬μŠ€νΌλΌλŠ” μ•½μžλ₯Ό μ‚¬μš©ν•˜λŠ”μ§€ μ•„μ‹œκ² μ£ ?
03:28
It's a mechanism that allows cells to record, over time,
59
208890
3988
이것은 μ„Έν¬λ“€λ‘œ ν•˜μ—¬κΈˆ κ°μ—Όλœ 적 μžˆλŠ” λ°”μ΄λŸ¬μŠ€λ“€μ„
03:32
the viruses they have been exposed to.
60
212902
2682
μ‹œκ°„μ΄ μ§€λ‚˜λ„ κΈ°μ–΅ν•˜λ„λ‘ ν•˜λŠ” κΈ°μ œμž…λ‹ˆλ‹€.
03:35
And importantly, those bits of DNA are passed on to the cells' progeny,
61
215608
4683
더 μ€‘μš”ν•œ 것은, κ·Έ DNA 정보가 μ„Έν¬λ“€μ˜ μžμ†μ—κ²Œ μ „ν•΄μ§„λ‹€λŠ” 것이죠.
03:40
so cells are protected from viruses not only in one generation,
62
220315
4533
κ·Έλž˜μ„œ 단지 ν•œ μ„ΈλŒ€μ˜ 세포듀 뿐만 μ•„λ‹ˆλΌ
03:44
but over many generations of cells.
63
224872
2349
λ‹€μŒ μ„ΈλŒ€μ˜ 세포듀 λ˜ν•œ λ°”μ΄λŸ¬μŠ€λ‘œλΆ€ν„° λ³΄ν˜Έλ˜λŠ” κ²ƒμž…λ‹ˆλ‹€.
03:47
This allows the cells to keep a record of infection,
64
227245
4285
μ΄λŠ” 세포듀이 감염 기둝을 λ³΄μ‘΄ν•˜λ„λ‘ ν•©λ‹ˆλ‹€.
03:51
and as my colleague, Blake Wiedenheft, likes to say,
65
231554
2939
μ €μ˜ λ™λ£ŒμΈ λΈ”λ ˆμ΄ν¬ μœ„λ“ ν—€ν”„κ°€ 즐겨 λ§ν•˜λŠ” ν‘œν˜„λŒ€λ‘œλΌλ©΄
03:54
the CRISPR locus is effectively a genetic vaccination card in cells.
66
234517
4956
이 크리슀퍼의 μ‘΄μž¬λŠ” μ„Έν¬λ“€μ—λŠ” 효과적인 μœ μ „μ  μ˜ˆλ°©μ ‘μ’… μΉ΄λ“œ μž…λ‹ˆλ‹€.
04:00
Once those bits of DNA have been inserted into the bacterial chromosome,
67
240572
4498
일단 μ΄λŸ¬ν•œ DNA 쑰각듀이 μ„Έκ· μ˜ μ—Όμƒ‰μ²΄λ‘œ μ‚½μž…λ˜λ©΄,
04:05
the cell then makes a little copy of a molecule called RNA,
68
245094
4199
μ„Έν¬λŠ” RNAλΌλŠ” λΆ„μž λ³΅μ œλ³Έμ„ λ§Œλ“€μ–΄λƒ…λ‹ˆλ‹€.
04:09
which is orange in this picture,
69
249317
2015
μœ„ κ·Έλ¦Όμ—μ„œ 주황색 뢀뢄인데,
04:11
that is an exact replicate of the viral DNA.
70
251356
4717
이것은 λ°”μ΄λŸ¬μŠ€ DNAκ°€ μ™„μ „νžˆ 볡제된 κ²ƒμž…λ‹ˆλ‹€.
04:16
RNA is a chemical cousin of DNA,
71
256502
2336
RNAλŠ” DNA와 화학적 ν˜•μ œκ΄€κ³„μ΄κ³ ,
04:18
and it allows interaction with DNA molecules
72
258862
3953
짝을 μ΄λ£¨λŠ” μ—ΌκΈ°μ„œμ—΄μ„ 가진 DNA λΆ„μžμ™€
04:22
that have a matching sequence.
73
262839
1999
μƒν˜Έμž‘μš©μ„ ν•©λ‹ˆλ‹€.
04:25
So those little bits of RNA from the CRISPR locus
74
265258
3909
κ·Έλž˜μ„œ 크리슀퍼 μ˜μ—­μ—μ„œ λ‚˜μ˜¨ μ΄λŸ¬ν•œ μž‘μ€ RNA 쑰각듀은
04:29
associate -- they bind -- to protein called Cas9,
75
269191
3789
캐슀9 이라 λΆˆλ¦¬λŠ” λ‹¨λ°±μ§ˆκ³Ό κ²°ν•©ν•˜μ—¬ μ—°κ²°λ©λ‹ˆλ‹€.
04:33
which is white in the picture,
76
273004
1935
κ·Έλ¦Όμ—μ„œ 흰 뢀뢄이 캐슀9μ΄μ§€μš”.
04:34
and form a complex that functions like a sentinel in the cell.
77
274963
4219
그리고 세포 λ‚΄μ—μ„œ κ°μ‹œλ³‘κ³Ό 같은 κΈ°λŠ₯을 ν•˜λŠ” 볡합체λ₯Ό ν˜•μ„±ν•©λ‹ˆλ‹€.
04:39
It searches through all of the DNA in the cell,
78
279206
3414
그것은 세포 λ‚΄μ˜ λͺ¨λ“  DNAλ₯Ό κ²€μƒ‰ν•΄μ„œ
04:42
to find sites that match the sequences in the bound RNAs.
79
282644
4252
κ²°ν•©λœ RNAλ“€μ˜ μ—ΌκΈ°μ„œμ—΄κ³Ό 짝이 λ˜λŠ” 뢀뢄을 μ°ΎμŠ΅λ‹ˆλ‹€.
04:46
And when those sites are found --
80
286920
1635
κ·Έ μœ„μΉ˜λ₯Ό 찾게 되면,
04:48
as you can see here, the blue molecule is DNA --
81
288579
3470
μ—¬κΈ°μ„œ λ³΄λŠ” 바와 같이, νŒŒλž€μƒ‰μ΄ DNA λΆ„μžμΈλ°μš”.
04:52
this complex associates with that DNA
82
292073
2849
이 λ³΅ν•©μ²΄λŠ” DNA와 μ—°κ²°λ˜κ³ 
04:54
and allows the Cas9 cleaver to cut up the viral DNA.
83
294946
5302
캐슀9이 칼처럼 λ°”μ΄λŸ¬μŠ€μ„± DNAλ₯Ό μž˜λΌλ‚Ό 수 있게 ν•©λ‹ˆλ‹€.
05:00
It makes a very precise break.
84
300272
2356
맀우 μ •λ°€ν•˜κ²Œ μž˜λΌλ‚΄μ£ .
05:04
So we can think of the Cas9 RNA sentinel complex
85
304096
4400
κ·Έλž˜μ„œ μš°λ¦¬λŠ” κ°μ‹œλ³‘ μ—­ν• μ˜ 캐슀9-RNA 볡합체λ₯Ό
05:08
like a pair of scissors that can cut DNA --
86
308520
3181
DNAλ₯Ό 자λ₯Ό 수 μžˆλŠ” κ°€μœ„λ‘œ λΉ„μœ ν•©λ‹ˆλ‹€.
05:11
it makes a double-stranded break in the DNA helix.
87
311725
3236
그것은 DNA λ‚˜μ„ μ—μ„œ μ–‘κ°€λ‹₯ μ ˆλ‹¨μ„ λ§Œλ“€μ–΄λƒ…λ‹ˆλ‹€.
05:14
And importantly, this complex is programmable,
88
314985
3982
더 μ€‘μš”ν•œ 것은, 이 λ³΅ν•©μ²΄λŠ” ν”„λ‘œκ·Έλž˜λ°ν™” ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
05:18
so it can be programmed to recognize particular DNA sequences,
89
318991
5200
즉, νŠΉμ •ν•œ DNA μ„œμ—΄μ„ μΈμ§€ν•˜λ„λ‘ ν”„λ‘œκ·Έλž¨ν™” ν•΄μ„œ
05:24
and make a break in the DNA at that site.
90
324215
3027
κ·Έ λΆ€λΆ„μ˜ DNAλ₯Ό μ ˆλ‹¨ν•΄ λ‚Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
05:27
As I'm going to tell you now,
91
327928
1580
μ œκ°€ μ§€κΈˆ λ§μ”€λ“œλ Έλ“―μ΄,
05:29
we recognized that that activity could be harnessed for genome engineering,
92
329532
4776
μš°λ¦¬λŠ” μ΄λŸ¬ν•œ 과정을 μœ μ „κ³΅ν•™μ— μ΄μš©ν•  수 μžˆλ‹€λŠ” 사싀을 μ•Œμ•˜μŠ΅λ‹ˆλ‹€.
05:34
to allow cells to make a very precise change to the DNA
93
334332
4227
μ„Έν¬λ‘œ ν•˜μ—¬κΈˆ DNAκ°€ μ ˆλ‹¨λœ 뢀뢄을
05:38
at the site where this break was introduced.
94
338583
2472
맀우 μ •κ΅ν•˜κ²Œ λ³€ν˜•μ‹œν‚¬ 수 μžˆλ„λ‘ ν•˜λŠ” 것이죠.
05:41
That's sort of analogous
95
341558
1215
이λ₯Ό λΉ„μœ ν•˜μžλ©΄,
05:42
to the way that we use a word-processing program
96
342797
2708
λ¬Έμ„œμ˜ μ˜€νƒ€λ₯Ό κ³ μΉ  수 μžˆλŠ”
05:45
to fix a typo in a document.
97
345529
2280
μ›Œλ“œν”„λ‘œμ„Έμ„œ ν”„λ‘œκ·Έλž¨μ„ μ‚¬μš©ν•˜λŠ” 것과 μœ μ‚¬ν•©λ‹ˆλ‹€.
05:49
The reason we envisioned using the CRISPR system for genome engineering
98
349541
4818
μš°λ¦¬κ°€ 이 크리슀퍼 체계λ₯Ό μœ μ „κ³΅ν•™μ— ν™œμš©ν•˜κΈ°λ₯Ό κΏˆκΎΈλŠ” μ΄μœ λŠ”
05:54
is because cells have the ability to detect broken DNA
99
354383
3619
세포듀이 망가진 DNAλ₯Ό μΆ”μ ν•˜κ³ 
05:58
and repair it.
100
358026
1192
κ³ μΉ  수 μžˆλŠ” λŠ₯λ ₯이 있기 λ•Œλ¬Έμ΄μ§€μš”.
05:59
So when a plant or an animal cell detects a double-stranded break in its DNA,
101
359780
4931
κ·Έλž˜μ„œ λ™μ‹λ¬Όμ˜ 세포가 DNAμ•ˆμ˜ μ–‘κ°€λ‹₯ μ ˆλ‹¨μ„ μ°Ύμ•„λ‚΄λ©΄
06:04
it can fix that break,
102
364735
1597
그것을 κ³ μΉ  수 있죠.
06:06
either by pasting together the ends of the broken DNA
103
366356
3527
μ—ΌκΈ°μ„œμ—΄μ˜ μ•„μ£Ό μž‘μ€ λ³€ν™”λ‘œ 망가진 DNA의 끝을
06:09
with a little, tiny change in the sequence of that position,
104
369907
4110
같이 μ—°κ²°ν•΄ λΆ™μ΄κ±°λ‚˜,
06:14
or it can repair the break by integrating a new piece of DNA at the site of the cut.
105
374041
6283
λ˜λŠ” μž˜λ €λ‚˜κ°„ μžλ¦¬μ— μƒˆλ‘œμš΄ DNA 쑰각을 ν†΅ν•©μ‹œμΌœ κ³ μΉ  수 μžˆμŠ΅λ‹ˆλ‹€.
06:21
So if we have a way to introduce double-stranded breaks into DNA
106
381054
4973
λ§Œμ•½ μš°λ¦¬κ°€ DNAμ•ˆ μ–‘κ°€λ‹₯ μ ˆλ‹¨μ˜
06:26
at precise places,
107
386051
1580
μ •ν™•ν•œ μœ„μΉ˜λ₯Ό μ•Œ 수 μžˆλ‹€λ©΄,
06:27
we can trigger cells to repair those breaks,
108
387655
2809
μ„Έν¬λ‘œ ν•˜μ—¬κΈˆ μ„Έν¬λΆ„μ—΄μ΄λ‚˜ μƒˆλ‘œμš΄ μœ μ „ μ •λ³΄μ˜ 결합에 μ˜ν•΄
06:30
by either the disruption or incorporation of new genetic information.
109
390488
5023
μ΄λŸ¬ν•œ μ ˆλ‹¨ 뢀뢄을 κ³ μΉ˜λ„λ‘ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
06:35
So if we were able to program the CRISPR technology
110
395936
3242
λ”°λΌμ„œ, 크리슀퍼 κΈ°μˆ μ„ ν”„λ‘œκ·Έλž¨ν™” ν•  수 μžˆλ‹€λ©΄
06:39
to make a break in DNA
111
399202
2323
예λ₯Ό λ“€μ–΄, 낭포성 μ„¬μœ μ¦μ„ μœ λ°œν•˜λŠ”
06:41
at the position at or near a mutation causing cystic fibrosis, for example,
112
401549
6269
λŒμ—°λ³€μ΄ μžλ¦¬λ‚˜ κ·Έ 근처의 DNAλ₯Ό μ ˆλ‹¨ν•  수 μžˆλ‹€λ©΄
06:47
we could trigger cells to repair that mutation.
113
407842
3505
μš°λ¦¬λŠ” μ„Έν¬λ“€λ‘œ ν•˜μ—¬κΈˆ κ·Έ λŒμ—°λ³€μ΄λ₯Ό κ³ μΉ˜λ„λ‘ ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
06:52
Genome engineering is actually not new, it's been in development since the 1970s.
114
412464
5103
μœ μ „κ³΅ν•™μ€ 사싀 μƒˆλ‘œμš΄ 것이 μ•„λ‹ˆλΌ, 1970λ…„λŒ€ λΆ€ν„° λ°œμ „λ˜μ–΄ μ™”μŠ΅λ‹ˆλ‹€.
06:57
We've had technologies for sequencing DNA,
115
417591
2751
DNA의 μ—ΌκΈ°μ„œμ—΄μ„ κ²°μ •ν•˜κ±°λ‚˜,
07:00
for copying DNA,
116
420366
1595
DNAλ₯Ό λ³΅μ œν•˜κ±°λ‚˜,
07:01
and even for manipulating DNA.
117
421985
2380
심지어 DNAλ₯Ό μ‘°μž‘ν•˜λŠ” κΈ°μˆ μ—κΉŒμ§€ 이λ₯΄λ €μŠ΅λ‹ˆλ‹€.
07:04
And these technologies were very promising,
118
424969
3136
그리고 이런 κΈ°μˆ λ“€μ€ 맀우 μ΄‰λ§λ°›μ•˜μ§€λ§Œ,
07:08
but the problem was that they were either inefficient,
119
428129
4406
λ¬Έμ œλŠ” κΈ°μˆ λ“€μ΄ νš¨κ³Όμ μ΄μ§€ μ•Šκ±°λ‚˜,
07:12
or they were difficult enough to use
120
432559
2333
μ‚¬μš©ν•˜κΈ° μ–΄λ ΅λ‹€λŠ” 데에 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
07:14
that most scientists had not adopted them for use in their own laboratories,
121
434916
4802
λŒ€λΆ€λΆ„μ˜ κ³Όν•™μžλ“€μ΄ μžμ‹ μ˜ μ‹€ν—˜μ‹€μ—μ„œ 이 κΈ°μˆ μ„ μ œλŒ€λ‘œ μ μš©ν•˜μ§€ λͺ»ν–ˆκ³ ,
07:19
or certainly for many clinical applications.
122
439742
4165
λ˜λŠ” λ§Žμ€ μž„μƒμ μš©μ—λ„ λΆ„λͺ…ν•œ 어렀움이 μžˆμ—ˆμ£ .
07:24
So, the opportunity to take a technology like CRISPR and utilize it has appeal,
123
444722
7000
κ·Έλž˜μ„œ ν¬λ¦¬μŠ€νΌμ™€ 같은 기술의 적용 κ°€λŠ₯성이 μ£Όλͺ© λ°›λŠ” μ΄μœ λŠ”
07:32
because of its relative simplicity.
124
452101
3001
λ°”λ‘œ, 이 기술이 μƒλŒ€μ μœΌλ‘œ κ°„λ‹¨ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
07:35
We can think of older genome engineering technologies
125
455126
3769
μ˜ˆμ „μ˜ μœ μ „κ³΅ν•™ κΈ°μˆ λ“€μ€
07:38
as similar to having to rewire your computer
126
458919
3659
마치 맀번 μƒˆλ‘œμš΄ μ†Œν”„νŠΈμ›¨μ–΄λ₯Ό κ΅¬λ™μ‹œν‚¬ λ•Œλ§ˆλ‹€
07:42
each time you want to run a new piece of software,
127
462602
3770
μ»΄ν“¨ν„°μ˜ 전선을 λ°”κΎΈλŠ” 것과 λ§ˆμ°¬κ°€μ§€μ˜€μ£ .
07:46
whereas the CRISPR technology is like software for the genome,
128
466396
3764
반면, 크리슀퍼 κΈ°μˆ μ€ μœ μ „μ²΄λ₯Ό μœ„ν•œ μ†Œν”„νŠΈμ›¨μ–΄μ™€ κ°™μ•„μ„œ,
07:50
we can program it easily, using these little bits of RNA.
129
470184
3486
μž‘μ€ RNA 쑰각듀을 μ΄μš©ν•΄μ„œ μ‰½κ²Œ ν”„λ‘œκ·Έλž¨ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
07:54
So once a double-stranded break is made in DNA,
130
474658
3093
λ”°λΌμ„œ, 일단 DNAμ•ˆμ— μ–‘κ°€λ‹₯ μ ˆλ‹¨μ΄ λ§Œλ“€μ–΄μ§€λ©΄,
07:57
we can induce repair,
131
477775
2087
DNA 볡ꡬλ₯Ό μœ λ„ν•  수 있고,
07:59
and thereby potentially achieve astounding things,
132
479886
3446
이둜써 잠재적으둜 λ†€λΌμš΄ 것듀을 ν•΄λ‚Ό 수 μžˆμŠ΅λ‹ˆλ‹€.
08:03
like being able to correct mutations that cause sickle cell anemia
133
483356
4159
겸상 적혈ꡬ λΉˆν˜ˆμ΄λ‚˜ ν—ŒνŒ…ν„΄ 증후ꡰ을 μœ λ°œν•˜λŠ”
08:07
or cause Huntington's Disease.
134
487539
2351
λŒμ—°λ³€μ΄λ₯Ό λ°”λ‘œμž‘μ„ 수 있죠.
08:09
I actually think that the first applications of the CRISPR technology
135
489914
3461
μ €λŠ” 사싀 이 크리슀퍼 κΈ°μˆ μ„ κ°€μž₯ λ¨Όμ € μ μš©ν•  수 μžˆλŠ” λŒ€μƒμ€
08:13
are going to happen in the blood,
136
493399
2269
ν˜ˆμ•‘μ΄λΌκ³  μƒκ°ν•©λ‹ˆλ‹€.
08:15
where it's relatively easier to deliver this tool into cells,
137
495692
4753
λ”±λ”±ν•œ 쑰직에 λΉ„ν•΄ ν˜ˆμ•‘μ€ μ΄λŸ¬ν•œ κΈ°μˆ μ„ μ μš©ν•˜κΈ°κ°€
08:20
compared to solid tissues.
138
500469
2121
비ꡐ적 μ‰½μŠ΅λ‹ˆλ‹€.
08:23
Right now, a lot of the work that's going on
139
503905
2404
μ§€κΈˆλ„ 인간 μ§ˆλ³‘μ— κ΄€ν•œ λ§Žμ€ 연ꡬ듀이
08:26
applies to animal models of human disease, such as mice.
140
506333
3969
μ₯μ™€ 같은 동물듀에 적용되고 μžˆμŠ΅λ‹ˆλ‹€.
08:30
The technology is being used to make very precise changes
141
510326
3269
이 기술둜 맀우 μ •κ΅ν•œ λ³€ν˜•μ„ λ§Œλ“€μ–΄ λ‚Ό 수 있고,
08:33
that allow us to study the way that these changes in the cell's DNA
142
513619
5171
이둜써 세포 쑰직, ν˜Ήμ€ ν•œ 유기체 전체에 영ν–₯을 λ―ΈμΉ˜λŠ”
08:38
affect either a tissue or, in this case, an entire organism.
143
518814
4320
세포 DNA의 λ³€ν˜•μ„ 연ꡬ할 수 μžˆμŠ΅λ‹ˆλ‹€.
08:43
Now in this example,
144
523515
1207
이것은 κ·Έ μ˜ˆμΈλ°μš”.
08:44
the CRISPR technology was used to disrupt a gene
145
524746
3793
μ₯κ°€ 검은 털을 갖도둝 ν•˜λŠ” νŠΉμ • μœ μ „μžμ˜
08:48
by making a tiny change in the DNA
146
528563
2767
DNA 내에 μ•„μ£Ό μž‘μ€ λ³€ν™”λ₯Ό λ§Œλ“€μ–΄λƒ„μœΌλ‘œμ¨
08:51
in a gene that is responsible for the black coat color of these mice.
147
531354
4444
μœ μ „μžλ₯Ό κ΅λž€μ‹œν‚€λŠ” 데에 크리슀퍼 기술이 μ‚¬μš©λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
08:56
Imagine that these white mice differ from their pigmented litter-mates
148
536219
5018
ν•œ 배에 νƒœμ–΄λ‚œ 색을 κ°–λŠ” λ‹€λ₯Έ μƒˆλΌλ“€κ³Ό 달리, 이 ν•˜μ–€ μ₯λ“€μ€
09:01
by just a tiny change at one gene in the entire genome,
149
541261
4667
전체 μœ μ „μ²΄ 쀑 단 ν•˜λ‚˜μ˜ μœ μ „μžκ°€ λ³€ν˜•λ˜μ—ˆλ‹€λŠ” 점을 생각해 λ³΄μ„Έμš”.
09:05
and they're otherwise completely normal.
150
545952
1990
κ·Έ 점 λ§κ³ λŠ” μ™„μ „νžˆ μ •μƒμ΄μ§€μš”.
09:07
And when we sequence the DNA from these animals,
151
547966
2891
그리고, 이같은 λ™λ¬Όλ“€μ˜ DNA μ—ΌκΈ°μ„œμ—΄μ„ 확인해 보면
09:10
we find that the change in the DNA
152
550881
2626
DNA μ•ˆμ—μ„œμ˜ λ³€ν™”κ°€
09:13
has occurred at exactly the place where we induced it,
153
553531
3142
크리슀퍼 κΈ°μˆ μ„ μ‚¬μš©ν•΄ μœ λ„ν•œ κ³³μ—μ„œ
09:16
using the CRISPR technology.
154
556697
1762
μ •ν™•νžˆ μΌμ–΄λ‚¬μŒμ„ μ•Œμ•„λƒˆμŠ΅λ‹ˆλ‹€.
09:19
Additional experiments are going on in other animals
155
559662
2724
좔가적인 μ‹€ν—˜λ“€μ΄ λ‹€λ₯Έ λ™λ¬Όλ“€μ—κ²Œλ„ μ‹œν–‰λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
09:22
that are useful for creating models for human disease,
156
562410
4095
인간 μ§ˆλ³‘μ„ μœ„ν•œ λͺ¨λΈμ„ λ§Œλ“œλŠ”λ° μœ μš©ν•˜μ§€μš”.
09:26
such as monkeys.
157
566529
1603
μ›μˆ­μ΄λ₯Ό λŒ€μƒμœΌλ‘œ ν•˜λŠ” μ‹€ν—˜λ“€μ΄μ£ .
09:28
And here we find that we can use these systems
158
568156
3226
그리고 μ΄λŸ¬ν•œ 체계λ₯Ό μ΄μš©ν•˜μ—¬
09:31
to test the application of this technology in particular tissues,
159
571406
3802
νŠΉμ •ν•œ 쑰직에 이 κΈ°μˆ μ„ μ μš©ν•˜λŠ” 방법을 μ°Ύκ³  μžˆμŠ΅λ‹ˆλ‹€.
09:35
for example, figuring out how to deliver the CRISPR tool into cells.
160
575232
5073
예λ₯Όλ“€μ–΄, 크리슀퍼 도ꡬλ₯Ό 세포 μ†μœΌλ‘œ μ „λ‹¬ν•˜λŠ” 방법을 μ°Ύκ³  있죠.
09:40
We also want to understand better
161
580329
2148
μš°λ¦¬λŠ” λ˜ν•œ, DNAκ°€ 잘렀져 λ‚˜κ°„ 후에
09:42
how to control the way that DNA is repaired after it's cut,
162
582501
3627
볡ꡬ 과정을 μ‘°μ ˆν•˜λŠ” 방법과
09:46
and also to figure out how to control and limit any kind of off-target,
163
586152
5681
ν‘œμ μ„ λ²—μ–΄λ‚˜κ±°λ‚˜ μ˜λ„λ˜μ§€ μ•Šμ€ 영ν–₯을
09:51
or unintended effects of using the technology.
164
591857
3237
ν†΅μ œν•˜κ³  μ œν•œν•˜λŠ” 방법을 μ•Œμ•„λ‚΄κ³ μž ν•©λ‹ˆλ‹€.
09:56
I think that we will see clinical application of this technology,
165
596356
6150
μ €λŠ” 이 κΈ°μˆ μ„ 틀림없이 μž„μƒμ—λ„ μ μš©ν•  수 μžˆμ„ κ²ƒμœΌλ‘œ λ΄…λ‹ˆλ‹€.
10:02
certainly in adults,
166
602530
1512
μ„±μΈλ“€μ—κ²Œμš”.
10:04
within the next 10 years.
167
604066
1603
10λ…„ μ•ˆμ— λ§μž…λ‹ˆλ‹€.
10:05
I think that it's likely that we will see clinical trials
168
605693
3134
μ €λŠ” μž„μƒ μ‹€ν—˜κ³Ό
10:08
and possibly even approved therapies within that time,
169
608851
3429
심지어 곡인된 μΉ˜λ£Œλ²•λ“€μ„ 곧 λ³Ό 수 μžˆμ„ 것이라 μƒκ°ν•©λ‹ˆλ‹€.
10:12
which is a very exciting thing to think about.
170
612304
2713
μƒκ°λ§Œν•΄λ„ 맀우 ν₯미둜운 일이죠.
10:15
And because of the excitement around this technology,
171
615041
2501
그리고, 이 κΈ°μˆ μ— μ—΄κ΄‘ν•˜μ—¬
10:17
there's a lot of interest in start-up companies
172
617566
3293
크리슀퍼 κΈ°μˆ μ„ 상업화해 온
10:20
that have been founded to commercialize the CRISPR technology,
173
620883
4762
μ‹ μƒνšŒμ‚¬μ—μ„œ λ§Žμ€ ν₯λ―Έλ₯Ό 가지고 있고,
10:25
and lots of venture capitalists
174
625669
1489
λ§Žμ€ λ²€μ²˜νˆ¬μžκ°€λ“€μ΄ 이 νšŒμ‚¬λ“€μ— 투자λ₯Ό ν•΄μ˜€κ³  μžˆμ§€μš”.
10:27
that have been investing in these companies.
175
627182
2572
10:31
But we have to also consider
176
631242
1573
κ·ΈλŸ¬λ‚˜ μš°λ¦¬λŠ” 크리슀퍼 기술이
10:32
that the CRISPR technology can be used for things like enhancement.
177
632839
3622
'κ°œλŸ‰'을 μœ„ν•΄μ„œ μ‚¬μš©λ  μˆ˜λ„ μžˆλ‹€λŠ” 점을 κ³ λ €ν•΄μ•Όλ§Œ ν•©λ‹ˆλ‹€.
10:36
Imagine that we could try to engineer humans
178
636485
3044
인간을 μ„€κ³„ν•˜λ € ν•œλ‹€κ³  μƒμƒν•΄λ³΄μ„Έμš”.
10:39
that have enhanced properties, such as stronger bones,
179
639553
4435
νŠΉμ„±μ„ κ°œλŸ‰ν•  λͺ©μ μœΌλ‘œ 뼈λ₯Ό 더 κ°•ν•˜κ²Œ ν•˜κ±°λ‚˜,
10:44
or less susceptibility to cardiovascular disease
180
644012
4071
μ‹¬ν˜ˆκ΄€ μ§ˆν™˜μ— λŒ€ν•œ 민감도λ₯Ό 쀄이고,
10:48
or even to have properties
181
648107
1511
심지어, 개인의 νŠΉμ„±μ„
10:49
that we would consider maybe to be desirable,
182
649642
2399
μš°λ¦¬κ°€ λ°”λΌλŠ” μͺ½μœΌλ‘œ κ°œλŸ‰ν•˜λŠ” κ²λ‹ˆλ‹€.
10:52
like a different eye color or to be taller, things like that.
183
652065
4441
λ‹€λ₯Έ 눈 색깔을 κ°–κ±°λ‚˜, ν‚€κ°€ 더 컀지도둝 ν•˜λŠ” 것이죠.
10:57
"Designer humans," if you will.
184
657942
1846
λ§Œμ•½ κ·Έλ ‡λ‹€λ©΄ 인간을 λ””μžμΈ ν•˜λŠ” 것이 λ˜κ² μ§€μš”.
11:00
Right now, the genetic information
185
660780
2390
ν˜„μž¬, μ–΄λ–€ μ’…λ₯˜μ˜ μœ μ „μžκ°€
11:03
to understand what types of genes would give rise to these traits
186
663194
4493
이 같은 νŠΉμ„±μ„ κ°€μ Έλ‹€ μ£ΌλŠ”μ§€ μ΄ν•΄ν•˜κΈ° μœ„ν•œ μœ μ „μ  μ •λ³΄λŠ”
11:07
is mostly not known.
187
667711
1733
거의 μ•Œλ €μ Έ μžˆμ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
11:09
But it's important to know
188
669468
1246
ν•˜μ§€λ§Œ, μ€‘μš”ν•œ 것은
11:10
that the CRISPR technology gives us a tool to make such changes,
189
670738
4285
κ·Έ 지식을 μ–»κΈ°λ§Œ ν•œλ‹€λ©΄
11:15
once that knowledge becomes available.
190
675047
2507
크리슀퍼 기술이 μš°λ¦¬μ—κ²Œ 이 같은 λ³€ν™”λ₯Ό κ°€μ Έμ˜¬ κ±°λΌλŠ” 점이죠.
11:18
This raises a number of ethical questions that we have to carefully consider,
191
678651
4111
μ΄λŠ” μš°λ¦¬κ°€ μ‹ μ€‘νžˆ κ³ λ €ν•΄μ•Όλ§Œ ν•˜λŠ” λ§Žμ€ 윀리적인 문제λ₯Ό μ•ΌκΈ°ν•˜κ³ ,
11:22
and this is why I and my colleagues have called for a global pause
192
682786
4215
κ·Έλ ‡κΈ° λ•Œλ¬Έμ— 저와 제 λ™λ£Œλ“€μ€ 인간 배아에 λŒ€ν•œ 크리슀퍼 기술의
11:27
in any clinical application of the CRISPR technology in human embryos,
193
687025
4746
μ–΄λ– ν•œ μž„μƒμ μΈ μ μš©μ— λŒ€ν•΄μ„œλ„ ꡭ제적인 쀑단을 μš”κ΅¬ν–ˆλ˜ κ²ƒμž…λ‹ˆλ‹€.
11:31
to give us time
194
691795
1151
κ·Έλ ‡κ²Œ ν•¨μœΌλ‘œμ¨ λ°œμƒλ˜λŠ”
11:32
to really consider all of the various implications of doing so.
195
692970
4377
λͺ¨λ“  λ‹€μ–‘ν•œ κ²°κ³Όλ₯Ό μ§„μ§€ν•˜κ²Œ κ³ λ €ν•  μ‹œκ°„μ„ κ°–κΈ° μœ„ν•΄μ„œ λ§μž…λ‹ˆλ‹€.
11:37
And actually, there is an important precedent for such a pause
196
697743
3843
사싀, 이같은 쀑단을 μœ„ν•œ μ€‘μš”ν•œ μ„ λ‘€κ°€
11:41
from the 1970s,
197
701610
1319
1970λ…„λŒ€ λΆ€ν„° μžˆμŠ΅λ‹ˆλ‹€.
11:42
when scientists got together
198
702953
1864
κ³Όν•™μžλ“€μ΄ λͺ¨μ—¬
11:44
to call for a moratorium on the use of molecular cloning,
199
704841
3830
기술의 μ•ˆμ •μ„±μ΄ μ‹ μ€‘ν•˜κ³  νƒ€λ‹Ήν•˜κ²Œ κ²€ν† λ˜κΈ° μ „κΉŒμ§€λŠ”
11:48
until the safety of that technology could be tested carefully and validated.
200
708695
5955
λΆ„μž 볡제 μ‚¬μš©μ˜ μΌμ‹œμ μΈ 정지λ₯Ό μš”μ²­ν–ˆμ„ λ•Œμ˜€μ£ .
11:55
So, genome-engineered humans are not with us yet,
201
715506
5140
λ”°λΌμ„œ μœ μ „μž μ„€κ³„λ‘œ νƒœμ–΄λ‚œ 인간이 아직은 μ‘΄μž¬ν•˜μ§€λŠ” μ•Šμ§€λ§Œ,
12:00
but this is no longer science fiction.
202
720670
2647
더 이상은 κ³΅μƒκ³Όν•™μ†Œμ„€λ„ μ•„λ‹Œ 것이죠.
12:04
Genome-engineered animals and plants are happening right now.
203
724179
4365
λ™μ‹λ¬Όμ˜ μœ μ „μž μ‘°μž‘μ€ μ§€κΈˆλ„ 행해지고 μžˆμŠ΅λ‹ˆλ‹€.
12:09
And this puts in front of all of us a huge responsibility,
204
729069
4053
μ΄λŸ¬ν•œ μ‚¬μ‹€λ‘œ 우리 μ•žμ—λŠ” λ§‰μ€‘ν•œ μ±…μž„κ°μ΄ 놓여져 μžˆμŠ΅λ‹ˆλ‹€.
12:13
to consider carefully both the unintended consequences
205
733146
4577
그것은 κ³Όν•™μ˜ 획기적 진보에 λ”°λ₯Έ κ³„νšλœ 영ν–₯뿐만 μ•„λ‹ˆλΌ
12:17
as well as the intended impacts of a scientific breakthrough.
206
737747
4046
μ˜λ„ν•˜μ§€ μ•Šμ€ 결과듀도 μ‹ μ€‘ν•˜κ²Œ κ³ λ €ν•΄μ•Ό ν•œλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
12:22
Thank you.
207
742190
1152
κ°μ‚¬ν•©λ‹ˆλ‹€.
12:23
(Applause)
208
743366
4000
(λ°•μˆ˜)
12:31
(Applause ends)
209
751572
1163
(λ°•μˆ˜ 끝)
12:33
Bruno Giussani: Jennifer, this is a technology with huge consequences,
210
753535
3894
BG: μ œλ‹ˆνΌ λ°•μ‚¬λ‹˜, λ‹Ήμ‹  λ§λŒ€λ‘œ 이것은 ꡉμž₯ν•œ 결과듀을
12:37
as you pointed out.
211
757453
1151
κ°€μ Έμ˜¬ 수 μžˆλŠ” κΈ°μˆ μ΄λ„€μš”.
12:38
Your attitude about asking for a pause or a moratorium or a quarantine
212
758628
5172
이 κΈ°μˆ μ— λŒ€ν•œ 쀑단, 유예 λ˜λŠ” 격리λ₯Ό μš”κ΅¬ν•˜λŠ” λ‹Ήμ‹ μ˜ μž…μž₯은
12:43
is incredibly responsible.
213
763824
2020
ꡉμž₯ν•œ μ±…μž„κ°μ„ κ°–κ³  μžˆλŠ” 것 κ°™μŠ΅λ‹ˆλ‹€.
12:46
There are, of course, the therapeutic results of this,
214
766733
2548
λ¬Όλ‘ , 이 기술의 치료적 νš¨κ³Όλ“€λ„ μžˆκ² μ§€λ§Œ,
12:49
but then there are the un-therapeutic ones
215
769305
2150
λΉ„μΉ˜λ£Œμ  결과듀도 있고,
12:51
and they seem to be the ones gaining traction,
216
771479
2180
또 κ·Έ λΉ„μΉ˜λ£Œμ  결과듀이 관심을 끌고 μžˆλŠ” 것 처럼 λ³΄μž…λ‹ˆλ‹€.
12:53
particularly in the media.
217
773683
1246
특히 λ―Έλ””μ–΄μ—μ„œ 말이죠.
12:54
This is one of the latest issues of The Economist -- "Editing humanity."
218
774953
4166
μ΄μ½”λ…Έλ―ΈμŠ€νŠΈμ§€μ˜ κ°€μž₯ 졜근 μ΄μŠˆκ°€ "인간 μˆ˜μ •" μž…λ‹ˆλ‹€.
12:59
It's all about genetic enhancement, it's not about therapeutics.
219
779143
3819
μ–΄λ– ν•œ μΉ˜λ£Œμ— λŒ€ν•œ 것이 μ•„λ‹ˆλΌ, μ˜¨ν†΅ μœ μ „μž 강화에 λŒ€ν•œ 것이죠.
13:03
What kind of reactions did you get back in March
220
783541
2282
μ§€λ‚œ 3월에 λ°•μ‚¬λ‹˜κ»˜μ„œ
13:05
from your colleagues in the science world,
221
785847
2017
이 κΈ°μˆ μ— λŒ€ν•΄ 생각할 μ‹œκ°„μ„ κ°–κΈ° μœ„ν•΄
13:07
when you asked or suggested
222
787888
1358
기술 μ‚¬μš©μ„ μž μ‹œ 쀑단해야 ν•œλ‹€κ³ 
13:09
that we should actually pause this for a moment and think about it?
223
789270
3316
μ œμ•ˆν–ˆμ„ λ•Œ, 학계 λ™λ£Œλ“€μ€ μ–΄λ– ν•œ λ°˜μ‘μ„ λ³΄μ˜€λ‚˜μš”?
13:13
Jennifer Doudna: My colleagues were actually, I think, delighted
224
793054
3001
JD: 제 생각에, 제 λ™λ£Œλ“€μ€ 사싀
13:16
to have the opportunity to discuss this openly.
225
796079
2245
이것을 곡둠화할 기회λ₯Ό κ°–κ²Œ λ˜μ–΄ κΈ°λ»ν–ˆμ–΄μš”.
13:18
It's interesting that as I talk to people,
226
798348
2346
ν₯미둜운 것은, μ œκ°€ μ‚¬λžŒλ“€κ³Ό μ–˜κΈ°ν•˜κ³ 
13:20
my scientific colleagues as well as others,
227
800718
2468
제 λ™λ£Œλ‚˜ κ·Έ μ™Έ μ‚¬λžŒλ“€μ—κ²Œ 이 λ¬Έμ œμ— λŒ€ν•΄ 말할 λ•Œλ§ˆλ‹€
13:23
there's a wide variety of viewpoints about this.
228
803210
2310
각기 λ‹€λ₯Έ λ‹€μ–‘ν•œ 의견이 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
13:25
So clearly it's a topic that needs careful consideration and discussion.
229
805544
3674
ν™•μ‹€νžˆ 이 μ£Όμ œλŠ” μ‹ μ€‘ν•˜κ²Œ κ³ λ €ν•˜κ³  λ…Όμ˜λ˜μ–΄μ•Ό ν•˜λŠ” κ²ƒμž„μ€ λΆ„λͺ…ν•©λ‹ˆλ‹€.
13:29
BG: There's a big meeting happening in December
230
809242
2215
BG: λ°•μ‚¬λ‹˜κ³Ό λ°•μ‚¬λ‹˜ λ™λ£Œλ“€ μš”μ²­μœΌλ‘œ
13:31
that you and your colleagues are calling,
231
811481
1976
ꡭ립과학원과 타 기관듀이 μ°Έκ°€ν•˜λŠ”
13:33
together with the National Academy of Sciences and others,
232
813481
2767
큰 νšŒμ˜κ°€ 12월에 μžˆλŠ”λ°μš”,
13:36
what do you hope will come out of the meeting, practically?
233
816272
3292
κ·Έ νšŒμ˜μ—μ„œ 사싀상 μ–΄λ–€ 결둠이 λ‚˜κΈ°λ₯Ό λ°”λΌμ‹œλŠ”μ§€μš”?
13:39
JD: Well, I hope that we can air the views
234
819588
2366
JD: 음 κΈ€μŽ„μš”, μ €λŠ” 이 κΈ°μˆ μ„ μ±…μž„κ°μžˆκ²Œ
13:41
of many different individuals and stakeholders
235
821978
3600
μ‚¬μš©ν•˜λŠ” 방법에 λŒ€ν•΄ κ³ λ―Όν•˜λŠ”
13:45
who want to think about how to use this technology responsibly.
236
825602
4292
λ§Žμ€ κ°œμΈμ΄λ‚˜ 이해 κ΄€κ³„μžλ“€μ˜ 관점을 ν™˜κΈ°μ‹œν‚¬ 수 μžˆμ—ˆμœΌλ©΄ ν•©λ‹ˆλ‹€.
13:49
It may not be possible to come up with a consensus point of view,
237
829918
3102
ν•©μ˜λœ ν•˜λ‚˜μ˜ 관점을 λ„μΆœν•˜κΈ°λŠ” μ–΄λ ΅κ² μ§€λ§Œ,
13:53
but I think we should at least understand
238
833044
1993
μš°λ¦¬κ°€ μ•žμœΌλ‘œ λ‚˜μ•„κ°μ— 따라
13:55
what all the issues are as we go forward.
239
835061
1972
μ–΄λ–€ λ¬Έμ œκ°€ λ°œμƒν•  것인지 λ§ŒνΌμ€ 적어도 이해해야 ν•œλ‹€κ³  μƒκ°ν•©λ‹ˆλ‹€.
13:57
BG: Now, colleagues of yours,
240
837057
1406
BG: λ°•μ‚¬λ‹˜ λ™λ£Œλ“€ 쀑에
13:58
like George Church, for example, at Harvard,
241
838487
2079
ν•˜λ²„λ“œμ˜ 쑰지 μ²˜μΉ˜μ™€ 같은 μ‚¬λžŒμ€ μΌλ‘€λ‘œ 이런 말을 ν–ˆμŠ΅λ‹ˆλ‹€.
14:00
they say, "Yeah, ethical issues basically are just a question of safety.
242
840590
3436
"예, 윀리 λ¬Έμ œλŠ” 기본적으둜 μ•ˆμ „μ„±μ˜ λ¬Έμ œμ— λΆˆκ³Όν•©λ‹ˆλ‹€.
14:04
We test and test and test again, in animals and in labs,
243
844050
2654
동물 및 μ‹€ν—˜μ‹€ ν…ŒμŠ€νŠΈλ₯Ό ν•˜κ³ , ν…ŒμŠ€νŠΈ ν•˜κ³ , 또 ν…ŒμŠ€νŠΈλ₯Ό ν•œ λ‹€μŒμ—
14:06
and then once we feel it's safe enough, we move on to humans."
244
846728
3916
μΆ©λΆ„νžˆ μ•ˆμ „ν•˜λ‹€κ³  λŠκ»΄μ§€λ©΄, κ·Έ λ•Œμ•Ό μΈκ°„μ—κ²Œ μ μš©ν•˜κ²Œ 되죠"
14:10
So that's kind of the other school of thought,
245
850668
2581
κ·ΈλŸ¬λ‹ˆκΉŒ, λ‹€λ₯Έ ν•™κ΅μ˜ 생각은
14:13
that we should actually use this opportunity and really go for it.
246
853273
3148
μš°λ¦¬κ°€ μ‹€μ œμ μœΌλ‘œ 이것을 ν™œμš©ν•΄μ•Ό ν•˜κ³ , ν•œλ²ˆ ν•΄ 봐야 ν•œλ‹€λŠ” 것이죠.
14:16
Is there a possible split happening in the science community about this?
247
856445
3778
κ³Όν•™κ³„μ—μ„œ 이것에 λŒ€ν•΄ 의견이 λ‚˜λ‰  κ°€λŠ₯성이 μžˆμŠ΅λ‹ˆκΉŒ?
14:20
I mean, are we going to see some people holding back
248
860247
2460
κ·ΈλŸ¬λ‹ˆκΉŒ 제 말은, μ–΄λ–€ μ‚¬λžŒμ€ 이것이 μ£ΌλŠ”
14:22
because they have ethical concerns,
249
862731
1794
윀리적인 문제 λ•Œλ¬Έμ— 망섀이고,
14:24
and some others just going forward
250
864549
1687
또 μ–΄λ–€ μ‚¬λžŒμ€ κ΅­κ°€μ˜ λΆ€μ μ ˆν•œ κ·œμ œλ‚˜
14:26
because some countries under-regulate or don't regulate at all?
251
866260
3296
ν˜Ήμ€ λΉ„κ·œμ œλ‘œ 인해 μ‹œν–‰ν•˜κ²Œ λ˜λŠ” 상황을 보게 λ κΉŒμš”?
14:29
JD: Well, I think with any new technology, especially something like this,
252
869580
3845
JD: 음, 제 μƒκ°μ—λŠ”, 특히 이런 μ–΄λ– ν•œ μƒˆλ‘œμš΄ κΈ°μˆ μ—λŠ”
14:33
there are going to be a variety of viewpoints,
253
873449
2785
ꡉμž₯히 λ‹€μ–‘ν•œ μž…μž₯듀이 있기 마련이고,
14:36
and I think that's perfectly understandable.
254
876258
3341
제 생각에 그것은 μΆ©λΆ„νžˆ 이해가 λ©λ‹ˆλ‹€.
14:39
I think that in the end,
255
879623
1512
μ € μ—­μ‹œ κ²°κ΅­μ—λŠ”,
14:41
this technology will be used for human genome engineering,
256
881159
4951
이 기술이 인간 μœ μ „κ³΅ν•™μ„ μœ„ν•΄ μ“°μ—¬μ§ˆ 것이라고 μƒκ°ν•©λ‹ˆλ‹€λ§Œ,
14:46
but I think to do that without careful consideration and discussion
257
886134
4317
잠재적 λ¬Έμ œλ‚˜ μœ„ν—˜μ— λŒ€ν•œ
14:50
of the risks and potential complications
258
890475
2937
μ„Έμ‹¬ν•œ κ³ λ €λ‚˜ λ…Όμ˜ 없이 이것을 μ‹œν–‰ν•˜λŠ” 것은
14:53
would not be responsible.
259
893436
1539
λ¬΄μ±…μž„ν•˜λ‹€ μƒκ°ν•©λ‹ˆλ‹€.
14:54
BG: There are a lot of technologies and other fields of science
260
894999
2979
BG: λ°•μ‚¬λ‹˜μ˜ 기술처럼, κ³Όν•™κ³„μ˜ λ‹€λ₯Έ λΆ„μ•Όμ—μ„œλ„
14:58
that are developing exponentially, pretty much like yours.
261
898002
2771
κΈ°ν•˜κΈ‰μˆ˜μ μœΌλ‘œ λ§Žμ€ κΈ°μˆ λ“€μ΄ 발λͺ…λ˜κ³  μžˆλŠ”λ°μš”.
15:00
I'm thinking about artificial intelligence, autonomous robots and so on.
262
900797
4033
μ €λŠ” 인곡지λŠ₯μ΄λ‚˜, μžλ™λ‘œλ΄‡ λ“±κ³Ό 같은 것듀이 μƒκ°λ‚˜λ„€μš”.
15:05
No one seems --
263
905199
1167
ν•˜μ§€λ§Œ 아무도,
15:06
aside from autonomous warfare robots --
264
906390
2508
무인 μ „μŸ λ‘œλ΄‡μ„ μ œμ™Έν•˜κ³ λŠ”,
15:08
nobody seems to have launched a similar discussion in those fields,
265
908922
4705
아무도 이같은 λΆ„μ•Όμ—μ„œ μΌμ‹œμ  쀑단을 μ΄‰κ΅¬ν•˜λŠ”
15:13
in calling for a moratorium.
266
913651
1410
λΉ„μŠ·ν•œ λ…Όμ˜λ₯Ό μ‹œμž‘ν•˜μ§€ μ•ŠλŠ” κ²ƒμœΌλ‘œ λ³΄μ΄λŠ”λ°
15:15
Do you think that your discussion may serve as a blueprint for other fields?
267
915085
4071
λ°•μ‚¬λ‹˜μ˜ λ…Όμ˜κ°€ λ‹€λ₯Έ λΆ„μ•Όμ—κ²Œ 본보기가 될수 μžˆλ‹€κ³  μƒκ°ν•˜μ‹­λ‹ˆκΉŒ?
15:19
JD: Well, I think it's hard for scientists to get out of the laboratory.
268
919180
3519
JD: κΈ€μŽ„μš”, κ³Όν•™μžλ“€μ—κ²ŒλŠ” μ‹€ν—˜μ‹€ λ°–μœΌλ‘œ λ²—μ–΄λ‚˜λŠ” 것은 μ–΄λ €μš΄ 일이죠.
15:22
Speaking for myself,
269
922723
1153
제 μ˜κ²¬μ„ λ§ν•˜μžλ©΄,
15:23
it's a little bit uncomfortable to do that.
270
923900
2616
그런 일듀이 μ‘°κΈˆμ€ λΆˆνŽΈν•  μˆ˜λŠ” μžˆμŠ΅λ‹ˆλ‹€.
15:26
But I do think that being involved in the genesis of this
271
926540
3914
ν•˜μ§€λ§Œ, 이 기술의 탄생과 κ΄€λ ¨λ˜μ–΄ μžˆλ‹€λŠ” 사싀이
15:30
really puts me and my colleagues in a position of responsibility.
272
930478
3856
저와 제 λ™λ£Œλ“€λ‘œ ν•˜μ—¬κΈˆ μ±…μž„κ°μ„ 갖도둝 ν–ˆμ£ .
15:34
And I would say that I certainly hope that other technologies
273
934358
3430
이런 말씀을 λ“œλ¦¬κ³  μ‹Άκ΅°μš”.
15:37
will be considered in the same way,
274
937812
2646
μš°λ¦¬κ°€ 생물학 μ΄μ™Έμ˜ λΆ„μ•Όμ—μ„œ
15:40
just as we would want to consider something that could have implications
275
940482
3398
영ν–₯을 미치게 될 것듀을 κ³ λ €ν•˜κ³ μž ν–ˆλ˜ κ²ƒμ²˜λŸΌ,
15:43
in other fields besides biology.
276
943904
1927
λ‹€λ₯Έ 기술 뢄야도 같은 λ°©μ‹μœΌλ‘œ κ³ λ €ν•˜κΈ°λ₯Ό λ°”λžλ‹ˆλ‹€.
15:45
BG: Jennifer, thanks for coming to TED.
277
945855
1952
μ œλ‹ˆνΌ λ°•μ‚¬λ‹˜, TED 강연에 와 μ£Όμ…”μ„œ κ°μ‚¬λ“œλ¦½λ‹ˆλ‹€.
15:47
JD: Thank you.
278
947831
1261
κ°μ‚¬ν•©λ‹ˆλ‹€.
15:49
(Applause)
279
949116
3693
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7