How to practice safe sexting | Amy Adele Hasinoff

153,176 views ・ 2017-03-16

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: KI young Jang κ²€ν† : Ju Hye Lim
00:12
People have been using media to talk about sex for a long time.
0
12540
4280
μ‚¬λžŒλ“€μ€ 였래 μ „λΆ€ν„° λ―Έλ””μ–΄λ₯Ό μ΄μš©ν•΄ 성에 κ΄€ν•œ 이야기λ₯Ό ν–ˆμ–΄μš”.
00:17
Love letters, phone sex, racy Polaroids.
1
17420
3440
ν°μ„ΉμŠ€λ₯Ό ν•˜κ±°λ‚˜ μ—°μ• νŽΈμ§€, μ•Όν•œ 사진 등을 λ³΄λƒˆμ£ .
00:21
There's even a story of a girl who eloped with a man that she met over the telegraph
2
21300
6056
심지어 μ „λ³΄λ‘œ λ§Œλ‚œ λ‚¨μΉœκ³Ό μ•Όλ°˜λ„μ£Όν•œ μ—¬μžμ•„μ΄λ„ μžˆμ—ˆμ–΄μš”.
00:27
in 1886.
3
27380
1640
무렀 1886λ…„μ—μš”.
00:30
Today we have sexting, and I am a sexting expert.
4
30380
5096
였늘 μ£Όμ œλŠ” μ„ΉμŠ€νŒ…(sexting)이고 μ „ μ„ΉμŠ€νŒ… μ „λ¬Έκ°€μ˜ˆμš”
00:35
Not an expert sexter.
5
35500
1880
μ œκ°€ μ„ΉμŠ€νŒ…μ„ μž˜ν•œλ‹€λŠ” 게 μ•„λ‹ˆμ˜ˆμš”
00:38
Though, I do know what this means -- I think you do too.
6
38620
4176
이게 λ­”μ§€λŠ” μ•Œμ§€λ§Œμš”. μ—¬λŸ¬λΆ„λ„ μ•„μ‹€ κ±°μ˜ˆμš”.
00:42
[it's a penis]
7
42820
1375
(이것은 μ„±κΈ°λ‹€)
00:44
(Laughter)
8
44220
2360
(μ›ƒμŒ)
00:48
I have been studying sexting since the media attention to it began in 2008.
9
48180
6336
2008λ…„λΆ€ν„° λ―Έλ””μ–΄κ°€ μ„ΉμŠ€νŒ…μ— μ£Όλͺ©ν•˜μž 저도 μ—°κ΅¬ν•˜κΈ° μ‹œμž‘ν–ˆμ–΄μš”.
00:54
I wrote a book on the moral panic about sexting.
10
54540
2976
μ„ΉμŠ€νŒ…κ³Ό 도덕적 ν˜Όλž€μ„ 주제둜 책도 λƒˆμ–΄μš”.
00:57
And here's what I found:
11
57540
1616
그리고 이걸 μ•Œμ•„λƒˆμ£ .
00:59
most people are worrying about the wrong thing.
12
59180
3216
μ‚¬λžŒλ“€μ€ μ—‰λš±ν•œ 것을 걱정을 ν•˜κ³  μžˆμ–΄μš”.
01:02
They're trying to just prevent sexting from happening entirely.
13
62420
4176
μ„ΉμŠ€νŒ…μ„ μ™„μ „νžˆ μ—†μ• λ € λ…Έλ ₯ν•˜μ£ .
01:06
But let me ask you this:
14
66620
1536
그런데 질문 ν•˜λ‚˜ λ“œλ¦΄κ²Œμš”.
01:08
As long as it's completely consensual, what's the problem with sexting?
15
68180
4856
μ„œλ‘œ ν•©μ˜ ν•˜μ— ν•˜λŠ” ν•œ μ™œ μ„ΉμŠ€νŒ…μ΄ λ¬Έμ œκ°€ λ˜λŠ” κ±°μ£ ?
01:13
People are into all sorts of things that you may not be into,
16
73060
4016
μ—¬λŸ¬λΆ„μ—κ² ν₯λ―Έμ—†λŠ” μΌμ΄μ§€λ§Œ 그것에 μ—΄κ΄‘ν•˜λŠ” μ‚¬λžŒλ„ λ§Žμ•„μš”.
01:17
like blue cheese or cilantro.
17
77100
2296
블루 μΉ˜μ¦ˆλ‚˜ κ³ μˆ˜κ°™μ€ κ±° 말이죠.
01:19
(Laughter)
18
79420
1640
(μ›ƒμŒ)
01:22
Sexting is certainly risky, like anything that's fun,
19
82420
4136
μ„ΉμŠ€νŒ…μ€ λΆ„λͺ…νžˆ μœ„ν—˜ν•œ μ·¨λ―Έμ£ . λ‹€λ₯Έ μž¬λ°ŒλŠ” 취미와 λ§ˆμ°¬κ°€μ§€λ‘œμš”.
01:26
but as long as you're not sending an image to someone who doesn't want to receive it,
20
86580
6696
ν•˜μ§€λ§Œ μ›μΉ˜ μ•ŠλŠ” μ‚¬λžŒμ—κ²Œλ§Œ 보내지 μ•ŠμœΌλ©΄
01:33
there's no harm.
21
93300
1776
ν•΄κ°€ 될 일은 μ—†μ–΄μš”.
01:35
What I do think is a serious problem
22
95100
2616
제 생각에 μ‹¬κ°ν•œ λ¬Έμ œκ°€ λ°œμƒν•˜λŠ” κ²½μš°λŠ”
01:37
is when people share private images of others
23
97740
3256
μ‚¬λžŒλ“€μ΄ ν—ˆλ½μ—†μ΄ νƒ€μΈμ˜ 사적인 사진을 κ³΅μœ ν•  λ•Œμ˜ˆμš”.
01:41
without their permission.
24
101020
1520
01:43
And instead of worrying about sexting,
25
103180
2336
μ„ΉμŠ€νŒ…ν•˜λŠ” κ±Έ κ±±μ •ν•˜κΈ°λ³΄λ‹¨
01:45
what I think we need to do is think a lot more about digital privacy.
26
105540
4560
디지털 ν”„λΌμ΄λ²„μ‹œμ— λŒ€ν•΄ 생각해보아야 ν•©λ‹ˆλ‹€.
01:50
The key is consent.
27
110700
2200
핡심은 λ™μ˜μž…λ‹ˆλ‹€.
01:53
Right now most people are thinking about sexting
28
113500
3296
μš”μ¦˜ μ‚¬λžŒλ“€μ€ λŒ€λΆ€λΆ„ λ™μ˜μ— λŒ€ν•΄ 생각해보지 μ•Šμ€ μ±„λ‘œ
01:56
without really thinking about consent at all.
29
116820
2960
μ„ΉμŠ€νŒ… μžμ²΄κ°€ 문제라고 μƒκ°ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
02:00
Did you know that we currently criminalize teen sexting?
30
120220
3680
μš”μ¦˜ 10λŒ€ μ„ΉμŠ€νŒ…μ΄ λΆˆλ²•μΈ κ±° μ•„μ‹œλ‚˜μš”?
02:05
It can be a crime because it counts as child pornography,
31
125220
3456
18μ„Έ 미만의 사진이 있으면 아동포λ₯΄λ…Έμ— ν•΄λ‹Ήλ˜κΈ° λ•Œλ¬Έμ—
02:08
if there's an image of someone under 18,
32
128700
2856
범죄가 될 수 μžˆμ–΄μš”.
02:11
and it doesn't even matter
33
131580
1336
그리고 아이듀이 본인의 μ˜μ§€λ‘œ
02:12
if they took that image of themselves and shared it willingly.
34
132940
4200
자기 사진을 μ°μ–΄μ„œ κ³΅μœ ν•œ 것인지 μ•„λ‹Œμ§€λŠ” μ€‘μš”μΉ˜ μ•Šμ•„μš”.
02:17
So we end up with this bizarre legal situation
35
137620
2976
κ·Έλž˜μ„œ μš°λ¦¬λŠ” λ―Έκ΅­μ—μ„œ 17μ„Έκ°€
02:20
where two 17-year-olds can legally have sex in most US states
36
140620
4536
성관계λ₯Ό ν•˜λŠ” 건 ν•©λ²•μ΄μ§€λ§Œ 사진을 μ°λŠ” 건 λΆˆλ²•μΈ
02:25
but they can't photograph it.
37
145180
1880
μ΄μƒν•œ 법적인 상황에 λΆ€λ”ͺ히게 λ©λ‹ˆλ‹€.
02:28
Some states have also tried passing sexting misdemeanor laws
38
148380
4256
λͺ‡λͺ‡ 지역에선 μ„ΉμŠ€νŒ… 경범죄 법을 ν†΅κ³Όμ‹œν‚€λ €λŠ” λ…Έλ ₯을 ν•˜κΈ°λ„ ν–ˆμ£ .
02:32
but these laws repeat the same problem
39
152660
3016
ν•˜μ§€λ§Œ 이 법도 λ˜‘κ°™μ€ 문제λ₯Ό λ°˜λ³΅ν•  λΏμ΄μ—ˆμ–΄μš”.
02:35
because they still make consensual sexting illegal.
40
155700
3760
μ™œλƒν•˜λ©΄ ν•©μ˜ ν•˜μ— ν•œ μ„ΉμŠ€νŒ…μ΄ μ—¬μ „νžˆ λΆˆλ²•μ΄μ—ˆμœΌλ‹ˆκΉŒμš”.
02:40
It doesn't make sense
41
160340
1256
μ‚¬μƒν™œ μΉ¨ν•΄ 문제λ₯Ό ν•΄κ²°ν•˜κΈ° μœ„ν•΄
02:41
to try to ban all sexting to try to address privacy violations.
42
161620
4736
μ„ΉμŠ€νŒ…μ„ λͺ¨λ‘ κΈˆμ§€ν•˜λŠ” 것은 말이 μ•ˆ λ©λ‹ˆλ‹€.
02:46
This is kind of like saying,
43
166380
1496
그건 이런 말과 κ°™μ•„μš”.
02:47
let's solve the problem of date rape by just making dating completely illegal.
44
167900
5440
"데이트 강간을 ν•΄κ²°ν•˜κΈ° μœ„ν•΄ 데이트 자체λ₯Ό μ™„μ „νžˆ λΆˆλ²•ν™”ν•˜μž."
02:54
Most teens don't get arrested for sexting, but can you guess who does?
45
174940
5376
μ„ΉμŠ€νŒ…μ„ ν•œλ‹€κ³  μ‹­λŒ€λ“€μ΄ κ΅¬μ†λ˜μ§€λŠ” μ•Šμ•„μš”. λˆ„κ°€ κ΅¬μ†λ˜κ²Œμš”?
03:00
It's often teens who are disliked by their partner's parents.
46
180340
4976
μ„ΉμŠ€νŒ… μƒλŒ€μ˜ λΆ€λͺ¨λ“€μ΄ μ‹­λŒ€λ“€μ„ μ’‹μ•„ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
03:05
And this can be because of class bias, racism or homophobia.
47
185340
4800
계급적 νŽΈκ²¬μ΄λ‚˜ 인쒅차별 λ™μ„±μ• ν˜λͺ¨ λ•Œλ¬Έμ΄κΈ°λ„ ν•˜μ£ .
03:10
Most prosecutors are, of course, smart enough
48
190780
2776
λ¬Όλ‘  λŒ€λΆ€λΆ„μ˜ 검찰은 λ˜‘λ˜‘ν•΄μ„œ μ‹­λŒ€λ₯Ό 아동포λ₯΄λ…Έ 혐의둜
03:13
not to use child pornography charges against teenagers, but some do.
49
193580
5480
κΈ°μ†Œν•˜μ§€ μ•Šμ•„μš”. λͺ‡λͺ‡μ€ ν•˜μ§€λ§Œμš”.
03:19
According to researchers at the University of New Hampshire
50
199540
3656
뉴 ν–„ν”„μ…” λŒ€ν•™μ˜ 연ꡬ에 λ”°λ₯΄λ©΄
03:23
seven percent of all child pornography possession arrests are teens,
51
203220
5656
아동 포λ₯΄λ…Έλ₯Ό μ†Œμž₯ν•œ μ£„λ‘œ μ±„ν¬λ‹Ήν•œ μ‚¬λžŒ 쀑 7%κ°€ 10λŒ€λΌκ³  ν•©λ‹ˆλ‹€.
03:28
sexting consensually with other teens.
52
208900
3000
또래 μΉœκ΅¬μ™€ ν•©μ˜ ν•˜μ— ν•œ μ„ΉμŠ€νŒ…μΈλ°λ„ 말이죠.
03:33
Child pornography is a serious crime,
53
213300
2536
아동 포λ₯΄λ…ΈλŠ” μ‹¬κ°ν•œ λ²”μ£„μ§€λ§Œ
03:35
but it's just not the same thing as teen sexting.
54
215860
3680
10λŒ€ μ„ΉμŠ€νŒ…κ³ΌλŠ” λ‹€λ₯Έ κ²λ‹ˆλ‹€.
03:40
Parents and educators are also responding to sexting
55
220860
3416
λΆ€λͺ¨μ™€ κ΅μœ‘μžλ“€ λ˜ν•œ ν•©μ˜λΌλŠ” 것은 λ¬΄μ‹œν•œ 채
03:44
without really thinking too much about consent.
56
224300
3136
μ„ΉμŠ€νŒ… λ¬Έμ œμ— λŒ€μ‘ν•©λ‹ˆλ‹€.
03:47
Their message to teens is often: just don't do it.
57
227460
4120
보톡 μ•„μ΄μ—κ²Œ μ΄λ ‡κ²Œ κ°€λ₯΄μΉ˜μ£ . "κ·Έλƒ₯ ν•˜μ§€ 말거라"
03:52
And I totally get it -- there are serious legal risks
58
232020
3496
저도 μ „μ μœΌλ‘œ μ΄ν•΄λŠ” ν•©λ‹ˆλ‹€ μ‹¬κ°ν•œ 법적 μœ„ν—˜μ΄ 있고
03:55
and of course, that potential for privacy violations.
59
235540
3656
λ¬Όλ‘  μ‚¬μƒν™œ μΉ¨ν•΄μ˜ κ°€λŠ₯성도 있죠.
03:59
And when you were a teen,
60
239220
1256
그리고 μ—¬λŸ¬λΆ„μ΄ 10λŒ€μ˜€μ„ λ•Œ
04:00
I'm sure you did exactly as you were told, right?
61
240500
3400
λΆ„λͺ…νžˆ μ–΄λ₯Έλ“€ 말씀을 κ³ λΆ„κ³ λΆ„ λ“€μœΌμ…¨μ„ κ±°μ˜ˆμš”, 맞죠?
04:05
You're probably thinking, my kid would never sext.
62
245260
3456
그리고 μ—¬λŸ¬λΆ„λ“€λ„ λ‚΄ μ•„μ΄λŠ” μ•ˆ ν•˜κ² μ§€λΌκ³  μƒκ°ν•˜μ‹€ κ²λ‹ˆλ‹€.
04:08
And that's true, your little angel may not be sexting
63
248740
3456
λ§žμ•„μš”, μ—¬λŸ¬λΆ„μ˜ μ°©ν•œ μ•„κΈ°λŠ” μ„ΉμŠ€νŒ…μ„ μ•ˆ 할지도 λͺ¨λ¦…λ‹ˆλ‹€.
04:12
because only 33 percent
64
252220
3136
μ™œλƒν•˜λ©΄ 16-17μ„Έ 쀑에 μ„ΉμŠ€νŒ…ν•˜λŠ” μ•„μ΄λŠ”
04:15
of 16- and 17-year-olds are sexting.
65
255380
2360
κ³ μž‘ 33%밖에 μ•ˆ λ˜κ±°λ“ μš”.
04:19
But, sorry, by the time they're older, odds are they will be sexting.
66
259020
4616
ν•˜μ§€λ§Œ μ£„μ†‘ν•΄μš”, 아이듀이 크면 이걸 ν•  ν™•λ₯ λ„ λ†’μ•„μ Έμš”.
04:23
Every study I've seen puts the rate above 50 percent for 18- to 24-year-olds.
67
263660
6200
μ œκ°€ μ—¬νƒœκ» 봐온 λͺ¨λ“  연ꡬ에 λ”°λ₯΄λ©΄ 18-24μ„Έμ˜ κ²½μš°λŠ” 50%μ΄μƒμ΄μ—μš”.
04:30
And most of the time, nothing goes wrong.
68
270540
3056
ν•˜μ§€λ§Œ λŒ€λΆ€λΆ„μ˜ 경우 아무 λ¬Έμ œλ„ 생기지 μ•Šμ•„μš”.
04:33
People ask me all the time things like, isn't sexting just so dangerous, though?
69
273620
5376
μ£Όλ³€μ—μ„œ 항상 μ΄λ ‡κ²Œ λ¬Όμ–΄λ΄μš”. "μ„ΉμŠ€νŒ… μžμ²΄κ°€ μœ„ν—˜ν•œ κ±° μ•„λ‹ˆμ•Ό?"
이건 곡원 λ²€μΉ˜μ— 지갑을 두고 μ˜€μ§€ μ•ŠλŠ” κ±°ν•˜κ³  λΉ„μŠ·ν•œ κ±°μ˜ˆμš”.
04:39
It's like you wouldn't leave your wallet on a park bench
70
279020
3576
04:42
and you expect it's going to get stolen if you do that, right?
71
282620
3440
지갑을 λ²€μΉ˜μ— 놓고 μ™”μœΌλ©΄ λˆ„κ°€ 훔쳐갔을거라 μƒκ°ν•˜μ‹œκ² μ£ .
04:46
Here's how I think about it:
72
286700
1456
μ €λŠ” μ΄λ ‡κ²Œ μƒκ°ν•΄μš”.
04:48
sexting is like leaving your wallet at your boyfriend's house.
73
288180
3936
μ„ΉμŠ€νŒ…μ€ λ‚¨μžμΉœκ΅¬ 집에 지갑을 놓고 μ˜€λŠ” 것과 κ°™λ‹€κ³ μš”.
04:52
If you come back the next day
74
292140
1776
κ·Έλž˜μ„œ λ‹€μŒλ‚  찾으러 κ°”λŠ”λ°
04:53
and all the money is just gone,
75
293940
2280
돈이 λ‹€ μ—†μ–΄μ‘Œλ‹€λ©΄
04:56
you really need to dump that guy.
76
296860
2120
정말 κ·Έ 인간을 차버렀야죠.
04:59
(Laughter)
77
299510
2170
(μ›ƒμŒ)
05:03
So instead of criminalizing sexting
78
303180
2320
κ·ΈλŸ¬λ‹ˆ μ‚¬μƒν™œ μΉ¨ν•΄λ₯Ό λ§‰μœΌλ €κ³ 
05:05
to try to prevent these privacy violations,
79
305540
2616
μ„ΉμŠ€νŒ…μ„ λ²”μ£„ν™”ν•˜κΈ°λ³΄λ‹€λŠ”
05:08
instead we need to make consent central
80
308180
3296
개인 μ •λ³΄μ˜ μœ ν¬μ— κ΄€ν•΄
05:11
to how we think about the circulation of our private information.
81
311500
4080
ν•©μ˜λΌλŠ” κ±Έ μ€‘μš”μ‹œν•΄μ•Ό ν•΄μš”.
05:16
Every new media technology raises privacy concerns.
82
316300
4256
μƒˆλ‘œμš΄ λ―Έλ””μ–΄ 기술이 생길 λ•Œλ§ˆλ‹€ μ‚¬μƒν™œμ— λŒ€ν•œ μš°λ €κ°€ μ œκΈ°λ©λ‹ˆλ‹€.
05:20
In fact, in the US the very first major debates about privacy
83
320580
4616
μ‹€μ œλ‘œ λ―Έκ΅­μ—μ„œ λ‹Ήμ‹œ 비ꡐ적 μ΅œμ‹ μΈ 기술이 λ“±μž₯ν–ˆμ„ λ•Œ
05:25
were in response to technologies that were relatively new at the time.
84
325220
4496
졜초둜 μ‚¬μƒν™œ 문제λ₯Ό 주제둜 ν•œ λŒ€κ·œλͺ¨ λ…ΌμŸμ΄ 일어났죠.
05:29
In the late 1800s, people were worried about cameras,
85
329740
3896
1800λ…„λŒ€ 말 μ‚¬λžŒλ“€μ€ μ‹ λ¬Έμ˜ κ°€μ‹­λž€κ³Ό,
05:33
which were just suddenly more portable than ever before,
86
333660
3456
ν•œμˆœκ°„μ— μ˜ˆμ „λ³΄λ‹€ νœ΄λŒ€ν•˜κΈ° νŽΈν•΄μ§„
05:37
and newspaper gossip columns.
87
337140
2496
카메라λ₯Ό κ±±μ •ν–ˆμ–΄μš”.
05:39
They were worried that the camera would capture information about them,
88
339660
3816
μ™œλƒν•˜λ©΄ μΉ΄λ©”λΌλ‘œ 자기λ₯Ό 찍어
05:43
take it out of context and widely disseminate it.
89
343500
3200
μ „ν›„ μ„€λͺ…없이 신문에 μ‹€μ–΄μ„œ 널리 퍼뜨릴까봐 κ±±μ •λœκ±°μ£ .
05:47
Does this sound familiar?
90
347060
1616
μ΅μˆ™ν•œ 상황이죠?
05:48
It's exactly what we're worrying about now with social media and drone cameras,
91
348700
4856
μš°λ¦¬κ°€ μš”μ¦˜ SNSλ‚˜ λ“œλ‘  카메라에 λŒ€ν•΄ κ±±μ •ν•˜λŠ” 것과 λ™μΌν•©λ‹ˆλ‹€.
05:53
and, of course, sexting.
92
353580
1640
λ¬Όλ‘  μ„ΉμŠ€νŒ…λ„ κ·Έλ ‡κ³ μš”.
05:55
And these fears about technology,
93
355740
2216
μ‹ κΈ°μˆ μ— λŒ€ν•œ 두렀움은
05:57
they make sense
94
357980
1216
λͺ¨λ‘ 곡감할 κ±°μ˜ˆμš”.
05:59
because technologies can amplify and bring out
95
359220
3416
μ™œλƒν•˜λ©΄ κΈ°μˆ μ€ 우리의 λ‚˜μœ ν–‰λ™μ΄λ‚˜ 단점을
λΆ€ν’€λ €μ„œ ν­λ‘œν•΄λ²„λ¦΄ 수 μžˆμœΌλ‹ˆκΉŒμš”.
06:02
our worst qualities and behaviors.
96
362660
2720
06:05
But there are solutions.
97
365980
2376
κ·Έλž˜λ„ 해결책은 μžˆμ–΄μš”.
06:08
And we've been here before with a dangerous new technology.
98
368380
3560
그리고 이전에도 μœ„ν—˜ν•œ μ‹ κΈ°μˆ λ•Œλ¬Έμ— 이런 상황에 μ²˜ν•΄λ³Έ 적 μžˆμž–μ•„μš”.
06:12
In 1908, Ford introduced the Model T car.
99
372540
3776
1908년에 ν¬λ“œλŠ” λͺ¨λΈ T μ°¨λŸ‰μ„ μ„ λ³΄μ˜€κ³ ,
06:16
Traffic fatality rates were rising.
100
376340
2576
ꡐ톡사고 μ‚¬λ§μž μˆ˜κ°€ μ¦κ°€ν–ˆμ–΄μš”.
06:18
It was a serious problem -- it looks so safe, right?
101
378940
2800
μ‹¬κ°ν•œ λ¬Έμ œμ˜€μ£ . 보기엔 μ•ˆμ „ν•œ 것 같은데, κ·Έλ ‡μ£ ?
06:23
Our first response was to try to change drivers' behavior,
102
383900
3976
κ·Έλž˜μ„œ μš°λ¦¬κ°€ 처음 ν•œ 일은 μš΄μ „μžμ˜ 행동을 λ°”κΎΈλŠ” κ±°μ˜€μ–΄μš”.
06:27
so we developed speed limits and enforced them through fines.
103
387900
3720
κ·Έλž˜μ„œ 속도 μ œν•œμ„ λ§Œλ“€κ³  λ²ŒκΈˆμ„ λ¬Όμ–΄ μ§€ν‚€κ²Œ λ§Œλ“€μ—ˆμ£ .
06:32
But over the following decades,
104
392060
1856
ν•˜μ§€λ§Œ μˆ˜μ‹­ 년을 μ§€λ‚˜μ˜€λ©΄μ„œ
06:33
we started to realize the technology of the car itself is not just neutral.
105
393940
5496
μžλ™μ°¨ κΈ°μˆ μ€ 쀑립적이지 μ•Šλ‹€λŠ” κ±Έ κΉ¨λ‹¬μ•˜μ–΄μš”.
06:39
We could design the car to make it safer.
106
399460
3216
μ°¨λŸ‰μ„ 더 μ•ˆμ „ν•˜κ²Œ μ œμž‘ν•  수 μžˆμ—ˆμ£ .
06:42
So in the 1920s, we got shatter-resistant windshields.
107
402700
3456
κ·Έλž˜μ„œ 1920년에 비산방지 μ•žμœ λ¦¬λ₯Ό λ§Œλ“­λ‹ˆλ‹€.
06:46
In the 1950s, seat belts.
108
406180
2496
50λ…„λŒ€μ— μ•ˆμ „λ²¨νŠΈλ₯Ό λ§Œλ“€κ³ μš”.
06:48
And in the 1990s, airbags.
109
408700
3080
그리고 90λ…„λŒ€μ—” μ—μ–΄λ°±μ„μš”.
06:52
All three of these areas:
110
412260
2376
세월이 흐λ₯΄λ©΄μ„œ
06:54
laws, individuals and industry came together over time
111
414660
4776
법, 개인 그리고 μ—…κ³„μ˜ 3개 μ˜μ—­ νž˜μ„ 합쳐
06:59
to help solve the problem that a new technology causes.
112
419460
3776
μ‹ κΈ°μˆ μ΄ μ•ΌκΈ°ν•˜λŠ” 문제λ₯Ό ν•΄κ²°ν•˜κΈ° μœ„ν•΄ λ‚˜μ„°μ£ .
07:03
And we can do the same thing with digital privacy.
113
423260
3240
디지털 μ‚¬μƒν™œ λ¬Έμ œλ„ λ˜‘κ°™μ΄ ν•΄κ²°ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
07:06
Of course, it comes back to consent.
114
426980
2760
λ¬Όλ‘  "ν•©μ˜"λΌλŠ” κ²ƒμœΌλ‘œ λŒμ•„κ°€μ„œ 말이죠.
07:10
Here's the idea.
115
430180
1216
이런 방법이 μžˆμ–΄μš”.
07:11
Before anyone can distribute your private information,
116
431420
3816
λˆ„κ΅°κ°€ μ—¬λŸ¬λΆ„ 개인 정보λ₯Ό 퍼트리기 전에
07:15
they should have to get your permission.
117
435260
2240
μ—¬λŸ¬λΆ„μ˜ ν—ˆλ½μ΄ μžˆμ–΄μ•Ό ν•˜λŠ” κ±°μ˜ˆμš”.
07:18
This idea of affirmative consent comes from anti-rape activists
118
438060
4816
적극적 ν•©μ˜λž€ κ°œλ…μ€ λͺ¨λ“  μ„±ν–‰μœ„μ—λŠ” ν•©μ˜κ°€ μžˆμ–΄μ•Ό ν•œλ‹€κ³  μ£Όμž₯ν•˜λŠ”
07:22
who tell us that we need consent for every sexual act.
119
442900
3776
성폭행 λ°˜λŒ€μš΄λ™κ°€λ“€μ—κ²Œμ„œ νŒŒμƒλœ κ±°μ˜ˆμš”.
07:26
And we have really high standards for consent in a lot of other areas.
120
446700
4576
여타 λ§Žμ€ λΆ„μ•Όμ—μ„œλ„ ν•©μ˜μ— λŒ€ν•œ 기쀀이 μƒλ‹Ήνžˆ λ†’μ•„μš”.
07:31
Think about having surgery.
121
451300
1856
μ™Έκ³Όμˆ˜μˆ μ„ 예λ₯Ό 듀어보죠.
07:33
Your doctor has to make sure
122
453180
1616
μ£ΌμΉ˜μ˜λŠ” 의료 ν–‰μœ„λ₯Ό ν•˜κΈ° μ „
07:34
that you are meaningfully and knowingly consenting to that medical procedure.
123
454820
4040
ν™˜μžκ°€ λ‚΄μš©μ„ μ•Œκ³  λ™μ˜ν–ˆλ‹€λŠ” κ±Έ ν™•μ‹€νžˆ 해둬야 ν•˜μ£ .
07:39
This is not the type of consent like with an iTunes Terms of Service
124
459340
3696
이건 μ•„μ΄νŠ μ¦ˆ μ•½κ΄€ λ™μ˜μ²˜λŸΌ 맨 λ°‘μœΌλ‘œ 슀크둀 내리고
07:43
where you just scroll to the bottom and you're like, agree, agree, whatever.
125
463060
3656
λ™μ˜ λž€μ— 체크, μ²΄ν¬ν•˜λŠ” κ²ƒκ³ΌλŠ” λ‹¬λΌμš”.
07:46
(Laughter)
126
466740
1720
(μ›ƒμŒ)
07:48
If we think more about consent, we can have better privacy laws.
127
468980
5256
ν•©μ˜μ— λŒ€ν•΄ 더 고민해보면 더 κ°œμ„ λœ μ‚¬μƒν™œ λ³΄ν˜Έλ²•μ„ λ§Œλ“€ 수 μžˆμ–΄μš”.
07:54
Right now, we just don't have that many protections.
128
474260
3416
μ§€κΈˆμ€ λ³„λ‘œ 보호 μž₯μΉ˜κ°€ λ§Žμ§€ μ•Šμ•„μš”.
07:57
If your ex-husband or your ex-wife is a terrible person,
129
477700
3576
μ „λ‚¨νŽΈμ΄λ‚˜ 전뢀인이 λ”μ°ν•œ μ‚¬λžŒμ΄λ©΄
08:01
they can take your nude photos and upload them to a porn site.
130
481300
4216
μ—¬λŸ¬λΆ„μ˜ λ‚˜μ²΄μ‚¬μ§„μ„ 포λ₯΄λ…Έ μ‚¬μ΄νŠΈμ— μ˜¬λ €λ²„λ¦΄ μˆ˜λ„ 있죠.
08:05
It can be really hard to get those images taken down.
131
485540
3216
이 사진듀을 μ—†μ• λ €λ©΄ 정말 μ• λ₯Ό λ¨Ήμ£ .
08:08
And in a lot of states,
132
488780
1216
λ§Žμ€ μ£Όμ—μ„œ
08:10
you're actually better off if you took the images of yourself
133
490020
3816
λ‚΄ 사진을 λ‚΄κ°€ 직접 μ°λŠ” 게 더 λ‚˜μ•„μš”.
그러면 μ €μž‘κΆŒμ„ μ£Όμž₯ν•  수 μžˆμ–΄μ„œμš”.
08:13
because then you can file a copyright claim.
134
493860
2800
08:17
(Laughter)
135
497140
2056
(μ›ƒμŒ)
08:19
Right now, if someone violates your privacy,
136
499220
2976
μš”μ¦˜μ€ λˆ„κ°€ μ—¬λŸ¬λΆ„ μ‚¬μƒν™œμ„ μΉ¨ν•΄ν•˜λ©΄
08:22
whether that's an individual or a company or the NSA,
137
502220
4200
그게 κ°œμΈμ΄λ‚˜ κΈ°μ—…, ν˜Ήμ€ ꡭ정원일지라도
08:27
you can try filing a lawsuit,
138
507100
2736
κ³ μ†Œν•  수 μžˆμ–΄μš”.
08:29
though you may not be successful
139
509860
2136
λ‹€μˆ˜μ˜ 법원이 디지털 μ‚¬μƒν™œμ€
08:32
because many courts assume that digital privacy is just impossible.
140
512020
4776
λΆˆκ°€λŠ₯ν•˜λ‹€κ³  κ°€μ •ν•˜κΈ° λ•Œλ¬Έμ— μŠΉμ†Œν•˜κΈ΄ μ–΄λ ΅μ§€λ§Œμš”.
08:36
So they're not willing to punish anyone for violating it.
141
516820
3440
κ·Έλž˜μ„œ μ‚¬μƒν™œμ„ 침해해도 μ²˜λ²Œν•˜μ§„ μ•ŠμœΌλ €κ³  ν•©λ‹ˆλ‹€.
08:41
I still hear people asking me all the time,
142
521020
2896
μ£Όλ³€μ—μ„œλŠ” 제게 μ—¬μ „νžˆ λ¬Όμ–΄λ΄μš”.
08:43
isn't a digital image somehow blurring the line between public and private
143
523940
5296
인터넷에 올린 μ΄λ―Έμ§€λŠ” λ””μ§€ν„Έμ΄λΌμ„œ 사적인건지와 κ³΅μ μΈκ±΄μ§€μ˜
ꡬ뢄이 애맀해지지 μ•Šμ•„μš”?
08:49
because it's digital, right?
144
529260
1640
08:51
No! No!
145
531420
1336
μ ˆλŒ€! μ•„λ‹™λ‹ˆλ‹€!
08:52
Everything digital is not just automatically public.
146
532780
3336
λ””μ§€ν„Έμ΄λΌκ³ ν•΄μ„œ 곡적인 것은 μ•„λ‹™λ‹ˆλ‹€.
08:56
That doesn't make any sense.
147
536140
1896
μ „ν˜€ 말도 μ•ˆ λ˜λŠ” λ§μž…λ‹ˆλ‹€.
08:58
As NYU legal scholar Helen Nissenbaum tells us,
148
538060
3496
λ‰΄μš•λŒ€ λ²•ν•™μž ν—¬λ Œ λ‹ˆμ„Όλ°”μ›€μ€
09:01
we have laws and policies and norms
149
541580
2616
μš°λ¦¬μ—κ² 사적인 정보면 μ–΄λ–€ μ’…λ₯˜λ“  λ³΄ν˜Έν•΄μ£ΌλŠ”
09:04
that protect all kinds of information that's private,
150
544220
3136
법과 μ •μ±…, κ·œλ²”μ΄ 있기 λ•Œλ¬Έμ—
09:07
and it doesn't make a difference if it's digital or not.
151
547380
3416
디지털이든 μ•„λ‹ˆλ“  λ‹¬λΌμ§ˆ 게 μ—†λ‹€κ³  λ§ν•©λ‹ˆλ‹€.
09:10
All of your health records are digitized
152
550820
2656
λͺ¨λ“  건강 기둝도 디지털화 λ˜μ–΄μžˆμ§€λ§Œ
09:13
but your doctor can't just share them with anyone.
153
553500
3136
μ˜μ‚¬λŠ” κ·Έκ±Έ λˆ„κ΅¬μ™€λ„ κ³΅μœ ν•  수 μ—†μ£ .
09:16
All of your financial information is held in digital databases,
154
556660
4456
μ—¬λŸ¬λΆ„μ˜ λͺ¨λ“  재무 μ •λ³΄λŠ” 디지털 λ°μ΄ν„°λ² μ΄μŠ€μ— μ €μž₯돼 μžˆμ§€λ§Œ
09:21
but your credit card company can't just post your purchase history online.
155
561140
4240
μΉ΄λ“œμ‚¬λŠ” μ—¬λŸ¬λΆ„μ˜ ꡬ맀기둝을 κ³΅κ°œν•  수 μ—†μŠ΅λ‹ˆλ‹€.
09:26
Better laws could help address privacy violations after they happen,
156
566900
5456
더 쒋은 λ²•μ œλŠ” μ‚¬μƒν™œ 침해에 사후 λŒ€μ‘μ„ ν•  수 μžˆκ² μ§€λ§Œ
09:32
but one of the easiest things we can all do is make personal changes
157
572380
4376
저희가 ν•  수 μžˆλŠ” 것 쀑 κ°€μž₯ μ‰¬μš΄ 방법 ν•˜λ‚˜λŠ”
09:36
to help protect each other's privacy.
158
576780
2680
타인이 μ‚¬μƒν™œμ„ 지킬 수 있게 ν•΄μ£ΌλŠ” κ±°μ˜ˆμš”.
09:40
We're always told that privacy
159
580180
1896
항상 μ‚¬μƒν™œ λ¬Έμ œλŠ”
09:42
is our own, sole, individual responsibility.
160
582100
3056
μ™„μ „νžˆ 개인 μ±…μž„μ΄λΌκ³  λ§ν•˜μ£ .
09:45
We're told, constantly monitor and update your privacy settings.
161
585180
4256
계속 κ°œμΈμ •λ³΄λ₯Ό μ—…λ°μ΄νŠΈν•˜κ³  κ²€μ‚¬ν•˜κ³ 
09:49
We're told, never share anything you wouldn't want the entire world to see.
162
589460
4800
μ˜¨μ„Έμƒμ— μ•Œλ¦¬κ³  싢지 μ•Šμ€ 것은 μ ˆλŒ€ κ³΅μœ ν•˜μ§€ 말라고 ν•©λ‹ˆλ‹€.
09:55
This doesn't make sense.
163
595220
1216
말도 μ•ˆ λΌμš”.
09:56
Digital media are social environments
164
596460
2976
디지털 λ―Έλ””μ–΄λŠ” μ‚¬νšŒμ™€ κ°™μ•„μ„œ
09:59
and we share things with people we trust all day, every day.
165
599460
4280
μ‹ λ’°ν•˜λŠ” μ‚¬λžŒλ“€κ³Ό 맀일 정보λ₯Ό κ³΅μœ ν•΄μš”.
10:04
As Princeton researcher Janet Vertesi argues,
166
604580
2976
ν”„λ¦°μŠ€ν„΄λŒ€ 연ꡬ원 μŸˆλ„· λ²„ν…Œμ‹œλŠ”
10:07
our data and our privacy, they're not just personal,
167
607580
4016
개인 정보와 μ‚¬μƒν™œμ€ κ·Έλƒ₯ 개인적인 것이 μ•„λ‹ˆλΌ
10:11
they're actually interpersonal.
168
611620
2576
μ‹€μ œλ‘  λŒ€μΈμ μΈ 것이라고 μ£Όμž₯ν•©λ‹ˆλ‹€.
10:14
And so one thing you can do that's really easy
169
614220
3256
κ·ΈλŸ¬λ‹ˆ μ •λ§λ‘œ μ‰½κ²Œ ν•  수 μžˆλŠ” 일은
10:17
is just start asking for permission before you share anyone else's information.
170
617500
5096
κ·Έλƒ₯ νƒ€μΈμ˜ 정보λ₯Ό κ³΅μœ ν•˜κΈ° 전에 ν—ˆλ½λ°›κΈ° μ‹œμž‘ν•˜λŠ” κ±°μ˜ˆμš”.
10:22
If you want to post a photo of someone online, ask for permission.
171
622620
4536
λˆ„κ΅°κ°€μ˜ 사진을 인터넷에 올리고 μ‹ΆμœΌμ‹œλ©΄ ν—ˆλ½μ„ λ°›μœΌμ„Έμš”.
10:27
If you want to forward an email thread,
172
627180
2456
이메일을 μ „λ‹¬ν•˜κ³  μ‹ΆμœΌμ‹œλ©΄
10:29
ask for permission.
173
629660
1376
ν—ˆλ½μ„ λ°›μœΌμ„Έμš”.
10:31
And if you want to share someone's nude selfie,
174
631060
2776
λˆ„κ΅°κ°€μ˜ λ‚˜μ²΄ μ…€μΉ΄λ₯Ό κ³΅μœ ν•˜κ³  μ‹ΆμœΌλ©΄
10:33
obviously, ask for permission.
175
633860
2280
λ¬Όλ‘  ν—ˆλ½μ„ λ°›μœΌμ„Έμš”.
10:37
These individual changes can really help us protect each other's privacy,
176
637380
4456
개개인의 λ³€ν™”κ°€ μ‚¬μƒν™œ λ³΄ν˜Έμ— λ§Žμ€ 도움이 λ©λ‹ˆλ‹€.
10:41
but we need technology companies on board as well.
177
641860
3800
ν•˜μ§€λ§Œ κΈ°μˆ κ°œλ°œμ‚¬λ“€λ„ 동참해야 ν•©λ‹ˆλ‹€.
10:46
These companies have very little incentive to help protect our privacy
178
646180
4496
κΈ°μˆ κ°œλ°œμ‚¬λ“€μ€ μ‚¬μƒν™œ 보호λ₯Ό μœ„ν•΄ λ…Έλ ₯ν•  동기가 μ—†μŠ΅λ‹ˆλ‹€.
10:50
because their business models depend on us sharing everything
179
650700
3296
μ™œλƒν•˜λ©΄ 사업이 μ„±κ³΅ν•˜λ €λ©΄ μ‚¬λžŒλ“€μ΄ μ΅œλŒ€ν•œ λ§Žμ€ 이듀과
10:54
with as many people as possible.
180
654020
2240
λͺ¨λ“  κ±Έ κ³΅μœ ν•΄μ£Όμ–΄μ•Ό λ˜λ‹ˆκΉŒμš”.
10:56
Right now, if I send you an image,
181
656900
1936
μ œκ°€ μ—¬λŸ¬λΆ„κ»˜ 사진 ν•˜λ‚˜λ₯Ό 보내면
10:58
you can forward that to anyone that you want.
182
658860
3096
μ—¬λŸ¬λΆ„μ€ κ·Έ 사진을 λˆ„κ΅¬μ—κ²Œλ“  보낼 수 μžˆμ–΄μš”.
11:01
But what if I got to decide if that image was forwardable or not?
183
661980
4256
ν•˜μ§€λ§Œ 보내도 λ˜λŠ”μ§€μ˜ κ²°μ •κΆŒμ΄ 제게 있게 λœλ‹€λ©΄μš”?
11:06
This would tell you, you don't have my permission to send this image out.
184
666260
4056
그럼 이 이미지λ₯Ό 전솑해도 λœλ‹€κ³  ν—ˆλ½ν•˜μ§€ μ•Šμ•˜λ‹€λŠ” κ±Έ μ•Œ 수 있겠죠.
11:10
We do this kind of thing all the time to protect copyright.
185
670340
4136
μ €μž‘κΆŒμ„ λ³΄ν˜Έν•˜κΈ° μœ„ν•΄μ„  항상 이런 일듀을 ν•˜μ£ .
11:14
If you buy an e-book, you can't just send it out to as many people as you want.
186
674500
4776
μ „μžμ±…μ„ κ΅¬μž…ν•˜λ©΄ λ§Žμ€ μ‚¬λžŒλ“€κ³Ό κ³΅μœ ν•΄μ„œλŠ” μ•ˆλ˜μ£ .
11:19
So why not try this with mobile phones?
187
679300
2560
그럼 ν•Έλ“œν°μ—μ„œλ„ κ·Έλ ‡κ²Œ ν•˜λŠ” 건 μ–΄λ–¨κΉŒμš”?
11:22
What you can do is we can demand that tech companies add these protections
188
682780
4776
μš°λ¦¬λŠ” κΈ°μˆ κ°œλ°œμ‚¬μ—κ²Œ 이런 보호 μž₯μΉ˜κ°€ μžλ™μœΌλ‘œ
11:27
to our devices and our platforms as the default.
189
687580
3736
기기와 ν”Œλž«νΌμ— 적용되게 좔가해달라고 μš”κ΅¬ν•  수 μžˆμ–΄μš”.
11:31
After all, you can choose the color of your car,
190
691340
3416
μ°¨λŸ‰ 색깔은 선택할 수 μžˆμ§€λ§Œ
11:34
but the airbags are always standard.
191
694780
2840
애어백은 항상 λ“€μ–΄μžˆμž–μ•„μš”.
11:39
If we don't think more about digital privacy and consent,
192
699900
3816
디지털 μ‚¬μƒν™œκ³Ό ν•©μ˜μ— λŒ€ν•΄ 더 κ³ λ―Όν•˜μ§€ μ•ŠμœΌλ©΄
11:43
there can be serious consequences.
193
703740
2720
μ‹¬κ°ν•œ κ²°κ³Όλ₯Ό μ΄ˆλž˜ν•  κ±°μ˜ˆμš”.
11:47
There was a teenager from Ohio --
194
707180
2256
μ˜€ν•˜μ΄μ˜€μ— μ‚¬λŠ” 10λŒ€ 아이가 μžˆλŠ”λ°,
11:49
let's call her Jennifer, for the sake of her privacy.
195
709460
2840
κ°œμΈμ •λ³΄ 보호λ₯Ό μœ„ν•΄ μ œλ‹ˆνΌλΌκ³  ν• κ²Œμš”.
11:52
She shared nude photos of herself with her high school boyfriend,
196
712940
3576
μžμ‹ μ˜ λ‚˜μ²΄ 사진을 고등학ꡐ λ‚¨μΉœκ³Ό κ³΅μœ ν–ˆμ£ .
11:56
thinking she could trust him.
197
716540
1520
λ‚¨μΉœμ„ λ―Ώμ—ˆκ±°λ“ μš”.
11:59
Unfortunately, he betrayed her
198
719540
1936
λΆˆν–‰νžˆλ„ λ‚¨μžμΉœκ΅¬λŠ”
12:01
and sent her photos around the entire school.
199
721500
2976
μ œλ‹ˆνΌλ₯Ό λ°°μ‹ ν•˜κ³  사진을 학ꡐ 전체에 λΏŒλ Έμ–΄μš”.
12:04
Jennifer was embarrassed and humiliated,
200
724500
3520
μ œλ‹ˆνΌλŠ” λ„ˆλ¬΄ λ‹Ήν™©μŠ€λŸ½κ³  μˆ˜μΉ˜μ‹¬μ„ λŠκΌˆμ§€λ§Œ
12:08
but instead of being compassionate, her classmates harassed her.
201
728620
4136
λ™μ •λ°›κΈ°λŠ”μ»€λ…• 반 μΉœκ΅¬λ“€λ‘œλΆ€ν„° κ΄΄λ‘­νž˜μ„ λ‹Ήν–ˆμ–΄μš”.
12:12
They called her a slut and a whore
202
732780
1856
걸레고 창녀라고 λΆˆλ €κ³ 
12:14
and they made her life miserable.
203
734660
1960
μ œλ‹ˆνΌλ₯Ό λΉ„μ°Έν•˜κ²Œ λ§Œλ“€μ—ˆμ–΄μš”.
12:17
Jennifer started missing school and her grades dropped.
204
737180
3680
μ œλ‹ˆνΌλŠ” 학ꡐλ₯Ό 빠지기 μ‹œμž‘ν–ˆκ³  성적은 λ–¨μ–΄μ‘Œμ–΄μš”.
12:21
Ultimately, Jennifer decided to end her own life.
205
741340
3800
κ²°κ΅­ μ œλ‹ˆνΌλŠ” μžμ‚΄μ„ μ„ νƒν–ˆμ–΄μš”.
12:26
Jennifer did nothing wrong.
206
746540
2696
κ·Έ μ•„μ΄λŠ” 잘λͺ»ν•œ 게 μ—†μ–΄μš”.
12:29
All she did was share a nude photo
207
749260
2256
믿을 수 μžˆλ‹€κ³  μƒκ°ν•œ μ‚¬λžŒμ—κ²Œ
12:31
with someone she thought that she could trust.
208
751540
2816
단지 λ‚˜μ²΄ 사진 ν•˜λ‚˜ λ³΄λƒˆμ„ λΏμ΄μ—μš”.
12:34
And yet our laws tell her
209
754380
2616
κ·ΈλŸ°λ°λ„ 법은 μ œλ‹ˆνΌμ—κ²Œ
12:37
that she committed a horrible crime equivalent to child pornography.
210
757020
4160
아동 포λ₯΄λ…Έμ™€ λ™λ“±ν•œ λ”μ°ν•œ 범죄λ₯Ό μ €μ§ˆλ €λ‹€κ³  ν•©λ‹ˆλ‹€.
12:41
Our gender norms tell her
211
761740
1496
우리의 성관념은
12:43
that by producing this nude image of herself,
212
763260
3216
μžμ‹ μ˜ λ‚˜μ²΄ 사진을 찍은 건
12:46
she somehow did the most horrible, shameful thing.
213
766500
3200
κ°€μž₯ λ”λŸ½κ³  μΆ”μž‘ν•œ 짓이라며 μ œλ‹ˆνΌλ₯Ό λΉ„λ‚œν•©λ‹ˆλ‹€.
12:50
And when we assume that privacy is impossible in digital media,
214
770220
4216
그리고 디지털 λ§€μ²΄μ—μ„œ μ‚¬μƒν™œμ€ λΆˆκ°€λŠ₯ν•˜λ‹€κ³  κ°€μ •ν•˜κ²Œ 되면
12:54
we completely write off and excuse her boyfriend's bad, bad behavior.
215
774460
5520
κ·Έ λ‚¨ν•™μƒμ˜ 악행을 λ©΄μ£„ν•˜κ³  λ³€λͺ…ν•΄μ£ΌλŠ” 꼴이 될 뿐이죠.
13:01
People are still saying all the time to victims of privacy violations,
216
781020
5736
μ‚¬λžŒλ“€μ€ μ—¬μ „νžˆ μ‚¬μƒν™œ μΉ¨ν•΄ ν”Όν•΄μžμ—κ²Œ μ΄λ ‡κ²Œ λ§ν•©λ‹ˆλ‹€.
13:06
"What were you thinking?
217
786780
1256
"무슨 μƒκ°μœΌλ‘œ 그런 κ±°μ•Ό?
13:08
You should have never sent that image."
218
788060
2480
그런 사진은 μ ˆλŒ€ 보내지 λ§μ•˜μ–΄μ•Όμ§€."
13:11
If you're trying to figure out what to say instead, try this.
219
791460
4000
κ·Έ λŒ€μ‹ μ— 무슨 말을 해야할지 고민이라면 μ΄λ ‡κ²Œ ν•΄λ³΄μ„Έμš”.
13:15
Imagine you've run into your friend who broke their leg skiing.
220
795980
3520
μ—¬λŸ¬λΆ„ μΉœκ΅¬κ°€ μŠ€ν‚€ 타닀가 닀리가 λΆ€λŸ¬μ‘Œλ‹€κ³  μƒμƒν•΄λ³΄μ„Έμš”.
13:20
They took a risk to do something fun, and it didn't end well.
221
800060
4576
재미둜 μœ„ν—˜ν•œ 짓을 ν•˜λ‹€κ°€ 끝이 쒋지 μ•Šμ•˜μ–΄μš”
13:24
But you're probably not going to be the jerk who says,
222
804660
2536
ν•˜μ§€λ§Œ κ±°κΈ°λ‹€ λŒ€κ³  멍청이처럼 μ΄λ ‡κ²Œ λ§ν•˜μ§„ μ•Šκ² μ£ .
13:27
"Well, I guess you shouldn't have gone skiing then."
223
807220
2440
"κ·ΈλŸ¬λ‹ˆκΉŒ μŠ€ν‚€λ₯Ό 타지 λ§μ•˜μ–΄μ•Όμ§€."
13:31
If we think more about consent,
224
811900
2136
ν•©μ˜μ— λŒ€ν•΄ μ’€ 더 생각해본닀면
13:34
we can see that victims of privacy violations
225
814060
3256
μ‚¬μƒν™œ μΉ¨ν•΄ ν”Όν•΄μžλŠ”
13:37
deserve our compassion,
226
817340
1736
λ²”μ£„μžμ˜ λ‚™μΈμ΄λ‚˜ μˆ˜μΉ˜μ‹¬, 괴둭힘, 처벌이 μ•„λ‹ˆλΌ
13:39
not criminalization, shaming, harassment or punishment.
227
819100
4600
우리의 연민을 λ°›μ•„μ•Όν•œλ‹€λŠ” κ±Έ μ•Œ 수 μžˆμ„ κ±°μ˜ˆμš”.
13:44
We can support victims, and we can prevent some privacy violations
228
824260
4496
μš°λ¦¬λŠ” 법적, 개인적, 기술적 λ³€ν™”λ₯Ό 톡해
13:48
by making these legal, individual and technological changes.
229
828780
4320
ν”Όν•΄μžλ“€μ„ μ§€μ§€ν•˜κ³  μ‚¬μƒν™œ μΉ¨ν•΄λ₯Ό μ˜ˆλ°©ν•  수 μžˆμ–΄μš”.
13:53
Because the problem is not sexting, the issue is digital privacy.
230
833660
5816
μ„ΉμŠ€νŒ…μ΄ λ¬Έμ œκ°€ μ•„λ‹ˆλΌ 디지털 μ‚¬μƒν™œμ΄ λ¬Έμ œμ΄λ‹ˆκΉŒμš”.
13:59
And one solution is consent.
231
839500
2360
μœ μΌν•œ 해결책은 ν•©μ˜μ˜ˆμš”.
14:02
So the next time a victim of a privacy violation comes up to you,
232
842500
4576
λ‹€μŒλ²ˆμ— μ‚¬μƒν™œ μΉ¨ν•΄ ν”Όν•΄μžκ°€ μ—¬λŸ¬λΆ„μ—κ²Œ μ˜€κ±°λ“ 
14:07
instead of blaming them, let's do this instead:
233
847100
2736
ν”Όν•΄μž 탓을 ν•˜λŠ” λŒ€μ‹  μ΄λ ‡κ²Œ ν•΄λ³΄μ„Έμš”.
14:09
let's shift our ideas about digital privacy,
234
849860
3416
디지털 μ‚¬μƒν™œμ— λŒ€ν•œ 생각을 λ°”κΎΈκ³ 
14:13
and let's respond with compassion.
235
853300
2640
동정심을 가지고 λŒ€ν•΄ μ£Όμ„Έμš”.
14:16
Thank you.
236
856500
1216
κ°μ‚¬ν•©λ‹ˆλ‹€.
14:17
(Applause)
237
857740
6136
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7