日韩色综合-日韩色中色-日韩色在线-日韩色哟哟-国产ts在线视频-国产suv精品一区二区69

手機APP下載

您現在的位置: 首頁 > 在線廣播 > PBS高端訪談 > PBS訪談商業系列 > 正文

PBS高端訪談:臉書在社會問題上會更加謹慎?

編輯:max ?  可可英語APP下載 |  可可官方微信:ikekenet
  


掃描二維碼進行跟讀打分訓練

JUDY WOODRUFF: Now questions about the ever-growing scope of Facebook's empire and social network, and whether the company is embracing enough responsibility for its reach.

Today, Facebook CEO Mark Zuckerberg announced that they will add 3,000 more people to monitor live video, after problems with violence and hate speech.

Hari Sreenivasan takes it from there.

HARI SREENIVASAN: The decision comes after a series of cases where people shared live video of murder and suicide, recent examples, a murder in Cleveland last month that was posted live on Facebook, and a man in Thailand posted video of him murdering his 11-month-old daughter. It wasn't removed for 24 hours.

Once Facebook makes these announced hires, there will be 7,500 employees to monitor thousands of hours of videos uploaded constantly.

Farhad Manjoo is a tech columnist for The New York Times who has been closely covering Facebook. He joins me now to talk about this issue and other questions facing the company.

Farhad, so let's first — today's news, how significant is this?

FARHAD MANJOO, The New York Times: I think it's significant.

I mean, it's a significant sort of step up in their ability to monitor these videos, and it should help. The way it works is, there's lots of videos going on, on Facebook all the time. If somebody sees something that looks bad, that looks like it may be criminal or some other, you know, terrible thing, they flag it, and the flagged videos go to these reviewers.

And just having more of these reviewers should make the whole process faster. So, it should help. I mean, I think the question is why it took them a year to do this.

HARI SREENIVASAN: So, put the scale in perspective here. If they have 1.2 billion active users a month or whatever it is that they talk about, even at one-half of 1 percent, if they wanted to harm themselves and put this on Facebook, that's six million people.

How do these 7,000 stop that?

FARHAD MANJOO: Yes.

I mean, the way that tech companies generally work is, they manage scale by, you know, leveraging commuters, basically. There's a lot of kind of algorithmic stuff that goes into making sure — they try to, you know, cut down the pool that the human reviewers have to look at.

And there is some experience in this in the Valley. I mean, YouTube has had to deal with this sort of thing for years. And the way they have really come around to doing it is a similar process. Like, they have thousands and thousands of hours of videos uploaded essentially every minute, and they count on kind of the viewers to flag anything that's terrible, and then it goes to these human reviewers.

So, it's a process that can work. The difficulty in Facebook's case is, it's live video, so they have to get it down much more quickly. And so, you know, it's possible that they may need more people or some other, you know, algorithmic solution, but I think this is a — you know, it should be an improvement over what they have now.

臉書在社會問題上會更加謹慎?

HARI SREENIVASAN: You mentioned it took them so long to get to this point. Why?

FARHAD MANJOO: I think this is a real sort of cultural blind spot for Facebook in general.

Oftentimes, they go into these projects — you know, Facebook Live is an example, but many of the other things they have done — with, you know, tremendous optimism.

As a company, and Mark Zuckerberg as a technologist, he has tremendous optimism in technology. And they often fail to see or appreciate the possible kind of downsides of their technology and the ways that it could be misused.

I mean, I think that what we have seen with live — with the live video is a small example. The way that Facebook has sort of affected elections, the way that — you know, the fake news problem we saw in the U.S. election, the way it's been used as a tool for propaganda in various other parts of the world, you know, those are huge examples of, you know, what looked like a fairly simple solution technologically, like we're going to get everyone connected and have them share the news.

You know, it brings some real deep, like, social questions that they are only lately beginning to confront in a serious way.

HARI SREENIVASAN: So, this combination of, I guess, an optimism in the technology and design and a faith in users are ultimately good and will make the right choice, I mean, is that the sort of core cultural concern or problem that keeps the company making these sorts of decisions?

FARHAD MANJOO: That's part of it. And the other thing to remember is, you know, they're a technology company, and speed is of the utmost concern for them.

One of the things that was happening in the tech industry last year is that a whole bunch of other companies were rolling out live video systems, and Facebook didn't want to be left behind. And so they created their live video system.

And it became, you know, the biggest, because they're the biggest social network. But with that sort of size comes, you know, an increased opportunity for misuse and more power, right. Like, a video on Facebook that can be seen by, you know, potentially much more people has a lot more potential for being misused.

And I think they — it's not right to say that they don't consider those things, but it seems like it's on a back burner for them. And I think what's happening at Facebook is a shift toward thinking about these issues at an earlier stage.

And we have really seen this more recently in their work with the news industry. I mean, after the — after what happened — after what happened in the election and the kind of controversy about fake news, they have started to — they have rolled out a bunch of initiatives to do stuff to improve how news is seen on Facebook. They have added fact-checkers and other things.

So, I think their attitude is changing, but it may be changing too slowly, compared to how quick the technology they're rolling out is changing.

HARI SREENIVASAN: All right, Farhad Manjoo of The New York Times, thanks so much.

FARHAD MANJOO: All right, great. Thanks so much.

重點單詞   查看全部解釋    
scope [skəup]

想一想再看

n. 能力,范圍,眼界,機會,余地
vt. 仔

 
election [i'lekʃən]

想一想再看

n. 選舉

聯想記憶
confront [kən'frʌnt]

想一想再看

vt. 面臨,對抗,遭遇

 
combination [.kɔmbi'neiʃən]

想一想再看

n. 結合,聯合,聯合體

聯想記憶
improvement [im'pru:vmənt]

想一想再看

n. 改進,改善

 
optimism ['ɔptimizəm]

想一想再看

n. 樂觀,樂觀主義

聯想記憶
potentially [pə'tenʃəli]

想一想再看

adv. 潛在地

 
opportunity [.ɔpə'tju:niti]

想一想再看

n. 機會,時機

 
network ['netwə:k]

想一想再看

n. 網絡,網狀物,網狀系統
vt. (

 
misuse [mis'ju:z]

想一想再看

vt. & n. 誤用,濫用

聯想記憶
?
發布評論我來說2句

    最新文章

    可可英語官方微信(微信號:ikekenet)

    每天向大家推送短小精悍的英語學習資料.

    添加方式1.掃描上方可可官方微信二維碼。
    添加方式2.搜索微信號ikekenet添加即可。
    主站蜘蛛池模板: 新年大吉祁隆| 性视频在线播放| 情哥哥| 38在线电影| 我的父亲是板凳 电视剧| 新烈火情挑| 薛平贵与王宝钏56集免费观看| 蜗居电视剧完整版免费观看高清| 红星闪闪歌词完整版打印| 性感美女写真视频| 口加一笔变新字有几个| 汤唯韩国电影| 北京新闻频道回看| you are my sunshine简谱| 七品芝麻官豫剧| 仲文你好vlog最新视频| 活埋电影| 分部分项工程验收记录表| 吻戏韩国电影| 小学生必用头像| 李采禫的电影| 去分母解一元一次方程100道及答案 | 情侣野战| 布莱德·德尔森| 电影壮志凌云女版满天星法版在线看| 被打屁股的作文 | 铠甲勇士第六部| 试看60秒做受小视频| 老男人gay同性gay做受| 天天快乐视频免费观看下载| 满天星 电视剧| 易烊千玺是哪里人| 微信头像2024年最新版图片男| 石隽| 坚强的理由吉他谱| 眼皮下垂手术费用多少钱| 中央6套| 全国精神病查询系统官网| 印章抠图| 大连好生活| 挠脚心 | vk|