日韩色综合-日韩色中色-日韩色在线-日韩色哟哟-国产ts在线视频-国产suv精品一区二区69

手機(jī)APP下載

您現(xiàn)在的位置: 首頁 > 在線廣播 > PBS高端訪談 > PBS訪談社會(huì)系列 > 正文

PBS高端訪談:醫(yī)療算法存在種族歧視

來源:可可英語 編輯:Wendy ?  可可英語APP下載 |  可可官方微信:ikekenet
  


手機(jī)掃描二維碼查看全部內(nèi)容
=g~3iv~#nHt]=kD

!~LqUrpdoY]vK~7=

Hari Sreenivasan: A recent study published in Science magazine found significant racial bias in an algorithm used by hospitals across the nation to determine who needs follow up care and who does not. Megan Thompson recently spoke with STAT's Shraddha Chakradhar, who explained what the researchers found.

smVdGS8ZNl

Megan Thompson: Where exactly was this bias coming from?

e15|!QmB;LPmqH

Shraddha Chakradhar: There are two ways that we can identify how sick a person is. One, is how many dollars are spent on that person. You know, the assumption being the more health care they come in for, the more treatment that they get, the more dollars they spend and presumably the sicker they are if they're getting all that treatment. And the other way is that, you know, we can measure actual biophysical things, you know, from lab tests, what kind of conditions or diseases they might have. So it seems like this algorithm was relying on the cost prediction definition. In other words, the more dollars a patient was projected to spend on the part of an insurance company or a hospital, then that was a sign of how sick they were going to be. And that seems to be where the bias emerged.

[oY&t896kPXsvqO%E

Megan Thompson: I understand that the researchers then sort of use the algorithm using a different type of data. Can you just tell us a little bit more about that? What did they use?

)AwGX(EoE~P2yGpd

1222.jpg

TkBYF8B(D@Cnn

Shraddha Chakradhar: Yeah. So instead of relying on just costs to predict which patients are going to need follow up care, they actually used biometric data, physical biophysical data, physiological data, and they saw a dramatic difference, you know, in the previous model. The algorithm missed some 48,000 extra chronic conditions that African-American patients had. But when they rejiggered the algorithm to look more at actual biological data, they brought that down to about 7,700. So it was about an 84 percent reduction in bias.

dYEUBse~*=v

Megan Thompson: Do we know anything about how the use of this biased algorithm actually affected patient care?

u]eHk(@WMo%5dn*PLaX

Shraddha Chakradhar: We don't actually know that. But as I mentioned, the algorithm is used by hospitals to help them flag patients who might need extra care in the coming year, whether it's, you know, an at-home nurse or making sure that they come in for regularly scheduled doctor's appointments. So we can only presume that if black patients, sicker black patients weren't being flagged accurately, that they also missed out on this follow up care.

(hzqKN7xJCm+uBBOqk

Megan Thompson: Are there any consequences for the company, Optum, that was behind this algorithm?

_Jb;Wz%r1=6Z#

Shraddha Chakradhar: Yes. So the day after the study came out, actually, New York regulators, the Department of Financial Services and the Department of Health sent a letter to the company saying they were investigating this algorithm and that the company had to show that the way the algorithm worked wasn't in violation of anti-discrimination laws in New York. So that investigation is pending. One encouraging thing is that when the researchers did the study, they actually reached back to Optum and let them know about the discrepancy in the data. And the company was glad to be told about it. And I'm told that they're working on a fix. And the other encouraging thing is that the researchers have actually now launched an initiative to help other companies who may be behind similar algorithms to help them fix any biases in their programs. So they've launched a program based out of the University of Chicago's Booth School to do this work on a pro bono basis so that they can sort of catch these things in other algorithms that might be used across the country.

e0*@glsSIl31X

Megan Thompson: All right, Shraddha Chakradhar of STAT, thank you so much for being with us.

q;,jGagiK2KK|Pp~_

Shraddha Chakradhar: Thank you for having me.

[~n72PYK]~KOy[M

plqQ^k|-i=eTLZ00Hqr8s_SpH,6AtM[U-;3*PrvOx

重點(diǎn)單詞   查看全部解釋    
reduction [ri'dʌkʃən]

想一想再看

n. 減少,縮小,(化學(xué))還原反應(yīng),(數(shù)學(xué))約分

 
deception [di'sepʃən]

想一想再看

n. 騙局,詭計(jì),欺詐

 
biased ['baiəst]

想一想再看

adj. 有偏見的;結(jié)果偏倚的,有偏的

 
bias ['baiəs]

想一想再看

n. 偏見,斜紋
vt. 使偏心

聯(lián)想記憶
initiative [i'niʃətiv]

想一想再看

adj. 創(chuàng)始的,初步的,自發(fā)的
n. 第一步

聯(lián)想記憶
determine [di'tə:min]

想一想再看

v. 決定,決心,確定,測(cè)定

聯(lián)想記憶
previous ['pri:vjəs]

想一想再看

adj. 在 ... 之前,先,前,以前的

聯(lián)想記憶
affected [ə'fektid]

想一想再看

adj. 受影響的,受感動(dòng)的,受疾病侵襲的 adj. 做

聯(lián)想記憶
projected [prə'dʒektid]

想一想再看

adj. 投影的,投射 v. 投射(project的過去

 
definition [.defi'niʃən]

想一想再看

n. 定義,闡釋,清晰度

聯(lián)想記憶
?

關(guān)鍵字: 種族 算法 治療 病人 PBS高端訪談

最新文章

可可英語官方微信(微信號(hào):ikekenet)

每天向大家推送短小精悍的英語學(xué)習(xí)資料.

添加方式1.掃描上方可可官方微信二維碼。
添加方式2.搜索微信號(hào)ikekenet添加即可。
主站蜘蛛池模板: 封神第一部| 爱之梦钢琴谱| 韩义生| 石璐| 漂亮小蜜桃| 初恋在线观看| 杨玉环一级片| 魔界 电影| 杨紫和肖战演的电视剧是什么 | 国产伦理电影在线观看| 夜店 电影| 做生活的高手| 西楚霸王| 重温经典节目预告| 电子天平检定规程| 张颜齐| 默读车| 电影不见不散| 一元二次不等式的例题100道| 汤唯和梁朝伟拍戏原版视频在线观看| 男生的帅气头像| 太太的情人 电影| 小姐与流氓| 守株待兔的故事讲解视频完整版| 范艳华| 迪欧电影网| 杯弓蛇影读后感| 电影《kiskisan》在线播放| 怎么做发射器| 福利视频观看| 烽火流金电视剧全集免费观看| 珠帘玉幕剧情介绍| 夫妻最现实的约法三章| 男吸女人奶水视频免费观看| 赫伯曼电影免费观看| 金枝玉叶电视剧免费观看| 樊霖锋| 成人免费视频视频| 血色天劫| 慕思成| 看黄免费在线|