Samsung 三星手机后盖注塑模设计【含CAD图纸、论文、开题报告、答辩稿】
收藏
资源目录
压缩包内文档预览:
编号:208522826
类型:共享资源
大小:10.83MB
格式:ZIP
上传时间:2022-04-19
上传人:机****料
认证信息
个人认证
高**(实名认证)
河南
IP属地:河南
50
积分
- 关 键 词:
-
资源目录里展示的全都有
所见即所得。下载后全都有
请放心下载。原稿可自行编辑修改=【QQ:197216396
或11970985
有疑问可加】
- 资源描述:
-
Samsung 三星手机后盖注塑模设计【含CAD图纸、论文、开题报告、答辩稿】,资源目录里展示的全都有,所见即所得。下载后全都有,请放心下载。原稿可自行编辑修改=【QQ:197216396,或11970985,有疑问可加】
- 内容简介:
-
英文文献Probability and punishment John Anderrton is the chief of a special police unit in Washington,D.C. This particular morning,he bursts into a suburban house moments before Howard Marks, in a state of frenzied rage, is about to plunge a pair of scissors into the torso of his wife,whom he found in bed with another man. For Anderton, it is just another day preventing capital crimes.By mandate of the District of Columbia Precrime Division,he recites,Im placing you under arrest of Sarah Marks, that was to take place today. Other cops start restraining Marks, who screams, i did not to do anything! The opening scene of the film Mfnority Report depicts a society in which predictions seem so accurate that the police arrest indiviuals for crimes before they are committed. People are imprisoned not for what they did, but for what they are foreseen to do,even though they and preemptive law enforcement to the visions of three clairvoyants,not to data analysis. But the unsettling future Minority Report portrays is one that unchecked big-data analysis threatens to bring about,in which judgments of culpability are based on individualized predictions of future behavior.Already we see the seedlings of this, Parole boards in more than half of all U.S. states use predictions founded on data analysis as a factor in deciding whether to release somebody from prison or to keep him incarcerated.A growing number of places in the United States from precincts in Los Angeles to cities like Richmond,Virginia employ predictive policing: using big-data analysis to select what streets, groups and individuals to subject to extra scrutiny,simply because an algorithm pointed to them as more likely to commit crime. In the city of Memphis, Tennessee a program called Blue CRUSH(for Crime Reduction Utilizing StatisticaJ History) provides police officers with relatively precise areas of interest in terms of locality (a few blocks) and time(a few hours during a particular day of the week). The system ostensibly helps Jaw enforcement bettef taf. Get its scarce resources. Since its inception in 2006, major property crimes and violent oftenses have fallen by a quarter, according to one measure (though of course, this says nothing about causality;theres nothing to indicate that the decrease is due to Blue CRUSH).In many contexts, data analysis is already employed in the narne of prevention. It is used to lump us into cohorts of people like us,and we are often characterized accordingly. Actuarial tables note that men over 50 are prone to prostate cancer, so members of that group may pay more for health insurance even if they never get prostate cancer. High-school students with good grades, as a group, are less likely to get into car accidents-so some of their less-learned peers have to pay higher insurance premiums. Individuals with certain character-istics are subjected to extra screening when they pass through airport security.Thats the idea behind profilingin todays small-data world. Find a common association in the data,define a group of people to whom it applies, and then place those people under additional scrutiny. It is generalizable rule that applies to everyone in the group.Profiling, If misused, it can lead not only to discrimination against certain groups but also to guilt by association.In contrast, big data predictions about people are different.Where todays forecasts of likely behavior-found in the things like insurance premiums or credit scores-usually rely on a handful of factors that are based on a mental model of the issue at hand (that is, previous health problems or loan repayment history), with big datas noncausal analysis we often simply identify the most suitable predictors from the sea of information.Most important, usingbig data we hope to identify specific individuals rather than groups ; this liberates us from profilings shortcoming of making every predicted suspect a case of guilt by association, In a big-data world, somebody with an Arahic name, who has paid in cash for a one-way ticket in first class, may no longer be subjected to secondary screening at an airport if other data specific to him make it very unlikely that hes terrorist. With big data we can escape the straitjacket of groupidentities, and replace them with much more granul ar predictions for each individual.The promise of big data is that we do what weve been doing all along-profiling-but make it better, less discriminatory, and more individualized. That sounds acceptable if the aim is simply to prevent unwanted actions. But it becomes very dangerous if we use big-data predictions to decide whether somebody is culpable and ought to be punished for behavior that has not yet happened.The very idea of penalizing based on propensities is nauseating. To accuse a person of some possible future behavior is to negate the very foundation of justice : that one must have done something before we can hold him accountable for it. After all, thinking bad things is not illegal, doing them is. It is a fundamental tenet of our society that individual responsibility is tied to individual choice of action. If one is forced at gunpoint to open the companys safe, one has no choice and thus isnt held responsible. If big-data predictions were perfect, if algorithms could foresee our future with flawless clarity, we would no longer have a choice to act in the future. We would behave exactly as predicted. Were perfect predictions possible, they would deny human volition, our ability to lives our lives freely. Also, ironically, by depriving us of choice they would exculpate us from any responsibility. Of course perfect prediction is impossible. Rather, big-data analysis will predict that for a specific individual, a particular future behavior has a certain probability.Consider, for example, research conducted by Richard Berk, a professor of statistics and criminology at the University of Pennsylvania. He claims his method can predict whether a person released on parole will be involved in a homicide (either kill or be killed). As inputs he uses numerous case-specific variables, including reason for incarceration and date of first offense, but also demographic data like age and gender. Berk suggests that he can forecast a future murder among those on parole with at least a 75 percent probability. Thats not bad. However, it also means that should parole boards rely on Berk analysis,they would be wrong as often as one out of four times.But thc core problem with relying on such predictions is not that expose society to risk. The fundamental trouble is that with such a system we essentially punish people before they do something bad. And by intervening before they act(for instance by denying them parole if predictions show there is a high probability that they will murder), we never know whether or not they would have actually committed the predicted crime. We do not let fate play out, and yet we hold individuals responsible for what our prediction tells us they would have done. Such predictions can never be disproven. This negates the very idea of the presumption of innocence, the principle upon which our legal system, as well as our sense of fairness,is based. And if we hold people responsible for predicted future acts, ones they may never commit, we also deny that humans have a capacity for moral choice.The important point here is not simply one of policing The danger is much broader than criminal justice; it covers all areas of society, all instances of human judgment in which big-data predictions are used to decide whether people are culpable for future acts or not. Those include everything from a companys decision to dismiss an employee, to a doctor denying a patient surgery,to a spouse filing for divorce.Perhaps with such a system society would be safer or more efficient, but an essential part of what makes us human-our ability to choose the actions we take and be held accountable for them-would be destroyed. Big data would have become a tool to collectivize human choice and abandon free will in our society.Of course, big data ofters numerous benefits. What turns it into a weapon of dehumanization is a shortcoming, not of big data itself, but of the ways we use its predictions. The crux is that holding people culpable for predicted acts before they can commit them uses big data predictions based on correlations to make causal decisions about individual responsibility.Big data is useful to understand present and future risk, and to adjust our actions accordingly. Its predictions help patients and insurers, lenders and consumers. But big data does not tell us anything about causality. In contrast, assigning guilt-individual culpability-requires that people we judge have chosen a particular action. Their decision must have been causal for the action that followed. Precisely because big data is based on correlations, it is an utterly unsuitable tool to help us judge causality and thus assign individual culpability. The trouble is that humans are primed to see the world through the lens of cause and effect. Thus big data is under constant threat of being abused for causal purposes, of being tied to rosy visions of how much more effective our judgment, our human decision-making of assigning culpability, could be if we only were armed with big-data predictions.It is the quintessential slippery slope-leading straight to the sociew portrayed in Minority Report, a world in which individual choice and free will have been eliminated, in which our individual moral compass has been replaced by predictive algorithms and individuals are exposed to the unencumbered brunt of collective fiat. if so employed, big data threatens to imprison us-perhaps literally-in probabilities. 概率与惩罚约翰安德尔顿(John Anderrton)是华盛顿特区警察部队的负责人。在这个特定的早晨,他突然进入一个郊区的房子,霍华德马克斯(Howard Marks)在疯狂愤怒的状态之前,即将把一把剪刀放进他妻子的躯干里,他和另一个男人在床上发现。对于安德顿来说,这只是另一天的预防首都犯罪。根据哥伦比亚特区刑事司的授权,他说:“我把你们逮捕了今天要举行的萨拉马克斯,”其他警察开始遏制马克斯,谁尖叫,“我没有做任何事情!”电影“Mfnority Report”的开幕式描绘了一个社会,其中预测似乎如此准确,以至于在犯罪之前将警察逮捕个人犯罪。人们不是因为他们做了什么而被监禁,而是因为他们预见到的那样被监禁,即使他们和先发执法对三位透视者的愿景,而不是数据分析。但令人不安的未来“少数民族报告”描绘的是大型数据分析无可挑剔的挑战,其中的责任判断是基于对未来行为的个性化预测。我们已经看到这个幼苗,超过一半的美国国家的假释委员会使用基于数据分析的预测,作为决定是否释放某人离开监狱还是让他监禁的一个因素。在美国越来越多的地方从洛杉矶的地区到维吉尼亚州里士满市,都采用“预防性警务”:利用大数据分析来选择哪些街道,团体和个人进行额外的审查,只是因为一个算法指出他们更有可能犯罪。在孟菲斯市,田纳西州的一个名为蓝色 CRUSH(为减少犯罪利用StatisticaJ历史)的计划为警察提供了相对精确的地区(几个街区)和时间(几个小时在特定的一天)周)。该系统表面上可以帮助缉拿执法机构。获得其稀缺的资源。自2006年成立以来,主要的财产犯罪和暴力事件已经下降了四分之一,根据一个措施(当然这并不表示因果关系,没有什么可以表明减少是由于蓝色CRUSH)。在许多情况下,数据分析已经被用于预防的范围。它被用来把我们融入像我们这样的群体,我们往往是相应的。精算表中指出,50岁以上的男性易患前列腺癌,因此即使没有摄入前列腺癌,该组织的成员也可以为健康保险付出更多的代价。具有良好成绩的高中学生作为一个群体,不太可能陷入车祸,所以他们的一些不太了解的同行必须支付更高的保险费。具有某些特征的个人在通过机场安全时将受到额外的筛选。这就是今天小数据世界中“剖析”的想法。在数据中找到一个共同的关联,定义一组适用于它的人,然后将这些人放在另外的审查之下。 “分析”,如果被滥用,不仅可以导致对某些群体的歧视,而且还可能导致“由社团犯罪”。相比之下,关于人们的大数据预测是不同的。今天在保险费或信用评分等方面可能发生的行为的预测通常依赖于一系列因素,这些因素是基于手头的问题的心理模型(那些是以前的健康问题或贷款偿还历史),大数据的非因果分析我们经常简单地从信息海域中确定最合适的预测因子。最重要的是,使用数据我们希望识别具体的个人而不是团体;这解脱了我们的缺点,使每个预测的嫌疑犯成为协会的罪魁祸首。在一个大数据世界中,有一个具有阿拉赫奇名字的人,以现金支付一等舱单程票,可能不再是在机场进行二次筛选,如果其他具体的数据使他不太可能是恐怖分子。有了大数据,我们可以摆脱群体的束缚,并为每个人替代更多的预言。大数据的承诺是,我们一直在做我们一直在做的一切 - 但是使其更好,更少的歧视性和更个性化。如果目的只是为了防止不必要的行为,这听起来是可以接受的。但是,如果我们使用大数据预测来决定是否有人是有罪的,应该是如此,那就变得非常危险因为尚未发生的行为受到惩罚。基于倾向的惩罚的想法是恶心的。指责一个人有可能的未来行为是否定正义的基础:在我们可以让他负责之前,必须做一些事情。毕竟,认为坏事不是非法的,而是做事情。我们社会的基本宗旨是个人责任与个人行动选择有关。如果一个人被迫在枪口打开公司的保险箱,就别无选择,所以不负责任。如果大数据预测是完美的,如果算法能够以无瑕
- 温馨提示:
1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
2: 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
3.本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

人人文库网所有资源均是用户自行上传分享,仅供网友学习交流,未经上传用户书面授权,请勿作他用。