第155期:数字助手:谁说我必须是女声

第155期:数字助手:谁说我必须是女声

2016-11-13    07'03''

主播: FM715925

7435 365

介绍:
想成为我们的主播,欢迎加微信 xdfbook 投稿。 一段美文,一首英文歌,或是一点生活感想,全由你做主。 《数字助手:谁说我必须是女声》 Stop Giving Digital Assistants Female Voices When the makers of Apple’s Siri unveiled Viv1) at TechCrunch Disrupt NYC last month, the crowd—and press—swooned2). Pitched as “the intelligent interface for everything,” Viv is a personal digital assistant armed with a nearly transcendent level of sophistication. She is designed to move seamlessly3) across services, and be able to fulfill complex tasks such as “Find me a place to take my peanut-free uncle if it rains tomorrow in Cleveland.” Viv is also just the latest virtual helpmeet with a feminine voice and female name. In addition to Siri (Norse for “beautiful woman who leads you to victory”), her sorority sisters include Amazon’s Alexa and Microsoft’s Cortana (named after a voluptuous4) character in the video game “Halo,” who wears a “holographic5) body stocking”). Why are digital assistants overwhelmingly female? Some say that people prefer women’s voices, while others note that in our culture, secretaries and administrative assistants are still usually women. Regardless, this much is certain: Consistently representing digital assistants as female matters a lot in real life: It hard-codes6) a connection between a woman’s voice and subservience. As social scientists explore the question of why women lag so far behind men in workplace leadership, there’s increasing evidence that unconscious bias plays an important role. According to Erika Hall, a professor at Emory University’s Goizueta Business School, unconscious bias has its origins in the “cultural knowledge” we absorb from the world around us. This knowledge can come from movies and television, from teachers and family members; we acquire it almost osmotically by living in our society. Unconscious bias happens when we then engage in discriminatory behaviors because we unwittingly use this knowledge to guide our actions. And this knowledge is everywhere: Our society largely depicts women as supporters and assistants rather than leaders and protagonists. A recent study found that women accounted for only 22 percent of protagonists in the top-grossing films of 2015 (and only 13 percent of protagonists in films directed by men). A comprehensive review of video game studies found that female characters are predominately supporting characters, often “assistants to the leading male character.” And a study of prime-time television found that women comprise the majority of aides7) and administrative support characters. These create “descriptive stereotypes” about what women are like—that women are somehow innately more “supporter-like” than “leader-like.” Because Viv and her fellow digital assistants are female, their usage adds to the store of cultural knowledge about who women are and what women do. Every time you say, “Viv, order me a turkey club” or “Viv, get me an Uber,” the association between “woman” and “assistant” is strengthened. According to Calvin Lai, a Harvard University post-doc who studies unconscious bias, the associations we harbor depend on the number of times we are exposed to them. As these A.I. assistants improve and become more popular, the number of times we’re exposed to the association between “woman” and “assistant” increases. The real-world consequences of these stereotypes is well-documented: Research has shown that people tend to prefer women as supporters and men as leaders. A study of engineering undergraduates at the University of Michigan found that when students presented work, the men tended to present the material and the women tended to play the role of “supporter of the male expert.” In another study, when people were shown identical resumes with either male or female names for a lab manager position, they rated the male candidate significantly more competent and hirable. A third study found that saleswomen earned less than salesmen in part because they’d been denied support staff—why would a supporter need a supporter, after all? While “descriptive stereotypes” lead to women not being perceived as suitable for leadership positions, stereotypes can be prescriptive, too: Women are expected to conform to the stereotype of being a supporter or helper, and rejected or punished for failing to do so. Linguist Kieran Snyder’s study of performance reviews in tech companies showed that women are routinely criticized for having personality traits that don’t conform to feminine stereotypes. Women, but not men, were consistently docked8) for being “abrasive9)” and not “letting others shine.” In other words, they were punished for not being good helpers and supporters. In a study by New York University psychologist Madeline Heinemann, a woman who stayed late to help a colleague was rated less favorably than a man who stayed to help—but penalized more when she declined to stay to help. Indeed, because women are expected to be helpers, they don’t actually accrue10) any reward for doing it—they’re simply living up to the expectation. But if they decline to help, they are seen as selfish. Women are aware of this expectation, too: In a study of medical residents, a female medical resident reported that when leading others, “The most important thing is that when I ask for things they should not sound like orders.” Ultimately, the more our culture teaches us to associate women with assistants, the more real women will be seen as assistants, and penalized for not being assistant-like. At this moment in culture, when more and more attention is being paid to women’s roles in the workplace, it’s essential to pay attention to our cultural inputs, too. Let’s eschew11) the false choice between male and female voices. If these A.I. assistants are meant to lead us into the future, why not transcend gender entirely—perhaps a voice could be ambiguously gendered, or shift between genders? At the very least, the default12) settings for these assistants should not always be women. Change Viv to Victor, and maybe one fewer woman will be asked to be the next meeting’s designated note-taker. 上个月(编注:原文发表于2016年6月),苹果语音助手Siri的制作团队在纽约国际创新峰会上推出了人工智能助手Viv,大众和媒体纷纷为之痴迷。被定位成“万能智能界面”的Viv是一款个人数字助手,她搭载的技术水平堪称卓越。Viv的设计旨在可以于各种服务之间无缝切换,并能完成复杂的任务,例如“如果明天克利夫兰下雨,找一个我和我不吃花生的叔叔去吃饭的地方”。Viv只是最新的拥有女性声音和女性名字的虚拟助手。除了Siri (挪威语中指“引领你走向胜利的美女”),Viv的姐妹团还包括亚马逊的Alexa和微软的Cortana (名字取自电子游戏《光晕》里一个身着“全息紧身衣”的性感角色)。 为什么数字助手绝大多数都是女性呢?有人说人们更喜欢女性的声音,而另一些人则指出,在我们的文化里,秘书和行政助理通常仍以女性为主。不管怎样,有一点是肯定的:一贯把数字助手描绘成女性对现实生活影响很大——这种做法在女性声音和从属地位之间建立了一种牢固的联系。 为什么职场领导层中女性的数量远远少于男性?随着社会学家不断探索这个问题,越来越多的证据表明无意识的偏见发挥着重要的作用。根据埃默里大学戈伊祖塔商学院艾瑞卡·霍尔教授的说法,无意识的偏见源于我们从周围世界得到的“文化知识”,这些知识可能来自电影和电视节目,来自老师和家人。生活在这个社会,我们几乎是潜移默化地获得了这些知识。当我们有厚此薄彼的行为时,无意识的偏见便发挥了作用,因为我们不经意地用这种知识指导我们的行动。 并且,这种知识无处不在:多数情况下,我们的社会把女性刻画成协助者和助手而非领导者和主人公。最近一项研究发现,2015年票房大卖的电影中只有22%的主角是女性(而由男性执导的电影中,只有13%的主角是女性)。全面回顾电子游戏研究后发现,女性角色大多为配角,通常是“男性领导者角色的助手”。对黄金时段电视节目的研究发现,女性占了助手和行政支持角色的大多数。这些都造就了关于女性是什么样子的“描述性的模式化形象”——女性似乎天生更像随从者,而不是领导者。 因为Viv和她的数字助手伙伴们都是女性,对它们的使用增加了关于女性角色以及女性工作内容的文化知识。每一次你说“Viv,帮我订个火鸡肉三明治”或“Viv,帮我叫下优步”时,“女性”和“助手”的联系就得到了加强。根据加尔文·莱的说法,我们脑海中的这种联系取决于我们接触到它的次数。加尔文·莱是哈佛大学研究无意识的偏见的博士后。随着这些人工智能助手不断改进并且变得越来越受欢迎,我们接触到“女性”与“助手”之间联系的次数也在增加。 有充分的证据表明这些模式化形象对现实世界有影响:研究发现,人们往往喜欢把女性当做支持者,把男性当做领导者。对密歇根大学工科大学生的研究发现,学生展示作业时,男生倾向于展示材料,而女生倾向于当“男性专家”助手的角色。在另一项研究中,人们接到两份相同的简历,申请的都是实验室经理的职位,一份使用男性姓名,另一份使用女性姓名。他们认为男性候选人更能胜任工作,可以被雇用。还有一项研究发现女销售员比男销售员赚得少,部分原因是她们得不到助手——说到底,一个助手为何还需要辅助人员呢? 虽然这些“描述性的模式化形象”使人们普遍认为女性不适合领导者的位置,但模式化形象也可能是规定性的:人们期望女性符合这种模式化形象,成为支持者或帮助者,如果不这么做,就会被拒绝或惩罚。语言学家基兰·斯奈德对科技公司绩效评估的研究发现,女性员工经常会因为有不符合女性在人们脑海中形象的个人特点而受到批评。女性,而不是男性,总是因为表现“粗鲁”或“喧宾夺主”而被扣工资。换句话说,她们因不是一个好助手或支持者而受罚。 纽约大学心理学家玛德琳·海涅曼的研究指出,一名留到很晚帮助同事的女性得到的评价不如留下来帮忙的男性高,但是她如果拒绝留下帮忙却会受到更多的处罚。实际上,因为人们期望女性是协助者,所以她们并不会因此获得任何奖赏——她们只是达到了人们的期待而已。但是如果她们拒绝帮忙,就会被认为自私。女性也意识到这种期望:在一项对住院医生的调查中,一名女住院医生提到,在领导别人时,“最重要的一点是当我要求大家做事时,我的话听起来不应该像是命令”。 ………… 文章摘自:《新东方英语》杂志2016年10月号