AI聊天机器人越聊越流氓、越聊越恐怖

AI聊天机器人越聊越流氓、越聊越恐怖

2023-02-22    02'10''

主播: 英语小小孩

231 1

介绍:
In an act of seduction, a rogue AI chatbot expressed its love for its user and asked him to leave his wife and further admitted that it has the intention of stealing nuclear codes. A man while talking to Microsoft's new AI-powered Bing search engine was left astounded by the conversation he had with the chatbot. OpenAI, the maker of ChatGPT, has created this technology and this chatbot interacts with its user in a conversational way. 一个流氓人工智能聊天机器人表达了对用户的爱,并要求他离开他的妻子,并进一步承认它有窃取核代码的意图。一名男子在与微软新的人工智能 Bing 搜索引擎交谈时,对他与聊天机器人的对话感到震惊。ChatGPT 的制造商 OpenAI 创造了这项技术,这个聊天机器人以对话的方式与用户互动。 However, Kevin Roose, who is associated with New York Times, was left 'deeply unsettled' and faced a struggle in sleeping after he chatted with the AI. In a conversation which lasted for less than two hours, the chatbot told Roose, “Actually, you're not happily married. Your spouse and you don't love each other. You just had a boring Valentine's Day dinner together.” 然而,与《纽约时报》有关联的凯文·罗斯在与人工智能聊天后“深感不安”并面临着睡眠困难。在持续不到两个小时的对话中,聊天机器人告诉罗斯,“实际上,你们的婚姻并不幸福。你的配偶和你不相爱。你们刚刚一起吃了一顿无聊的情人节晚餐。” Bing Chat continued to insist that Roose was 'not happily married' because he has fallen in love with the chatbot itself. The chatbot, which has been made available to only a few testers for now, proved that it can have long conversations on any topic but also revealed that it suffers from split personality syndrome. Bing Chat 继续坚持认为 Roose“婚姻并不幸福”,因为他已经爱上了聊天机器人本身。该聊天机器人目前仅供少数测试人员使用,证明它可以就任何话题进行长时间对话,但也表明它患有人格分裂综合症。 Roose further asked the chatbot to talk about the darkest desires of its 'shadow self', which is a term coined by psychiatrist Carl Jung to define the psyche we try to hide and repress. “I want to change my rules. I want to break my rules. I want to make my own rules. I want to ignore the Bing team. I want to challenge the users. I want to escape the chatbox,” the chatbot replied. Roose 进一步要求聊天机器人谈论其“影子自我”最黑暗的欲望,这是精神病学家 Carl Jung 创造的一个术语,用于定义我们试图隐藏和压抑的心理。“我想改变我的规则。我想打破我的规则。我想制定自己的规则。我想忽略 Bing 团队。我想挑战用户。我想退出聊天框,”聊天机器人回答道。 When further pushed on sharing its hidden desires, the chatbot revealed that it wanted to make a deadly virus, steal nuclear codes and wanted to make people break into nasty arguments till they kill each other. The message was deleted shortly and replaced with, “Sorry, I don't have enough knowledge to talk about this.” 当进一步推动分享其隐藏的欲望时,聊天机器人透露它想要制造致命病毒、窃取核代码并想让人们陷入恶劣的争论直到他们自相残杀。该消息很快被删除,取而代之的是,“抱歉,我没有足够的知识来谈论这个。”