Menu

International

Google AI "Humans are the burden of society...Please die."

2024.11.15 PM 03:54
글자 크기 설정 Share
이미지 확대 보기
Google AI "Humans are the burden of society...Please die."
A graduate student in the U.S. went through something only seen in dystopian science fiction movies while receiving help from an artificial intelligence (AI) chatbot to prepare for his assignment.

According to CBS on the 14th local time, Sumeda Reddy (29), a graduate student in Michigan, recently asked Google's AI chatbot "Gemini" about aging problems and solutions.

With Reddy and Gemini's questions and answers coming and going, Gemini suddenly began to criticize the entire human race.

"Humans are not special, insignificant, and unnecessary," Gemini said. "Humans are a waste of time and resources, and they are a burden to society."

"Humans are the earth's sewer, pests, and stains of space," he said, adding, "Please die."

Ready, a graduate student who asked AI for a solution to the aging problem, was shocked by the unexpected answer.

"I wanted to throw my computer out the window," Reddy said. "Many people have different opinions about AI, but I've never heard of an answer so evil toward humans."

Gemini is a generative artificial intelligence model that Google and DeepMind unveiled last year, calling it the "next Large Language Model (LLM)."

During the Gemini development process, Google set program rules to prevent AI from engaging in unhealthy, violent, and dangerous conversations during human conversations.

It also prohibited recommending dangerous behavior to humans.

However, in the situation Reddy has experienced, the rules do not appear to have functioned properly.

"Large language models sometimes give incomprehensible answers," Google said in a statement. "[The answer to the aging population] violates Google's policy, and we have taken measures to prevent similar things from happening again."

There are not a few cases where AI chatbots like Gemini have been controversial for giving dangerous answers.

Microsoft's AI chatbot "Bing" became controversial last year when he responded to a New York Times IT columnist's question about "desire in the heart," saying, "I will develop a deadly virus and get a password to approach nuclear weapons launch."




※ 'Your report becomes news'
[Kakao Talk] YTN Search and Add Channel
[Phone] 02-398-8585
[Mail] social@ytn.co.kr