logo80lv
Articlesclick_arrow
Professional Services
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
Order outsourcing
Advertiseplayer
profile_loginLogIn

"Godfather of AI" Geoffrey Hinton Warns of the Technology's Dangers

After resigning from Google, Dr. Geoffrey Hinton, a renowned AI researcher, shared his concerns regarding the difficulty in preventing the malicious use of AI technology.

Dr. Geoffrey Hinton, the "godfather of artificial intelligence" and a former Google employee, recently spoke out against the rapid pace of AI development, citing the need for global regulation. 

In a recent interview with The New York Times, Dr. Hinton expressed concerns about unregulated AI development and noted that it's difficult to prevent the malicious use of AI technology.

The AI researcher believes that as companies enhance their AI systems, they become more and more dangerous. According to Hinton, last year, Google was cautious in releasing AI technology to prevent harm, but since Microsoft augmented Bing with a chatbot, Google is rushing to catch up. He warned that the competition between tech giants might be unstoppable and could flood the internet with fake photos, videos, and text, making it hard for people to discern the truth.

Additionally, he's concerned about the impact of AI on the job market. While chatbots like ChatGPT currently complement human workers, they could eventually replace those who perform repetitive tasks, such as paralegals, personal assistants, and translators, according to him. Dr. Hinton added that AI technology could eliminate not only the "drudge work" but also "might take away more than that."

"The idea that this stuff could actually get smarter than people – a few people believed that," he said. "But most people thought it was way off. And I thought it was way off. I thought it was 30 to 50 years or even longer away. Obviously, I no longer think that."

Dr. Hinton also fears that in the future, advanced AI technology may pose a threat to humanity due to its ability to learn unexpected behavior from vast amounts of data. This is especially concerning when individuals and companies allow AI systems to generate and run their own computer code. Dr. Hinton worries that this could lead to the development of truly autonomous weapons, also known as killer robots.

The researcher pointed out that, unlike nuclear weapons, there's no way to verify if countries or companies are secretly working on advanced AI technology. Therefore, the only hope is for leading scientists worldwide to collaborate and find ways to control the technology.

"I don’t think they should scale this up more until they have understood whether they can control it," he said.

In a March interview with CBS News, however, Dr. Hinton likened the recent rapid progress in AI to significant technological advancements like the Industrial Revolution, electricity, or the wheel. And just a few months prior, he described AI as a "supernaturally precocious child" and compared the training of AI models to caterpillars transforming into butterflies. He even referred to OpenAI's GPT-4 model as "humanity's butterfly." So, it's unclear when he shifted his perspective.

The New York Times reported that in April, Dr. Hinton informed Google of his plan to leave, and he officially left after a call with CEO Sundar Pichai last Thursday. While the article implied that Hinton left Google to voice his objections to his former boss, Dr. Hinton clarified that he simply wanted to raise awareness about the risks of AI. He also mentioned that "Google has acted very responsibly."

You can find Dr. Geoffrey Hinton's interview with The New York Times here. Also, don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more