‘Father of AI’ says tech fears misplaced: ‘You cannot stop it’


A German computer scientist known as the “father of AI” said fears about the technology are misplaced and there is no stopping the progress of artificial intelligence.

“You can’t stop it,” Jürgen Schmidhaber said of the current international race to build artificial intelligence and more powerful systems, according to The Guardian. “Certainly not internationally because one country may have really different goals than another country. So, of course, they’re not going to participate in any kind of deterrence.”

In the 1990s Schmidhaber worked on artificial neural networks, his research later produced language-processing models for technologies such as Google Translate, The Guardian reported.

He currently serves as the director of the AI ​​initiative at the King Abdullah University of Science and Technology in Saudi Arabia, and states in his bio that he has since been building “a self-improving artificial intelligence (AI) smarter than itself.” “Working on making About 15 years old.

AI could go ‘Terminator’, gain upper hand over humans in Darwinian rules of evolution, report warns

Jürgen Schmidthuber (Getty Images)

Schmiduber said that he does not believe that anyone should try to stop the progress of developing powerful artificial intelligence systems, arguing that “in 95% of all cases, AI research is actually about our original objective. is, which makes human life long and healthy and easy.”

Schmidhuber also said that concerns about AI are misplaced and that developing AI-powered tools for good purposes will counter bad actors using the technology.

Future of AI: New technology will create ‘digital humans’, could use more energy than all working people by 2025

“These are the only tools that are now being used to make life better, can be used by bad actors, but they can also be used against bad actors,” he said, according to The Guardian.

OpenAI logo

Schmidhaber said that concerns about AI are misplaced and that developing AI-powered tools for good purposes will counter bad actors using the technology. (Bloomberg via Getty Images)

“And I would be much more concerned about the old dangers of nuclear bombs than the new little dangers of AI that we see now.”

Tech CEOs warn of AI risks ‘human destruction’ as experts rally behind six-month hiatus

His comments come as other tech leaders and experts have sounded the alarm that powerful technology poses threats to humanity. Tesla founder Elon Musk and Apple co-founder Steve Wozniak joined thousands of other tech experts in signing a letter in March asking AI labs to halt their research until safeguards are put in place.

Jeffrey Hinton

Artificial intelligence pioneer Jeffrey Hinton speaks at the Thomson Reuters Financial and Risk Summit on December 4, 2017 in Toronto. (Reuters/Mark Blinch/File)

Artificial intelligence ‘godfather’ on AI possibly wiping out humanity: ‘It’s incomprehensible’

Jeffrey Hinton, known as the “Godfather of AI,” announced this month that he quit his job at Google to speak out about his technological fears. On Friday, Hinton said AI could pose a “more pressing” risk to humanity than climate change — but even though he shares similar concerns with tech leaders like Musk, he said it would be “absolutely unrealistic” to stop AI research in labs. is.”

“I’m in the camp that thinks this is an existential risk, and it’s close enough that we should be working very hard right now and putting a lot of resources into figuring out what we can do about it. Yes,” he told Reuters.

Schmidhuber, who has openly criticized Hinton for allegedly failing to cite fellow researchers in his studies, told The Guardian that AI will surpass human intelligence and ultimately benefit people as AI systems. , which follows comments he has made in the past.

Click here to get the Fox News app

“I’m working [AI] For several decades, since the 80s, basically, and I still believe that it will be possible to witness that AIs are going to be much smarter than me, as soon as I can retire,” Schmidhuber said in 2018. said in

Read original article here

Leave A Reply