Education AI Cannot Be Limited, Triggers Major Disasters
The threat of existential disaster is a situation where humans are unable to limit the development of AI. Associate Professor of Computer Engineering and Science at the University of Louisville’s Speed School of Engineering, Roman Yampolskiy, revealed that from an investigation of the scientific literature searched, there was no evidence that AI could be held.
“We are facing an event that almost certainly has the potential to cause existential disaster. It is not surprising that many consider this to be the most important problem humans have ever faced. The result could be prosperity or extinction and the fate of the universe depends on that,” said Yampolskiy.
He said that AI should not be optimized without a definite need, even though many are implementing and developing it. Because Yampolskiy said this technology is still poorly understood, not well defined, and not researched enough.
In his upcoming book entitled ‘AI: Unexplainable, Unpredictable, Uncontrollable’, he explains that exploring AI has the potential to dramatically change society.
Researchers Assume AI Control
“Why do so many researchers assume that AI control problems can be overcome? To the best of our knowledge, there is no such evidence. Before starting the quest to build controlled AI, it is important to show that these problems can be overcome first,” stressed Yampolskiy.
Dozens of well-known figures in the world of politics and business are urging world leaders to resolve the threat of artificial intelligence (AI) and the climate crisis. Two of them are Virgin Group founder Richard Branson and Charles Oppenheimer – the grandson of atomic bomb inventor J. Robert Oppenheimer. The two of them, along with former UN Secretary General Ban Ki-moon, signed an open letter urging action https://aseansafeschoolsinitiative.org/ to address the increasing threat of the climate crisis, pandemics, nuclear weapons and uncontrolled AI.
“Our world is in grave danger. We face a series of threats that endanger all of humanity. Our leaders are not responding with the necessary wisdom and urgency.” “The impacts of these threats are already prominent: a rapidly changing climate, a pandemic that is killing millions of people and costing trillions of dollars, a war in which the use of nuclear weapons is being openly discussed,” he continued.
The open letter demands that multilateral action be taken, such as funding the transition from fossil fuels, signing a just pandemic agreement, restarting discussions on nuclear weapons, and building global governance to create a force for good. The letter was published by The Elders, a non-profit organization launched by former South African President Nelson Mandela and Branson to resolve human rights issues and support peace in the world.
The message in the letter was also pushed by the Future of Life Institute, a non-profit organization founded by Massachusetts Institute of Technology cosmologist Max Tegmark and Skype co-founder Jaan Tallin. This organization aims to limit new technological breakthroughs such as AI to benefit humans and avoid large-scale threats.