Home > Media News > 'Huge development in AI could be existential risk'

'Huge development in AI could be existential risk'
28 Oct, 2024 / 05:19 AM / OMNES Media LLC

Source: http://www.webdesk.com

163 Views

(Web Desk) - The world is not ready for artificial general intelligence (AGI), or the point at which artificial intelligence becomes as good as human brains, according to a senior OpenAI researcher.

For years, researchers have been speculating about the arrival of artificial general intelligence, or AGI, when artificial systems will be as good as we are at a broad variety of tasks.

Many have suggested that its arrival could be an existential risk, since it could allow computers to behave in ways we can’t expect.

Now the person tasked with ensuring that ChatGPT developer OpenAI is ready for its arrival has said that both the world and the company itself is “not ready”.

Miles Brundage had previously served as OpenAI’s “senior adviser for the readiness of AGI”, but announced his departure this week as the company said it would wind down the team.

The lack of preparedness is not “controversial statement among OpenAI’s leadership, and notably, that’s a different question from whether the company and the world are on track to be ready at the relevant time”.

Being ready when it arrives will depend on regulation and how the culture around the safety of AI changes, he suggested.

OpenAI has faced questions in recent months over its plans for artificial intelligence and how highly it values safety. While it was created as a non-profit with a view to researching how to safely build artificial intelligence, the success of ChatGPT has brought major investment and some pressure to use its new technology to make a profit.

Mr Brundage said that he was leaving the company for a variety of reasons, including the fact that he does not have time to work on some projects and that he had largely done what he intended.

He also said that it would be easier to work from the outside since he would be free of bias and conflicts of interest. 

 

 

Tags