An open letter signed by Elon Musk and AI experts warned of an 'out-of-control' AI race with potential risks to humanity. Here are the 4 key points.
- AI experts and company leaders have signed an open letter calling for a pause on AI development.
- The Future of Life Institute's letter warned of an 'out-of-control' race to deploy the new tech.
Artificial intelligence heavyweights are calling for a pause on advanced AI development.
Elon Musk, Steve Wozniak, Pinterest cofounder Evan Sharp, and Stability AI CEO Emad Mostaque have all added their signatures to an open letter issued by the Future of Life Institute, a non-profit that works to reduce existential risk from powerful technologies.
The letter warns that AI systems such as OpenAI's GPT-4 are becoming "human-competitive at general tasks" and pose a potential risk to humanity and society. It calls on AI labs to pause training any tech more powerful than GPT-4 for six months while the dangers of the new technology are properly assessed.
Industry experts Yoshua Bengio, sometimes referred to as one of the "godfathers of AI," and influential computer scientist Stuart Russell also threw their weight behind the letter. At the time of publication, no representatives from OpenAI appear to have signed the letter.
The letter cites concerns about the spread of misinformation, the risk of automation in the labor market, and the possibility of a loss of control of civilization. Here are the key points:
The non-profit floats the possibility of developers losing control of powerful new AI systems and their intended effect on civilization. It suggests companies are racing to develop AI technology so advanced that not even the creators can "understand, predict, or reliably control."
The letter stated: "Should we risk loss of control of our civilization? Such decisions must not be delegated to unelected tech leaders."
A "dangerous race"
The letter warned that AI companies are locked in an "out-of-control race to develop and deploy" new advanced systems. In recent months, the viral popularity of OpenAI's ChatGPT has appeared to push other companies to release their own AI products.
The letter urged companies to reap the rewards of an "AI summer" while society has a chance to adapt to the new technology instead of rushing into an "unprepared fall."
AI automation and misinformation
The letter highlighted several risks of the new tech, including the possibility that nonhuman minds will eventually "outnumber, outsmart, obsolete and replace us."
It said that AI systems are becoming "human-competitive" at some tasks and cited concerns around misinformation and labor automation, stating: "Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones?"
The open letter asks for a six-month break from developing any AI systems more powerful than those already on the market.
It asks developers to work with policymakers to create AI governance systems, highlighting the need for regulatory authorities as well as AI "watermarking systems" to help people differentiate between human and AI-made content. The letter also suggests the need for "well-resourced institutions" to cope with economic and political disruptions caused by AI.
The open letter stated the pause should be a step back from a "dangerous race" around advanced technology rather than a complete stop on general AI development.
- Gen Z faces more pressure at work than previous generations because technology has eliminated work-life boundaries, a psychology professor says
- All 81 movies based on Marvel comics, ranked according to critics
- EY and Deloitte are scrutinizing staff workloads and quietly letting some workers go, report says
- Legal doyen and veteran advocate Fali S Nariman passes away at 95
- Sensex, Nifty trade with marginal gains in early session
- Audio series catch on as Indians spend up to 1.5 hours per day listening to them
- Family Business Succession Planning
- Delhi govt issues guidelines for handling end-of-life vehicles in public places