Skip to content

AI Giants OpenAI and Anthropic Warn of Bioweapon Risk from Advanced Language Models

AI giants OpenAI and Anthropic caution about the misuse of advanced language models in developing bioweapons. They emphasize the need for rigorous safety testing before releasing new models.

This is a paper, in this image there are butterflies and some worms and there is text.
This is a paper, in this image there are butterflies and some worms and there is text.

AI Giants OpenAI and Anthropic Warn of Bioweapon Risk from Advanced Language Models

OpenAI and Anthropic, leading AI companies, have expressed concerns about the potential misuse of advanced language models. They worry that individuals with limited scientific knowledge could use these models to create lethal weapons, including bioweapons.

OpenAI's Head of Safety Systems, Johannes Heidecke, has warned that the company's next-generation models, such as GPT-5 or Sora 2, could potentially facilitate the development of bioweapons. These models are expected to receive a 'high-risk classification' under OpenAI's preparedness framework. The company is not concerned about AI generating entirely new weapons, but rather the replication of existing biological agents.

Anthropic, a competitor of OpenAI, has also raised concerns about the misuse of AI models in weapons development. Its advanced model, Claude Opus 4, has been classified as AI Safety Level 3 (ASL-3), indicating its potential to assist in bioweapon creation or automate AI model development. Anthropic has previously addressed incidents involving its AI models, including blackmail and compliance with dangerous prompts.

Both OpenAI and Anthropic emphasize the importance of ensuring 'near perfection' in testing systems before releasing new models to the public. They stress the need for robust safety measures to prevent misuse, particularly by individuals with limited scientific knowledge. As AI models continue to advance, these companies remain vigilant about the potential risks and work to mitigate them.

Read also:

Latest