A team of scientists has developed an advanced computer algorithm that can predict the toxicity of new chemicals better than standard animal tests. This breakthrough has the potential to spare millions of animals from having to endure such testing.
Each year around the globe, an estimated 3–4 million rabbits, rats, and other animals are subjected to tests of new chemical compounds intended for human or environmental use. The tests are often repeated dozens of times. The computer algorithm, however, equaled or outperformed the animal tests in six areas that account for nearly 60 percent of all such toxicity tests (acute oral and dermal toxicity, eye and skin irritation, DNA mutation, and skin sensitization).
Two years ago, Dr. Hartung and his team at Johns Hopkins developed the world’s largest toxicological database containing information on the properties and structures of 10,000 chemical compounds, based in part on 800,000 toxicology tests. In the new study, which was published in the journal Toxicological Sciences on July 11, the researchers expanded the database and developed a computer algorithm to generate a map of the relationships between chemical structures and toxic properties. Now, using related software also developed by these researchers, it is possible to determine the precise location of any new chemical compound on the map, and predict—with more accuracy than any single animal test—whether the compound is likely to have toxic effects based on its proximity to other compounds on the map.
“These results ... suggest that we can replace many animal tests with computer-based prediction and get more reliable results,” says Hartung. “Our automated approach clearly outperformed the animal test, in a very solid assessment using data on thousands of different chemicals and tests. So it’s big news for toxicology.” It is also big news for animal welfare.