In an essay, Michael R. Jackson, PhD, senior vice president for drug discovery and development at Sanford Burnham Prebys, explains.
Apart from the occasional moment of serendipity, the development of first-in-class drugs has always been more grind than grand, requiring as much as a decade and hundreds of millions of dollars to bring a new medicine to market. Most drug discovery efforts never reach that goal.
The more we learn about the molecular details of life — the previously unseen and unknown biology of different molecules and how they interact in health and disease — the more complex we realize it is, leaving much uncertainty as to what to target with a drug and how best to achieve desired results.
Indeed, the overall success rate of discovering new drugs, especially small molecules, has not dramatically improved over the past 20 years. While incremental advances have occurred, considerable risk and uncertainty remains in every step of the process.
Artificial intelligence (AI) and related advances are poised to change this reality, and rapidly. They are reshaping almost every stage of the drug discovery process, from identifying drug targets and simulating molecular interactions to designing drugs de novo (entirely from scratch) and accurately predicting which are most likely succeed before actual testing or clinical trials.
AI promises transformational progress in discovering drug. We can work faster, cheaper and more efficiently.
Perhaps the most impactful step to be improved is the selection of which molecule (typically a protein) to target with a drug. In a marriage of medical informatics and bioinformatics, data scientists are using AI to merge huge multi-omic datasets to reveal the mechanisms of disease, and which targets should be drugged. Downstream of this critical decision are three stages of drug development all of which seem destined to be revamped by AI:
First, for small molecule drugs we need to find a chemical that interacts with the selected drug target in a way that prevents, inhibits or erases a disease or its symptoms. Traditionally, this might entail screening 500,000 or more random chemicals in the hope that a few will bind (so called hits) that can be further developed into a drug.
Technologies like cryo-electron microscopy now allow us to visualize the three-dimensional structure of biomolecules alone or in complexes. We can see at the molecular level precisely how a chemical, found in a screen, fits into a protein target, not unlike a key into a lock or a jigsaw piece into a puzzle.
Exactly how a chemical binds informs on whether it inhibits, promotes or alters the function of the drug target. It can help medicinal chemists optimize the fit of the bound chemical.
With that information, emerging artificial intelligence tools can tap into and help make sense of vast, ever-growing databases, then suggest the most promising chemicals, which are similar to screening hits but can be calculated to fit the pocket better.
And in a huge step, AI- driven processes can be deployed to identify completely new binding chemicals that are chemically different from screening hits. This is achieved by a process called “in silico docking,” in which the fit of billions of different chemicals is calculated. A massively parallel computational effort is required to accomplish this scale of activity. It was not achievable until the advent of AI chips.
This is research driven by calculated hypothesis, not educated guesswork, and it happens in silico, meaning through computer modeling and simulation. It’s all virtual, compressing years of work into months, weeks or days. AI and machine learning processes have put this stage of the drug discovery process on steroids.
Second, drugs need to have other properties beyond simply binding to their target so that they can be taken as once-a-day pills, safe as well as efficacious. Recent advances in deep learning techniques allow the drug like properties of a chemical to be more accurately predicted by a computer. As this can be done very rapidly and before a chemical is made, it allows a medicinal chemist to focus on making only those compounds that have properties suitable to be a drug. While predicting drug properties is not new, AI has greatly enhanced predictive power, impacting the pace and success rates.
Third, human testing can be much more precise. Designed drugs can be refined to meet extremely specific medical needs. You have data to show which drug candidates are most likely to be effective for different types of patients and diseases and in combination with other drugs. As a result, clinical trials can be more focused, shorter and less costly. Remedies can get to patients who need them faster.
All of this happens universally. Most data is shared. Used effectively, AI informs everybody’s work, though human ingenuity and innovation remain critical. Scientists still need to interpret the data and make ensure that hypotheses are rigorously tested.
The future of drug discovery and development is simply bigger and better with AI. Researchers aren’t limited to what they’ve discovered or learned alone or in their labs. They now have tools to explore and exploit boundless troves of data and knowledge generated by the entire scientific enterprise.
Progress and achievement won’t come without bumps and glitches, of course. There are fundamental issues to address, such as access to the enormous computing powers and resources necessary to effectively use AI, new imaging technologies and other tools. Researchers, labs and institutions unable or unwilling to embrace these technologies may be left behind.
Going all in on AI isn’t just the smart choice. It’s the only choice.
Programming in a Petri Dish, an 8-part series
How artificial intelligence, machine learning and emerging computational technologies are changing biomedical research and the future of health care
- Part 1 – Using machines to personalize patient care. Artificial intelligence and other computational techniques are aiding scientists and physicians in their quest to prescribe or create treatments for individuals rather than populations.
- Part 2 – Objective omics. Although the hypothesis is a core concept in science, unbiased omics methods may reduce attachments to incorrect hypotheses that can reduce impartiality and slow progress.
- Part 3 – Coding clinic. Rapidly evolving computational tools may unlock vast archives of untapped clinical information—and help solve complex challenges confronting health care providers.
- Part 4 – Scripting their own futures. At Sanford Burnham Prebys Graduate School of Biomedical Sciences, students embrace computational methods to enhance their research careers.
- Part 5 – Dodging AI and computational biology dangers. Sanford Burnham Prebys scientists say that understanding the potential pitfalls of using AI and other computational tools to guide biomedical research helps maximize benefits while minimizing concerns.
- Part 6 – Mapping the human body to better treat disease. Scientists synthesize supersized sets of biological and clinical data to make discoveries and find promising treatments.
- Part 7 – Simulating science or science fiction? By harnessing artificial intelligence and modern computing, scientists are simulating more complex biological, clinical and public health phenomena to accelerate discovery.
- Part 8 – Acceleration by automation. Increases in the scale and pace of research and drug discovery are being made possible by robotic automation of time-consuming tasks that must be repeated with exhausting exactness.