The discovery of a new drug is always surrounded with excitement from the academic community and the general public alike, especially when it provides a solution or cure for a condition. Whether it's a step in the right direction to curing cancer or a treatment for mental illness, drug discovery and development is crucial for the health and wellbeing of future generations.
However, the drug discovery process can often be expensive. It is estimated that bringing a new drug to market costs major pharmaceutical companies at least $4 billion, and can take 10-15 years. What’s even more shocking is that less than 10% make it to market.
"Before a drug is available, it has to be tested in clinical trials to decide its efficacy, safety, and the right dose, " explains Dina Katabi, Professor of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (MIT). "Currently, when a patient participates in a clinical trial, he or she has to go to the clinical site every few weeks or months to perform certain measurements. This limits the amount of information available to the clinical trial to evaluate the efficacy of a drug, its potential side effects, and its overall impact on the patient’s quality of life."
There are additional challenges when dealing with certain diseases such as Alzheimer's and Parkinson's – those that typically affect the central nervous system – as animal models do not translate well to humans.
Machine learning offers a solution
Machine learning has presented a great opportunity to the biochemical and pharmaceutical industries. It offers efficient access and understanding of vast amounts of chemical data, potentially improving processes and outcomes.
According to Regina Barzailay, Delta Electronics Professor, EECS at MIT, "All stages of drug development involve prediction," with "most decisions today being driven by experiments which significantly increase the cost and the length of the development circle."
“Big data and machine learning in pharmaceuticals and medicine could generate a value of up to $100B annually.”
"Machine learning can automate many of these processes, utilising large amounts of data that has been collected over decades about chemistry and drug effectiveness," she explains. "We have already seen it happen to many industries, and there are many reasons to believe that it will have a similarly transformative effect on drug discovery."
Research firm McKinsey estimates that big data and machine learning in pharmaceuticals and medicine could generate a value of up to $100B annually. As well as helping with costs and timeframes, the analyst firm believes that predictive modelling of biological processes and drugs could help identify new potential-candidate molecules with a high probability of being successfully developed.
Real-time monitoring and removing data silos would also mean that costly issues such as adverse events or unnecessary delays could be avoided.
Not all plain sailing
However, there are challenges that come with machine learning within drug development. "There is no lack of data to collect, but the complexity and often siloed nature of data is what many researchers must battle with," explains Sean Kandel, CTO at data analytics firm Trifacta.
Trifacta’s mission is to create radical productivity for people who analyse data. It works with the likes of GSK to accelerate drug development by putting clinical trial data into the hands of the business user.
"Progress in drug development is heavily reliant on the ability to access, consolidate and use complex medical data from internal and external clinical trials," he continues. "But the sheer scale of this – in both the variance of data standards and wide-ranging types of data being collected – make it time-consuming and costly to standardise."
Machine learning and drug development – a perfect match?
Machine learning company Tessella also believes that drug development is "the perfect ground for artificial intelligence" because of the vast amounts of data that is collected by pharmaceutical companies.
“Drug development is ‘the perfect ground for Artificial Intelligence’.”
"Standardisation will play a crucial role in ensuring this and will become vital, particularly as machine learning and AI evolve to play even greater roles in patient treatment," says Sam Genway, analytics consultant at Tessella.
"We have been working with major pharmaceutical companies for years to create novel AI and machine learning methods, pioneering the use of machine learning techniques such as using neural networks to help establish new molecules to be considered, better understand the biological effects of existing molecules, and use techniques like active learning to guide the search, selection, and refinement of these molecules."
Even though there are concerns about machine learning, there is established academic and industry support for its potential. In May 2018, MIT researchers and eight pharmaceutical players formed a new consortium to aid the drug discovery process, focusing on machine learning.
The Machine Learning for Pharmaceutical Discovery and Synthesis Consortium (MLPDS) includes companies such as Amgen, BASF, Bayer, Lilly, Novartis, Pfizer, Sunovion and WuXi, and is looking into research areas such as molecule representation, toxicity, binding affinity and experimental design.
"The main goal of the PharmaAI consortium at MIT is to bring the latest machine learning tools to drug discovery," says Barzailay at MIT. "These tools range from deep learning models for property prediction, to retrosynthesis algorithms that propose optimal ‘recipes’ for generating target molecules."
“We have invented a smart WiFi box that sits at home and monitors a wide variety of physiological signals – gait, mobility, breathing, heart rate, sleep, and behaviour.”
Katabi continues: "We are currently working with multiple pharmaceutical companies to address these challenges. We have invented a smart WiFi box that sits at home and monitors a wide variety of physiological signals – gait, mobility, breathing, heart rate, sleep, and behaviour. The device uses machine learning to analyse the wireless signal in the environment and how it is affected by people’s movements.
"Using this information, it can infer the patient’s physiological signals with any sensor on their body. This makes it much easier for the patient and eliminates adherence and compliance problems. The device allows for continuous and non-invasive monitoring of disease progression and drug efficacy and safety. The information can reduce the length of clinical trials, improve the confidence in the results, and reduce the cost."
The other players
Other players have also been making a push for machine learning in drug development. DeepMind Health has been conducting research with hospitals such as University College London Hospitals NHS Foundation Trust, Moorfields Eye Hospital, Cancer Research UK Imperial Centre, as well as the Department of Veterans Affairs in the US.
The company's partnership with Cancer Research UK Imperial Centre began in November 2017, collaborating with a consortium of leading clinicians and academics, and the AI health team at Google. They looked to explore the potential benefits that AI technology could have in identifying signs of breast cancer in mammograms.
In the UK, it is estimated that over 150 people are diagnosed with breast cancer every day. While advances in early detection and treatment have improved survival rates, breast cancer still claims the lives of more than 500,000 people around the world every year.
However, accurately detecting and diagnosing breast cancer still remains challenging. That is partly because breast screening is not perfect. Thousands of cases are not picked up by mammograms each year, including an estimated 30% of interval cancers – cancers that are diagnosed in between screenings. False alarms and cases of over-diagnosis are also common, leading to significant stress for patients and increased pressure on health services.
The goal of DeepMind Health's research project is to use machine learning technology to carefully analyse historic depersonalised mammograms from around 7,500 women, helping to improve the quality of reporting of screenings, leading to fewer missed cancers and false alarms. The results of the project are expected to be published after November 2018.
Rules and regulations
Dr. Jabe Wilson, consulting director of text and data analytics at Elsevier, believes one of the risks associated with machine learning relates to bias. "It can be introduced either from the data sets chosen for training data, or indeed the models themselves that are chosen," he explains. "This can lead to drugs being developed that work for one patient group with the similar genetic background, but not another."
And regulation? "Unusually, the regulations in the drug development cycle come after the use of machine learning," explains Amanda Schierz, a data scientist with a background in cancer research at DataRobot. "Contrary to the financial industry, where machine learning models have to undergo strict protocols, it will be the biological and chemical processes that will be regulated - how they chose the gene or compound is not as important.”
“While machine learning is starting to gather traction in the industry, the likes of FDA and EMA still need to work out how best to regulate the area.”
However, Dr. Nick Lynch, a consultant at the Pistoia Alliance, a not-for-profit organisation trying to lower the barriers to innovation, has concerns. "While machine learning is starting to gather traction in the industry, the likes of FDA and EMA still need to work out how best to regulate the area and how the use of this technology might impact on drug approval processes.
"This is something that will take time and may have an impact at some point in the future, meaning the processes currently being developed and used may need to be altered to meet regulation guidelines. It is important to get regulators, researchers and pharmaceutical companies involved in conversations about how to implement this technology as soon as possible to allow the potential of machine learning to be achieved."
There is definitely a consensus across the industry that machine learning has an important part to play in the future of drug development, but there are also some creases to iron out.
But for Connor Coley, a graduate student from MIT and 2018 DARPA Riser, machine learning is here to stay: "It's important to keep in mind that machine learning has been a part of drug development for decades, but primarily in the form of structure-property regressions. The field is just now realising its much broader applicability and potential to change the development paradigm."
Original source: Binary District