A Guide to Drug Discovery and Development in the UK
Mar 07 2021 Read 64 Times
Discovering and developing effective drug treatments for any illness or disease is a long and complicated process. Indeed, for every 10,000 compounds that are tested in the search for drug discovery, only one or two will actually prove to be effective, licensed treatments. Even if a compound shows initially encouraging prospects, it can fail, be rejected and subsequently removed from the arena of scientific interest at any time for reasons of safety, ineffectuality or quality.
Understanding how the drug discovery process works in the UK is a demanding task, especially given the changing face of the industry over the years. In order to better comprehend the challenges in drug discovery and development and how these have evolved over the years, it’s important to learn about the history of the subject. This article aims to give an introductory guide to British methods of discovering, developing and distributing drugs over the years.
What is the drug discovery process?
Historically, drug discovery relied on analysis and testing of natural materials, such as plants, herbs, roots, fungi and vines using a process called random screening. What is random screening in drug discovery? Put simply, this involves sifting through a large set of compounds in those materials to identify and isolate the ones that could hold efficacy in curing a certain malady or disease. These would then be tested in animals and humans to determine whether they had the desired outcome on a consistent basis.
However, as our scientific knowledge has advanced, so too have the drug discovery approaches we have employed. With the advent of DNA sequencing in the 1970s, researchers began to be able to identify sick, malfunctioning or diseased cells in the human body and analysing the biological molecules involved in their makeup at the atomic level. This has opened up new avenues of research into drug discovery and development such as genomics, metabolomics, metagenomics, proteomics and transcriptomics (collectively known as -omics) and allowed scientists to work backwards from the disease itself to find a cure or vaccination. The article Rethinking Oncological Drug Discovery with Advances in Analytical Proteomics Technology talks about some of the latest developments in this area for those interested.
Modern advances in informatics, technology and science have all combined to revolutionise how drugs are discovered and developed today. For example, the sheer scale of compounds which must be tested in order to arrive at a successful treatment is so great that it can be overwhelming for humans to handle alone. Thankfully, with the advent of artificial intelligence, machine learning and Big Data, researchers can now delegate much of the heavy lifting and legwork to a computer, freeing up humans for more challenging and more productive tasks.
When was the first drug discovered?
It wasn’t until the mid-19th century that man began to develop chemical compounds for use as pain relief strategies or disease cures. Until this point, the only substances available to help in such situations were herbal, fungal or other naturally-derived remedies. However, in 1869, the first modern drug was discovered in the shape of chloral hydrate, which was used as a hypnotic sedative. It is still available in some parts of the world today, though modern alternatives have outstripped it in terms of efficacy and safety.
What is the role of the pharmacist in drug discovery?
Pharmacists are just one type of scientist who is instrumental in the discovery of new drugs in the UK. Working alongside other industry professionals such as bioscientists, medicinal chemists, pharmacokineticists and toxicologists, pharmacists can aid in identifying compounds which they believe may yield positive results in drug development.
They will first strive to understand the biological properties of the selected compound, then assess which form and size of dosage is most appropriate to the situation and how the exact process of administering the drug may affect the subject’s receptiveness to it. They are also key in formulating the drug and refining its manufacturing process to ensure that it can be produced, distributed and administered on a large scale without compromising on its quality, effectiveness or affordability.
They can also play a major role in the clinical research phase of the drug’s development. This can involve anything from planning the research and formulating the prototype for the initial stages of the trial to refining, producing, packaging, labelling and distributing the product to the researchers in charge of the trial. They can also aid in interpreting and writing up the results of sophisticated and complex studies.
How is bioinformatics used in drug discovery?
As our abilities in medicine have advanced over the years, so too have our knowledge of other sectors, such as applied mathematics, computer science and statistical modelling. These areas of research – traditionally associated with informatics – can be incorporated into biological research to create a new field called bioinformatics.
By and large, bioinformatics is used to aid in the interpretation and analysis of three major areas of molecular biology. These are genomic sequencing, the conclusions drawn from functional genome experiments and macromolecular structures. All three of these facets of research yield enormous datasets which can be difficult to process without the aid of informatics techniques. Therefore, bioinformatics can significantly facilitate and expedite the process of doing so. To learn more about this specific facet of drug discovery and development, check out the informative article Automating for Multi-omics Workflows for Drug Discovery and Toxicology.
Among the computational strategies which fall under the umbrella of bioinformatics, scientists use database design and data mining, expression data clustering, gene finding, macromolecular geometry, prediction of protein structures, phylogenetic tree construction and the alignment of sequencing with structures. All of these methods greatly aid in the process of drug discovery and development.
What are the four stages of drug development?
After a new compound has been identified as being potentially viable for use as a drug treatment, it must go through a rigorous screening process which tests its safety, efficacy and quality. This process is broken down into four stages, which are as follows:
- Stage 1. The first phase tests whether a drug is, above all, safe for use in humans. It involves the selection of a small quantity of healthy volunteers who are administered single doses of the drug. They will then be monitored and their feedback collected to analyse how the drug affects them and determine any unwanted side-effects, as well as giving the researchers an idea of the appropriate dosage.
- Stage 2. Once the safety of a drug has been determined, it’s then necessary to ascertain how effective it is in curing or preventing the ailment in question. For this, a larger sample size of around 100 or 200 volunteers is normally used in a controlled, randomised or double-blind study which may take place over a number of months or years. A placebo will be introduced alongside the drug to help contextualise its efficacy and effects.
- Stage 3. A drug which has shown promising results in Stage 2 is promoted to a third phase, when the same types of clinical trials are repeated on a larger and wider-ranging basis. This time, it could involve several hundred or even thousands of people across multiple control centres, potentially in more than one country. Stage 3 can take several years to complete and offers further information about how the drug affects different people over time.
- Stage 4. At this stage, evidence from all three prior stages is collated and presented to the relevant licensing body. In the UK, this is the Medicines and Healthcare Products Regulatory Agency (MHRA). If the MHRA believes there is enough evidence to prove that the drug is both effective and safe for human consumption, as well as meeting the relevant quality standards in the UK, it will be authorised and licensed for distribution in the country.
Even after the license has been issued, there may still be a final step in the evolution of the drug’s development. That’s because UK law has been engineered to prevent postcode lottery prescriptions, which means that if the relevant bodies (the National Institute for Health and Care Excellence, or NICE, in England and Wales and the Scottish Medicines Consortium, or SMC, in Scotland) recommend the drug, the NHS must provide funding for it.
What happens after a drug has been approved and licensed?
Even after a drug has become commercially available and in widespread circulation, monitoring of its progress continues. This is because certain side effects may only affect a small number of people (such as 1 in 10,000) and as such do not become apparent until after it is being used by the public. The NHS employs a “Yellow Card” system whereby GPs and other medical professionals can document known side-effects of a drug and report them back to the MHRA. If enough evidence is collated, the patient information leaflet (PIL) which accompanies the drug may be amended or the drug may be withdrawn altogether.
Meanwhile, clinical trials of the drug continue after its licensing, as well. This is to test it for other uses than the one it was originally developed for, compare its efficacy alongside existing and new treatments, analyse its effectiveness in a larger sample size of volunteers and understand more about the long-term ramifications of taking the drug.
In This Edition Articles - Seats at the table to shape the industry’s future - Battling biohazardous liquid laboratory waste Laboratory Products - Two New Powerful Stirrers Announced...
View all digital editions
Apr 20 2021 Virtual event
Apr 26 2021 Online event
Apr 28 2021 Virtual event
May 02 2021 Virtual conference
May 03 2021 Online event