ICON Explores Quantum Computing to Improve Trial Designs for Heterogeneous Populations


Accurately projecting outcomes for diverse patient populations – from the wealth of genomic, phenotypic and outcomes data available through  genome sequencing and  electronic health records – holds the potential to transform the effectiveness and efficiency of drug and medical device development.

Yet the computations required for statisticians to explore, understand and interpret these enormous multivariate and often poorly structured data take a prohibitively long time to complete. In many cases, it would take months and often years to transform the raw data into informative, scientifically sound, and statistically significant results using even the most advanced  conventional computers.

Quantum computing may change that. In theory, quantum computers are capable of computational power many orders of magnitude beyond today’s conventional computers.  Quantum computers promise to be faster and more reliable in correctly identifying patterns and dynamic trends in massive noisy data sets similar to those that would be useful for mapping dose response and other complex therapeutic relationships.

Quantum computing could prove useful in designing clinical trials. For example, statistical methods of trial design that fuse combinatorial and optimal experimental design techniques have been developed to potentially reduce by half the number of sites and patients needed to select the best combination treatment for targeting multiple cancer types and various biomarkers.

That emerging approach was among the advances presented at a quantum computing workshop this spring, co-sponsored by the ICON Innovation Centre with the George Washington University (GWU) Department of Statistics and Lockheed Martin Corporation. Speakers representing GWU, Lockheed, ICON, the National Institute of Standards and Technology, the University of Wisconsin-Madison, and Nokia Bell Labs addressed more than 50 participants from academia, government, and industry.

The workshop featured quantum computing overviews, talks on quantum algorithms and their links with statistics, as well as case studies and a round table discussion. It exemplifies the cross-industry collaborative approach that is moving this potentially revolutionary computing technology from theory to practical reality.

We asked Sergei Leonov, VP of Clinical Trial Methodology at ICON, to reflect on discussions at the conference about how quantum computing is evolving for clinical applications.

Q: Before you explain how quantum computing works, how can clinical trial design and execution potentially benefit from it?

A: Quantum computing has the potential to substantially reduce the cost and increase the precision of clinical trials. This is because many statistical problems in clinical trial design may be well suited to quantum characterisation and manipulation.

Drug development in oncology presents a challenging example. Increasingly, progress requires efficiently designing and interpreting the results of trials comparing the effects of multiple treatments on patients with varying biomarkers, characteristics and conditions. We anticipate that various problems arising in the design of complex oncology trials can be reduced to a particular class of optimisation problems that can be solved on quantum systems using algorithms of optimal model-based experimental design.

In fact, quantum computing may be well-suited to analysing and finding significant connections among vast amounts of genomic, proteomic and population data, enabling quicker identification of patients that might be better responders, and risk factors that may cluster around factors that might not otherwise be suspected. This not only could optimise trial populations, dosing, etc. to lower the cost of trials, it might also accelerate development by finding unexpected benefits of new and existing compounds, and eliminating confounding factors sooner.

Quantum computing also may help in selecting virtual comparators and quasi-identical subjects, and detection of rare events. It may be particularly useful in the simulation of complex trials based on mechanistic models, such as population pharmacokinetic and pharmacodynamics studies, with a goal of finding optimal operating characteristics within the limits of specific ethical, resource and regulatory constraints.

Q: When will quantum computing be available for use in clinical trials?

A: Quantum computing is only beginning to move from the realm of demonstrations to practical application. Medical product development may be one of the earliest likely applications.

However, harnessing the potential power of quantum computing computational power will require significant additional advances in both mathematics and technology. The machine learning algorithms and statistical methods needed are in development, but far from mature.

Significant hardware and software challenges also exist. One is providing a super-cool (no pun intended) environment. Since minor interactions with the external world may change the state of a quantum system, a stable  quantum system can only exist at temperatures close to absolute zero on the Kelvin scale (0°K = -273.15°C = 459.67°F) and being isolated from the external world. For instance, the latest generation D-Wave system operates at 15 millikelvins [1].  Given the physical limits involved, developing software and interfaces capable of reliably working with transient quantum models is difficult.

While it will take time and effort to apply quantum computing to clinical development it may be the only practical way to process the masses of data now available. Even partial success would represent a “quantum leap” in computing power  and with it development efficiency, so the effort is well worth it. We expect to make major progress in the next couple of years.

Q: Finally, what is quantum computing?

A: Quantum computing is an alternate approach to solving complex mathematical problems, often involving vast amounts of data.

The theoretical power advantage quantum computing holds over conventional computing primarily relates to two quantum mechanical properties – superposition and entanglement. Together these greatly increase both the quantity of data a computer can process, and the ways in which these data can be combined [2].

For conventional computing, the basic unit, the bit, exists in one state at a time, and this state is deterministic, either 0 or 1. Superposition means that the basic unit of quantum computing, known as the qubit, exists in two states at one time, and these states are probabilistic, adding up to 1. Therefore, whereas classical computing is limited to manipulating binary bits in a linear, deterministic stream, quantum computing can manipulate vast data sets simultaneously in a probabilistic ocean. The magnitude of this difference is hinted at by the number of states a quantum computer can represent. Today, the most powerful commercially available quantum computer operates on quantum systems of up to 2,000 qubits, which could exist in a superposition of as many as 22,000, or about 10600 quantum states. For context, it is estimated that there are about 1080 atoms in the known, observable universe.

The second advantage of quantum computing is entanglement, which means individual quantum qubits can interact directly with each other, even at great distances, altering each other’s states simultaneously without intermediate causal connections. By comparison, conventional bits interact only in a linear sequence, changing each other’s state one at a time in an extended chain of binary operations. So, a quantum computer potentially can use computational “shortcuts” not available in conventional computers.

References

  1. Latest Generation D-Wave System  
  2. Nielsen, M.A., Chuang, I.L. (2010). Quantum Computation and Quantum Information: 10th Anniversary Edition. Cambridge University Press.