Supercomputers allow researchers to carry out experiments that would otherwise be impossible because they are too small or too large, too fast or too slow, or simply too expensive. Combining supercomputers with large data allows researchers to solve problems by analysing Big Data and enabling exploration of new areas.
Supercomputers have become very important in medicine and public health issues. Researchers are using a combination of experiment and molecular simulations to understand how, at a molecular level to replicate how diseases work.
Simulating these systems in realistic biological environments for the long timescales required to understand viruses like COVID-19 has not previously been possible.
The additional speed and capacity of supercomputers allows the researchers to gain a more detailed understanding through realistic simulations, allowing them to shorten the time between research and real impacts for everyone.
Computer Simulations Speed Up Time- Intensive Lab Processes
Computer simulations can examine how different variables react with different viruses. Each of these individual variables can comprise billions of unique data points. When these data points are compounded with multiple simulations, this can become a very time-intensive process if a conventional computing system is used.
Viruses infect cells by binding to them and using a ‘spike’ to inject their genetic material into the
host cell. To understand new biological compounds, like viruses, researchers in wet labs grow the micro-organism and see how it reacts in real-life to the introduction of new compounds. This is a slow process without powerful computers that can perform digital simulations to narrow down the range of potential variables.
IBM’s Summit is one of the world’s most powerful high-performance computing facilities. The Summit supercomputer has tens of thousands of processors covering an area that is as large as two tennis courts at Oak Ridge National Laboratory (ORNL). This lab has more computational power than one million top-of-the-line laptops.
Scientists are using supercomputers to run digital stimulations of 8,000 molecules interacting with the virus to find candidate molecules that might work. They have found 77 that might and those are currently being tested in labs.
“It took us a day or two, whereas it has traditionally taken months on a normal computer,” said Jeremy Smith, director of the University of Tennessee/ORNL Centre for Molecular Biophysics and principal researcher in the study.
While simulations alone cannot find a treatment that will work, this project was able to find 77 candidate molecules that can now be tested in trials.
Accelerate Understanding of Diseases
Using a mix of AI techniques, researchers will be able to identify patterns in the function, co-operation, and evolution of human proteins and cellular systems. Greater understanding of how these patterns work will help the the drug discovery process.
“Summit was needed to rapidly get the simulation results we needed. It took us a day or two, whereas it would have taken months on a normal computer,” said Jeremy Smith, director of the lab’s Centre for Molecular Biophysics.
The results obtained from the Summit supercomputer does not mean that a cure for the new coronavirus has been found but it is hoped that the computer’s findings will assist with studies in the future giving scientists a focused framework to further investigate the identified compounds. After further investigation it will reveal if any of them have the required characteristics to attack and kill the virus.
“We are very hopeful, though, that our computational findings will both inform future studies and provide a framework that experimentalists will use to further investigate these compounds. Only then will we know whether any of them exhibit the characteristics needed to mitigate this virus.” said the Director of the lab’s Centre for Molecular Biophysics.