Initiatives are underway to predict where the virus will spread and analyze how effective preventative measures are, according to the National Center for Supercomputing Applications.
In the frenzied rush to find a drug to treat COVID-19 patients researchers have a powerful weapon aiding them-supercomputers. The world's fastest computers are being utilized to vastly speed up research to aid in the fight against the coronavirus, said John Towns, executive director for science and technology at the National Center for Supercomputing Applications.
Supercomputers are being used by researchers and institutions to track the spread of COVID-19 in real time. Researchers are using supercomputers to predict where the virus will spread by identifying patterns and analyzing how preventative measures like social distancing are helping, the NCSA said.
They are also playing a role in the discovery of effective treatments. Supercomputers are allowing experts to explore the structure of the virus to develop a vaccine and find drugs to combat COVID-19. And as concerns grow over equipment shortages, experts are using supercomputers to position resources, like ventilators and masks, in hospitals all across the country.
The NCSA is a member of the COVID-19 High Performance Computing Consortium, which consists of federal agencies, laboratories, academia, and industry all contributing compute time and resources that researchers can use to address the pandemic, Towns said.
SEE: How supercomputers are helping speed up coronavirus research (TechRepublic)
Towns is chairing a review committee that takes proposals from the research community, assesses them, and makes recommendations on whether they should be approved and if so, the resources they should be matched with.
One such promising project, he said, is out of the University of Utah, where a team of researchers is using the Blue Waters supercomputer and other resources "to design novel peptidic inhibitors of the COVID-19 main protease–a specific protein among the class of enzymes that chops up other proteins, critical for many biological functions," according to a blog post on the consortium site.
Chemist Thomas Cheatham said the team is trying to modify that peptide inhibitor, which "blocks activity of this protease, critical for COVID-19 function."
The hope is that they find a new peptide inhibitor that "will ultimately aid our understanding of one aspect of blocking COVID-19 through better targeting of its main protease," Cheatham said in an interview with the consortium.
This is "one of the bigger projects we're supporting," said Towns, who estimates that so far, there are also about 25 projects being supported with resources from the National Science Foundation, separate from the consortium.
Supercomputers are helping with three problems involved in addressing the pandemic, Towns said:
1. Understanding protein structures and how the virus got into human cells. "That requires a considerable amount of detailed protein simulation and understanding of how the structure of the virus attaches to cells and finds its way into the membrane of the cell," he said. This requires doing a significant number of simulations with respect to protein structures to help researchers understand how the virus attacks human cells and the pathways it uses to get inside a person's body, Towns said.
2. Therapeutics and how to develop and apply antiviral drugs. "Once you understand how the virus works you have to design something that is effective against this," he said.
3. Treating and managing patients. This starts with understanding how the evolution of the coronavirus, how it spreads, and using social distancing as a strategy to address the issue of the virus spreading through droplets.
Towns said he also anticipates other areas of research being done, such as how society behaves during a pandemic because that behavior affects how the virus spreads.
"If people don't pay attention to social distancing we'll spread the disease," he said. "The point is to slow the spread so we can manage it more effectively without overwhelming healthcare systems."
Researchers are also looking at supply chain issues—getting critical supplies where they are needed and when, he said. Supercomputing makes all of this possible, Towns said.
Supercomputers are "being used in different ways in these different areas, but it requires computing systems of this scale [with] capabilities to effectively simulate the virus," as well as using the data that has been gathered around the transmission of this disease, he said.
Results so far
There have been "very significant simulations of the virus and cell surface interaction that have helped us understand how the virus enters cells," Towns said. There has also been significant work done both on screening existing drugs for their efficacy against the coronavirus and in the design of small molecules that could become part of developing a drug.
Progress has also been made in analyzing the patient data that allows researchers to understand the rate at which we're seeing transmission of the disease spread throughout the country at the granularity of the county level, Towns said.
"I'm still very hopeful for the first two categories but they're hardest to see results in," Towns said. "What we've seen to date is these projections of the spread of disease and how it spreads [which] are directly informing the CDC in their development of the guidelines and recommendations it is making to political leaders and the public," as well as to healthcare organizations.
- How to become a data scientist: A cheat sheet (TechRepublic)
- 60 ways to get the most value from your big data initiatives (free PDF) (TechRepublic download)
- Feature comparison: Data analytics software, and services (TechRepublic Premium)
- Volume, velocity, and variety: Understanding the three V's of big data (ZDNet)
- Best cloud services for small businesses (CNET)
- Big Data: More must-read coverage (TechRepublic on Flipboard)