Home / Technology / IBM launches High Performance Computing Consortium to give coronavirus researchers access to supercomputers

IBM launches High Performance Computing Consortium to give coronavirus researchers access to supercomputers

Following the launch of its 2020 Call for Code Global Challenge, IBM today announced that it will coordinate an effort to provide over 200 petaflops of computing power to scientists researching COVID-19, the coronavirus that’s sickened over 300,000 people. The company anticipates that the capacity will be used to develop algorithms that assess how COVID-19 is progressing, and to model potential therapies in pursuit of a possible vaccine.

As part of a newly launched consortium — the High Performance Computing Consortium — that includes the White House Office of Science and Technology Policy, the U.S. Department of Energy, MIT, Rensselaer Polytechnic Institute, Lawrence Livermore National Lab, Argonne National Lab, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, NASA, and the National Science Foundation, Amazon, Google, and Microsoft, IBM says it will help to evaluate proposals from institutions and provide access to compute for projects that can “make the most immediate impact.” Researchers will have at their disposal 16 systems with a combined 775,000 processor cores and 34,000 GPUs, which can perform around 265 thousand million million floating-point operations per second (265 petaflops).

“These high-performance computing systems allow researchers to run very large numbers of calculations in epidemiology, bioinformatics, and molecular modeling. These experiments would take years to complete if worked by hand, or months if handled on slower, traditional computing platforms,” wrote IBM Research director Dario Gil in a blog post. “Since the start of the COVID-19 pandemic we have been working closely with governments in the U.S. and worldwide to find all available options to put our technology and expertise to work to help organizations be resilient and adapt to the consequences of the pandemic.”

The announcement follows news that scientists tapped IBM’s Summit at Oak Ridge National Laboratory, the world’s fastest supercomputer, to simulate how 8,000 different molecules would interact with COVID-19, resulting in the isolation of 77 compounds likely to render the virus unable to infect host cells. Elsewhere, the Tianhe-1 supercomputer at National Supercomputer Centre in Tianjin was recently used to process hundreds of images generated by computed tomography and give diagnoses in seconds. And the Gauss Centre for Supercomputing, an alliance of Germany’s three national supercomputing centers, said it would help those working on COVID-19 gain expedited access to computing resources.

More recently, Folding@home, one of the largest crowdsourced supercomputing programs in the world, kickstarted an initiative to uncover the mysteries behind COVID-19’s spike protein, which the virus uses to infect cells. Since announcing in late February the new focus on the coronavirus, some 400,000 new volunteers joined the effort, according to project organizer and Washington University School of Medicine associate professor of biochemistry and molecular biophysics Greg Bowman.

VB TRansform 2020: The AI event for business leaders. San Francisco July 15 - 16

Supercomputers have long been used to identify and test potential treatments for complex and chronic diseases. Researchers tapped the Texas Advanced Computing Center’s Lonestar5 cluster to simulate over 1,400 FDA-approved drugs to see if they could be used to treat cancer. Last June, eight supercomputing centers were selected across the E.U. to host applications in personalized medicine and drug design. And pharmaceutical company TwoXAR recently teamed up with the Asian Liver Center at Stanford to screen 25,000 drug candidates for adult liver cancer.

The hope is that supercomputers can reduce the amount of time it takes to bring novel drugs to market. Fewer than 12% of all drugs entering clinical trials end up in pharmacies, and it takes at least 10 years for medicines to complete the journey from discovery to the marketplace. Clinical trials alone take six to seven years on average, inflating the cost of R&D to roughly $ 2.6 billion, according to the Pharmaceutical Research and Manufacturers of America.

The White House previously partnered with Google parent company Verily to develop screening to build a triaging tool to help people find COVID-19 testing sites in the U.S., which is currently live for select locations in the San Francisco Bay Area. (Google is also working with the U.S. government to create self-screening tools for people wondering whether they should seek medical attention.) And last week, at the request of the White House Office of Science and Technology Policy, researchers and leaders from the Allen Institute for AI, Chan Zuckerberg Initiative, Microsoft, the National Library of Medicine at the National Institutes of Health, and others released a data set of over 29,000 articles about COVID-19, SARS-CoV-2, and the Coronavirus group.

This afternoon, U.S. President Trump gave Ford, GM, and Tesla the “go ahead” to make ventilators to help alleviate a shortage amid the pandemic, only days after Trump issued an executive order invoking the Defense Production Act. COVID-19 is a respiratory disease, and ventilators are a critical piece of medical equipment used to treat hospitalized patients. The Society of Critical Care Medicine projects that 960,000 coronavirus patients in the U.S. may need to be put on ventilators in the future, but the nation has only about 200,000 of the machines, and around half are around older models that might not be ideal for critically ill patients.

Let’s block ads! (Why?)

VentureBeat

About

Check Also

SAG-AFTRA hits out at AI Taylor Swift deepfakes and George Carlin special, calls to make nonconsensual ‘fake images’ illegal

The Screen Actors Guild – American Federation of Television and Radio Artists (SAG-AFTRA) put out …