Not only are researchers using supercomputers to analyze large data sets and conduct experiments, but they are more frequently applying them to help secure grant funding.
Higher education institutions are facing more intense competition for research grants as federal funding continues to wane. One exception is the National Science Foundation (NSF), which had roughly $7.8 billion in grant awards in fiscal 2018, making it the major source of federal backing in some fields. This makes the NSF an important source of capital for a university's research -- but only if they can win it. In its grant review process, the NSF has specified it seeks work that involves "radically different approaches, applies new expertise, or engages novel disciplinary or interdisciplinary perspectives," making competition fierce.
So not only are researchers using supercomputers to analyze large data sets and conduct experiments to test their hypotheses, but now they are more frequently using them to help secure grant funding.
SEE: Quantum computing: An insider's guide (free PDF) (TechRepublic)
According to a 2018 report by Educause, a non-profit higher education and IT association, "more than ever, research in all disciplines is data-driven and compute intensive." Between 2015 and 2017, the percentage of institutions with institution-wide or targeted deployment of high-performance computing and high-throughput computing, grew from 67% to 81%, Educause said.
In 2017, Educause data showed that 13% of doctoral institutions primarily funded their high performance computing initiatives with chargeback to research grants.
"According to our data, we've seen a growing trend amongst US doctoral institutions in investing in high-powered computing solutions," said Leah Lang, director of analytics services at Educause. "This trend is, in part, fueled by research grant funding."
Supercomputing on campus
With a growing research program and a 90% increase in artificial intelligence (AI) and supercomputing needs across campus, the University of Miami recently invested in a $3.7 millioncalled Triton, which researchers see as a key weapon in their arsenal to help them grow their program and be more competitive in the grant process.
"The grant funding is getting harder and harder from everywhere — federal agencies and other agencies that provide grant funding," says Dr. Nick Tsinoremas, founding director for the University of Miami Center for Computational Science. "A lot of investigators in the scientific field are competing for the same resources."
Triton is the third-generation supercomputer at the university, which started working with IBM in 2010. Officials chose Triton because "in the new era of artificial intelligence, big data, data science, and so forth, the workflow [was] a little anachronistic, and we wanted a little bit more real time or near real-time interactivity in batch processing and processing a lot of data,'' Tsinoremas explained.
With so much data coming at investigators today, a system needs to have a fast CPU so it can ingest it and process it fast.
There are three Vs in big data: Volume, variety and velocity, Tsinoremas explained. There are also two Ds: Disruption in the way research is done and the democratization of data, "so that the right data reaches the right people to make the right decisions at the right time, with the appropriate security safeguards," he said.
The competitive advantage
Every research university needs grants, and with Triton, the university now has a competitive edge, Tsinoremas maintained.
For example, one of their investigators has put together a consortium that does climate predictions for the entire globe. With Triton, the process of doing global climate predictions is now about 40% faster, according to Tsinoremas.
Thanks to an additional grant from the National Oceanic and Atmospheric Administration (NOAA), "we're able to do these kinds of climate predictions every week," Tsinoremas said. This enables the center to predict rainfall amounts two to three months in advance. "Before, we did climate predictions every spring.
"If [we didn't] have a supercomputer we couldn't apply for a grant, and now we can do the predictions,'' he said.
Since 2014, with a supercomputer in place, the University of Miami has been supporting its research with $120 million in grants.
- How to become a data scientist: A cheat sheet (TechRepublic)
- 60 ways to get the most value from your big data initiatives (free PDF) (TechRepublic)
- Feature comparison: Data analytics software, and services (TechRepublic Premium)
- Volume, velocity, and variety: Understanding the three V's of big data (ZDNet)
- Best cloud services for small businesses (CNET)
- Big data: More must-read coverage (TechRepublic on Flipboard)