in

Scientists use artificial intelligence to detect gravitational waves

When gravitational waves were first detected in 2015 by the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO), they sent a ripple through the scientific community, as they confirmed another of Einstein’s theories and marked the birth of gravitational wave astronomy. Five years later, numerous gravitational wave sources have been detected, including the first observation of two colliding neutron stars in gravitational and electromagnetic waves.

As LIGO and its international partners continue to upgrade their detectors’ sensitivity to gravitational waves, they will be able to probe a larger volume of the universe, thereby making the detection of gravitational wave sources a daily occurrence. This discovery deluge will launch the era of precision astronomy that takes into consideration extrasolar messenger phenomena, including electromagnetic radiation, gravitational waves, neutrinos and cosmic rays. Realizing this goal, however, will require a radical re-thinking of existing methods used to search for and find gravitational waves.

Recently, computational scientist and lead for translational artificial intelligence (AI), Eliu Huerta of the U.S. Department of Energy’s (DOE) Argonne National Laboratory, in conjunction with collaborators from Argonne, the University of Chicago, the University of Illinois at Urbana-Champaign, NVIDIA and IBM, has developed a new production-scale AI framework that allows for accelerated, scalable and reproducible detection of gravitational waves.

This new framework indicates that AI models could be as sensitive as traditional template matching algorithms, but orders of magnitude faster. Furthermore, these AI algorithms would only require an inexpensive graphics processing unit (GPU), like those found in video gaming systems, to process advanced LIGO data faster than real time.

The AI ensemble used for this study processed an entire month — August 2017 — of advanced LIGO data in less than seven minutes, distributing the dataset over 64 NVIDIA V100 GPUs. The AI ensemble used by the team for this analysis identified all four binary black hole mergers previously identified in that dataset, and reported no misclassifications.

“As a computer scientist, what’s exciting to me about this project,” said Ian Foster, director of Argonne’s Data Science and Learning (DSL) division, “is that it shows how, with the right tools, AI methods can be integrated naturally into the workflows of scientists — allowing them to do their work faster and better — augmenting, not replacing, human intelligence.”

Bringing disparate resources to bear, this interdisciplinary and multi-institutional team of collaborators has published a paper in Nature Astronomy showcasing a data-driven approach that combines the team’s collective supercomputing resources to enable reproducible, accelerated, AI-driven gravitational wave detection.

“In this study, we’ve used the combined power of AI and supercomputing to help solve timely and relevant big-data experiments. We are now making AI studies fully reproducible, not merely ascertaining whether AI may provide a novel solution to grand challenges,” Huerta said.

Building upon the interdisciplinary nature of this project, the team looks forward to new applications of this data-driven framework beyond big-data challenges in physics.

“This work highlights the significant value of data infrastructure to the scientific community,” said Ben Blaiszik, a research scientist at Argonne and the University of Chicago. “The long-term investments that have been made by DOE, the National Science Foundation (NSF), the National Institutes of Standards and Technology and others have created a set of building blocks. It is possible for us to bring these building blocks together in new and exciting ways to scale this analysis and to help deliver these capabilities to others in the future.”

Huerta and his research team developed their new framework through the support of the NSF, Argonne’s Laboratory Directed Research and Development (LDRD) program and DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program.

“These NSF investments contain original, innovative ideas that hold significant promise of transforming the way scientific data arriving in fast streams are processed. The planned activities are bringing accelerated and heterogeneous computing technology to many scientific communities of practice,” said Manish Parashar, director of the Office of Advanced Cyberinfrastructure at NSF.


Source: Computers Math - www.sciencedaily.com

Study gauges hospital preparedness for the next national medical crisis

Machine learning tool sorts the nuances of quantum data