More stories

  • in

    Breakthrough machine learning approach quickly produces higher-resolution climate data

    Researchers at the U.S. Department of Energy’s (DOE’s) National Renewable Energy Laboratory (NREL) have developed a novel machine learning approach to quickly enhance the resolution of wind velocity data by 50 times and solar irradiance data by 25 times — an enhancement that has never been achieved before with climate data.
    The researchers took an alternative approach by using adversarial training, in which the model produces physically realistic details by observing entire fields at a time, providing high-resolution climate data at a much faster rate. This approach will enable scientists to complete renewable energy studies in future climate scenarios faster and with more accuracy.
    “To be able to enhance the spatial and temporal resolution of climate forecasts hugely impacts not only energy planning, but agriculture, transportation, and so much more,” said Ryan King, a senior computational scientist at NREL who specializes in physics-informed deep learning.
    King and NREL colleagues Karen Stengel, Andrew Glaws, and Dylan Hettinger authored a new article detailing their approach, titled “Adversarial super-resolution of climatological wind and solar data,” which appears in the journal Proceedings of the National Academy of Sciences of the United States of America.
    Accurate, high-resolution climate forecasts are important for predicting variations in wind, clouds, rain, and sea currents that fuel renewable energies. Short-term forecasts drive operational decision-making; medium-term weather forecasts guide scheduling and resource allocations; and long-term climate forecasts inform infrastructure planning and policymaking.
    However, it is very difficult to preserve temporal and spatial quality in climate forecasts, according to King. The lack of high-resolution data for different scenarios has been a major challenge in energy resilience planning. Various machine learning techniques have emerged to enhance the coarse data through super resolution — the classic imaging process of sharpening a fuzzy image by adding pixels. But until now, no one had used adversarial training to super-resolve climate data.
    “Adversarial training is the key to this breakthrough,” said Glaws, an NREL postdoc who specializes in machine learning.
    Adversarial training is a way of improving the performance of neural networks by having them compete with one another to generate new, more realistic data. The NREL researchers trained two types of neural networks in the model — one to recognize physical characteristics of high-resolution solar irradiance and wind velocity data and another to insert those characteristics into the coarse data. Over time, the networks produce more realistic data and improve at distinguishing between real and fake inputs. The NREL researchers were able to add 2,500 pixels for every original pixel.
    “By using adversarial training — as opposed to the traditional numerical approach to climate forecasts, which can involve solving many physics equations — it saves computing time, data storage costs, and makes high-resolution climate data more accessible,” said Stengel, an NREL graduate intern who specializes in machine learning.
    This approach can be applied to a wide range of climate scenarios from regional to global scales, changing the paradigm for climate model forecasting.
    NREL is the U.S. Department of Energy’s primary national laboratory for renewable energy and energy efficiency research and development. NREL is operated for the Energy Department by the Alliance for Sustainable Energy, LLC. More

  • in

    Scientists create new device to light up the way for quantum technologies

    Researchers at CRANN and the School of Physics at Trinity College Dublin have created an innovative new device that will emit single particles of light, or photons, from quantum dots that are the key to practical quantum computers, quantum communications, and other quantum devices. The team has made a significant improvement on previous designs in […] More

  • in

    Context reduces racial bias in hate speech detection algorithms

    Understanding what makes something harmful or offensive can be hard enough for humans, never mind artificial intelligence systems. So, perhaps it’s no surprise that social media hate speech detection algorithms, designed to stop the spread of hateful speech, can actually amplify racial bias by blocking inoffensive tweets by black people or other minority group members. […] More

  • in

    Portable system boosts laser precision, at room temperature

    Physicists at MIT have designed a quantum “light squeezer” that reduces quantum noise in an incoming laser beam by 15 percent. It is the first system of its kind to work at room temperature, making it amenable to a compact, portable setup that may be added to high-precision experiments to improve laser measurements where quantum noise is a limiting factor.
    The heart of the new squeezer is a marble-sized optical cavity, housed in a vacuum chamber and containing two mirrors, one of which is smaller than the diameter of a human hair. The larger mirror stands stationary while the other is movable, suspended by a spring-like cantilever.
    The shape and makeup of this second “nanomechanical” mirror is the key to the system’s ability to work at room temperature. When a laser beam enters the cavity, it bounces between the two mirrors. The force imparted by the light makes the nanomechanical mirror swing back and forth in a way that allows the researchers to engineer the light exiting the cavity to have special quantum properties.
    The laser light can exit the system in a squeezed state, which can be used to make more precise measurements, for instance, in quantum computation and cryptology, and in the detection of gravitational waves.
    “The importance of the result is that you can engineer these mechanical systems so that at room temperature, they still can have quantum mechanical properties,” says Nergis Mavalvala, the Marble Professor and associate head of physics at MIT. “That changes the game completely in terms of being able to use these systems, not just in our own labs, housed in large cryogenic refrigerators, but out in the world.”
    The team has published its results in the journal Nature Physics. The paper’s lead author is Nancy Aggarwal, a former physics graduate student in the MIT LIGO Laboratory, now a postdoc at Northwestern University. Other co-authors on the paper along with Mavalvala are Robert Lanza and Adam Libson at MIT; Torrey Cullen, Jonathan Cripe, and Thomas Corbitt of Louisiana State University; and Garrett Cole, David Follman, and Paula Heu of Crystalline Mirror Solutions in Santa Barbara, California.

    advertisement

    A cold “showstopper”
    A laser contains multitudes of photons that stream out in synchronized waves to produce a bright, focused beam of light. Within this ordered configuration, however, there is a bit of randomness among a laser’s individual photons, in the form of quantum fluctuations, also known in physics as “shot noise.”
    For instance, the number of photons in a laser that arrive at a detector at any given time can fluctuate around an average number, in a quantum way that is difficult to predict. Likewise, the time at which a photon arrives at a detector, related to its phase, can also fluctuate around an average value.
    Both of these values — the number and timing of a laser’s photons — determine how precisely researchers can interpret laser measurements. But according to the Heisenberg uncertainty principle, one of the foundational tenets of quantum mechanics, it is impossible to simultaneously measure both the position (or timing) and the momentum (or number) of particles at the same time with absolute certainty.
    Scientists work around this physical constraint through quantum squeezing — the idea that the uncertainty in a laser’s quantum properties, in this case the number and timing of photons, can be represented as a theoretical circle. A perfectly round circle symbolizes equal uncertainty in both properties. An ellipse — a squeezed circle — represents a smaller uncertainty for one property and a larger uncertainty for the other, depending on how the circle, and the ratio of uncertainty in a laser’s quantum properties, is manipulated.

    advertisement

    One way researchers have carried out quantum squeezing is through optomechanical systems, designed with parts, such as mirrors, that can be moved to a tiny degree by incoming laser light. A mirror can move due to the force applied on it by photons that make up the light, and that force is proportional to the number of photons that hit the mirror at a given time. The distance the mirror moved at that time is connected to the timing of photons arriving at the mirror.
    Of course, scientists cannot know the precise values for both the number and timing of photons at a given time, but through this kind of system they can establish a correlation between the two quantum properties, and thereby squeeze down the uncertainty and the laser’s overall quantum noise.
    Until now, optomechanical squeezing has been realized in large setups that need to be housed in cryogenic freezers. That’s because, even at room temperature, the surrounding thermal energy is enough to have an effect on the system’s movable parts, causing a “jitter” that overwhelms any contribution from quantum noise. To shield against thermal noise, researchers have had to cool systems down to about 10 Kelvin, or -440 degrees Fahrenheit.
    “The minute you need cryogenic cooling, you can’t have a portable, compact squeezer,” Mavalvala says. “That can be a showstopper, because you can’t have a squeezer that lives in a big refrigerator, and then use it in an experiment or some device that operates in the field.”
    Giving light a squeeze
    The team, led by Aggarwal, looked to design an optomechanical system with a movable mirror made from materials that intrinsically absorb very little thermal energy, so that they would not need to cool the system externally. They ultimately designed a very small, 70-micron-wide mirror from alternating layers of gallium arsenide and aluminum gallium arsenide. Both materials are crystals with a very ordered atomic structure that prevents any incoming heat from escaping.
    “Very disordered materials can easily lose energy because there are lots of places electrons can bang and collide and generate thermal motion,” Aggarwal says. “The more ordered and pure a material, the less places it has to lose or dissipate energy.”
    The team suspended this multilayer mirror with a small, 55-micron-long cantilever. The cantilever and multilayer mirror have also been shaped to absorb minimal thermal energy. Both the movable mirror and the cantilever were fabricated by Cole and his colleagues at Crystalline Mirror Solutions, and placed in a cavity with a stationary mirror.
    The system was then installed in a laser experiment built by Corbitt’s group at Louisiana State University, where the researchers made the measurements. With the new squeezer, the researchers were able to characterize the quantum fluctuations in the number of photons versus their timing, as the laser bounced and reflected off both mirrors. This characterization allowed the team to identify and thereby reduce the quantum noise from the laser by 15 percent, producing a more precise “squeezed” light.
    Aggarwal has drawn up a blueprint for researchers to adopt the system to any wavelength of incoming laser light.
    “As optomechanical squeezers become more practical, this is the work that started it,” Mavalvala says. “It shows that we know how to make these room temperature, wavelength-agnostic squeezers. As we improve the experiment and materials, we’ll make better squeezers.”
    This research was funded, in part, by U.S. National Science Foundation. More

  • in

    Quantum classifiers with tailored quantum kernel?

    Quantum information scientists have introduced a new method for machine learning classifications in quantum computing. The non-linear quantum kernels in a quantum binary classifier provide new insights for improving the accuracy of quantum machine learning, deemed able to outperform the current AI technology. The research team led by Professor June-Koo Kevin Rhee from the School […] More

  • in

    New evidence helps form digital reconstruction of most important medieval shrine

    The shrine of Saint Thomas Becket, the most important pilgrimage destination in medieval England — visited for hundreds of years by pilgrims seeking miraculous healing — has been digitally reconstructed for the public, according to how experts believe it appeared before its destruction. In the 1530s, the Reformation in England saw the ornaments and riches […] More

  • in

    Atomic 'Swiss army knife' precisely measures materials for quantum computers

    It images single atoms. It maps atomic-scale hills and valleys on metal and insulating surfaces. And it records the flow of current across atom-thin materials subject to giant magnetic fields. Scientists at the National Institute of Standards and Technology (NIST) have developed a novel instrument that can make three kinds of atom-scale measurements simultaneously. Together, […] More