More stories

  • in

    Intense drought or flash floods can shock the global economy

    Extremes in rainfall — whether intense drought or flash floods — can catastrophically slow the global economy, researchers report in the Jan. 13 Nature. And those impacts are most felt by wealthy, industrialized nations, the researchers found.

    A global analysis showed that episodes of intense drought led to the biggest shocks to economic productivity. But days with intense deluges — such as occurred in July 2021 in Europe — also produced strong shocks to the economic system (SN: 8/23/21). Most surprising, though, was that agricultural economies appeared to be relatively resilient against these types of shocks, says Maximilian Kotz, an environmental economist at the Potsdam Institute for Climate Impact Research in Germany. Instead, two other business sectors — manufacturing and services — were the most hard-hit.

    As a result, the nations most affected by rainfall extremes weren’t those that tended to be poorer, with agriculture-dependent societies, but the wealthiest nations, whose economies are tied more heavily to manufacturing and services, such as banking, health care and entertainment.

    It’s well established that rising temperatures can take a toll on economic productivity, for example by contributing to days lost at work or doctors’ visits (SN: 11/28/18). Extreme heat also has clear impacts on human behavior (SN: 8/18/21). But what effect climate change–caused shifts in rainfall might have on the global economy hasn’t been so straightforward.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    That’s in part because previous studies looking at a possible connection between rainfall and productivity have focused on changes in yearly precipitation, a timeframe that “is just too coarse to really describe what’s actually happening [in] the economy,” Kotz says. Such studies showed that more rain in a given year was basically beneficial, which makes sense in that having more water available is good for agriculture and other human activities, he adds. “But these findings were mainly focused on agriculturally dependent economies and poorer economies.”

    In the new study, Kotz and his colleagues looked at three timescales — annual, monthly and daily rainfall — and examined what happened to economic output for time periods in which the rainfall deviated from average historical values. In particular, Kotz says, they introduced two new measures not considered in previous studies: the amount of rainy days that a region gets in a year and extreme daily rainfall. The team then examined these factors across 1,554 regions around the world — which included many subregions within 77 countries — from 1979 to 2019.

    The disparity over which regions are hit hardest is “at odds with the conventional wisdom” — and with some previous studies — that agriculture is vulnerable to extreme rainfall, writes Xin-Zhong Liang, an atmospheric scientist at the University of Maryland in College Park, in a commentary in the same issue of Nature. Researchers may need to incorporate other factors in future assessments, such as growth stages of crops, land drainage or irrigation, in order to really understand how these extremes affect agriculture, Liang writes.

    “That was definitely surprising for us as well,” Kotz says. Although the study doesn’t specifically try to answer why manufacturing and services were so affected, it makes intuitive sense, he says. Flooding, for example, can damage infrastructure and disrupt transportation, effects that can then propagate along supply chains. “It’s feasible that these things might be most important in manufacturing, where infrastructure is very important, or in the services sectors, where the human experience is very much dictated by these daily aspects of weather and rainfall.”

    Including daily and monthly rainfall extremes in this type of analysis was “an important innovation” because it revealed new economic vulnerabilities, says Tamma Carleton, an environmental economist at the University of California, Santa Barbara, who was not involved in the new work. However, Carleton says, “the findings in the paper are not yet conclusive on who is most vulnerable and why, and instead raise many important questions for future research to unpack.”

    Extreme rainfall events, including both drought and deluge, will occur more frequently as global temperatures rise, the United Nations’ Intergovernmental Panel on Climate Change noted in August (SN: 8/9/21). The study’s findings, Kotz says, offer yet another stark warning to the industrialized, wealthy world: Human-caused climate change will have “large economic consequences.” More

  • in

    Quantum dots boost perovskite solar cell efficiency and scalability

    Perovskites are hybrid compounds made from metal halides and organic constituents. They show great potential in a range of applications, e.g. LED lights, lasers, and photodetectors, but their major contribution is in solar cells, where they are poised to overtake the market from their silicon counterparts.
    One of the obstacles facing the commercialization of perovskite solar cells is that their power-conversion efficiency and operational stability drop as they scale up, making it a challenge to maintain high performance in a complete solar cell.
    The problem is partly with the cell’s electron-transport layer, which ensures that the electrons produced when the cell absorbs light will transfer efficiently to the device’s electrode. In perovskite solar cells, the electron-transport layer is made with mesoporous titanium dioxide, which shows low electron mobility, and is also susceptible to adverse, photocatalytic events under ultraviolet light.
    In a new publication in Science, scientists led by Professor Michael Grätzel at EPFL and Dr Dong Suk Kim at the Korea Institute of Energy Research have found an innovative way to increase the performance and maintain it at a high level in perovskite solar cells even at large scales. The innovative idea was to replace the electron-transport layer with a thin layer of quantum dots.
    Quantum dots are nanometer-sized particle that act as semiconductors, and emit light of specific wavelengths (colors) when they illuminated. Their unique optical properties make quantum dots ideal for use in a variety of optical applications, including photovoltaic devices.
    The scientists replaced the titanium dioxide electron-transport layer of their perovskite cells with a thin layer of polyacrylic acid-stabilized tin(IV) oxide quantum dots, and found that it enhanced the devices’ light-capturing capacity, while also suppressing nonradiative recombination, an efficiency-sapping phenomenon that sometimes takes on the interface between the electron-transport layer and the actual perovskite layer.
    By using the quantum dot layer, the researchers found that perovskite solar cells of 0.08 square centimeters attained a record power-conversion efficiency of 25.7% (certified 25.4%) and high operational stability, while facilitating the scale-up. When increasing the surface area of the solar cells to 1, 20, and 64 square centimeters, power-conversion efficiency measured at 23.3, 21.7, and 20.6% respectively.
    Other contributors Ulsan National Institute of Science and Technology University of Ulsan Zurich University of Applied Sciences Uppsala University
    Story Source:
    Materials provided by Ecole Polytechnique Fédérale de Lausanne. Original written by Nik Papageorgiou. Note: Content may be edited for style and length. More

  • in

    Advancing materials science with the help of biology and a dash of dish soap

    Compounds that form tiny crystals hold secrets that could advance renewable energy generation and semiconductor development. Revealing the arrangement of their atoms has already allowed for breakthroughs in materials science and solar cells. However, existing techniques for determining these structures can damage sensitive microcrystals.
    Now scientists have a new tool in their tool belts: a system for investigating microcrystals by the thousands with ultrafast pulses from an X-ray free-electron laser (XFEL), which can collect structural information before damage sets in. This approach, developed over the past decade to study proteins and other large biological molecules at the Department of Energy’s SLAC National Accelerator Laboratory, has now been applied for the first time to small molecules that are of interest to chemistry and materials science.
    Researchers from the University of Connecticut, SLAC, DOE’s Lawrence Berkeley National Laboratory and other institutions developed the new process, called small molecule serial femtosecond X-ray crystallography or smSFX, to determine the structures of three compounds that form microcrystal powders, including two that were previously unknown. The experiments took place at SLAC’s Linac Coherent Light Source (LCLS) XFEL and the SACLA XFEL in Japan.
    The new approach is likely to have a big impact since it should be “broadly applicable across XFEL and synchrotron radiation facilities equipped for serial crystallography,” the research team wrote in a paper published today in Nature.
    Disentangling metal compounds
    Researchers used the method to determine the structures of two metal-organic materials, thiorene and tethrene, for the first time. Both are potential candidates for use in next-generation field effect transistors, energy storage devices, and solar cells and panels. Mapping thiorene and tethrene allowed researchers to better understand why some other metal-organic materials glow bright blue under ultraviolet light, which the scientists compared to Frodo’s magical sword, Sting, in The Lord of the Rings. More

  • in

    Researchers simulate behavior of living 'minimal cell' in three dimensions

    Scientists report that they have built a living “minimal cell” with a genome stripped down to its barest essentials — and a computer model of the cell that mirrors its behavior. By refining and testing their model, the scientists say they are developing a system for predicting how changes to the genomes, living conditions or physical characteristics of live cells will alter how they function.
    They report their findings in the journal Cell.
    Minimal cells have pared-down genomes that carry the genes necessary to replicate their DNA, grow, divide and perform most of the other functions that define life, said Zaida (Zan) Luthey-Schulten, a chemistry professor at the University of Illinois Urbana-Champaign who led the work with graduate student Zane Thornburg. “What’s new here is that we developed a three-dimensional, fully dynamic kinetic model of a living minimal cell that mimics what goes on in the actual cell,” Luthey-Schulten said.
    The simulation maps out the precise location and chemical characteristics of thousands of cellular components in 3D space at an atomic scale. It tracks how long it takes for these molecules to diffuse through the cell and encounter one another, what kinds of chemical reactions occur when they do, and how much energy is required for each step.
    To build the minimal cell, scientists at the J. Craig Venter Institute in La Jolla, California, turned to the simplest living cells — the mycoplasmas, a genus of bacteria that parasitize other organisms. In previous studies, the JCVI team built a synthetic genome missing as many nonessential genes as possible and grew the cell in an environment enriched with all the nutrients and factors needed to sustain it. For the new study, the team added back a few genes to improve the cell’s viability. This cell is simpler than any naturally occurring cell, making it easier to model on a computer.
    Simulating something as enormous and complex as a living cell relies on data from decades of research, Luthey-Schulten said. To build the computer model, she and her colleagues at Illinois had to account for the physical and chemical characteristics of the cell’s DNA; lipids; amino acids; and gene-transcription, translation and protein-building machinery. They also had to model how each component diffused through the cell, keeping track of the energy required for each step in the cell’s life cycle. NVIDIA graphic processing units were used to perform the simulations.
    “We built a computer model based on what we knew about the minimal cell, and then we ran simulations,” Thornburg said. “And we checked to see if our simulated cell was behaving like the real thing.”
    The simulations gave the researchers insight into how the actual cell “balances the demands of its metabolism, genetic processes and growth,” Luthey-Schulten said. For example, the model revealed that the cell used the bulk of its energy to import essential ions and molecules across its cell membrane. This makes sense, Luthey-Schulten said, because mycoplasmas get most of what they need to survive from other organisms.
    The simulations also allowed Thornburg to calculate the natural lifespan of messenger RNAs, the genetic blueprints for building proteins. They also revealed a relationship between the rate at which lipids and membrane proteins were synthesized and changes in membrane surface area and cell volume.
    “We simulated all of the chemical reactions inside a minimal cell — from its birth until the time it divides two hours later,” Thornburg said. “From this, we get a model that tells us about how the cell behaves and how we can complexify it to change its behavior.”
    “We developed a three-dimensional, fully dynamic kinetic model of a living minimal cell,” Luthey-Schulten said. “Our model opens a window on the inner workings of the cell, showing us how all of the components interact and change in response to internal and external cues. This model — and other, more sophisticated models to come — will help us better understand the fundamental principles of life.” More

  • in

    Towards quantum simulation of false vacuum decay

    Phase transitions are everywhere, ranging from water boiling to snowflakes melting, and from magnetic transitions in solids to cosmological phase transitions in the early universe. Particularly intriguing are quantum phase transitions that occur at temperatures close to absolute zero and are driven by quantum rather than thermal fluctuations.
    Researchers in the University of Cambridge studied properties of quantum phases and their transitions using ultracold atoms in an optical lattice potential (formed by a set of standing wave lasers). Typically, the transition from a Mott insulator (MI) to a superfluid (SF), which is governed by the interplay of the atom-atom interactions and the hopping of atoms, is a continuous transition, where the system undergoes a smooth continuous change crossing the phase transition point.
    However, many phase transitions are discontinuous, such as water freezing to ice, or the transition thought to have triggered the inflation period in the early universe. These are called ‘first-order transitions’ and for instance allow both phases to coexist — just like ice blocks in a glass of water — and can lead to hysteresis and metastability, where a system remains stuck in its original phase (the false vacuum) even though the ground state has changed.
    By resonantly shaking the position of the lattice potential, the researchers could couple or “mix” the first two bands of the lattice. For the right parameters, this can excite the atoms from the lowest band into the first excited band, where they would form a new superfluid in which the atoms appear at the edge of the band. Crucially, the transition from the original Mott insulator in the lowest band to the resulting staggered superfluid in the excited band can be first-order (discontinuous), because the non-staggered order in the Mott insulator is incompatible with the staggered order of this superfluid — so the system has to choose one. The researchers could directly observe the metastability and hysteresis associated with this first-order transition by monitoring how fast one phase changes into another, or not. The findings are published in the journal Nature Physics.
    “We realised a very flexible platform where phase transitions could be tuned from continuous to discontinuous by changing the shaking strength. This demonstration opens up new opportunities for exploring the role of quantum fluctuations in first-order phase transitions, for instance, the false vacuum decay in the early universe,” said first author Dr Bo Song from Cambridge’s Cavendish Laboratory. “It is really fascinating that we are on the road to cracking the mystery of the hot and dense early universe using such a cold and tiny atomic ensemble.”
    “We are excited to enhance the scope of quantum simulators from condensed matter settings towards potential simulations of the early universe. While there clearly is a long way still to go, this work is an important first step,” added Professor Ulrich Schneider, who led the research at the Cavendish Laboratory. “This work also provides a testbed for exploring the spontaneous formation of spatial structures when a strongly interacting quantum system undergoes a discontinuous transition.”
    “The underlying physics involves ideas that have a long history at the Cavendish, from Nevill Mott (on correlations) to Pyotr Kapitsa (on superfluidity), and even using shaking to effect dynamical control in a manner explained by Kapitsa but put to use in a way he would never have envisaged,” explained Professor Nigel Cooper, also from the Cavendish.
    The research is funded in part by the European Research Council (ERC), and the UK Engineering and Physical Sciences Research Council (EPSRC) as well as the Simons Foundation.
    Story Source:
    Materials provided by University of Cambridge. Note: Content may be edited for style and length. More

  • in

    Towards compact quantum computers thanks to topology

    Researchers at PSI have compared the electron distribution below the oxide layer of two semiconductors. The investigation is part of an effort to develop particularly stable quantum bits -and thus, in turn, particularly efficient quantum computers. They have now published their latest research, which is supported in part by Microsoft, in the scientific journal Advanced Quantum Technologies.
    By now, the future of computing is inconceivable without quantum computers. For the most part, these are still in the research phase. They hold the promise of speeding up certain calculations and simulations by orders of magnitude compared to classical computers.
    Quantum bits, or qubits for short, form the basis of quantum computers. So-called topological quantum bits are a novel type that might prove to be superior. To find out how these could be created, an international team of researchers has carried out measurements at the Swiss Light Source SLS at PSI.
    More stable quantum bits
    “Computer bits that follow the laws of quantum mechanics can be achieved in different ways,” explains Niels Schröter, one of the study’s authors. He was a researcher at PSI until April 2021, when he moved to the Max Planck Institute of Microstructure Physics in Halle, Germany. “Most types of qubits unfortunately lose their information quickly; you could say they are forgetful qubits.” There is a technical solution to this: Each qubit is backed up with a system of additional qubits that correct any errors that occur. But this means that the total number of qubits needed for an operational quantum computer quickly rises into the millions.
    “Microsoft’s approach, which we are now collaborating on, is quite different,” Schröter continues. “We want to help create a new kind of qubit that is immune to leakage of information. This would allow us to use just a few qubits to achieve a slim, functioning quantum computer.”
    The researchers hope to obtain such immunity with so-called topological quantum bits. These would be something completely new that no research group has yet been able to create. More

  • in

    A new approach to a $1 million mathematical enigma

    Numbers like π, e and φ often turn up in unexpected places in science and mathematics. Pascal’s triangle and the Fibonacci sequence also seem inexplicably widespread in nature. Then there’s the Riemann zeta function, a deceptively straightforward function that has perplexed mathematicians since the 19th century. The most famous quandary, the Riemann hypothesis, is perhaps the greatest unsolved question in mathematics, with the Clay Mathematics Institute offering a $1 million prize for a correct proof.
    UC Santa Barbara physicist Grant Remmen believes he has a new approach for exploring the quirks of the zeta function. He has found an analogue that translates many of the function’s important properties into quantum field theory. This means that researchers can now leverage the tools from this field of physics to investigate the enigmatic and oddly ubiquitous zeta function. His work could even lead to a proof of the Riemann hypothesis. Remmen lays out his approach in the journal Physical Review Letters.
    “The Riemann zeta function is this famous and mysterious mathematical function that comes up in number theory all over the place,” said Remmen, a postdoctoral scholar at UCSB’s Kavli Institute for Theoretical Physics. “It’s been studied for over 150 years.”
    An outside perspective
    Remmen generally doesn’t work on cracking the biggest questions in mathematics. He’s usually preoccupied chipping away at the biggest questions in physics. As the fundamental physics fellow at UC Santa Barbara, he normally devotes his attention to topics like particle physics, quantum gravity, string theory and black holes. “In modern high-energy theory, the physics of the largest scales and smallest scales both hold the deepest mysteries,” he remarked.
    One of his specialties is quantum field theory, which he describes as a “triumph of 20th century physics.” Most people have heard of quantum mechanics (subatomic particles, uncertainty, etc.) and special relativity (time dilation, E=mc2, and so forth). “But with quantum field theory, physicists figured out how to combine special relativity and quantum mechanics into a description of how particles moving at or near the speed of light behave,” he explained. More

  • in

    New simulations can improve avalanche forecasting

    Computer simulations of snow cover can accurately forecast avalanche hazard, according to a new international study involving researchers from Simon Fraser University.
    Currently, avalanche forecasts in Canada are made by experienced professionals who rely on data from local weather stations and on-the-ground observations from ski and backcountry ski operators, avalanche control workers for transportation and industry, and volunteers who manually test the snowpack.
    But simulated snow cover models developed by a team of researchers are able detect and track weak layers of snow and identify avalanche hazard in a completely different way — and can provide forecasters with another reliable tool when local data is insufficient or not available, according to a new study published in the journal Cold Regions Science and Technology.
    “As far as natural hazards go, avalanches are still one of the leading causes of fatalities in Canada,” says Simon Horton, a post-doctoral fellow with the SFU Centre for Natural Hazards Research and a forecaster with Avalanche Canada. “We’ve had these complex models that simulate the layers in the snowpack for a few decades now and they’re getting more and more accurate, but it’s been difficult to find out how to apply that to actual decision-making and improving safety.”
    Researchers took 16 years’ worth of daily meteorological, snow cover and avalanche data from two sites in Canada (Whistler and Rogers Pass, both in British Columbia) and Weissfluhjoch in Davos, Switzerland and ran computer simulations that could classify different avalanche situations.
    The simulations could determine avalanche risk, for either natural or artificial release, for problem types such as new snow, wind slab, persistent weak layers and wet snow conditions.
    “In the avalanche forecasting world, describing avalanche problems — the common scenarios that you might expect to find — are a good way for forecasters to describe avalanche hazard and communicate it to the public, so they know what kind of conditions to expect when they head out,” says Horton. “So that information is already available, except those are all done through expert assessment based on what they know from available field observations. In a lot of situations, there’s a fair bit of uncertainty about the human assessment of what these types of avalanche problems will be.
    “That’s where having more automated tools that can help predict potential hazards can help forecasters better prepare an accurate, precise forecast.”
    The results of the study showed the modelling was consistent with the real observed frequencies of avalanches over those 16 years and that the approach has potential to support avalanche forecasting in the future.
    Researchers also believe the modelling might be useful to study the future impacts of climate change on snow instability.
    Story Source:
    Materials provided by Simon Fraser University. Note: Content may be edited for style and length. More