More stories

  • in

    When water is coming from all sides

    When Hurricanes Harvey (2017) and Florence (2018) hit, it was not solely the storm surge from the Gulf of Mexico and Atlantic Ocean that led to flooding. Inland sources, like rain-swollen rivers, lakes, and suburban culverts also contributed significantly. These factors were missed by many computer models at the time, which underestimated the flood risk.
    “People don’t care as much as to whether flooding is coming from the river or the ocean, especially when both contribute to water levels, as they want to know, ‘Is my house going to be flooded?'” said Edward Myers, branch chief of the Coastal Marine Modeling Branch, located in the Coast Survey Development Laboratory at the National Oceanographic and Atmospheric Administration (NOAA).
    Myers and his colleagues at NOAA are collaborating with Y. Joseph Zhang from the Virginia Institute of Marine Science (VIMS) at William & Mary to develop and test the world’s first three-dimensional operational storm surge model.
    “We started with the right attitude and the right core algorithm,” joked Zhang, research professor at the Center for Coastal Resources Management. “Over the years, we’ve re-engineered the dynamic core multiple times and that led to the current modeling system.”
    Now in its third incarnation, the Semi-implicit Cross-scale Hydroscience Integrated System Model (SCHISM) forecasts coastal flooding in Taiwan, at agencies across the European Union, and elsewhere. It is being considered for operational use by NOAA. (The researchers described the system in the Nov. 2021 issue of EOS, the science news magazine of the American Geophysical Union.)
    SCHISM is designed to serve the needs of a wide range of potential users. “Compound surge and flooding is a world-wide hazard,” Zhang said. “It’s notoriously challenging, especially in the transition zone where the river meets the sea. Lots of factors come into play and interact non-linearly.”
    Surrounding the hydrodynamic core of SCHISM are numerous modules that simulate other phenomena important to flooding. These include air-sea exchange, vegetation, and sediment. Other modules adapt the system for specific events, like oil spills, or to predict conditions, like water quality. More

  • in

    Machine learning for morphable materials

    Flat materials that can morph into three-dimensional shapes have potential applications in architecture, medicine, robotics, space travel, and much more. But programming these shape changes requires complex and time-consuming computations.
    Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a platform that uses machine learning to program the transformation of 2D stretchable surfaces into specific 3D shapes.
    “While machine learning methods have been classically employed for image recognition and language processing, they have also recently emerged as powerful tools to solve mechanics problems,” said Katia Bertoldi, the William and Ami Kuan Danoff Professor of Applied Mechanics at SEAS and senior author of the study. “In this work we demonstrate that these tools can be extended to study the mechanics of transformable, inflatable systems.”
    The research is published in Advanced Functional Materials.
    The research team began by dividing an inflatable membrane into a 10×10 grid of 100 square pixels that can either be soft or stiff. The soft or stiff pixels can be combined in an almost infinite variety of configurations, making manual programming extremely difficult. That’s where machine learning comes in.
    The researchers used what’s known as finite element simulations to sample this infinite design space. Then neural networks used that sample to learn how the location of soft and stiff pixels controls the deformation of the membrane when it is pressurized. More

  • in

    New cloud-based platform opens genomics data to all

    Harnessing the power of genomics to find risk factors for major diseases or search for relatives relies on the costly and time-consuming ability to analyze huge numbers of genomes. A team co-led by a Johns Hopkins University computer scientist has leveled the playing field by creating a cloud-based platform that grants genomics researchers easy access to one of the world’s largest genomics databases.
    Known as AnVIL (Genomic Data Science Analysis, Visualization, and Informatics Lab-space), the new platform gives any researcher with an Internet connection access to thousands of analysis tools, patient records, and more than 300,000 genomes. The work, a project of the National Human Genome Institute (NHGRI), appears today in Cell Genomics.
    “AnVIL is inverting the model of genomics data sharing, offering unprecedented new opportunities for science by connecting researchers and datasets in new ways and promising to enable exciting new discoveries,” said project co-leader Michael Schatz, Bloomberg Distinguished Professor of Computer Science and Biology at Johns Hopkins.
    Typically genomic analysis starts with researchers downloading massive amounts of data from centralized warehouses to their own data centers, a process that is not only time-consuming, inefficient, and expensive, but also makes collaborating with researchers at other institutions difficult.
    “AnVIL will be transformative for institutions of all sizes, especially smaller institutions that don’t have the resources to build their own data centers. It is our hope that AnVIL levels the playing field, so that everyone has equal access to make discoveries,” Schatz said.
    Genetic risk factors for ailments such as cancer or cardiovascular disease are often very subtle, requiring researchers to analyze thousands of patients’ genomes to discover new associations. The raw data for a single human genome comprises about 40GB, so downloading thousands of genomes can take takes several days to several weeks: A single genome requires about 10 DVDs worth of data, so transferring thousands means moving “tens of thousands of DVDs worth of data,” Schatz said. More

  • in

    Photon pairs are more sensitive to rotations than single photons

    In the field of quantum metrology, scientists are developing novel measurement schemes that benefit from quantum features and are more precise and sensitive than classical conventional methods. The team of researchers from Tampere University, Finland, and the National Research Council of Canada has now shown how a simple and powerful technique called two-photon N00N states can be used to create spatially structured quantum states of light that can go beyond the classical limit in rotation estimation. The results are published in the journal Physical Review Letters.
    “Our experimental results demonstrate a simple but powerful way of custom-tailoring two-photon quantum states and holds promise for applications that can achieve high measurement precisions.The simplicity of our method opens a path to creating a measurement system that beats the classical estimation limit with current technologies,” explains Doctoral Researcher and lead author Markus Hiekkamäki.
    Measurement precisions at the absolute quantum limit
    The method utilizes a fundamental quantum feature, i.e., the interference between two photons, which is often termed photon bunching. In contrast to the more common photon bunching into the same physical path, the novel scheme leads to a bunching into the same spatial structure.
    “In our case, the quantum interference results in an entangled state of two photons. Because of the quantum nature of the realized state, the entangled photon pair gives a better measurement precision when compared to the same spatial shape imprinted on a similar amount of single photons or laser light. Using a counter-intuitive quantum response, we were able to show that it will be possible to achieve measurement precisions at the absolute quantum limit ,” says Associate Professor Robert Fickler, leader of the Experimental Quantum Optics group at Tampere University.
    Besides rotational measurements, the method allows the generation of a large variety of different quantum states for transverse-spatial modes. Hence, it could also be utilized in measurements of many different types of systems as well as in fundamental tests of multi-photon quantum states of light.
    After demonstrating the advantage in rotational estimation, the researchers are now planning on using the method to shed new light on another fundamental property of waves called the Gouy phase. In addition, they study how it could be extended into quantum-enhanced measurement schemes in multiple degrees of freedom.
    Story Source:
    Materials provided by Tampere University. Note: Content may be edited for style and length. More

  • in

    A century of quantum mechanics questions the fundamental nature of reality

    Scientists are like prospectors, excavating the natural world seeking gems of knowledge about physical reality. And in the century just past, scientists have dug deep enough to discover that reality’s foundations do not mirror the world of everyday appearances. At its roots, reality is described by the mysterious set of mathematical rules known as quantum mechanics.

    Conceived at the turn of the 20th century and then emerging in its full form in the mid-1920s, quantum mechanics is the math that explains matter. It’s the theory for describing the physics of the microworld, where atoms and molecules interact to generate the world of human experience. And it’s at the heart of everything that made the century just past so dramatically unlike the century preceding it. From cell phones to supercomputers, DVDs to pdfs, quantum physics fueled the present-day electronics-based economy, transforming commerce, communication and entertainment.

    But quantum theory taught scientists much more than how to make computer chips. It taught that reality isn’t what it seems.

    “The fundamental nature of reality could be radically different from our familiar world of objects moving around in space and interacting with each other,” physicist Sean Carroll suggested in a recent tweet. “We shouldn’t fool ourselves into mistaking the world as we experience it for the world as it really is.”

    In a technical paper backing up his tweet, Carroll notes that quantum theory consists of equations that describe mathematical entities roaming through an abstract realm of possible natural events. It’s plausible, Carroll argues, that this quantum realm of mathematical possibilities represents the true, fundamental nature of reality. If so, all the physical phenomena we perceive are just a “higher-level emergent description” of what’s really going on.

    “Emergent” events in ordinary space are real in their own way, just not fundamental, Carroll allows. Belief that the “spatial arena” is fundamental “is more a matter of convenience and convention than one of principle,” he says.

    Carroll’s perspective is not the only way of viewing the meaning of quantum math, he acknowledges, and it is not fully shared by most physicists. But everybody does agree that quantum physics has drastically remodeled humankind’s understanding of nature. In fact, a fair reading of history suggests that quantum theory is the most dramatic shift in science’s conception of reality since the ancient Greeks deposed mythological explanations of natural phenomena in favor of logic and reason. After all, quantum physics itself seems to defy logic and reason.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    It doesn’t, of course. Quantum theory represents the ultimate outcome of superior logical reasoning, arriving at truths that could never be discovered merely by observing the visible world.

    It turns out that in the microworld — beyond the reach of the senses — phenomena play a game with fantastical rules. Matter’s basic particles are not tiny rocks, but more like ghostly waves that maintain multiple possible futures until forced to assume the subatomic equivalent of substance. As a result, quantum math does not describe a relentless cause-and-effect sequence of events as Newtonian science had insisted. Instead science morphs from dictator to oddsmaker; quantum math tells only probabilities for different possible outcomes. Some uncertainty always remains.

    Quantum mechanics says that whether an electron behaves as particle or wave depends on how it is observed.Max Löffler

    The quantum revolution

    The discovery of quantum uncertainty was what first impressed the world with the depth of the quantum revolution. German physicist Werner Heisenberg, in 1927, astounded the scientific community with the revelation that deterministic cause-and-effect physics failed when applied to atoms. It was impossible, Heisenberg deduced, to measure both the location and velocity of a subatomic particle at the same time. If you measured one precisely, some uncertainty remained for the other.

    “A particle may have an exact place or an exact speed, but it can not have both,” as Science News Letter, the predecessor of Science News, reported in 1929. “Crudely stated, the new theory holds that chance rules the physical world.” Heisenberg’s uncertainty principle “is destined to revolutionize the ideas of the universe held by scientists and laymen to an even greater extent than Einstein’s relativity.”

    Werner Heisenberg, shown in 1936, declared with his uncertainty principle that a particle’s position and velocity couldn’t both be precisely measured at the same time.AIP Emilio Segrè Visual Archives

    Heisenberg’s breakthrough was the culmination of a series of quantum surprises. First came German physicist Max Planck’s discovery, in 1900, that light and other forms of radiation could be absorbed or emitted only in discrete packets, which Planck called quanta. A few years later Albert Einstein argued that light also traveled through space as packets, or particles, later called photons. Many physicists dismissed such early quantum clues as inconsequential. But in 1913, the Danish physicist Niels Bohr used quantum theory to explain the structure of the atom. Soon the world realized that reality needed reexamining.

    By 1921, awareness of the quantum revolution had begun to expand beyond the confines of physics conferences. In that year, Science News Bulletin, the first iteration of Science News, distributed what was “believed to be the first popular explanation” of the quantum theory of radiation, provided by American physical chemist William D. Harkins. He proclaimed that the quantum theory “is of much more practical importance” than the theory of relativity.

    “Since it concerns itself with the relations between matter and radiation,” Harkins wrote, quantum theory “is of fundamental significance in connection with almost all processes which we know.” Electricity, chemical reactions and how matter responds to heat all require quantum-theoretic explanations.

    As for atoms, traditional physics asserts that atoms and their parts can move about “in a large number of different ways,” Harkins stated. But quantum theory maintains that “of all the states of motion (or ways of moving) prescribed by the older theory, only a certain number actually do occur.” Therefore, events previously believed “to occur as continuous processes, actually do occur in steps.”

    Quantum theory “is of fundamental significance in connection with almost all processes which we know.”William Harkins

    But in 1921 quantum physics remained embryonic. Some of its implications had been discerned, but its full form remained undeveloped in detail. It was Heisenberg, in 1925, who first transformed the puzzling jumble of clues into a coherent mathematical picture. His decisive advance was developing a way to represent the energies of electrons in atoms using matrix algebra. With aid from German physicists Max Born and Pascual Jordan, Heisenberg’s math became known as matrix mechanics. Shortly thereafter, Austrian physicist Erwin Schrödinger developed a competing equation for electron energies, viewing the supposed particles as waves described by a mathematical wave function. Schrödinger’s “wave mechanics” turned out to be mathematically equivalent to Heisenberg’s particle-based approach, and “quantum mechanics” became the general term for the math describing all subatomic systems.

    Still, some confusion remained. It wasn’t clear how an approach picturing electrons as particles could be equivalent to one supposing electrons to be waves. Bohr, by then regarded as the foremost of the world’s atomic physicists, pondered the question deeply and by 1927 arrived at a novel viewpoint he called complementarity.

    Bohr argued that the particle and wave views were complementary; both were necessary for a full description of subatomic phenomena. Whether a “particle” — say, an electron — exhibited its wave or particle nature depended on the experimental setup observing it. An apparatus designed to find a particle would find a particle; an apparatus geared to detect wave behavior would find a wave.

    At about the same time, Heisenberg derived his uncertainty principle. Just as wave and particle could not be observed in the same experiment, position and velocity could not both be precisely measured at the same time. As physicist Wolfgang Pauli commented, “Now it becomes day in quantum theory.”

    But the quantum adventure was really just beginning.

    In the many worlds interpretation of quantum mechanics, all possible realities exist, but humans perceive just one.Max Löffler

    A great debate

    Many physicists, Einstein among them, deplored the implications of Heisenberg’s uncertainty principle. Its introduction in 1927 eliminated the possibility of precisely predicting the outcomes of atomic observations. As Born had shown, you could merely predict the probabilities for the various possible outcomes, using calculations informed by the wave function that Schrödinger had introduced. Einstein famously retorted that he could not believe that God would play dice with the universe. Even worse, in Einstein’s view, the wave-particle duality described by Bohr implied that a physicist could affect reality by deciding what kind of measurement to make. Surely, Einstein believed, reality existed independently of human observations.

    On that point, Bohr engaged Einstein in a series of discussions that came to be known as the Bohr-Einstein debate, a continuing dialog that came to a head in 1935. In that year, Einstein, with collaborators Nathan Rosen and Boris Podolsky, described a thought experiment supposedly showing that quantum mechanics could not be a complete theory of reality.

    In a brief summary in Science News Letter in May 1935, Podolsky explained that a complete theory must include a mathematical “counterpart for every element of the physical world.” In other words, there should be a quantum wave function for the properties of every physical system. Yet if two physical systems, each described by a wave function, interact and then fly apart, “quantum mechanics … does not enable us to calculate the wave function of each physical system after the separation.” (In technical terms, the two systems become “entangled,” a term coined by Schrödinger.) So quantum math cannot describe all elements of reality and is therefore incomplete.

    Bohr soon responded, as reported in Science News Letter in August 1935. He declared that Einstein and colleagues’ criterion for physical reality was ambiguous in quantum systems. Einstein, Podolsky and Rosen assumed that a system (say an electron) possessed definite values for certain properties (such as its momentum) before those values were measured. Quantum mechanics, Bohr explained, preserved different possible values for a particle’s properties until one of them was measured. You could not assume the existence of an “element of reality” without specifying an experiment to measure it.

    Niels Bohr and Albert Einstein disagreed over the nature of reality.Photograph by Paul Ehrenfest, courtesy of AIP Emilio Segrè Visual Archives, Gamow Collection

    Einstein did not relent. He acknowledged that the uncertainty principle was correct with respect to what was observable in nature, but insisted that some invisible aspect of reality nevertheless determined the course of physical events. In the early 1950s physicist David Bohm developed such a theory of “hidden variables” that restored determinism to quantum physics, but made no predictions that differed from the standard quantum mechanics math. Einstein was not impressed with Bohm’s effort. “That way seems too cheap to me,” Einstein wrote to Born, a lifelong friend.

    Einstein died in 1955, Bohr in 1962, neither conceding to the other. In any case it seemed like an irresolvable dispute, since experiments would give the same results either way. But in 1964, physicist John Stewart Bell deduced a clever theorem about entangled particles, enabling experiments to probe the possibility of hidden variables. Beginning in the 1970s, and continuing to today, experiment after experiment confirmed the standard quantum mechanical predictions. Einstein’s objection was overruled by the court of nature.

    Still, many physicists expressed discomfort with Bohr’s view (commonly referred to as the Copenhagen interpretation of quantum mechanics). One particularly dramatic challenge came from the physicist Hugh Everett III in 1957. He insisted that an experiment did not create one reality from the many quantum possibilities, but rather identified only one branch of reality. All the other experimental possibilities existed on other branches, all equally real. Humans perceive only their own particular branch, unaware of the others just as they are unaware of the rotation of the Earth. This “many worlds interpretation” was widely ignored at first but became popular decades later, with many adherents today.

    Since Everett’s work, numerous other interpretations of quantum theory have been offered. Some emphasize the “reality” of the wave function, the mathematical expression used for predicting the odds of different possibilities. Others emphasize the role of the math as describing the knowledge about reality accessible to experimenters.

    Some interpretations attempt to reconcile the many worlds view with the fact that humans perceive only one reality. In the 1980s, physicists including H. Dieter Zeh and Wojciech Zurek identified the importance of a quantum system’s interaction with its external environment, a process called quantum decoherence. Some of a particle’s many possible realities rapidly evaporate as it encounters matter and radiation in its vicinity. Soon only one of the possible realities remains consistent with all the environmental interactions, explaining why on the human scale of time and size only one such reality is perceived.

    This insight spawned the “consistent histories” interpretation, pioneered by Robert Griffiths and developed in more elaborate form by Murray Gell-Mann and James Hartle. It is widely known among physicists but has received little wider popularity and has not deterred the pursuit of other interpretations. Scientists continue to grapple with what quantum math means for the very nature of reality.

    Using principles of quantum information theory, a particle’s quantum state can be replicated at a distant location, a feat known as quantum teleportation.Max Löffler

    It from quantum bit

    In the 1990s, the quest for quantum clarity took a new turn with the rise of quantum information theory. Physicist John Archibald Wheeler, a disciple of Bohr, had long emphasized that specific realities emerged from the fog of quantum possibilities by irreversible amplifications — such as an electron definitely establishing its location by leaving a mark after hitting a detector. Wheeler suggested that reality as a whole could be built up from such processes, which he compared to yes or no questions — is the electron here? Answers corresponded to bits of information, the 1s and 0s used by computers. Wheeler coined the slogan “it from bit” to describe the link between existence and information.

    Taking the analogy further, one of Wheeler’s former students, Benjamin Schumacher, devised the notion of a quantum version of the classical bit of information. He introduced the quantum bit, or qubit, at a conference in Dallas in 1992.

    Schumacher’s qubit provided a basis for building computers that could process quantum information. Such “quantum computers” had previously been envisioned, in different ways, by physicists Paul Benioff, Richard Feynman and David Deutsch. In 1994, mathematician Peter Shor showed how a quantum computer manipulating qubits could crack the toughest secret codes, launching a quest to design and build quantum computers capable of that and other clever computing feats. By the early 21st century, rudimentary quantum computers had been built; the latest versions can perform some computing tasks but are not powerful enough yet to make current cryptography methods obsolete. For certain types of problems, though, quantum computing may soon achieve superiority over standard computers.

    Quantum computing’s realization has not resolved the debate over quantum interpretations. Deutsch believed that quantum computers would support the many worlds view. Hardly anyone else agrees, though. And decades of quantum experiments have not provided any support for novel interpretations — all the results comply with the traditional quantum mechanics expectations. Quantum systems preserve different values for certain properties until one is measured, just as Bohr insisted. But nobody is completely satisfied, perhaps because the 20th century’s other pillar of fundamental physics, Einstein’s theory of gravity (general relativity), does not fit in quantum theory’s framework.

    For decades now, the quest for a quantum theory of gravity has fallen short of success, despite many promising ideas. Most recently a new approach suggests that the geometry of spacetime, the source of gravity in Einstein’s theory, may in some way be built from the entanglement of quantum entities. If so, the mysterious behavior of the quantum world defies understanding in terms of ordinary events in space and time because quantum reality creates spacetime, rather than occupying it. If so, human observers witness an artificial, emergent reality that gives the impression of events happening in space and time while the true, inaccessible reality doesn’t have to play by the spacetime rules.

    In a crude way this view echoes that of Parmenides, the ancient Greek philosopher who taught that all change is an illusion. Our senses show us the “way of seeming,” Parmenides declared; only logic and reason can reveal “the way of truth.” Parmenides didn’t reach that insight by doing the math, of course (he said it was explained to him by a goddess). But he was a crucial figure in the history of science, initiating the use of rigorous deductive reasoning and relying on it even when it led to conclusions that defied sensory experience.

    Yet as some of the other ancient Greeks realized, the world of the senses does offer clues about the reality we can’t see. “Phenomena are a sight of the unseen,” Anaxagoras said. As Carroll puts it, in modern terms, “the world as we experience it” is certainly related to “the world as it really is.”

    “But the relationship is complicated,” he says, “and it’s real work to figure it out.”

    In fact, it took two millennia of hard work for the Greek revolution in explaining nature to mature into Newtonian science’s mechanistic understanding of reality. Three centuries later quantum physics revolutionized science’s grasp of reality to a comparable extent. Yet the lack of agreement on what it all means suggests that perhaps science needs to dig a little deeper still. More

  • in

    Computer model seeks to explain the spread of misinformation, and suggest counter measures

    It starts with a superspreader, and winds its way through a network of interactions, eventually leaving no one untouched. Those who have been exposed previously may only experience mild effects.
    No, it’s not a virus. It’s the contagious spread of misinformation and disinformation — misinformation that’s fully intended to deceive.
    Now Tufts University researchers have come up with a computer model that remarkably mirrors the way misinformation spreads in real life. The work might provide insight on how to protect people from the current contagion of misinformation that threatens public health and the health of democracy, the researchers say.
    “Our society has been grappling with widespread beliefs in conspiracies, increasing political polarization, and distrust in scientific findings,” said Nicholas Rabb, a Ph.D. computer science student at Tufts School of Engineering and lead author of the study, which came out January 7 in the journal PLOS ONE. “This model could help us get a handle on how misinformation and conspiracy theories are spread, to help come up with strategies to counter them.”
    Scientists who study the dissemination of information often take a page from epidemiologists, modeling the spread of false beliefs on how a disease spreads through a social network. Most of those models, however, treat the people in the networks as all equally taking in any new belief passed on to them by contacts.
    The Tufts researchers instead based their model on the notion that our pre-existing beliefs can strongly influence whether we accept new information. Many people reject factual information supported by evidence if it takes them too far from what they already believe. Health-care workers have commented on the strength of this effect, observing that some patients dying from COVID cling to the belief that COVID does not exist. More

  • in

    New model examines the effects of toxicants on populations in polluted rivers

    When designing environmental policies to limit the damage of river pollution, it is paramount to assess the specific risks that particular pollutants pose to different species. However, rigorously testing the effects of toxicants — like insecticides, plastic debris, pathogens, and chemicals — on entire groups of organisms without severely damaging their whole ecosystems is simply not feasible. Mathematical modeling can provide a flexible way to assess toxicants’ impact on river populations without endangering the environment.
    In a paper that published today in the SIAM Journal on Applied Mathematics, Peng Zhou (Shanghai Normal University) and Qihua Huang (Southwest University, Chongqing) develop a model that describes the interactions between a population and a toxicant in an advective environment — a setting in which a fluid tends to transport material in one direction, like a river. Such a model can help scientists study how the way in which a pollutant moves through a river affects the wellbeing and distribution of the river’s inhabitants.
    Much of the previous experimental research on the ecological risks of toxicants has been performed on individual organisms in controlled laboratory conditions over a fairly short-term basis. The design of environmental management strategies, however, requires an understanding of toxicants’ impact on the health of entire exposed natural populations in the long term. Fortunately, there is an intermediary. “Mathematical models play a crucial role in translating individual responses to population-level impacts,” Huang said.
    The existing models that describe the way in which toxicants affect population dynamics generally ignore many of the properties of water bodies. But in doing so, they are missing a big piece of the puzzle. “In reality, numerous hydrological and physical characteristics of water bodies can have a substantial impact on the concentration and distribution of a toxicant,” Huang said. “[For example], once a toxicant is released into a river, several dispersal mechanisms — such as diffusion and transport — are present that may aid in the spread of the toxicant.”
    Similarly, the models that mathematicians often use to portray the transport of pollutants through a river also do not include all of the necessary components for this study. These are reaction-advection-diffusion equation models, whose solutions can show how pollutants distribute and vary under different influences like changes in the rate of water flow. While such models enable researchers to predict the evolution of toxicant concentrations and assess their impact on the environment, they do not consider toxicant influence on the dynamics of affected populations. Zhou and Huang thus expanded upon this type of model, adding new elements that allowed them to explore the interaction between a toxicant and a population in a polluted river.
    The authors’ model consists of two reaction-diffusion-advection equations — one that governs the population’s dispersal and growth under the toxicant’s influence, and another that describes the processes that the toxicant experiences. “As far as we know, our model represents the first effort to model the population-toxicant interactions in an advective environment by using reaction-diffusion-advection equations,” Zhou said. “This new model could potentially open a [novel] line of research.”
    The model allows Zhou and Huang to tweak different factors and investigate the resulting changes to the ecosystem. They tried altering the river’s flow speed and the advection rate — i.e., the rate at which the toxicant or organisms are carried downstream — and observing these parameters’ influence on the population persistence and distribution of both the population and toxicant. These theoretical results can provide insights that could help inform ecological policies when taken in concert with other information.
    One scenario that the researchers studied involved a toxicant that had a much slower advection rate than the population and thus was not washed away as easily. The model showed that, intuitively, the population density decreases with increasing water flow because more individuals are carried downstream and out of the river area in question. However, the concentration of the toxicant increases with the increasing flow speed because it can resist the downstream current and the organisms are often swept away before they can uptake it.
    In the opposite case, the toxicant has a faster advection rate and is therefore much more sensitive to water flow speed than the population. Increasing the water flow then reduces the toxicant concentration by sweeping the pollutants away. For a medium flow speed, the highest population density occurs downstream because the water flow plays a trade-off role; it transports more toxicants away but also carries more individuals downstream.
    This demonstrates that a higher sensitivity of a pollutant to water flow is generally more advantageous to population persistence. “In the absence of toxicants, it is generally known that the higher the flow speed, the more individuals will be washed out of the river,” Zhou said. “However, our findings suggest that, for a given toxicant level, population abundance may increase as flow rate increases.”
    By providing this model with the parameters for certain species and pollutants, one may be able to determine criteria regarding the water quality that is necessary to maintain aquatic life. This outcome could ultimately aid in the development of policy guidelines surrounding the target species and toxicants. “The findings here offer the basis for effective decision-making tools for water and environment managers,” Huang said. Managers could connect the results from the model with other factors, such as what may happen to the pollutant after it washes downstream.
    Further extensions to Zhou and Huang’s new model could make it even more applicable to real river ecosystems — for example, by allowing the flow velocity and release of toxicants to vary over time, or accounting for the different ways in which separate species may respond to the same pollutant. This mathematical model’s capability to find the population-level effects of toxicants might play a critical part in the accurate assessment of pollutants’ risk to rivers and their inhabitants. More

  • in

    Climate change communication should focus less on specific numbers

    What’s in a number? The goals of the 2021 United Nations’ climate summit in Glasgow, Scotland, called for nations to keep a warming limit of 1.5 degrees Celsius “within reach.” But when it comes to communicating climate change to the public, some scientists worry that too much emphasis on a specific number is a poor strategy.

    Focusing on one number obscures a more important point, they say: Even if nations don’t meet this goal to curb global climate change, any progress is better than none at all. Maybe it’s time to stop talking so much about one number.

    On November 13, the United Nations’ 26th annual climate change meeting, or COP26, ended in a new climate deal, the Glasgow Climate Pact. In that pact, the 197 assembled nations reaffirmed a common “ideal” goal: limiting global warming to no more than 1.5 degrees C by 2100, relative to preindustrial times (SN: 12/17/18).

    Holding temperature increases to 1.5 degrees C, researchers have found, would be a significant improvement over limiting warming to 2 degrees C, as agreed upon in the 2015 Paris Agreement (SN: 12/12/15). The more stringent limit would mean fewer global hazards, from extreme weather to the speed of sea level rise to habitat loss for species (SN: 12/17/18).

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    The trouble is that current national pledges to reduce greenhouse gas emissions are nowhere near enough to meet either of those goals. Even accounting for the most recent national pledges to cut emissions, the average global temperature by 2100 is likely to be between 2.2 and 2.7 degrees C warmer than it was roughly 150 years ago (SN: 10/26/21).

    And that glaring disparity is leading not just to fury and frustration for many, but also to despair and pervasive feelings of doom, says paleoclimatologist Jessica Tierney of the University of Arizona in Tucson.

    “It’s something I’ve been thinking about for a while, but I think it was definitely made sort of more front and center with COP,” Tierney says. She describes one news story in the wake of the conference that “mentioned 1.5 degrees C, and then said this is the threshold over which scientists have told us that catastrophic climate change will occur.”

    The article reveals a fundamental misunderstanding of what the agreed-upon limit really represents, Tierney explains. “A lot of my students, for example, are really worried about climate change, and they are really worried about passing some kind of boundary. People have this idea that if you pass that boundary, you sort of tip over a cliff.”

    The climate system certainly has tipping points — thresholds past which, for example, an ice sheet begins to collapse and it’s not possible to stop or reverse the process. But, Tierney says, “we really should start communicating more about the continuum of climate change. Obviously, less warming is better.” However, “if we do blow by 1.5, we don’t need to panic. It’s okay if we can stop at 1.6 or 1.7.”

    Tierney notes that climate communications expert Susan Hassol, director of the Colorado-based nonprofit Climate Communication, has likened the approach to missing an exit while driving on the highway. “If you miss the 1.5 exit, you just slow down and take the next one, or the next one,” Tierney says. “It’s still better than hitting the gas.”

    Target numbers do have some uses, notes climate scientist Joeri Rogelj of Imperial College London. After decades of international climate negotiations and wrangling over targets and strategies, the world has now agreed that 1.5 degrees C of warming is a desirable target for many countries, says Rogelj, who was one of the lead authors on the Intergovernmental Panel on Climate Change’s 2018 special report on global warming.

    A global temperature limit “is a good proxy for avoiding certain impacts,” he adds. “These numbers are basically how to say this.”

    But Rogelj agrees that focusing too much on a particular number may be counterproductive, even misleading. “There is a lot of layered meaning under those numbers,” he says. “The true interests, the true goals of countries are not those numbers, but avoiding the impacts that underlie them.”

    And framing goals as where we should be by the end of the century — such as staying below 1.5 degrees C by the year 2100 — can give too much leeway to stall on reducing emissions. For example, such framing implies the planet could blow past the temperature limit by mid-century and rely on still-unproven carbon dioxide removal strategies to bring warming back down in the next few decades, Rogelj and colleagues wrote in 2019 in Nature.

    Banking on future technologies that have yet to be developed is worrisome, Rogelj notes. After all, some warming-related extreme events, such as heat waves, are more reversible than others, such as sea level rise (SN: 8/9/21). Heat wave incidence may decrease once carbon is removed from the atmosphere, but the seas will stay high.

    Rogelj acknowledges that it’s a challenge to communicate the urgency of taking action to reduce emissions now without spinning off into climate catastrophe or cliff edge narratives. For his part, Rogelj says he’s trying to tackle this challenge by adding a hefty dose of reality in his scientific presentations, particularly those aimed at nonscientists.

    He starts with pictures of forest fires and floods in Europe from 2021. “I say, ‘Look, this is today, 1.1 degrees warmer than preindustrial times,’” Rogelj explains. “‘Do you think this is safe? Today is not safe. And so, 1.5 won’t be safer than today; it will be worse than today. But it will be better than 1.6. And 1.6 won’t be the end of the world.’ And that kind of makes people think about it a bit differently.” More