More stories

  • in

    A coral pollution study unexpectedly helped explain Hurricane Maria’s fury

    Hurricane Maria struck the island of Puerto Rico early on September 20, 2017, with 250-kilometer-per-hour winds, torrential rains and a storm surge up to three meters high. In its wake: nearly 3,000 people dead, an almost yearlong power outage and over $90 billion in damages to homes, businesses and essential infrastructure, including roads and bridges.

    Geologist and diver Milton Carlo took shelter at his house in Cabo Rojo on the southwest corner of the island with his wife, daughter and infant grandson. He watched the raging winds of the Category 4 hurricane lift his neighbor’s SUV into the air, and remembers those hours as some of the worst of his life.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    For weeks, the rest of the world was in the dark about the full extent of the devastation, because Maria had destroyed the island’s main weather radar and almost all cell phone towers.

    Far away on the U.S. West Coast, in Santa Cruz, Calif., oceanographer Olivia Cheriton watched satellite radar images of Maria passing over the instruments she and her U.S. Geological Survey team had anchored a few kilometers southwest of Puerto Rico. The instruments, placed offshore from the seaside town of La Parguera, were there to track pollution circulating around some of the island’s endangered corals.

    More than half a year went by before she learned the improbable fate of those instruments: They had survived and had captured data revealing hurricane-related ocean dynamics that no scientist had ever recorded.

    The wind-driven coastal currents interacted with the seafloor in a way that prevented Maria from drawing cold water from the depths of the sea up to the surface. The sea surface stayed as warm as bathwater. Heat is a hurricane’s fuel source, so a warmer sea surface leads to a more intense storm. As Cheriton figured out later, the phenomenon she stumbled upon likely played a role in maintaining Maria’s Category 4 status as it raked Puerto Rico for eight hours.

    “There was absolutely no plan to capture the impact of a storm like Maria,” Cheriton says. “In fact, if we somehow could’ve known that a storm like that was going to occur, we wouldn’t have put hundreds of thousands of dollars’ worth of scientific instrumentation in the water.”

    A storm’s path is guided by readily observable, large-scale atmospheric features such as trade winds and high-pressure zones. Its intensity, on the other hand, is driven by weather events inside the hurricane and wave action deep below the ocean’s surface. The findings by Cheriton and colleagues, published May 2021 in Science Advances, help explain why hurricanes often get stronger before making landfall and can therefore help forecasters make more accurate predictions.

    Reef pollution

    Cheriton’s original research objective was to figure out how sea currents transport polluted sediments from Guánica Bay — where the Lajas Valley drains into the Caribbean Sea — to the pristine marine ecosystems 10 kilometers west in La Parguera Natural Reserve, famous for its bioluminescent waters.

    Endangered elkhorn and mountainous star corals, called “the poster children of Caribbean reef decline” by marine geologist Clark Sherman, live near shore in some of the world’s highest recorded concentrations of now-banned industrial chemicals. Those polychlorinated biphenyls, or PCBs, hinder coral reproduction, growth, feeding and defensive responses, says Sherman, of the University of Puerto Rico–Mayagüez.

    Elkhorn coral (left) and mountainous star coral (right) were once ubiquitous in the Caribbean. Their numbers have dropped greatly due to bleaching and disease. Pollution is partly to blame.  FROM LEFT: NICK HOBGOOD/WIKIMEDIA COMMONS (CC BY-SA 3.0); NOAA FISHERIES

    Half of corals in the Caribbean have died since monitoring began in the 1970s, and pollution is a major cause, according to an April 2020 study in Science Advances. Of particular interest to Cheriton, Sherman and their colleagues was whether the pollution had reached deepwater, or mesophotic, reefs farther offshore, which could be a refuge for coral species that were known to be dying in shallower areas.

    The main artery for this pollution is the Rio Loco — which translates to “Crazy River.” It spews a toxic runoff of eroded sediments from the Lajas Valley’s dirt roads and coffee plantations into Guánica Bay, which supports a vibrant fishing community. Other possible contributors to the pollution — oil spills, a fertilizer plant, sewage and now-defunct sugar mills — are the subject of investigations by public health researchers and the U.S. Environmental Protection Agency.

    In June 2017, the team convened in La Parguera to install underwater sensors to measure and track the currents in this threatened marine environment. From Sherman’s lab on a tiny islet overrun with iguanas the size of house cats, he and Cheriton, along with team leader and USGS research geologist Curt Storlazzi and USGS physical scientist Joshua Logan, launched a boat into choppy seas.

    Marine geologist Clark Sherman dives amid colonies of healthy great star corals, black corals, a large sea fan and a variety of sponges along the steep island shelf of southwest Puerto Rico. Sherman helped investigate whether pollution was reaching these deepwater reefs.E. TUOHY/UNIV. OF PUERTO RICO–MAYAGÜEZ

    At six sites near shore, Storlazzi, Sherman and Logan dove to the seafloor and used epoxy to anchor pressure gauges and batonlike current meters. Together the instruments measured hourly temperature, wave height and current speed. The team then moved farther offshore where the steep island shelf drops off at a 45-degree angle to a depth of 60 meters, but the heavy ocean chop scuttled their efforts to install instruments there.

    In June 2017, research geologist Curt Storlazzi (left) and physical scientist Joshua Logan (right) prepare to dive near Puerto Rico’s Guánica Bay to install instruments for monitoring currents suspected of delivering pollution to coral reefs.USGS

    For help working in the difficult conditions, Sherman enlisted two expert divers for a second attempt: Carlo, the geologist and diving safety officer, and marine scientist Evan Tuohy, both of the University of Puerto Rico–­Mayagüez. The two were able to install the most important and largest piece, a hydroacoustic instrument comprising several drums fastened to a metal grid, which tracked the direction and speed of currents every minute using pulsating sound waves. A canister containing temperature and salinity sensors took readings every two minutes. Above this equipment, an electric thermometer extended to within 12 meters of the surface, registering temperature every five meters vertically every few seconds.

    The instruments installed by Storlazzi, Logan and others collected unexpected underwater ocean observations during Hurricane Maria. An acoustic Doppler current profiler (left) used pulsating sound waves to measure the direction and speed of currents at the shelf break and slope site about 12 kilometers offshore of La Parguera. A Marotte current meter (right) measured wave height, current speed and temperature at six spots close to shore.USGS

    Working in concert, the instruments gave a high-resolution, seafloor-to-surface snapshot of the ocean’s hydrodynamics on a near-continuous basis. The equipment had to sit level on the sloping seafloor so as not to skew the measurements and remain firmly in place. Little did the researchers know that the instruments would soon be battered by one of the most destructive storms in history.

    Becoming Maria

    The word hurricane derives from the Caribbean Taino people’s Huricán, god of evil. Some of the strongest of these Atlantic tropical cyclones begin where scorching winds from the Sahara clash with moist subtropical air over the island nation of Cape Verde off western Africa. The worst of these atmospheric disturbances create severe thunderstorms with giant cumulonimbus clouds that flatten out against the stratosphere. Propelled by the Earth’s rotation, they begin to circle counterclockwise around each other — a phenomenon known as the Coriolis effect.

    Weather conditions that summer had already spawned two monster hurricanes: Harvey and Irma. By late September, the extremely warm sea surface — 29º Celsius or hotter in some places — gave up its heat energy by way of evaporation into Maria’s rushing winds. All hurricanes begin as an area of low pressure, which in turn sucks in more wind, accelerating the rise of hot air, or convection. Countervailing winds known as shear can sometimes topple the cone of moist air spiraling upward. But that didn’t happen, so Maria continued to grow in size and intensity.

    Meteorologists hoped that Maria would lose force as it moved across the Caribbean, weakened by the wake of cooler water Irma had churned up two weeks earlier. Instead, Maria tracked south, steaming toward the eastern Caribbean island of Dominica. Within 15 hours of making landfall, its maximum sustained wind speed doubled, reaching a house-leveling 260 kilometers per hour. That doubling intensified the storm from a milder (still dangerous) Category 1 to a strong Category 5.

    NOAA’s computer forecasting models did not anticipate such rapid intensification. Irma had also raged with unforeseen intensity.

    After striking Dominica hard, Maria’s eyewall broke down, replaced by an outer band of whipping thunderstorms. This slightly weakened Maria to 250 kilometers per hour before it hit Puerto Rico, while expanding the diameter of the storm’s eyewall — the area of strong winds and heaviest precipitation — to 52 kilometers. That’s close to the width of the island.

    Hurricane Maria made landfall on Puerto Rico early in the morning on September 20, 2017, and cut across the island diagonally toward the northwest. Its eyewall generated maximum sustained winds of  250 kilometers per hour and spanned almost the width of the island.CIRA/NOAA

    It’s still not fully understood why Maria had suddenly gone berserk. Various theories point to the influence of hot towers — convective bursts of heat energy from thunderclouds that punch up into the stratosphere — or deep warm pools, buoyant freshwater eddies spilling out of the Amazon and Orinoco rivers into the Atlantic, where currents carry these pockets of hurricane-fueling heat to the Gulf of Mexico and the Caribbean Sea.

    But even though these smaller-scale events may have a big impact on intensity, they aren’t fully accounted for in weather models, says Hua Leighton, a scientist at the National Oceanic and Atmospheric Administration’s hurricane research division and the University of Miami’s Cooperative Institute for Marine and Atmospheric Studies. Leighton develops forecasting models and investigates rapid intensification of hurricanes.

    “We cannot measure everything in the atmosphere,” Leighton says.

    Without accurate data on all the factors that drive hurricane intensity, computer models can’t easily predict when the catalyzing events will occur, she says. Nor can models account for everything that happens inside the ocean during a hurricane. They don’t have the data.

    Positioning instruments just before a hurricane hits is a major challenge. But NOAA is making progress. It has launched a new generation of hurricane weather buoys in the western North Atlantic and remote control surface sensors called Saildrones that examine the air-sea interface between hurricanes and the ocean (SN: 6/8/19, p. 24).

    Underwater, NOAA uses other drones, or gliders, to profile the vast areas regularly traversed by tropical storms. These gliders collected 13,200 temperature and salinity readings in 2020. By contrast, the instruments that the team set in Puerto Rico’s waters in 2017 collected over 250 million data points, including current velocity and direction — a rare and especially valuable glimpse of hurricane-induced ocean dynamics at a single location.

    A different view

    After the storm passed, Storlazzi was sure the hurricane had destroyed his instruments. They weren’t designed to take that kind of punishment. The devices generally work in much calmer conditions, not the massive swells generated by Maria, which could increase water pressure to a level that would almost certainly crush instrument sensors.

    But remarkably, the instruments were battered but not lost. Sherman, Carlo and Touhy retrieved them after Maria passed and put them in crates awaiting the research group’s return.

    Milton Carlo (left) and Evan Tuohy (right), shown in an earlier deepwater dive, helped  place the current-monitoring instruments at the hard-to-reach sites where hurricane data were collected.MIKE ECHEVARRIA

    When Storlazzi and USGS oceanographer Kurt Rosenberger pried open the instrument casings in January 2018, no water gushed out. Good sign. The electronics appeared intact. And the lithium batteries had powered the rapid-fire sampling enterprise for the entire six-month duration. The researchers quickly downloaded a flood of data, backed it up and started transmitting it to Cheriton, who began sending back plots and graphs of what the readings showed.

    Floodwaters from the massive rains brought by Maria had pushed a whole lot of polluted sediment to the reefs outside Guánica Bay, spiking PCB concentrations and threatening coral health. As of a few months after the storm, the pollution hadn’t reached the deeper reefs.

    Then the researchers realized that their data told another story: what happens underwater during a massive hurricane. They presumed that other researchers had previously captured a profile of the churning ocean depths beneath a hurricane at the edge of a tropical island.

    Remarkably, that was not the case.

    “Nobody’s even measured this, let alone reported it in any published literature,” Cheriton says. The team began to explore the hurricane data not knowing where it might lead.

    “What am I looking at here?” Cheriton kept asking herself as she plotted and analyzed temperature, current velocity and salinity values using computer algorithms. The temperature gradient that showed the ocean’s internal or underwater waves was different than anything she’d seen before.

    Oceanographer Olivia Cheriton realized that data on ocean currents told a new story about Hurricane Maria.O.M. CHERITON

    During the hurricane, the top 20 meters of the Caribbean Sea had consistently remained at or above 26º C, a few degrees warmer than the layers beneath. But the surface waters should have been cooled if, as expected, Maria’s winds had acted like a big spoon, mixing the warm surface with cold water stirred up from the seafloor 50 to 80 meters below. Normally, the cooler surface temperature restricts the heat supply, weakening the hurricane. But the cold water wasn’t reaching the surface.

    To try to make sense of what she was seeing, Cheriton imagined herself inside the data, in a protective bubble on the seafloor with the instruments as Maria swept over. Storlazzi worked alongside her analyzing the data, but focused on the sediments circulating around the coral reefs.

    Cheriton was listening to “An Awesome Wave” by indie-pop band Alt-J and getting goosebumps while the data swirled before them. Drawing on instincts from her undergraduate astronomy training, she focused her mind’s eye on a constellation of data overhead and told Storlazzi to do the same.

    “Look up Curt!” she said.

    Up at the crest of the island shelf, where the seafloor drops off, the current velocity data revealed a broad stream of water gushing from the shore at almost 1 meter per second, as if from a fire hose. Several hours before Maria arrived, the wind-driven current had reversed direction and was now moving an order of magnitude faster. The rushing surface water thus became a barrier, trapping the cold water beneath it.

    As a result, the surface stayed warm, increasing the force of the hurricane. The cooler layers below then started to pile up vertically into distinct layers, one on top of the other, beneath the gushing waters above.

    Cheriton calculated that with the fire hose phenomenon the contribution from coastal waters in this area to Maria’s intensity was, on average, 65 percent greater, compared with what it would have been otherwise.

    Oceanographer Travis Miles of Rutgers University in New Brunswick, N.J., who was not involved in the research, calls Cheriton and the team’s work a “frontier study” that draws researchers’ attention to near-shore processes. Miles can relate to Cheriton and her team’s accidental hurricane discovery from personal experience: When his water quality–sampling gliders wandered into Hurricane Irene’s path in 2011, they revealed that the ocean off the Jersey Shore had cooled in front of the storm. Irene’s onshore winds had induced seawater mixing across the broad continental shelf and lowered sea surface temperatures.

    The Puerto Rico data show that offshore winds over a steep island shelf produced the opposite effect and should help researchers better understand storm-induced mixing of coastal areas, says NOAA senior scientist Hyun-Sook Kim, who was not involved in the research. It can help with identifying deficiencies in the computer models she relies on when providing guidance to storm-tracking meteorologists at the National Hurricane Center in Miami and the Joint Typhoon Warning Center in Hawaii.

    And the unexpected findings also could help scientists get a better handle on coral reefs and the role they play in protecting coastlines. “The more we study the ocean, especially close to the coast,” Carlo says, “the more we can improve conditions for the coral and the people living on the island.” More

  • in

    Humans may not be able to handle as much heat as scientists thought

    More than 2,000 people dead from extreme heat and wildfires raging in Portugal and Spain. High temperature records shattered from England to Japan. Overnights that fail to cool.

    Brutal heat waves are quickly becoming the hallmark of the summer of 2022.

    And even as climate change continues to crank up the temperature, scientists are working fast to understand the limits of humans’ resilience to heat extremes. Recent research suggests that heat stress tolerance in people may be lower than previously thought. If true, millions more people could be at risk of succumbing to dangerous temperatures sooner than expected.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    “Bodies are capable of acclimating over a period of time” to temperature changes, says Vivek Shandas, an environmental planning and climate adaptation researcher at Portland State University in Oregon. Over geologic time, there have been many climate shifts that humans have weathered, Shandas says. “[But] we’re in a time when these shifts are happening much more quickly.”

    Just halfway through 2022, heat waves have already ravaged many countries. The heat arrived early in southern Asia: In April, Wardha, India, saw a high of 45° Celsius (113° Fahrenheit); in Nawabshah, Pakistan, in May recorded temperatures rose to 49.5° C (121.1° F).

    Extreme heat alerts blared across Europe beginning in June and continuing through July, the rising temperatures exacerbating drought and sparking wildfires. The United Kingdom shattered its hottest-ever record July 19 when temperatures reached 40.3° C in the English village of Coningsby. The heat fueled fires in France, forcing thousands to evacuate from their homes. 

    And the litany goes on: Japan experienced its worst June heat wave since record-keeping began in 1875, leading to the country’s highest-ever recorded June temperature of 40.2° C.  China’s coastal megacities, from Shanghai to Chengdu, were hammered by heat waves in July as temperatures in the region also rose above 40° C. And in the United States, a series of heat waves gripped the Midwest, the South and the West in June and July. Temperatures soared to 42° C in North Platte, Neb., and to 45.6° C in Phoenix.

    The current global rate of warming on Earth is unprecedented (SN: 7/24/19). And scientists have long predicted that human-caused climate change will increase the occurrence of heat waves. Globally, humans’ exposure to extreme heat tripled from 1983 to 2016, particularly in South Asia.

    The heat already is taking an increasing toll on human health. It can cause heat cramps, heat exhaustion and heat stroke, which is often fatal. Dehydration can lead to kidney and heart disease. Extreme heat can even change how we behave, increasing aggression and decreasing our ability to focus (SN: 8/18/21).

    Staying cool

    The human body has various ways to shed excess heat and keep the core of the body at an optimal temperature of about 37° C (98.6° F). The heart pumps faster, speeding up blood flow that carries heat to the skin (SN: 4/3/18). Air passing over the skin can wick away some of that heat. Evaporative cooling — sweating — also helps.

    But there’s a limit to how much heat humans can endure. In 2010, scientists estimated that theoretical heat stress limit to be at a “wet bulb” temperature of 35° C. Wet bulb temperatures depend on a combination of humidity and “dry bulb” air temperature measured by a thermometer. Those variables mean a place could hit a wet bulb temperature of 35° C in different ways — for instance, if the air is that temperature and there’s 100 percent humidity, or if the air temperature is 45° C and there’s 50 percent humidity. The difference is due to evaporative cooling.

    When water evaporates from the skin or another surface, it steals away energy in the form of heat, briefly cooling that surface. That means that in drier regions, the wet bulb temperature — where that ephemeral cooling effect happens readily — will be lower than the actual air temperature. In humid regions, however, wet and dry bulb temperatures are similar, because the air is so moist it’s difficult for sweat to evaporate quickly.

    So when thinking about heat stress on the body, scientists use wet bulb temperatures because they are a measure of how much cooling through evaporation is possible in a given climate, says Daniel Vecellio, a climate scientist at Penn State.

    “Both hot/dry and warm/humid environments can be equally dangerous,” Vecellio says — and this is where the body’s different cooling strategies come into play. In hot, dry areas, where the outside temperature may be much hotter than skin temperature, human bodies rely entirely on sweating to cool down, he says. In warm, humid areas, where the air temperature may actually be cooler than skin temperatures (but the humidity makes it seem warmer than it is), the body can’t sweat as efficiently. Instead, the cooler air passing over the skin can draw away the heat.

    How hot is too hot?

    Given the complexity of the body’s cooling system, and the diversity of human bodies, there isn’t really a one-size-fits-all threshold temperature for heat stress for everybody. “No one’s body runs at 100 percent efficiency,” Vecellio says. Different body sizes, the ability to sweat, age and acclimation to a regional climate all have a role.

    Still, for the last decade, that theoretical wet bulb 35° C number has been considered to be the point beyond which humans can no longer regulate their bodies’ temperatures. But recent laboratory-based research by Vecellio and his colleagues suggests that a general, real-world threshold for human heat stress is much lower, even for young and healthy adults.

    The researchers tracked heat stress in two dozen subjects ranging in age from 18 to 34, under a variety of controlled climates. In the series of experiments, the team varied humidity and temperature conditions within an environmental chamber, sometimes holding temperature constant while varying the humidity, and sometimes vice versa.

    The subjects exerted themselves within the chamber just enough to simulate minimal outdoor activity, walking on a treadmill or pedaling slowly on a bike with no resistance. During these experiments, which lasted for 1.5 to two hours, the researchers measured the subjects’ skin temperatures using wireless probes and assessed their core temperatures using a small telemetry pill that the subjects swallowed.

    In warm and humid conditions, the subjects in the study were unable to tolerate heat stress at wet bulb temperatures closer to 30° or 31° C, the team estimates. In hot and dry conditions, that wet bulb temperature was even lower, ranging from 25° to 28° C, the researchers reported in the February Journal of Applied Physiology. For context, in a very dry environment at about 10 percent humidity, a wet bulb temperature of 25° C would correspond to an air temperature of about 50° C (122° F).

    These results suggest that there is much more work to be done to understand what humans can endure under real-world heat and humidity conditions, but that the threshold may be much lower than thought, Vecellio says. The 2010 study’s theoretical finding of 35° C may still be “the upper limit,” he adds. “We’re showing the floor.”

    And that’s for young, healthy adults doing minimal activity. Thresholds for heat stress are expected to be lower for outdoor workers required to exert themselves, or for the elderly or children. Assessing laboratory limits for more at-risk people is the subject of ongoing work for Vecellio and his colleagues.

    A worker wipes away sweat in Toulouse, France, on July 13. An intense heat wave swept across Europe in mid-July, engulfing Spain, Portugal, France, England and other countries.VALENTINE CHAPUIS/AFP via Getty Images

    If the human body’s tolerance for heat stress is generally lower than scientists have realized, that could mean millions more people will be at risk from the deadliest heat sooner than scientists have realized. As of 2020, there were few reports of wet bulb temperatures around the world reaching 35° C, but climate simulations project that limit could be regularly exceeded in parts of South Asia and the Middle East by the middle of the century.

    Some of the deadliest heat waves in the last two decades were at lower wet bulb temperatures: Neither the 2003 European heat wave, which caused an estimated 30,000 deaths, nor the 2010 Russian heat wave, which killed over 55,000 people, exceeded wet bulb temperatures of 28° C.

    Protecting people

    How best to inform the public about heat risk is “the part that I find to be tricky,” says Shandas, who wasn’t involved in Vecellio’s research. Shandas developed the scientific protocol for the National Integrated Heat Health Information System’s Urban Heat Island mapping campaign in the United States.

    It’s very useful to have this physiological data from a controlled, precise study, Shandas says, because it allows us to better understand the science behind humans’ heat stress tolerance. But physiological and environmental variability still make it difficult to know how best to apply these findings to public health messaging, such as extreme heat warnings, he says. “There are so many microconsiderations that show up when we’re talking about a body’s ability to manage [its] internal temperature.”

    One of those considerations is the ability of the body to quickly acclimate to a temperature extreme. Regions that aren’t used to extreme heat may experience greater mortality, even at lower temperatures, simply because people there aren’t used to the heat. The 2021 heat wave in the Pacific Northwest wasn’t just extremely hot — it was extremely hot for that part of the world at that time of year, which makes it more difficult for the body to adapt, Shandas says (SN: 6/29/21).

    Heat that arrives unusually early and right on the heels of a cool period can also be more deadly, says Larry Kalkstein, a climatologist at the University of Miami and the chief heat science advisor for the Washington, D.C.–based nonprofit Adrienne Arsht-Rockefeller Foundation Resilience Center. “Often early season heat waves in May and June are more dangerous than those in August and September.”

    One way to improve communities’ resilience to the heat may be to treat heat waves like other natural disasters — including give them names and severity rankings (SN: 8/14/20). As developed by an international coalition known as the Extreme Heat Resilience Alliance, those rankings form the basis for a new type of heat wave warning that explicitly considers the factors that impact heat stress, such as wet bulb temperature and acclimation, rather than just temperature extremes.

    The rankings also consider factors such as cloud cover, wind and how hot the temperatures are overnight. “If it’s relatively cool overnight, there’s not as much negative health outcome,” says Kalkstein, who created the system. But overnight temperatures aren’t getting as low as they used to in many places. In the United States, for example, the average minimum temperatures at nighttime are now about 0.8° C warmer than they were during the first half of the 20th century, according to the country’s Fourth National Climate Assessment, released in 2018 (SN: 11/28/18).

    By naming heat waves like hurricanes, officials hope to increase citizens’ awareness of the dangers of extreme heat. Heat wave rankings could also help cities tailor their interventions to the severity of the event. Six cities are currently testing the system’s effectiveness: four in the United States and in Athens, Greece, and Seville, Spain. On July 24, with temperatures heading toward 42° C, Seville became the first city in the world to officially name a heat wave, sounding the alarm for Heat Wave Zoe.

    As 2022 continues to smash temperature records around the globe, such warnings may come not a moment too soon. More

  • in

    Ancient penguin bones reveal unprecedented shrinkage in key Antarctic glaciers

    Antarctica’s Pine Island and Thwaites glaciers are losing ice more quickly than they have at any time in the last few thousand years, ancient penguin bones and limpet shells suggest.

    Scientists are worried that the glaciers, two of Antarctica’s fastest-shrinking ones, are in the process of unstable, runaway retreat. By reconstructing the history of the glaciers using the old bones and shells, researchers wanted to find out whether these glaciers have ever been smaller than they are today.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    “If the ice has been smaller in the past, and did readvance, that shows that we’re not necessarily in runaway retreat” right now, says glacial geologist Brenda Hall of the University of Maine in Orono. The new result, described June 9 in Nature Geoscience, “doesn’t give us any comfort,” Hall says. “We can’t refute the hypothesis of a runaway retreat.”

    Pine Island and Thwaites glaciers sit in a broad ocean basin shaped like a bowl, deepening toward the middle. This makes the ice vulnerable to warm currents of dense, salty water that hug the ocean floor (SN: 4/9/21). Scientists have speculated that as the glaciers retreat farther inland, they could tip into an irreversible collapse (SN: 12/13/21).  That collapse could play out over centuries and raise the sea level by roughly a meter.

    Researchers dated ancient shorelines (seen here as the series of small ridges in the rocky terrain between the foreground boulders and background snow) on islands roughly 100 kilometers from Pine Island and Thwaites glaciers in Antarctica to help figure out if the glaciers are in the process of unstable, runaway retreat.James Kirkham

    To reconstruct how the glaciers have changed over thousands of years, the researchers turned to old penguin bones and shells, collected by Scott Braddock, a glacial geologist in Hall’s lab, during a research cruise in 2019 on the U.S. icebreaker Nathaniel B. Palmer.

    One afternoon, Braddock clambered from a bobbing inflatable boat onto the barren shores of Lindsey 1 — one of a dozen or more rocky islands that sit roughly 100 kilometers from where Pine Island Glacier terminates in the ocean. As he climbed the slope, his boots slipped over rocks covered in penguin guano and dotted with dingy white feathers. Then, he came upon a series of ridges — rocks and pebbles that were piled up by waves during storms thousands of years before — that marked ancient shorelines.

    Twelve thousand years ago, just as the last ice age was ending, this island would have been entirely submerged in the ocean. But as nearby glaciers shed billions of metric tons of ice, the removal of that weight allowed Earth’s crust to spring up like a bed mattress — pushing Lindsey 1 and other nearby islands out of the water, a few millimeters per year.

    As Lindsey 1 rose, a series of shorelines formed on the edges of the island — and then were lifted, one after another, out of reach of the waves. By measuring the ages and heights of those stranded shorelines, the researchers could tell how quickly the island had risen. Because the rate of uplift is determined by the amount of ice being lost from nearby glaciers, this would reveal how quickly Pine Island and Thwaites glaciers had retreated — and whether they had gotten smaller than they are today and then readvanced.

    Braddock dug into the pebbly ridges, collecting ancient cone-shaped limpet shells and marble-sized fragments of penguin bones deposited when the shorelines formed. Back in Maine, he and his colleagues radiocarbon dated those objects to estimate the ages of the shorelines. Ultimately, the researchers dated nearly two dozen shorelines, spread across several islands in the region.

    These dates showed that the oldest and highest beach formed 5,500 years ago. Since that time, up until the last few decades, the islands have risen at a steady rate of about 3.5 millimeters per year. This is far slower than the 20 to 40 millimeters per year that the land around Pine Island and Thwaites is currently rising, suggesting that the rate of ice loss from nearby glaciers has skyrocketed due to the onset of rapid human-caused warming, after thousands of years of relative stability.

    “We’re going into unknown territory,” Braddock says. “We don’t have an analog to compare what’s going on today with what happened in the past.”

    Slawek Tulaczyk, a glaciologist at the University of California, Santa Cruz, sees the newly dated shorelines as “an important piece of information.” But he cautions against overinterpreting the results. While these islands are 100 kilometers from Pine Island and Thwaites, they are less than 50 kilometers from several smaller glaciers — and changes in these closer glaciers might have obscured whatever was happening at Pine Island and Thwaites long ago. He suspects that Pine Island and Thwaites could still have retreated and then readvanced a few dozen kilometers: “I don’t think this study settles it.” More

  • in

    Scientists hope to mimic the most extreme hurricane conditions

    Winds howl at over 300 kilometers per hour, battering at a two-story wooden house and ripping its roof from its walls. Then comes the water. A 6-meter-tall wave engulfs the structure, knocking the house off its foundation and washing it away.

    That’s the terrifying vision of researchers planning a new state-of-the-art facility to re-create the havoc wreaked by the most powerful hurricanes on Earth. In January, the National Science Foundation awarded a $12.8 million grant to researchers to design a facility that can simulate wind speeds of at least 290 km/h — and can, at the same time, produce deadly, towering storm surges.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    No facility exists that can produce such a one-two punch of extreme wind and water. But it’s an idea whose time has come — and not a moment too soon.

    “It’s a race against time,” says disaster researcher Richard Olson, director of extreme events research at Florida International University, or FIU, in Miami.

    Hurricanes are being made worse by human-caused climate change: They’re getting bigger, wetter, stronger and slower (SN: 9/13/18; SN: 11/11/20). Scientists project that the 2022 Atlantic Ocean hurricane season, spanning June 1 to November 30, will be the seventh straight season with more storms than average. Recent seasons have been marked by an increase in rapidly intensifying hurricanes linked to warming ocean waters (SN: 12/21/20).

    Those trends are expected to continue as the Earth heats up further, researchers say. And coastal communities around the world need to know how to prepare: how to build structures — buildings, bridges, roads, water and energy systems — that are resilient to such punishing winds and waves.

    To help with those preparations, FIU researchers are leading a team of wind and structural engineers, coastal and ocean engineers, computational modelers and resilience experts from around the United States to work out how best to simulate these behemoths. Combining extreme wind and water surges into one facility is uncharted territory, says Ioannis Zisis, a wind engineer at FIU. “There is a need to push the envelope,” Zisis says. But as for how exactly to do it, “the answer is simple: We don’t know. That’s what we want to find out.”

    Prepping for “Category 6”

    It’s not that such extreme storms haven’t been seen on Earth. Just in the last few years, Hurricanes Dorian (2019) and Irma (2017) in the Atlantic Ocean and super Typhoon Haiyan (2013) in the Pacific Ocean have brought storms with wind speeds well over 290 km/h. Such ultraintense storms are sometimes referred to as “category 6” hurricanes, though that’s not an official designation.

    The National Oceanic and Atmospheric Administration, or NOAA, rates hurricanes in the Atlantic and eastern Pacific oceans on a scale of 1 to 5, based on their wind speeds and how much damage those winds might do. Each category spans an increment of roughly 30 km/h.  

    Category 1 hurricanes, with wind speeds of 119 to 153 km/h, produce “some damage,” bringing down some power lines, toppling trees and perhaps knocking roof shingles or vinyl siding off a house. Category 5 storms, with winds starting at 252 km/h, cause “catastrophic damage,” bulldozing buildings and potentially leaving neighborhoods uninhabitable for weeks to months.

    But 5 is as high as it gets on the official scale; after all, what could be more devastating than catastrophic damage? That means that even monster storms like 2019’s Hurricane Dorian, which flattened the Bahamas with wind speeds of up to nearly 300 km/h, are still considered category 5 (SN: 9/3/19).

    “Strictly speaking, I understand that [NOAA doesn’t] see the need for a category 6,” Olson says. But there is a difference in public perception, he says. “I see it as a different type of storm, a storm that is simply scarier.”

    And labels aside, the need to prepare for these stronger storms is clear, Olson says. “I don’t think anybody wants to be explaining 20 years from now why we didn’t do this,” he says. “We have challenged nature. Welcome to payback.”

    Superstorm simulation

    FIU already hosts the Wall of Wind, a huge hurricane simulator housed in a large hangar anchored at one end by an arc of 12 massive yellow fans. Even at low wind speeds — say, around 50 km/h — the fans generate a loud, unsettling hum. At full blast, those fans can generate wind speeds of up to 252 km/h — equivalent to a low-grade category 5 hurricane.

    Inside, researchers populate the hangar with structures mimicking skyscrapers, houses and trees, or shapes representing the bumps and dips of the ground surface. Engineers from around the world visit the facility to test out the wind resistance of their own creations, watching as the winds pummel at their structural designs.

    Twelve fans tower over one end of the Wall of Wind, a large experimental facility at Florida International University in Miami. There, winds as fast as 252 kilometers per hour let researchers re-create conditions experienced during a low-grade category 5 hurricane.NSF-NHERI Wall of Wind/FIU

    It’s one of eight facilities in a national network of laboratories that study the potential impacts of wind, water and earthquake hazards, collectively called the U.S. Natural Hazards Engineering Research Infrastructure, or NHERI.

    The Wall of Wind is designed for full-scale wind testing of entire structures. Another wind machine, hosted at the University of Florida in Gainesville, can zoom in on the turbulent behavior of winds right at the boundary between the atmosphere and ground. Then there are the giant tsunami- and storm surge–simulating water wave tanks at Oregon State University in Corvallis.

    The new facility aims to build on the shoulders of these giants, as well as on other experimental labs around the country. The design phase is projected to take four years, as the team ponders how to ramp up wind speeds — possibly with more, or more powerful fans than the Wall of Wind’s — and how to combine those gale-force winds and massive water tanks in one experimental space.

    Existing labs that study wind and waves together, albeit on a much smaller scale, can offer some insight into that aspect of the design, says Forrest Masters, a wind engineer at the University of Florida and the head of that institution’s NHERI facility.

    This design phase will also include building a scaled-down version of the future lab as proof of concept. Building the full-scale facility will require a new round of funding and several more years.

    Past approaches to studying the impacts of strong wind storms tend to use one of three approaches: making field observations of the aftermath of a given storm; building experimental facilities to re-create storms; and using computational simulations to visualize how those impacts might play out over large geographical regions. Each of these approaches has strengths and limitations, says Tracy Kijewski-Correa, a disaster risk engineer at the University of Notre Dame in Indiana.

    “In this facility, we want to bring together all of these methodologies,” to get as close as possible to recreating what Mother Nature can do, Kijewski-Correa says.  

    It’s a challenging engineering problem, but an exciting one. “There’s a lot of enthusiasm for this in the broader scientific community,” Masters says. “If it gets built, nothing like it will exist.” More

  • in

    Farmers in India cut their carbon footprint with trees and solar power

    In 2007, 22-year-old P. Ramesh’s groundnut farm was losing money. As was the norm in most of India (and still is), Ramesh was using a cocktail of pesticides and fertilizers across his 2.4 hectares in the Anantapur district of southern India. In this desert-like area, which gets less than 600 millimeters of rainfall most years, farming is a challenge.

    “I lost a lot of money growing groundnuts through chemical farming methods,” says Ramesh, who goes by the first letter of his father’s name followed by his first name, as is common in many parts of southern India. The chemicals were expensive and his yields low.

    Then in 2017, he dropped the chemicals. “Ever since I took up regenerative agricultural practices like agroforestry and natural farming, both my yield and income have increased,” he says.

    Agroforestry involves planting woody perennials (trees, shrubs, palms, bamboos, etc.) alongside agricultural crops (SN: 7/3/21 & 7/17/21, p. 30). One natural farming method calls for replacing all chemical fertilizers and pesticides with organic matter such as cow dung, cow urine and jaggery, a type of solid dark sugar made from sugarcane, to boost soil nutrient levels. Ramesh also expanded his crops, originally groundnuts and some tomatoes, by adding papaya, millets, okra, eggplant (called brinjal locally) and other crops.

    Farmers in Anantapur, India, pose with the natural fertilizer they use on their crops. Called Ghanajeevamritam, it contains jaggery, cow dung, cow urine and sometimes flour from dried beans. M. Shaikshavali

    With help from the nonprofit Accion Fraterna Ecology Centre in Anantapur, which works with farmers who want to try sustainable farming, Ramesh increased his profits enough to buy more land, expanding his parcel to about four hectares. Like the thousands of other farmers practicing regenerative farming across India, Ramesh has managed to nourish his depleted soil, while his new trees help keep carbon out of the atmosphere, thus playing a small but important role in reducing India’s carbon footprint. Recent studies have shown that the carbon sequestration potential of agroforestry is as much as 34 percent higher than standard forms of agriculture.

    In western India, more than 1,000 kilometers from Anantapur, in Dhundi village in Gujarat, 36-year-old Pravinbhai Parmar is using his rice farm for climate change mitigation. By installing solar panels, he no longer uses diesel to power his groundwater pumps. And he has an incentive to pump only the water he needs because he can sell the electricity he doesn’t use.

    If all farmers like Parmar shifted to solar, India’s carbon emissions, which are 2.88 billion metric tons per year, could drop by between 45 million and 62 million tons annually, according to a 2020 report in Carbon Management. So far, the country has about 250,000 solar irrigation pumps out of an estimated 20 million to 25 million total groundwater pumps.

    For a nation that has to provide for what will soon be the world’s largest population, growing food while trying to bring down already high greenhouse gas emissions from agricultural practices is difficult. Today, agriculture and livestock account for 14 percent of India’s gross national greenhouse gas emissions. Adding in the electricity used by the agriculture sector brings this figure up to 22 percent.

    Ramesh and Parmar are part of a small but growing group of farmers getting assistance from government and nongovernmental programs to change how they farm. There’s still a ways to go to reach the estimated 146 million others who cultivate 160 million hectares of arable land in India. But these farmers’ success stories are testimony that one of India’s largest emitting sectors can change.

    Pravinbhai Parmar (center) poses with fellow farmers who are part of the solar irrigation program in Dhundi village, Gujarat.IWMI-TATA Program, Shashwat Cleantech and Dhundi Saur Urja Utpadak Sahkari Mandali

    Feeding the soil, sustaining farmers

    India’s farmers are already deeply feeling the effects of climate change, coping with dry spells, erratic rainfall and increasingly frequent heat waves and tropical cyclones. “When we talk about climate-smart agriculture, we are largely talking about how it has reduced emissions,” says Indu Murthy, sector head for climate, environment and sustainability at the Center for Study of Science, Technology and Policy, a think tank in Bengaluru. But such a system should also help farmers “cope with unexpected changes and weather patterns,” she says.

    This, in many ways, is the philosophy driving a variety of sustainable and regenerative agricultural practices under the agroecology umbrella. Natural farming and agroforestry are two components of this system that are finding more and more takers across India’s varied landscapes, says Y.V. Malla Reddy, director of Accion Fraterna Ecology Centre.

    “For me, the important change is the change in attitude of people towards trees and vegetation in the last few decades,” Reddy says. “In the ’70s and ’80s, people were not really conscious of the value of the trees, but now they consider trees, especially fruit and utilitarian trees, as also a source of income.” Reddy has advocated for sustainable farming in India for close to 50 years. Certain types of trees, such as pongamia, subabul and avisa, have economic benefits apart from their fruits; they provide fodder for livestock and biomass for fuel.

    Reddy’s organization has provided assistance to more than 60,000 Indian farming families to practice natural farming and agroforestry on almost 165,000 hectares. Calculation of the soil carbon sequestration potential of their work is ongoing. But a 2020 report by India’s Ministry of Environment, Forest and Climate Change notes that these farming practices can help India reach its goal of having 33 percent forest and tree cover to meet its carbon sequestration commitments under the Paris climate agreement by 2030.

    Regenerative agriculture is a relatively inexpensive way to reduce carbon dioxide in the atmosphere, as compared with other solutions. Regenerative farming costs $10 to $100 per ton of carbon dioxide removed from the atmosphere, compared with $100 to $1,000 per ton of carbon dioxide for technologies that mechanically remove carbon from the air, according to a 2020 analysis in Nature Sustainability. Such farming not only makes sense for the environment, but chances are the farmers’ earnings will also increase as they shift to regenerative agriculture, Reddy says.

    Farms in Kanumpalli village in Antanapur district grow multiple crops using natural farming methods.M. Shaikshavali

    Farmers from the Baiga and Gondh tribal communities in Dholbajja panchayat, India, harvest chiraita, or Andrographis paniculata, a plant used for medicinal purposes. Their Indigenous community recently took up agroforestry and sustainable farming methods.Elsa Remijn photographer, provided by Commonland

    Growing solar

    Establishing agroecology practices to see an effect on carbon sequestration can take years or decades. But using renewable energy in farming can quickly reduce emissions. For this reason, the nonprofit International Water Management Institute, IWMI, launched the program Solar Power as Remunerative Crop in Dhundi village in 2016.

    “The biggest threat climate change presents, specifically to farmers, is the uncertainty that it brings,” says Shilp Verma, an IWMI researcher of water, energy and food policies based in Anand. “Any agricultural practice that will help farmers cope with uncertainty will improve resilience to climate change.” Farmers have more funds to deal with insecure conditions when they can pump groundwater in a climate-friendly way that also provides incentives for keeping some water in the ground. “If you pump less, then you can sell the surplus energy to the grid,” he says. Solar power becomes an income source.

    Growing rice, especially lowland rice, which is grown on flooded land, requires a lot of water. On average it takes about 1,432 liters of water to produce one kilogram of rice, according to the International Rice Research Institute. The organization says that irrigated rice receives an estimated 34 to 43 percent of the world’s total irrigation water. India is the largest extractor of groundwater in the world, accounting for 25 percent of global extraction. When diesel pumps do the extracting, carbon is emitted into the atmosphere. Parmar and his fellow farmers used to have to buy that fuel to keep their pumps going.

    “We used to spend 25,000 rupees [about $330] a year for running our diesel-powered water pumps. This used to really cut into our profits,” Parmar says. When IWMI asked him in 2015 to participate in a pilot solar-powered irrigation project with zero carbon emissions, Parmar was all ears.

    Since then, Parmar and six fellow farmers in Dhundi have sold more than 240,000 kilowatt-hours to the state and earned more than 1.5 million rupees ($20,000). Parmar’s annual income has doubled from 100,000–150,000 rupees on average to 200,000–250,000 rupees.

    The boost is helping him educate his children, one of whom is pursuing a degree in agriculture — an encouraging sign in a country where farming is out of vogue with the younger generation. As Parmar says, “Solar power is timely, less polluting and also provides us an additional income. What is not to like about it?”

    This aerial image shows solar panels installed among crops to power groundwater pumps and offer a new income source for farmers in western India’s Dhundi village.IWMI-TATA Program, Shashwat Cleantech and Dhundi Saur Urja Utpadak Sahkari Mandali

    Parmar has learned to maintain and fix the panels and the pumps himself. Neighboring villages now ask for his help when they want to set up solar-powered pumps or need pump repairs. “I am happy that others are also following our lead. Honestly, I feel quite proud that they call me to help them with their solar pump systems.”

    IWMI’s project in Dhundi has been so successful that the state of Gujarat started replicating the scheme in 2018 for all interested farmers under an initiative called Suryashakti Kisan Yojana, which translates to solar power project for farmers. And India’s Ministry of New and Renewable Energy now subsidizes and provides low-interest loans for solar-powered irrigation among farmers.

    “The main thing about climate-smart agriculture is that everything we do has to have less carbon footprint,” says Aditi Mukherji, Verma’s colleague and an author of February’s report from the Intergovernmental Panel on Climate Change (SN: 3/26/22, p. 7). “That is the biggest challenge. How do you make something with a low carbon footprint, without having a negative impact on income and productivity?” Mukherji is the regional project leader for Solar Irrigation for Agricultural Resilience in South Asia, an IWMI project looking at various solar irrigation solutions in South Asia.

    Back in Anantapur, “there is also a visible change in the vegetation in our district,” Reddy says. “Earlier, there might not be any trees till the eye can see in many parts of the district. Now there is no place which doesn’t have at least 20 trees in your line of sight. It’s a small change, but extremely significant for our dry region.” And Ramesh and other farmers now enjoy a stable, sustainable income from farming.

    A family in the village of Muchurami in Anantapur district, India, display vegetables harvested through natural farming methods. The vegetables include pumpkins, peas, spinach, and bottle gourds.M. Shaikshavali

    “When I was growing groundnuts, I used to sell it to the local markets,” Ramesh says. He now sells directly to city dwellers through WhatsApp groups. And one of India’s largest online grocery stores, bigbasket.com, and others have started purchasing directly from him to meet a growing demand for organic and “clean” fruits and vegetables.

    “I’m confident now that my children too can take up farming and make a good living if they want to,” Ramesh says. “I didn’t feel the same way before discovering these nonchemical farming practices.” More

  • in

    Replacing some meat with microbial protein could help fight climate change

    “Fungi Fridays” could save a lot of trees — and take a bite out of greenhouse gas emissions. Eating one-fifth less red meat and instead munching on microbial proteins derived from fungi or algae could cut annual deforestation in half by 2050, researchers report May 5 in Nature.

    Raising cattle and other ruminants contributes methane and nitrous oxide to the atmosphere, while clearing forests for pasture lands adds carbon dioxide (SN: 4/4/22; SN: 7/13/21). So the hunt is on for environmentally friendly substitutes, such as lab-grown hamburgers and cricket farming (SN: 9/20/18; SN: 5/2/19).

    Another alternative is microbial protein, made from cells cultivated in a laboratory and nurtured with glucose. Fermented fungal spores, for example, produce a dense, doughy substance called mycoprotein, while fermented algae produce spirulina, a dietary supplement.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Cell-cultured foods do require sugar from croplands, but studies show that mycoprotein produces fewer greenhouse gas emissions and uses less land and water than raising cattle, says Florian Humpenöder, a climate modeler at Potsdam Institute for Climate Impact Research in Germany. However, a full comparison of foods’ future environmental impacts also requires accounting for changes in population, lifestyle, dietary patterns and technology, he says.

    So Humpenöder and colleagues incorporated projected socioeconomic changes into computer simulations of land use and deforestation from 2020 through 2050. Then they simulated four scenarios, substituting microbial protein for 0 percent, 20 percent, 50 percent or 80 percent of the global red meat diet by 2050.

    A little substitution went a long way, the team found: Just 20 percent microbial protein substitution cut annual deforestation rates — and associated CO2 emissions — by 56 percent from 2020 to 2050.

    Eating more microbial proteins could be part of a portfolio of strategies to address the climate and biodiversity crises — alongside measures to protect forests and decarbonize electricity generation, Humpenöder says. More

  • in

    How much does eating meat affect nations’ greenhouse gas emissions?

    The food we eat is responsible for an astounding one-third of global greenhouse gas emissions caused by human activities, according to two comprehensive studies published in 2021.

    “When people talk about food systems, they always think about the cow in the field,” says statistician Francesco Tubiello, lead author of one of the reports, appearing in last June’s Environmental Research Letters. True, cows are a major source of methane, which, like other greenhouse gases, traps heat in the atmosphere. But methane, carbon dioxide and other planet-warming gases are released from several other sources along the food production chain.

    Before 2021, scientists like Tubiello, of the Food and Agriculture Organization of the United Nations, were well aware that agriculture and related land use changes made up roughly 20 percent of the planet’s greenhouse gas emissions. Such land use changes include cutting down forests to make way for cattle grazing and pumping groundwater to flood fields for the sake of agriculture.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    But new modeling techniques used by Tubiello and colleagues, plus a study from a group at the European Commission Tubiello worked with, brought to light another big driver of emissions: the food supply chain. All the steps that take food from the farm to our plates to the landfill — transportation, processing, cooking and food waste — bring food-related emissions up from 20 percent to 33 percent.

    To slow climate change, the foods we eat deserve major attention, just like fossil fuel burning, says Amos Tai, an environmental scientist at the Chinese University of Hong Kong. The fuller picture of food-related emissions demonstrates that the world needs to make drastic changes to the food system if we are to reach international goals for reducing global warming.

    Change from developing countries

    Scientists have gained a clearer understanding of global human-related emissions in recent years through databases like EDGAR, or Emissions Database for Global Atmospheric Research, developed by the European Union. The database covers every country’s human-emitting activities, from energy production to landfill waste, from 1970 to the present. EDGAR uses a unified methodology to calculate emissions for all economic sectors, says Monica Crippa, a scientific officer at the European Commission’s Joint Research Centre.

    Crippa and colleagues, with help from Tubiello, built a companion database of food system–related emissions called EDGAR-FOOD. Using that database, the researchers arrived at the same one-third estimate as Tubiello’s group.

    Crippa’s team’s calculations, reported in Nature Food in March 2021, split food system emissions into four broad categories: land (including both agriculture and related land use changes), energy (used for producing, processing, packaging and transporting goods), industry (including the production of chemicals used in farming and materials used to package food) and waste (from unused food).

    The land sector is the biggest culprit in food system emissions, Crippa says, accounting for about 70 percent of the global total. But the picture looks different across different nations. The United States and other developed countries rely on highly centralized megafarms for much of their food production; so the energy, industry and waste categories make up more than half of these countries’ food system emissions.

    In developing countries, agriculture and changing land use are far greater contributors. Emissions in historically less developed countries have also been rising in the last 30 years, as these countries have cut down wild areas to make way for industrial farming and started eating more meat, another major contributor to emissions with impacts across all four categories.

    As a result, agriculture and related landscape shifts have driven major increases in food system emissions among developing countries in recent decades, while emissions in developed countries have not grown.

    For instance, China’s food emissions shot up by almost 50 percent from 1990 to 2018, largely due to a rise in meat-eating, according to the EDGAR-FOOD database. In 1980, the average Chinese person ate about 30 grams of meat a day, Tai says. In 2010, the average person in China ate almost five times as much, or just under 150 grams of meat a day.

    Top-emitting economies

    In recent years, Crippa says, six economies, the top emitters, have been responsible for more than half of total global food emissions. These economies, in order, are China, Brazil, the United States, India, Indonesia and the European Union. The immense populations of China and India help drive their high numbers. Brazil and Indonesia make the list because large swaths of their rainforests have been cut down to make room for farming. When those trees come down, vast amounts of carbon flow into the atmosphere (SN: 7/3/21 & 7/17/21, p. 24).

    The United States and the European Union are on the list because of heavy meat consumption. In the United States, meat and other animal products contribute the vast majority of food-related emissions, says Richard Waite, a researcher at the World Resources Institute’s food program in Washington, D.C.

    Waste is also a huge issue in the United States: More than one-third of food produced never actually gets eaten, according to a 2021 report from the U.S. Environmental Protection Agency. When food goes uneaten, the resources used to produce, transport and package it are wasted. Plus, the uneaten food goes into landfills, which produce methane, carbon dioxide and other gases as the food decomposes.

    Meat consumption drives emissions

    Climate advocates who want to reduce food emissions often focus on meat consumption, as animal products lead to far greater emissions than plants. Animal production uses more land than plant production, and “meat production is heavily inefficient,” Tai says.

    “If we eat 100 calories of grain, like maize or soybeans, we get that 100 calories,” he explains. All the energy from the food is delivered directly to the person who eats it. But if the 100 calories’ worth of grain is instead fed to a cow or a pig, when the animal is killed and processed for food, just one-tenth of the energy from that 100 calories of grain goes to the person eating the animal.

    Methane production from “the cow in the field” is another factor in meat consumption: Cows release this gas via their manure, burps and flatulence. Methane traps more heat per ton emitted than carbon dioxide, Tubiello says. So emissions from cattle farms can have an outsize impact (SN: 11/28/15, p. 22). These livestock emissions account for about one-third of global methane emissions, according to a 2021 U.N. report.

    Shifting from meats to plants

    U.S. residents should consider how they can shift to what Brent Kim calls “plant-forward” diets. “Plant-forward doesn’t mean vegan. It means reducing animal product intake, and increasing the share of plant foods that are on the plate,” says Kim, program officer at the Johns Hopkins Center for a Livable Future.

    Kim and colleagues estimated food emissions by diet and food group for 140 countries and territories, using a similar modeling framework to EDGAR-FOOD. However, the framework includes only the food production emissions (i.e. agriculture and land use), not processing, transportation and other pieces of the food system incorporated in EDGAR-FOOD.

    Producing the average U.S. resident’s diet generates more than 2,000 kilograms of greenhouse gas emissions per year, the researchers reported in 2020 in Global Environmental Change. The group measured emissions in terms of “CO2 equivalents,” a standardized unit allowing for direct comparisons between CO2 and other greenhouse gases like methane.

    Going meatless one day a week brings down that figure to about 1,600 kilograms of CO2 equivalents per year, per person. Going vegan — a diet without any meat, dairy or other animal products — cuts it by 87 percent to under 300. Going even two-thirds vegan offers a sizable drop to 740 kilograms of CO2 equivalents.

    Kim’s modeling also offers a “low food chain” option, which brings emissions down to about 300 kilograms of CO2 equivalents per year, per person. Eating low on the food chain combines a mostly plant-based diet with animal products that come from more climate-friendly sources that do not disturb ecological systems. Examples include insects, smaller fish like sardines, and oysters and other mollusks.

    Tai agrees that not everybody needs to become a vegetarian or vegan to save the planet, as meat can have important cultural and nutritional value. If you want to “start from the biggest polluter,” he says, focus on cutting beef consumption.

    But enough people need to make these changes to “send a signal back to the market” that consumers want more plant-based options, Tubiello says. Policy makers at the federal, state and local levels can also encourage climate-friendly farming practices, reduce food waste in government operations and take other actions to cut down the resources used in food production, Waite says.

    For example, the World Resources Institute, where Waite works, is part of an initiative called the Cool Food Pledge, in which companies, universities and city governments have signed on to reduce the climate impacts of the food they serve. The institutions agree to track the food they purchase every year to ensure they are progressing toward their goals, Waite says.

    Developed countries like the United States — which have been heavy meat consumers for decades — can have a big impact by changing food choices. Indeed, a paper published in Nature Food in January shows that if the populations of 54 high-income nations switched to a plant-focused diet, annual emissions from these countries’ agricultural production could drop by more than 60 percent. More

  • in

    Coastal cities around the globe are sinking

    Coastal cities around the globe are sinking by up to several centimeters per year, on average, satellite observations reveal. The one-two punch of subsiding land and rising seas means that these coastal regions are at greater risk for flooding than previously thought, researchers report in the April 16 Geophysical Research Letters.

    Matt Wei, an earth scientist at the University of Rhode Island in Narragansett, and colleagues studied 99 coastal cities on six continents. “We tried to balance population and geographic location,” he says. While subsidence has been measured in cities previously, earlier research has tended to focus on just one city or region. This investigation is different, Wei says. “It’s one of the first to really use data with global coverage.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Wei and his team relied on observations made from 2015 to 2020 by a pair of European satellites. Instruments onboard beam microwave signals toward Earth and then record the waves that bounce back. By measuring the timing and intensity of those reflected waves, the team determined the height of the ground with millimeter accuracy. And because each satellite flies over the same part of the planet every 12 days, the researchers were able to trace how the ground deformed over time.

    The largest subsidence rates — up to five centimeters per year —are mostly in Asian cities like Tianjin, China; Karachi, Pakistan; and Manila, Philippines, the team found. What’s more, one-third, or 33, of the analyzed cities are sinking in some places by more than a centimeter per year.

    That’s a worrying trend, says Darío Solano-Rojas, an earth scientist at the National Autonomous University of Mexico in Mexico City who was not involved in the research. These cities are being hit with a double whammy: At the same time that sea levels are rising due to climate change, the land is sinking (SN: 8/15/18). “Understanding that part of the problem is a big deal,” Solano-Rojas says.

    Wei and his colleagues think that the subsidence is largely caused by people. When the researchers looked at Google Earth imagery of the regions within cities that were rapidly sinking, the team saw mostly residential or commercial areas. That’s a tip-off that the culprit is groundwater extraction, the team concluded. Landscapes tend to settle as water is pumped out of aquifers (SN: 10/22/12).

    But there’s reason to be hopeful. In the past, cities such as Shanghai and Indonesia’s Jakarta were sinking by more than 10 centimeters per year, on average. But now subsidence in those places has slowed, possibly due to recent governmental regulations limiting groundwater extraction. More