More stories

  • in

    Vac to the future

    Scientists love a challenge. Or a friendly competition.
    Scientists at La Jolla Institute for Immunology (LJI) recently published the results of a competition that put researchers to the test. For the competition, part of the NIH-funded Computational Models of Immunity network, teams of researchers from different institutions offered up their best predictions regarding B. pertussis (whooping cough) vaccination.
    Each team tried to answer the same set of questions about vaccine responses in a diverse set of clinical study participants. Which study participants would show the highest antibody response to B. pertussis toxin 14 days post-vaccination? Which participants would show the highest increase of monocytes in their blood one day post-vaccination? And so on.
    The teams were given data on the study participant’s age, sex, and characteristics of their immune status prior to vaccination. The teams then developed computational models to predict vaccine responses in different patient groups.
    “We asked, ‘What do you think is the most important factor that drives vaccination outcome?'” says LJI Professor Bjoern Peters, Ph.D., who led the recent Cell Reports Methodsstudy. “The idea was to make the teams really put their money where their mouth is.”
    Multiple computational models to predict vaccine responses have been developed previously, many of them based on complex patterns in immune state before and after vaccination. Surprisingly, the best predictor in the competition was based on a very simple correlation: antibody responses decrease with the calendar age of study participants.
    The result may seem anti-climactic, but the competition sheds light on where more vaccine research is needed. “We know calendar age is important, but we still see a lot of variability in vaccination responses that we can’t explain,” says Peters.

    The competition has also helped rally scientists around further B. pertussis vaccine research. In the United States, B. pertussis vaccines were reformulated in the 1990s to address relatively minor adverse side effects. Research suggests the newer (aP vaccine) design may not be as effective as the older (wP vaccine) design in preventing disease transmission and infection.
    “We don’t know what’s missing from this current vaccine,” says Peters. “That’s an open question.”
    The prediction competition is shaping up to be an annual event, and previous entrants have gone back to the data to further hone their predictions. Perhaps, Peters hopes, this closer look at exactly what drives higher antibody responses in younger people can lead to better vaccines for all patient groups.
    “We are hoping to use this competition not just as a way to examine the capacity of people to predict vaccination outcomes, but also as a way to address an important public health question,” says Peters.
    The Peters Lab and the CMI-PB Team are currently finishing up their second invited challenge. They will host a public contest in or around August 2024. Researchers can join them at https://www.cmi-pb.org/
    Additional authors of the study, “A multi-omics systems vaccinology resource to develop and test computational models of immunity,” include Pramod Shinde, Ferran Soldevila, Joaquin Reyna, Minori Aoki, Mikkel Rasmussen, Lisa Willemsen, Mari Kojima, Brendan Ha, Jason A Greenbaum, James A Overton, Hector Guzman-Orozco, Somayeh Nili, Shelby Orfield, Jeremy P. Gygi, Ricardo da Silva Antunes, Alessandro Sette, Barry Grant, Lars Rønn Olsen, Anna Konstorum, Leying Guan, Ferhat Ay, and Steven H. Kleinstein.
    This study was supported by the National Institutes of Health’s (NIH) National Institute of Allergy and Infectious Diseases (NIAID; grants U01AI150753, U01AI141995 and U19AI142742.) More

  • in

    Information overload is a personal and societal danger

    We are all aware of the dangers of pollution to our air, water, and earth. In a letter recently published in Nature Human Behavior, scientists are advocating for the recognition and mitigation of another type of environmental pollution that poses equivalent personal and societal dangers: information overload.
    With the internet at our fingertips with smartphones, we are exposed to an unprecedented amount of data far beyond our ability to process. The result is an inability to evaluate information and make decisions. Further, it can lead us to limit our social activities, feel unsatisfied with our jobs, as well as unmotivated, and generally negative. Economists estimate that it all comes at a global cost of about $1 trillion. On top of the emotional and cognitive effects, contextual and environmental considerations may add to the personal and economic costs.
    The idea to explore information overload was incubated in a meeting of an international group of scientists two years ago, all of whom were supported by an E.U. grant for international collaboration. The E.U. team selected partners abroad including, for the third time, Rensselaer Polytechnic Institute’s Network Science and Technology Center (NeST), led by Boleslaw Szymanski, Ph.D., professor of computer science, in the United States.
    The researchers compare information overload to other historical shifts in society: open publishing brought about the need to filter out low-quality research from the vast number of accessible publications, the Industrial Revolution gave rise to air pollution, and environmental activists have helped usher in legal and economic changes to help curb pollution. Similarly, so-called “information pollution” or “data smog” must be addressed.
    Through the lens of computer science, there are at least three levels of information overload: “neural and cognitive mechanisms on the individual level… information and decisions at the group level… (and) societal level interactions among individuals, groups, and information providers.” These levels do not operate independently, so the flow of information may be treated as a multilevel network with nodes, which may give rise to an abrupt change. The researchers cite teamwork as an example: one team member’s information overload may cause the group’s performance to be hindered. It is a complex problem.
    “We are calling for action in science, education, and legislation,” said Szymanski. “We need further interdisciplinary research on information overload. Information ecology must be taught in school. We also need to start the conversation on legislative possibilities, akin to the Clean Air Act in the U.K. decades ago.”
    “Information overload can have severe implications,” said Curt Breneman, Ph.D., dean of Rensselaer’s School of Science. “It begins by eroding our emotional health, job performance, and satisfaction, subsequently influencing the actions of groups and ultimately, entire societies. I hope that Dr. Szymanski’s letter, written with colleagues from across the world, will raise public awareness of the problem and enable solutions to be studied and implemented.”
    Szymanski was joined in authoring the letter by Janusz A. Hołyst of Warsaw University of Technology, the principal investigator of the E.U. grant; Philipp Mayr of the Leibniz Institute for the Social Sciences; Michael Thelwall of University of Sheffield; Ingo Frommholz of University of Wolverhampton; Shlomo Havlin and Alon Sela of Bar-Ilan University; Yoed N. Kenett of Technion — Israel Institute of Technology; Denis Helic of Modul University Vienna; Aljoša Rehar and Sebastijan R. Maček of Slovenian Press Agency; Przemysław Kazienko and Tomasz Kajdanowicz of Wroclaw University of Science and Technology; Przemysław Biecek of Warsaw University of and University of Warsaw; and Julian Sienkiewicz of Warsaw University of Technology. More

  • in

    Advanced army robots more likely to be blamed for deaths

    Advanced killer robots are more likely to blamed for civilian deaths than military machines, new research has revealed.
    The University of Essex study shows that high-tech bots will be held more responsible for fatalities in identical incidents.
    Led by the Department of Psychology’s Dr Rael Dawtry it highlights the impact of autonomy and agency.
    And showed people perceive robots to be more culpable if described in a more advanced way.
    It is hoped the study — published in The Journal of Experimental Social Psychology — will help influence lawmakers as technology advances.
    Dr Dawtry said: “As robots are becoming more sophisticated, they are performing a wider range of tasks with less human involvement.
    “Some tasks, such as autonomous driving or military uses of robots, pose a risk to peoples’ safety, which raises questions about how — and where — responsibility will be assigned when people are harmed by autonomous robots.

    “This is an important, emerging issue for law and policy makers to grapple with, for example around the use of autonomous weapons and human rights.
    “Our research contributes to these debates by examining how ordinary people explain robots’ harmful behaviour and showing that the same processes underlying how blame is assigned to humans also lead people to assign blame to robots.”
    As part of the study Dr Dawtry presented different scenarios to more than 400 people.
    One saw them judge whether an armed humanoid robot was responsible for the death of a teenage girl.
    During a raid on a terror compound its machine guns “discharged” and fatally hit the civilian.
    When reviewing the incident, the participants blamed a robot more when it was described in more sophisticated terms despite the outcomes being the same.

    Other studies showed that simply labelling a variety of devices ‘autonomous robots’ lead people to hold them accountable compared to when they were labelled ‘machines’.
    Dr Dawtry added: “These findings show that how robots’ autonomy is perceived- and in turn, how blameworthy robots are — is influenced, in a very subtle way, by how they are described.
    “For example, we found that simply labelling relatively simple machines, such as those used in factories, as ‘autonomous robots’, lead people to perceive them as agentic and blameworthy, compared to when they were labelled ‘machines’.
    “One implication of our findings is that, as robots become more objectively sophisticated, or are simply made to appear so, they are more likely to be blamed.” More

  • in

    Alzheimer’s drug fermented with help from AI and bacteria moves closer to reality

    Galantamine is a common medication used by people with Alzheimer’s disease and other forms of dementia around the world to treat their symptoms. Unfortunately, synthesizing the active compounds in a lab at the scale needed isn’t commercially viable. The active ingredient is extracted from daffodils through a time-consuming process, and unpredictable factors, such as weather and crop yields, can affect supply and price of the drug.
    Now, researchers at The University of Texas at Austin have developed tools — including an artificial intelligence system and glowing biosensors — to harness microbes one day to do all the work instead.
    In a paper in Nature Communications, researchers outline a process using genetically modified bacteria to create a chemical precursor of galantamine as a byproduct of the microbe’s normal cellular metabolism. Essentially, the bacteria are programmed to convert food into medicinal compounds.
    “The goal is to eventually ferment medicines like this in large quantities,” said Andrew Ellington, a professor of molecular biosciences and author of the study. “This method creates a reliable supply that is much less expensive to produce. It doesn’t have a growing season, and it can’t be impacted by drought or floods.”
    Danny Diaz, a postdoctoral fellow with the Deep Proteins research group in UT’s Institute for Foundations of Machine Learning (IFML), developed an AI system called MutComputeX that is key to the process. It identifies how to mutate proteins inside the bacteria to improve their efficiency and operating temperature in order to maximize production of a needed medicinal chemical.
    “This system helped identify mutations that would make the bacteria more efficient at producing the target molecule,” Diaz said. “In some cases, it was up to three times as efficient as the natural system found in daffodils.”
    The process of harnessing microbes to produce useful byproducts is nothing new. Brewers use yeast to make alcohol, and bacteria help create cheese and yogurt. Microbial fermentation is currently used to make certain types of insulin for diabetes treatment, hormones and recombinant proteins used in several drugs such as autoimmune treatments, and even vaccines. But applying AI in the process is relatively new and expands what is possible with microbial fermentation.

    The research team genetically modified E. coli to produce 4-O’Methyl-norbelladine, a chemical building block of galantamine. The complex molecule is in a family of compounds extracted from daffodils that have medicinal uses in treating conditions such as cancer, fungal infections and viral infections, but using microbial fermentation to create a chemical in this family is new.
    The scientists also created a fluorescent biosensor to quickly detect and analyze which bacteria were producing the desired chemicals and how much. When the biosensor, a specially created protein, comes into contact with the chemical researchers wanted to create, it glows green.
    “The biosensor allows us to test and analyze samples in seconds when it used to take something like five minutes each,” said Simon d’Oelsnitz, a postdoctoral researcher formerly at UT Austin and now at Harvard University, the first author of the paper. “And the machine learning program allows us to easily narrow candidates from tens of thousands to tens. Put together, these are really powerful tools.”
    Wantae Kim, Daniel Acosta, Tyler Dangerfield, Mason Schechter, James Howard, Hannah Do, James Loy, Hal Alper and Y. Jessie Zhang of UT and Matthew Minus of Prairie View A&M University were also authors of the paper. The research was supported by the National Institute of Standards and Technology, the Air Force Office of Scientific Research and the National Institutes of Health, and the National Science Foundation supports IFML. Computing resources were provided by Advanced Micro Devices.
    Those involved in this research have submitted required financial disclosure forms with the University, and Ellington, Diaz and d’Oelsnitz have filed a patent application on materials described in this text. Diaz and d’Oelsnitz are each involved with startups related to this research. More

  • in

    AI for astrophysics: Algorithms help chart the origins of heavy elements

    The origin of heavy elements in our universe is theorized to be the result of neutron star collisions, which produce conditions hot and dense enough for free neutrons to merge with atomic nuclei and form new elements in a split-second window of time. Testing this theory and answering other astrophysical questions requires predictions for a vast range of masses of atomic nuclei. Los Alamos National Laboratory scientists are front and center in using machine learning algorithms (an application of artificial intelligence) to successfully model the atomic masses of the entire nuclide chart — the combination of all possible protons and neutrons that defines elements and their isotopes.
    “Many thousands of atomic nuclei that have yet to be measured may exist in nature,” said Matthew Mumpower, a theoretical physicist and co-author on several recent papers detailing atomic masses research. “Machine learning algorithms are very powerful, as they can find complex correlations in data, a result that theoretical nuclear physics models struggle to efficiently produce. These correlations can provide information to scientists about ‘missing physics’ and can in turn be used to strengthen modern nuclear models of atomic masses.”
    Simulating the rapid neutron-capture process
    Most recently, Mumpower and his colleagues, including former Los Alamos summer student Mengke Li and postdoc Trevor Sprouse, authored a paper in Physics Letters B that described simulating an important astrophysical process with a physics-based machine learning mass model. The r process, or rapid neutron-capture process, is the astrophysical process that occurs in extreme environments, like those produced by neutron star collisions. Heavy elements may result from this “nucleosynthesis”; in fact, half of the heavy isotopes up to bismuth and all of thorium and uranium in the universe may have been created by the r process.
    But modeling the r process requires theoretical predictions of atomic masses currently beyond experimental reach. The team’s physics-informed machine-learning approach trains a model based on random selection from the Atomic Mass Evaluation, a large database of masses. Next the researchers use these predicted masses to simulate the r process. The model allowed the team to simulate r-process nucleosynthesis with machine-learned mass predictions for the first time — a significant feat, as machine learning predictions generally break down when extrapolating.
    “We’ve shown that machine learning atomic masses can open the door to predictions beyond where we have experimental data,” Mumpower said. “The critical piece is that we tell the model to obey the laws of physics. By doing so, we enable physics-based extrapolations. Our results are on par with or outperform contemporary theoretical models and can be immediately updated when new data is available.”
    Investigating nuclear structures
    The r-process simulations complement the research team’s application of machine learning to related investigations of nuclear structure. In a recent article in Physical Review C selected as an Editor’s Suggestion, the team used machine learning algorithms to reproduce nuclear binding energies with quantified uncertainties; that is, they were able to ascertain the energy needed to separate an atomic nucleus into protons and neutrons, along with an associated error bar for each prediction. The algorithm thus provides information that would otherwise take significant computational time and resources to obtain from current nuclear modeling.
    In related work, the team used their machine learning model to combine precision experimental data with theoretical knowledge. These results have motivated some of the first experimental campaigns at the new Facility for Rare Isotope Beams, which seeks to expand the known region of the nuclear chart and uncover the origin of the heavy elements. More

  • in

    Robot ANYmal can do parkour and walk across rubble

    ANYmal has for some time had no problem coping with the stony terrain of Swiss hiking trails. Now researchers at ETH Zurich have taught this quadrupedal robot some new skills: it is proving rather adept at parkour, a sport based on using athletic manoeuvres to smoothly negotiate obstacles in an urban environment, which has become very popular. ANYmal is also proficient at dealing with the tricky terrain commonly found on building sites or in disaster areas.
    To teach ANYmal these new skills, two teams, both from the group led by ETH Professor Marco Hutter of the Department of Mechanical and Process Engineering, followed different approaches.
    Exhausting the mechanical options
    Working in one of the teams is ETH doctoral student Nikita Rudin, who does parkour in his free time. “Before the project started, several of my researcher colleagues thought that legged robots had already reached the limits of their development potential,” he says, “but I had a different opinion. In fact, I was sure that a lot more could be done with the mechanics of legged robots.”
    With his own parkour experience in mind, Rudin set out to further push the boundaries of what ANYmal could do. And he succeeded, by using machine learning to teach the quadrupedal robot new skills. ANYmal can now scale obstacles and perform dynamic manoeuvres to jump back down from them.
    In the process, ANYmal learned like a child would — through trial and error. Now, when presented with an obstacle, ANYmal uses its camera and artificial neural network to determine what kind of impediment it’s dealing with. It then performs movements that seem likely to succeed based on its previous training.
    Is that the full extent of what’s technically possible? Rudin suggests that this is largely the case for each individual new skill. But he adds that this still leaves plenty of potential improvements. These include allowing the robot to move beyond solving predefined problems and instead asking it to negotiate difficult terrain like rubble-strewn disaster areas.
    Combining new and traditional technologies
    Getting ANYmal ready for precisely that kind of application was the goal of the other project, conducted by Rudin’s colleague and fellow ETH doctoral student Fabian Jenelten. But rather than relying on machine learning alone, Jenelten combined it with a tried-and-tested approach used in control engineering known as model-based control. This provides an easier way of teaching the robot accurate manoeuvres, such as how to recognise and get past gaps and recesses in piles of rubble. In turn, machine learning helps the robot master movement patterns that it can then flexibly apply in unexpected situations. “Combining both approaches lets us get the most out of ANYmal,” Jenelten says.
    As a result, the quadrupedal robot is now better at gaining a sure footing on slippery surfaces or unstable boulders. ANYmal is soon also to be deployed on building sites or anywhere that is too dangerous for people — for instance to inspect a collapsed house in a disaster area. More

  • in

    Scientists use novel technique to create new energy-efficient microelectronic device

    Breakthrough could help lead to the development of new low-power semiconductors or quantum devices.
    As the integrated circuits that power our electronic devices get more powerful, they are also getting smaller. This trend of microelectronics has only accelerated in recent years as scientists try to fit increasingly more semiconducting components on a chip.
    Microelectronics face a key challenge because of their small size. To avoid overheating, microelectronics need to consume only a fraction of the electricity of conventional electronics while still operating at peak performance.
    Researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have achieved a breakthrough that could allow for a new kind of microelectronic material to do just that. In a new study published in Advanced Materials, the Argonne team proposed a new kind of “redox gating” technique that can control the movement of electrons in and out of a semiconducting material.
    “Redox” refers to a chemical reaction that causes a transfer of electrons. Microelectronic devices typically rely on an electric “field effect” to control the flow of electrons to operate. In the experiment, the scientists designed a device that could regulate the flow of electrons from one end to another by applying a voltage — essentially, a kind of pressure that pushes electricity — across a material that acted as a kind of electron gate. When the voltage reached a certain threshold, roughly half of a volt, the material would begin to inject electrons through the gate from a source redox material into a channel material.
    By using the voltage to modify the flow of electrons, the semiconducting device could act like a transistor, switching between more conducting and more insulating states.
    “The new redox gating strategy allows us to modulate the electron flow by an enormous amount even at low voltages, offering much greater power efficiency,” said Argonne materials scientist Dillon Fong, an author of the study. “This also prevents damage to the system. We see that these materials can be cycled repeatedly with almost no degradation in performance.”
    “Controlling the electronic properties of a material also has significant advantages for scientists seeking emergent properties beyond conventional devices,” said Argonne materials scientist Wei Chen, one of the study’s co-corresponding authors.

    “The subvolt regime, which is where this material operates, is of enormous interest to researchers looking to make circuits that act similarly to the human brain, which also operates with great energy efficiency,” he said.
    The redox gating phenomenon could also be useful for creating new quantum materials whose phases could be manipulated at low power, said Argonne physicist Hua Zhou, another co-corresponding author of the study. Moreover, the redox gating technique may extend across versatile functional semiconductors and low-dimensional quantum materials composed of sustainable elements.
    Work done at Argonne’s Advanced Photon Source, a DOE Office of Science user facility, helped characterize the redox gating behavior.
    Additionally, Argonne’s Center for Nanoscale Materials, also a DOE Office of Science user facility, was used for materials synthesis, device fabrication and electrical measurements of the device.
    A paper based on the study, “Redox Gating for Colossal Carrier Modulation and Unique Phase Control,” appeared in the Jan. 6, 2024 issue of Advanced Materials. In addition to Fong, Chen and Zhou, contributor authors include Le Zhang, Changjiang Liu, Hui Cao, Andrew Erwin, Dillon Fong, Anand Bhattacharya, Luping Yu, Liliana Stan, Chongwen Zou and Matthew V. Tirrell.
    The work was funded by DOE’s Office of Science, Office of Basic Energy Sciences, and Argonne’s laboratory-directed research and development program. More

  • in

    Supply chain disruptions will further exacerbate economic losses from climate change

    Global GDP loss from climate change will increase exponentially the warmer the planet gets when its cascading impact on global supply chains is factored in, finds a new study led by UCL researchers.
    The study, published in Nature, is the first to chart “indirect economic losses” from climate change on global supply chains that will affect regions that would have been less affected by projected warming temperatures.
    These previously unquantified disruptions in supply chains will further exacerbate projected economic losses due to climate change, bringing a projected net economic loss of between $3.75 trillion and $24.7 trillion in adjusted 2020 dollars by 2060, depending on how much carbon dioxide gets emitted.
    Senior author Professor Dabo Guan (UCL Bartlett School of Sustainable Construction) said: “These projected economic impacts are staggering. These losses get worse the more the planet warms, and when you factor in the effects on global supply chains it shows how everywhere is at economic risk.”
    As the global economy has grown more interconnected, disruptions in one part of the world have knock-on effects elsewhere in the world, sometimes in unexpected ways. Crop failures, labour slowdowns and other economic disruptions in one region can affect the supplies of raw materials flowing to other parts of the world that depend on them, disrupting manufacturing and trade in faraway regions. This is the first study to analyse and quantify the propagation of these disruptions from climate change, as well as their economic impacts.
    As the Earth warms, the worse off economically it becomes, with compounding damage and economic losses climbing exponentially as time goes on and the hotter it gets. Climate change disrupts the global economy primarily by health costs from people suffering from heat exposure, work stoppages when it’s too hot to work, and economic disruptions cascading through supply chains.
    The researchers compared expected economic losses across three projected global warming scenarios, called “Shared Socioeconomic Pathways,” based on low, medium and high projected global emissions levels. The best-case scenario would see global temperatures rise by only 1.5 degrees C over preindustrial levels by 2060, the middle track, which most experts believe Earth is on now, would see global temperatures rise by around 3 degrees C, and the worst-case scenario would see global temperatures rise by 7 degrees C.

    By 2060, projected economic losses will be nearly five times as much under the highest emissions path than the lowest, with economic losses getting progressively worse the warmer it gets. By 2060, total GDP losses will amount to 0.8% under 1.5 degrees of warming, 2.0% under 3 degrees of warming and 3.9% under 7 degrees of warming.
    The team calculated that supply chain disruptions also get progressively worse the warmer the climate gets, accounting for a greater and greater proportion of economic losses. By 2060, supply chain losses will amount to 0.1% of total global GDP (13% of the total GDP lost) under 1.5 degrees of warming, 0.5% of total GDP (25% of the total GDP lost) under 3 degrees, and 1.5% of total GDP (38% of the total GDP lost) under 7 degrees.
    Co-lead author, Dr Daoping Wang of King’s College London, said: “The negative impacts of extreme heat sometimes occur quietly on global supply chains, even escaping our notice altogether. Our developed Disaster Footprint model tracks and visually represents these impacts, underlining the imperative for global collaborative efforts in adapting to extreme heat.”
    For example, although extreme heat events occur more often in low-latitude countries, high-latitude regions, such as Europe or the United States, are also at significant risk. Future extreme heat is likely to cost Europe and the US about 2.2% and about 3.5% of their GDP respectively under the high emission scenario. The UK would lose about 1.5% of its GDP, with chemical products, tourism and electrical equipment industries suffering the greatest losses. Some of these losses originate from supply chain fluctuations caused by extreme heat in countries close to the equator.
    The direct human cost is likewise significant. Even under the lowest path, 2060 will see 24% more days of extreme heatwaves and an additional 590,000 heatwave deaths annually, while under the highest path there would be more than twice as many heatwaves and an expected 1.12 million additional annual heatwave deaths. These impacts will not be evenly distributed around the world, but countries situated near to the equator will bear the brunt of climate change, particularly developing countries.
    Co-lead author, Yida Sun from Tsinghua University said: “Developing countries suffer disproportionate economic losses compared to their carbon emissions. As multiple nodes in developing countries are hit simultaneously, economic damage can spread rapidly through the global value chain.”
    The researchers highlighted two illustrative examples of industries that are part of supply chains at risk from climate change: Indian food production and tourism in the Dominican Republic.

    The Indian food industry is heavily reliant on imports of fats and oils from Indonesia and Malaysia, Brazilian sugar, as well as vegetables, fruits and nuts from Southeast Asia and Africa. These supplier countries are among those most affected by climate change, diminishing India’s access to raw materials, which will diminish its food exports. As a result, the economies of countries reliant on these foods will feel the pinch of diminished supply and higher prices.
    The Dominican Republic is expected to see a decline in its tourism as its climate grows too warm to attract vacationers. A nation whose economy is heavily reliant on tourism, this slowdown will hurt tourism-reliant industries including manufacturing, construction, insurance, financial services, and electronic equipment.
    Professor Guan said: “This research is an important reminder that preventing every additional degree of climate change is critical. Understanding what nations and industries are most vulnerable is crucial for devising effective and targeted adaption strategies.” More