More stories

  • in

    Data transfer system connects silicon chips with a hair's-width cable

    Researchers have developed a data transfer system that can transmit information 10 times faster than a USB. The new link pairs high-frequency silicon chips with a polymer cable as thin a strand of hair. The system may one day boost energy efficiency in data centers and lighten the loads of electronics-rich spacecraft.
    The research was presented at this month’s IEEE International Solid-State Circuits Conference. The lead author is Jack Holloway ’03, MNG ’04, who completed his PhD in MIT’s Department of Electrical Engineering and Computer Science (EECS) last fall and currently works for Raytheon. Co-authors include Ruonan Han, associate professor and Holloway’s PhD adviser in EECS, and Georgios Dogiamis, a senior researcher at Intel.
    The need for snappy data exchange is clear, especially in an era of remote work. “There’s an explosion in the amount of information being shared between computer chips — cloud computing, the internet, big data. And a lot of this happens over conventional copper wire,” says Holloway. But copper wires, like those found in USB or HDMI cables, are power-hungry — especially when dealing with heavy data loads. “There’s a fundamental tradeoff between the amount of energy burned and the rate of information exchanged.” Despite a growing demand for fast data transmission (beyond 100 gigabits per second) through conduits longer than a meter, Holloway says the typical solution has been “increasingly bulky and costly” copper cables.
    One alternative to copper wire is fiber-optic cable, though that has its own problems. Whereas copper wires use electrical signaling, fiber-optics use photons. That allows fiber-optics to transmit data quickly and with little energy dissipation. But silicon computer chips generally don’t play well with photons, making interconnections between fiber-optic cables and computers a challenge. “There’s currently no way to efficiently generate, amplify, or detect photons in silicon,” says Holloway. “There are all kinds of expensive and complex integration schemes, but from an economics perspective, it’s not a great solution.” So, the researchers developed their own.
    The team’s new link draws on benefits of both copper and fiber optic conduits, while ditching their drawbacks. “It’s a great example of a complementary solution,” says Dogiamis. Their conduit is made of plastic polymer, so it’s lighter and potentially cheaper to manufacture than traditional copper cables. But when the polymer link is operated with sub-terahertz electromagnetic signals, it’s far more energy-efficient than copper in transmitting a high data load. The new link’s efficiency rivals that of fiber-optic, but has a key advantage: “It’s compatible directly with silicon chips, without any special manufacturing,” says Holloway.
    The team engineered such low-cost chips to pair with the polymer conduit. Typically, silicon chips struggle to operate at sub-terahertz frequencies. Yet the team’s new chips generate those high-frequency signals with enough power to transmit data directly into the conduit. That clean connection from the silicon chips to the conduit means the overall system can be manufactured with standard, cost-effective methods, the researchers say.
    The new link also beats out copper and fiber optic in terms of size. “The cross-sectional area of our cable is 0.4 millimeters by a quarter millimeter,” says Han. “So, it’s super tiny, like a strand of hair.” Despite its slim size, it can carry a hefty load of data, since it sends signals over three different parallel channels, separated by frequency. The link’s total bandwidth is 105 gigabits per second, nearly an order of magnitude faster than a copper-based USB cable. Dogiamis says the cable could “address the bandwidth challenges as we see this megatrend toward more and more data.”
    In future work, Han hopes to make the polymer conduits even faster by bundling them together. “Then the data rate will be off the charts,” he says. “It could be one terabit per second, still at low cost.”
    The researchers suggest “data-dense” applications, like server farms, could be early adopters of the new links, since they could dramatically cut data centers’ high energy demands. The link could also be a key solution for the aerospace and automotive industries, which place a premium on small, light devices. And one day, the link could replace the consumer electronic cables in homes and offices, thanks to the link’s simplicity and speed. “It’s far less costly than [copper or fiber optic] approaches, with significantly wider bandwidth and lower loss than conventional copper solutions,” says Holloway. “So, high fives all round.”
    This research was funded, in part, by Intel, Raytheon, the Naval Research Laboratory, and the Office of Naval Research. More

  • in

    Twin atoms: A source for entangled particles

    Heads or tails? If we toss two coins into the air, the result of one coin toss has nothing to do with the result of the other. Coins are independent objects. In the world of quantum physics, things are different: quantum particles can be entangled, in which case they can no longer be regarded as independent individual objects, they can only be described as one joint system.
    For years, it has been possible to produce entangled photons — pairs of light particles that move in completely different directions but still belong together. Spectacular results have been achieved, for example in the field of quantum teleportation or quantum cryptography. Now, a new method has been developed at TU Wien (Vienna) to produce entangled atom pairs — and not just atoms which are emitted in all directions, but well-defined beams. This was achieved with the help of ultracold atom clouds in electromagnetic traps.
    Entangled particles
    “Quantum entanglement is one of the essential elements of quantum physics,” says Prof. Jörg Schmiedmayer from the Institute of Atomic and Subatomic Physics at TU Wien. “If particles are entangled with each other, then even if you know everything there is to know about the total system, you still cannot say anything at all about one specific particle. Asking about the state of one particular particle makes no sense, only the overall state of the total system is defined.”
    There are different methods of creating quantum entanglement. For example, special crystals can be used to create pairs of entangled photons: a photon with high energy is converted by the crystal into two photons of lower energy — this is called “down conversion.” This allows large numbers of entangled photon pairs to be produced quickly and easily.
    Entangling atoms, however, is much more difficult. Individual atoms can be entangled using complicated laser operations — but then you only get a single pair of atoms. Random processes can also be used to create quantum entanglement: if two particles interact with each other in a suitable way, they can turn out to be entangled afterwards. Molecules can be broken up, creating entangled fragments. But these methods cannot be controlled. “In this case, the particles move in random directions. But when you do experiments, you want to be able to determine exactly where the atoms are moving,” says Jörg Schmiedmayer.

    advertisement

    The twin pair
    Controlled twin pairs could now be produced at TU Wien with a novel trick: a cloud of ultracold atoms is created and held in place by electromagnetic forces on a tiny chip. “We manipulate these atoms so that they do not end up in the state with the lowest possible energy, but in a state of higher energy,” says Schmiedmayer. From this excited state, the atoms then spontaneously return to the ground state with the lowest energy.
    However, the electromagnetic trap is constructed in such a way that this return to the ground state is physically impossible for a single atom — this would violate the conservation of momentum. The atoms can therefore only get trasferred to the ground state as pairs and fly away in opposite directions, so that their total momentum remains zero. This creates twin atoms that move exactly in the direction specified by the geometry of the electromagnetic trap on the chip.
    The double-slit experiment
    The trap consists of two elongated, parallel waveguides. The pair of twin atoms may have been created in the left or in the right waveguide — or, as quantum physics allows, in both simultaneously. “It’s like the well-known double-slit experiment, where you shoot a particle at a wall with two slits,” says Jörg Schmiedmayer. “The particle can pass through both the left and the right slit at the same time, behind which it interferes with itself, and this creates wave patterns that can be measured.”
    The same principle can be used to prove that the twin atoms are indeed entangled particles: only if you measure the entire system — i.e. both atoms at the same time — can you detect the wave-like superpositions typical of quantum phenomena. If, on the other hand, you restrict yourself to a single particle, the wave superposition disappears completely.
    “This shows us that in this case it makes no sense to look at the particles individually,” explains Jörg Schmiedmayer. “In the double-slit experiment, the superpositions disappear as soon as you measure whether the particle goes through the left or the right slit. As soon as this information is available, the quantum superposition is destroyed. It is very similar here: if the atoms are entangled and you only measure one of them, you could theoretically still use the other atom to measure whether they both originated in the left or the right part of the trap. Therefore, the quantum superpositions are destroyed.”
    Now that it has been proven that ultracold atom clouds can indeed be used to reliably produce entangled twin atoms in this way, further quantum experiments are to be carried out with these atom pairs — similar to those that have already been possible with photon pairs. More

  • in

    Scientists begin building highly accurate digital twin of our planet

    To become climate neutral by 2050, the European Union launched two ambitious programmes: “Green Deal” and “DigitalStrategy.” As a key component of their successful implementation, climate scientists and computer scientists launched the “Destination Earth” initiative, which will start in mid-2021 and is expected to run for up to ten years. During this period, a highly accurate digital model of the Earth is to be created, a digital twin of the Earth, to map climate development and extreme events as accurately as possible in space and time.
    Observational data will be continuously incorporated into the digital twin in order to make the digital Earth model more accurate for monitoring the evolution and predict possible future trajectories. But in addition to the observation data conventionally used for weather and climate simulations, the researchers also want to integrate new data on relevant human activities into the model. The new “Earth system model” will represent virtually all processes on the Earth’s surface as realistically as possible, including the influence of humans on water, food and energy management, and the processes in the physical Earth system.
    Information system for decision-making
    The digital twin of the Earth is intended to be an information system that develops and tests scenarios that show more sustainable development and thus better inform policies. “If you are planning a two-metre high dike in The Netherlands, for example, I can run through the data in my digital twin and check whether the dike will in all likelihood still protect against expected extreme events in 2050,” says Peter Bauer, deputy director for Research at the European Centre for Medium-Range Weather Forecasts (ECMWF) and co-initiator of Destination Earth. The digital twin will also be used for strategic planning of fresh water and food supplies or wind farms and solar plants.
    The driving forces behind Destination Earth are the ECMWF, the European Space Agency (ESA), and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). Together with other scientists, Bauer is driving the climate science and meteorological aspects of the Earth’s digital twin, but they also rely on the know-how of computer scientists from ETH Zurich and the Swiss National Supercomputing Centre (CSCS), namely ETH professors Torsten Hoefler, from the Institute for High Performance Computing Systems, and Thomas Schulthess, Director of CSCS.
    In order to take this big step in the digital revolution, Bauer emphasises the need for earth sciences to be married to the computer sciences. In a recent publication in Nature Computational Science, the team of researchers from the earth and computer sciences discusses which concrete measures they would like to use to advance this “digital revolution of earth-system sciences,” where they see the challenges and what possible solutions can be found.

    advertisement

    Weather and climate models as a basis
    In their paper, the researchers look back on the steady development of weather models since the 1940s, a success story that took place quietly. Meteorologists pioneered, so to speak, simulations of physical processes on the world’s largest computers. As a physicist and computer scientist, CSCS’s Schulthess is therefore convinced that today’s weather and climate models are ideally suited to identify completely new ways for many more scientific disciplines how to use supercomputers efficiently.
    In the past, weather and climate modelling used different approaches to simulate the Earth system. Whereas climate models represent a very broad set of physical processes, they typically neglect small-scale processes, which, however, are essential for the more precise weather forecasts that in turn, focus on a smaller number of processes. The digital twin will bring both areas together and enable high-resolution simulations that depict the complex processes of the entire Earth system. But in order to achieve this, the codes of the simulation programmes must be adapted to new technologies promising much enhanced computing power.
    With the computers and algorithms available today, the highly complex simulations can hardly be carried out at the planned extremely high resolution of one kilometre because for decades, code development stagnated from a computer science perspective. Climate research benefited from being able to gain higher performance by ways of new generations of processors without having to fundamentally change their programme. This free performance gain with each new processor generation stopped about 10 years ago. As a result, today’s programmes can often only utilise 5 per cent of the peak performance of conventional processors (CPU).
    For achieving the necessary improvements, the authors emphasize the need of co-design, i.e. developing hardware and algorithms together and simultaneously, as CSCS successfully demonstrated during the last ten years. They suggest to pay particular attention to generic data structures, optimised spatial discretisation of the grid to be calculated and optimisation of the time step lengths. The scientists further propose to separate the codes for solving the scientific problem from the codes that optimally perform the computation on the respective system architecture. This more flexible programme structure would allow a faster and more efficient switch to future architectures.
    Profiting from artificial intelligence
    The authors also see great potential in artificial intelligence (AI). It can be used, for example, for data assimilation or the processing of observation data, the representation of uncertain physical processes in the models and data compression. AI thus makes it possible to speed up the simulations and filter out the most important information from large amounts of data. Additionally, the researchers assume that the use of machine learning not only makes the calculations more efficient, but also can help describing the physical processes more accurately.
    The scientists see their strategy paper as a starting point on the path to a digital twin of the Earth. Among the computer architectures available today and those expected in the near future, supercomputers based on graphics processing units (GPU) appear to be the most promising option. The researchers estimate that operating a digital twin at full scale would require a system with about 20,000 GPUs, consuming an estimated 20MW of power. For both economic and ecological reasons, such a computer should be operated at a location where CO2-neutral generated electricity is available in sufficient quantities.

    Story Source:
    Materials provided by ETH Zurich. Original written by Simone Ulmer. Note: Content may be edited for style and length. More

  • in

    Impact of online communities

    The Governance Lab (The GovLab) at the NYU Tandon School of Engineering released a report, “The Power of Virtual Communities,” which examines the role online groups play in creating opportunities for people to build new kinds of meaningful communities they often could not form in real space.
    This first-of-its-kind research was built on interviews with 50 Facebook community leaders in 17 countries, 26 global experts from academia and industry, unique access to Facebook’s underlying research and an original global survey conducted by YouGov of 15,000 people in 15 countries who are currently members of online and in-person communities, which found that in 11 of those countries the majority of people said that the most meaningful communities to which they belong are primarily online.
    “Around the world, people who are otherwise voiceless in physical space are becoming powerful leaders of groups that confer a true sense of meaning and belonging for their members,” said Beth Simone Noveck, director of The GovLab. “This brief report, which tells the stories of several of those leaders and how they govern global communities is, we hope, the beginning of greater and much needed study of online groups and their impact on social and political life.”
    Many of these Facebook groups cut across traditional social groupings and bring together people around a shared trait or interest:
    Female IN (FIN), created as a safe space for women in the Nigerian diaspora to discuss and seek support for problems associated with such challenges as relationship struggles, health issues, abuse, grief and loss. Female IN grew by word-of-mouth into a 1.8 million-person community with members in more than 100 countries.
    Surviving Hijab encourages its 920,000 female members to take up or continue wearing the Muslim head covering in the face of political and social criticism.
    Blind PenPals enables its 7,000 blind and visually impaired members to share stories and advice.
    Canterbury Residents Group acts as a public square in the British city of Canterbury and has 38,000 members, about the same size as the city’s population.
    Subtle Asian Traits, which began as a modest initiative among nine young Australians of Chinese background to share funny memes about their Asian heritage, has expanded to a group of 1.82 million people who discuss and share the experience of growing up Asian in mostly majority-White societies.
    The GovLab’s report findings note that:
    Membership in online communities confers a strong sense of community, the lack of physical proximity notwithstanding.
    Online groups are a still fluid form of human organization that in many cases attract members and leaders who are marginalized in the physical societies they inhabit, and who use the platform to build new kinds of communities that would be difficult to form otherwise.
    Many of these groups have counter-cultural norms and are what political scientists might call “cross-cleavage” communities. These groups cut across traditional social groupings, and bring together people normally divided by geography around a shared trait or interest.
    The flexible affordances of online platforms have enabled new kinds of leaders to emerge in these groups with unique skills in moderating often divisive dialogues, sometimes among millions of members.
    Most groups are run as a labor of love; many leaders are neither trained nor paid and the rules that govern their internal operations are often uncodified and the hosting platform — in this case Facebook — holds significant power over their operations and future.
    These groups, some of which have huge memberships, remain emergent and largely unrecognized: they are outside traditional power structures, institutions and forms of governance.
    More research is needed to understand whether and how these groups will operate as genuine communities over the long term, especially given the tensions that derive from conducting public life on a private platform such as Facebook, and how such groups and their leaders can be supported to ensure they provide maximum voice, participation and benefit to their members
    Further, results from the YouGov survey and the interviews with group leaders indicated that the three most essential traits and behaviors for leaders to exhibit were welcoming differences of opinions, being visible and communicating well, and acting ethically at all times.
    This report, published in six languages, further shines a light on the role leaders have and why it is important to further support them in running their community.

    Story Source:
    Materials provided by NYU Tandon School of Engineering. Note: Content may be edited for style and length. More

  • in

    An intelligent soft material that curls under pressure or expands when stretched

    Plants and animals can rapidly respond to changes in their environment, such as a Venus flytrap snapping shut when a fly touches it. However, replicating similar actions in soft robots requires complex mechanics and sensors. Now, researchers reporting in ACS Applied Materials & Interfaces have printed liquid metal circuits onto a single piece of soft polymer, creating an intelligent material that curls under pressure or mechanical strain.
    Ideally, soft robots could mimic intelligent and autonomous behaviors in nature, combining sensing and controlled movement. But the integration of sensors and the moving parts that respond can be clunky or require an external computer. A single-unit design is needed that responds to environmental stimuli, such as mechanical pressure or stretching. Liquid metals could be the solution, and some researchers have already investigated their use in soft robots. These materials can be used to create thin, flexible circuits in soft materials, and the circuits can rapidly produce heat when an electric current is generated, either from an electrical source or from pressure applied to the circuit. When the soft circuits are stretched, the current drops, cooling the material. To make a soft robot capable of autonomous, intelligent movement, Chao Zhao, Hong Liu and colleagues wanted to integrate liquid metal circuits with liquid crystal elastomers (LCE) — polymers that can undergo large changes to their shape when heated or cooled.
    The researchers applied a nickel-infused gallium-indium alloy onto an LCE and magnetically moved the liquid metal into lines to form an uninterrupted circuit. A silicone sealant that changed from pink to dark red when warmed kept the circuit protected and in place. In response to a current, the soft material curled as the temperature increased, and the film turned redder over time. The team used the material to develop autonomous grippers that perceived and responded to pressure or stretching applied to the circuits. The grippers could pick up small round objects and then drop them when the pressure was released or the material was stretched. Finally, the researchers formed the film into a spiral shape. When pressure was applied to the circuit at the bottom of the spiral, it unfurled with a rotating motion, as the spiral’s temperature increased. The researchers say that these pressure- and stretch-sensitive materials could be adapted for use in soft robots performing complex tasks or locomotion.

    Story Source:
    Materials provided by American Chemical Society. Note: Content may be edited for style and length. More

  • in

    Quantum systems learn joint computing

    Researchers realize the first quantum-logic computer operation between two separate quantum modules in different laboratories.
    Today’s quantum computers contain up to several dozen memory and processing units, the so-called qubits. Severin Daiss, Stefan Langenfeld, and colleagues from the Max Planck Institute of Quantum Optics in Garching have successfully interconnected two such qubits located in different labs to a distributed quantum computer by linking the qubits with a 60-meter-long optical fiber. Over such a distance they realized a quantum-logic gate — the basic building block of a quantum computer. It makes the system the worldwide first prototype of a distributed quantum computer.
    The limitations of previous qubit architectures
    Quantum computers are considerably different from traditional “binary” computers: Future realizations of them are expected to easily perform specific calculations for which traditional computers would take months or even years — for example in the field of data encryption and decryption. While the performance of binary computers results from large memories and fast computing cycles, the success of the quantum computer rests on the fact that one single memory unit — a quantum bit, also called “qubit” — can contain superpositions of different possible values at the same time. Therefore, a quantum computer does not only calculate one result at a time, but instead many possible results in parallel. The more qubits there are interconnected in a quantum computer; the more complex calculations it can perform.
    The basic computing operations of a quantum computer are quantum-logic gates between two qubits. Such an operation changes — depending on the initial state of the qubits — their quantum mechanical states. For a quantum computer to be superior to a normal computer for various calculations, it would have to reliably interconnect many dozens, or even thousands of qubits for equally thousands of quantum operations. Despite great successes, all current laboratories are still struggling to build such a large and reliable quantum computer, since every additionally required qubit makes it much harder to build a quantum computer in just one single set-up. The qubits are implemented, for instance, with single atoms, superconductive elements, or light particles, all of which need to be isolated perfectly from each other and the environment. The more qubits are arranged next to one another, the harder it is to both isolate and control them from outside at the same time.
    Data line and processing unit combined
    One way to overcome the technical difficulties in the construction of quantum computers is presented in a new study in the journal Science by Severin Daiss, Stefan Langenfeld and colleagues from the research group of Gerhard Rempe at the Max Planck Institute of Quantum Optics in Garching. In this work supported by the Institute of Photonic Sciences (Castelldefels, Spain), the team succeeded in connecting two qubit modules across a 60-meter distance in such a way that they effectively form a basic quantum computer with two qubits. “Across this distance, we perform a quantum computing operation between two independent qubit setups in different laboratories,” Daiss emphasizes. This enables the possibility to merge smaller quantum computers to a joint processing unit.
    Simply coupling distant qubits to generate entanglement between them has been achieved in the past, but now, the connection can additionally be used for quantum computations. For this purpose, the researchers employed modules consisting of a single atom as a qubit that is positioned amidst two mirrors. Between these modules, they send one single light quanta, a photon, that is transported in the optical fiber. This photon is then entangled with the quantum states of the qubits in the different modules. Subsequently, the state of one of the qubits is changed according to the measured state of the “ancilla photon,” realizing a quantum mechanical CNOT-operation with a fidelity of 80 percent. A next step would be to connect more than two modules and to host more qubits in the individual modules.
    Higher performance quantum computers through distributed computing
    Team leader and institute director Gerhard Rempe believes the result will allow to further advance the technology: “Our scheme opens up a new development path for distributed quantum computing.” It could enable, for instance, to build a distributed quantum computer consisting of many modules with few qubits that are interconnected with the newly introduced method. This approach could circumvent the limitation of existing quantum computers to integrate more qubits into a single setup and could therefore allow more powerful systems.

    Story Source:
    Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length. More

  • in

    Most important global supply chain linkages

    In today’s global economy, production of goods depends on inputs from many trade partners around the world. Companies and governments need a deeper understanding of the global value chain to reduce costs, maintain a profitable production system, and anticipate ripple effects of disruptions in the supply chain.
    Applied economists from the University of Illinois have developed a new model for in-depth analysis of global supply chain linkages across countries and industries, providing a rich tool that delivers valuable insights for businesses and policy makers around the world.
    “We live in a time when production processes are very much fragmented. In order to end up with one type of good, a car for example, many inputs are assembled abroad and imported from different places around the world. For instance, a car sold by leading U.S. companies may have anywhere from just 2% to 85% of U.S. and Canadian parts in it,” says Sandy Dall’Erba, professor in the Department of Agricultural and Consumer Economics and director of the Regional Economics Applications Laboratory (REAL) at U of I. Dall’Erba is co-author of the study.
    “Coordination of the entire supply chain system becomes more and more complicated and sensitive to disruptions at any stage throughout the process. If just one element in your supply chain is missing, it will have a ripple effect on the entire industry,” Dall’Erba notes. “An example of this was the global semiconductor shortage that recently forced U.S. automakers to halt production.”
    The researchers started with a widely used economic growth model called shift-share decomposition and expanded its components to include interregional and inter-sectoral linkages. This allows them to identify, for each industrial sector and each country, if the growth of the sector of interest is due to supply chain linkages at the domestic level versus the international level. The latter can be further split between linkages with trade agreement partners (such as NAFTA for the U.S.) and countries from the rest of the world, highlighting the benefits of trade agreements.
    “When we apply our technique to understand the drivers of growth in a particular sector, we not only can say whether it is growing faster or slower than another sector or region, we can also identify other sectors that are important for the growth of this particular sector,” says Claudia Montania, the study’s lead author. Montania was a visiting scholar in REAL when she conducted the study and is currently a researcher at the United Nations Development Accelerator Lab in Asuncion, Paraguay.

    advertisement

    Traditional shift-share decomposition includes information about changes in the industry mix and in region-specific features such as taxes, regulations, or characteristics of the labor force. But it does not include connections among different regions or different industry sectors.
    “The information provided by the traditional shift-share model is not enough,” Dall’Erba notes. “For example, it would be a mistake to study only the food manufacturing sector in order to know what is happening in that sector, because it obviously depends on grain and livestock production which, in turn, depends on water and fertilizers among other inputs.
    “In addition, grains are not always used for food manufacturing but they may end up as fuel. The supply chain of any sector is intertwined with that of many other sectors,” he adds.
    In the paper, Dall’Erba and Montania apply their model to country-sector linkages in the European Union, allowing them to compare three levels of connections — domestic, within the EU, and with the rest of the world, and to identify which ones matter most for each sector. The analysis included 35 industrial sectors in 15 countries from 1995 to 2006.
    Overall, the researchers found the most important linkages were among EU trade partners; the second-most important were domestic ties; and the least important linkages were with the rest of the world. They emphasize the results vary across sectors and countries. For example, the supply-chain linkages in place to manufacture a French car are different from those that exist for a German car. Their multi-dynamic model can provide detailed, specific information for each country-sector combination as needed for preemptive and tailored planning and policy making.
    “Knowing which type of linkages are the most important for your product or your sector can be very useful for local governments, for companies, and for producers, because you can make better plans to achieve the expected growth for your sector,” Montania states. “You can also promote trade and diplomatic relationships in regions where you have strong sectoral linkages.”
    Dall’Erba points out this information can help countries and industries protect against supply chain disruptions. Those can occur in many forms, ranging from natural disasters such as drought or earthquake to political upheaval, trade wars, and even the global pandemic. For instance, the extreme disruption airline companies have experienced as demand for air travel dropped in 2020 means both Boeing and Airbus have significantly reduced their production and so have the multiple companies manufacturing airplane components from fuselage to seat belts.
    “COVID-19 has pushed several governments to consider bringing back some industries in order to get better control over all the supply chain links. However, it is not necessarily a viable option as many companies have already de-located their unskilled labor-intensive production to low-wage countries while maintaining high-skilled workers at home,” Dall’Erba concludes. More

  • in

    Climate change helped some dinosaurs migrate to Greenland

    A drop in carbon dioxide levels may have helped sauropodomorphs, early relatives of the largest animal to ever walk the earth, migrate thousands of kilometers north past once-forbidding deserts around 214 million years ago.
    Scientists pinpointed the timing of the dinosaurs’ journey from South America to Greenland by correlating rock layers with sauropodomorph fossils to changes in Earth’s magnetic field. Using that timeline, the team found that the creatures’ northward push coincides with a dramatic decrease in CO2, which may have removed climate-related barriers, the team reports February 15 in Proceedings of the National Academy of Sciences.
    The sauropodomorphs were a group of long-necked, plant-eating dinosaurs that included massive sauropods such as Seismosaurus as well as their smaller ancestors (SN: 11/17/20). About 230 million years ago, sauropodomorphs lived mainly in what is now northern Argentina and southern Brazil. But at some point, these early dinosaurs picked up and moved as far north as Greenland.
    Exactly when they could have made that journey has been a puzzle, though. “In principle, you could’ve walked from where they were to the other hemisphere, which was something like 10,000 kilometers away,” says Dennis Kent, a geologist at Columbia University. Back then, Greenland and the Americas were smooshed together into the supercontinent Pangea. There were no oceans blocking the way, and mountains were easy to get around, he says. If the dinosaurs had walked at the slow pace of one to two kilometers per day, it would have taken them approximately 20 years to reach Greenland.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    But during much of the Late Triassic Epoch, which spans 233 million to 215 million years ago, Earth’s carbon dioxide levels were incredibly high — as much as 4,000 parts per million. (In comparison, CO2 levels currently are about 415 parts per million.) Climate simulations have suggested that level of CO2 would have created hyper-arid deserts and severe climate fluctuations, which could have acted as a barrier to the giant beasts. With vast deserts stretching north and south of the equator, Kent says, there would have been few plants available for the herbivores to survive the journey north for much of that time period.
    Previous estimates suggested that these dinosaurs migrated to Greenland around 225 million to 205 million years ago. To get a more precise date, Kent and his colleagues measured magnetic patterns in ancient rocks in South America, Arizona, New Jersey, Europe and Greenland — all locales where sauropodomorphs fossils have been discovered. These patterns record the orientation of Earth’s magnetic field at the time of the rock’s formation. By comparing those patterns with previously excavated rocks whose ages are known, the team found that sauropodomorphs showed up in Greenland around 214 million years ago.
    Vertebrate fossils from the Late Triassic have been found at a number of sites around the world, some of which are marked (black dots) on this map showing how the continents were arranged about 220 million years ago. New dating of rocks at sites in South America and Greenland pinpoint when long-necked dinosaurs known as sauropodomorphs migrated north.Dennis Kent and Lars Clemmensen
    That more precise date for the sauropodomorphs’ migration may explain why it took them so long to start the trek north — and how they survived journey: Earth’s climate was changing rapidly at that time.
    Around the time that sauropodomorphs appeared in Greenland, carbon dioxide levels plummeted within a few million years to 2,000 parts per million, making the climate more travel-friendly to herbivores, the team reports. The reason for this drop in carbon dioxide — which appears in climate records from South America and Greenland — is unknown, but it allowed for an eventual migration northward.
    “We have evidence for all of these events, but the confluence in timing is what is remarkable here,” says Morgan Schaller, a geochemist at Rensselaer Polytechnic Institute in Troy, N.Y., who was not involved with this study. These new findings, he says, also help solve the mystery of why plant eaters stayed put during a time that meat eaters roamed freely.
    “This study reminds us that we can’t understand evolution without understanding climate and environment,” says Steve Brusatte, a vertebrate paleontologist and evolutionary biologist at the University of Edinburgh, also not involved with the study. “Even the biggest and most awesome creatures that ever lived were still kept in check by the whims of climate change.” More