More stories

  • in

    Wally Broecker divined how the climate could suddenly shift

    It was the mid-1980s, at a meeting in Switzerland, when Wally Broecker’s ears perked up. Scientist Hans Oeschger was describing an ice core drilled at a military radar station in southern Greenland. Layer by layer, the 2-kilometer-long core revealed what the climate there was like thousands of years ago. Climate shifts, inferred from the amounts of carbon dioxide and of a form of oxygen in the core, played out surprisingly quickly — within just a few decades. It seemed almost too fast to be true.      

    Broecker returned home, to Columbia University’s Lamont-Doherty Earth Observatory, and began wondering what could cause such dramatic shifts. Some of Oeschger’s data turned out to be incorrect, but the seed they planted in Broecker’s mind flowered — and ultimately changed the way scientists think about past and future climate.

    A geochemist who studied the oceans, Broecker proposed that the shutdown of a major ocean circulation pattern, which he named the great ocean conveyor, could cause the North Atlantic climate to change abruptly. In the past, he argued, melting ice sheets released huge pulses of water into the North Atlantic, turning the water fresher and halting circulation patterns that rely on salty water. The result: a sudden atmospheric cooling that plunged the region, including Greenland, into a big chill. (In the 2004 movie The Day After Tomorrow, an overly dramatized oceanic shutdown coats the Statue of Liberty in ice.)

    It was a leap of insight unprecedented for the time, when most researchers had yet to accept that climate could shift abruptly, much less ponder what might cause such shifts.

    Broecker not only explained the changes seen in the Greenland ice core, he also went on to found a new field. He prodded, cajoled and brought together other scientists to study the entire climate system and how it could shift on a dime. “He was a really big thinker,” says Dorothy Peteet, a paleoclimatologist at NASA’s Goddard Institute for Space Studies in New York City who worked with Broecker for decades. “It was just his genuine curiosity about how the world worked.”

    Broecker was born in 1931 into a fundamentalist family who believed the Earth was 6,000 years old, so he was not an obvious candidate to become a pathbreaking geoscientist. Because of his dyslexia, he relied on conversations and visual aids to soak up information. Throughout his life, he did not use computers, a linchpin of modern science, yet became an expert in radiocarbon dating. And, contrary to the siloing common in the sciences, he worked expansively to understand the oceans, the atmosphere, the land, and thus the entire Earth system.

    By the 1970s, scientists knew that humans were pouring excess carbon dioxide into the atmosphere, through burning fossil fuels and cutting down carbon-storing forests, and that those changes were tinkering with Earth’s natural thermostat. Scientists knew that climate had changed in the past; geologic evidence over billions of years revealed hot or dry, cold or wet periods. But many scientists focused on long-term climate changes, paced by shifts in the way Earth rotates on its axis and circles the sun — both of which change the amount of sunlight the planet receives. A highly influential 1976 paper referred to these orbital shifts as the “pacemaker of the ice ages.”

    Ice cores from Antarctica and Greenland changed the game. In 1969, Willi Dansgaard of the University of Copenhagen and colleagues reported results from a Greenland ice core covering the last 100,000 years. They found large, rapid fluctuations in oxygen-18 that suggested wild temperature swings. Climate could oscillate quickly, it seemed — but it took another Greenland ice core and more than a decade before Broecker had the idea that the shutdown of the great ocean conveyor system could be to blame.

    Pulled from southern Greenland beginning in 1979, the Dye-3 ice core (the drill used to retrieve the core is shown) revealed that abrupt climate change had occurred in the past.The Niels Bohr Institute

    Broecker proposed that such a shutdown was responsible for a known cold snap that started around 12,900 years ago. As the Earth began to emerge from its orbitally influenced ice age, water melted off the northern ice sheets and washed into the North Atlantic. Ocean circulation halted, plunging Europe into a sudden chill, he said. The period, which lasted just over a millennium, is known as the Younger Dryas after an Arctic flower that thrived during the cold snap. It was the last hurrah of the last ice age.

    Evidence that an ocean conveyor shutdown could cause dramatic climate shifts soon piled up in Broecker’s favor. For instance, Peteet found evidence of rapid Younger Dryas cooling in bogs near New York City — thus establishing that the cooling was not just a European phenomenon but also extended to the other side of the Atlantic. Changes were real, widespread and fast.

    By the late 1980s and early ’90s, there was enough evidence supporting abrupt climate change that two major projects — one European, one American — began to drill a pair of fresh cores into the Greenland ice sheet. Richard Alley, a geoscientist at Penn State, remembers working through the layers and documenting small climatic changes over thousands of years. “Then we hit the end of the Younger Dryas and it was like falling off a cliff,” he says. It was “a huge change after many small changes,” he says. “Breathtaking.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    The new Greenland cores cemented scientific recognition of abrupt climate change. Though the shutdown of the ocean conveyor could not explain all abrupt climate changes that had ever occurred, it showed how a single physical mechanism could trigger major planet-wide disruptions. It also opened discussions about how rapidly climate might change in the future.

    Broecker, who died in 2019, spent his last decades exploring abrupt shifts that are already happening. He worked, for example, with billionaire Gary Comer, who during a yacht trip in 2001 was shocked by the shrinking of Arctic sea ice, to brainstorm new directions for climate research and climate solutions.

    Broecker knew more than almost anyone about what might be coming. He often described Earth’s climate system as an angry beast that humans are poking with sticks. And one of his most famous papers was titled “Climatic change: Are we on the brink of a pronounced global warming?”

    It was published in 1975. More

  • in

    Quantum information theory: Quantum complexity grows linearly for an exponentially long time

    Physicists know about the huge chasm between quantum physics and the theory of gravity. However, in recent decades, theoretical physics has provided some plausible conjecture to bridge this gap and to describe the behaviour of complex quantum many-body systems, for example black holes and wormholes in the universe. Now, a theory group at Freie Universität Berlin and HZB, together with Harvard University, USA, has proven a mathematical conjecture about the behaviour of complexity in such systems, increasing the viability of this bridge. The work is published in Nature Physics.
    “We have found a surprisingly simple solution to an important problem in physics,” says Prof. Jens Eisert, a theoretical physicist at Freie Universität Berlin and HZB. “Our results provide a solid basis for understanding the physical properties of chaotic quantum systems, from black holes to complex many-body systems,” Eisert adds.
    Using only pen and paper, i.e. purely analytically, the Berlin physicists Jonas Haferkamp, Philippe Faist, Naga Kothakonda and Jens Eisert, together with Nicole Yunger Halpern (Harvard, now Maryland), have succeeded in proving a conjecture that has major implications for complex quantum many-body systems. “This plays a role, for example, when you want to describe the volume of black holes or even wormholes,” explains Jonas Haferkamp, PhD student in the team of Eisert and first author of the paper.
    Complex quantum many-body systems can be reconstructed by circuits of so-called quantum bits. The question, however, is: how many elementary operations are needed to prepare the desired state? On the surface, it seems that this minimum number of operations — the complexity of the system — is always growing. Physicists Adam Brown and Leonard Susskind from Stanford University formulated this intuition as a mathematical conjecture: the quantum complexity of a many-particle system should first grow linearly for astronomically long times and then — for even longer — remain in a state of maximum complexity. Their conjecture was motivated by the behaviour of theoretical wormholes, whose volume seems to grow linearly for an eternally long time. In fact, it is further conjectured that complexity and the volume of wormholes are one and the same quantity from two different perspectives. “This redundancy in description is also called the holographic principle and is an important approach to unifying quantum theory and gravity. Brown and Susskind’s conjecture on the growth of complexity can be seen as a plausibility check for ideas around the holographic principle,” explains Haferkamp.
    The group has now shown that the quantum complexity of random circuits indeed increases linearly with time until it saturates at a point in time that is exponential to the system size. Such random circuits are a powerful model for the dynamics of many-body systems. The difficulty in proving the conjecture arises from the fact that it can hardly be ruled out that there are “shortcuts,” i.e. random circuits with much lower complexity than expected. “Our proof is a surprising combination of methods from geometry and those from quantum information theory. This new approach makes it possible to solve the conjecture for the vast majority of systems without having to tackle the notoriously difficult problem for individual states,” says Haferkamp.
    “The work in Nature Physics is a nice highlight of my PhD,” adds the young physicist, who will take up a position at Harvard University at the end of the year. As a postdoc, he can continue his research there, preferably in the classic way with pen and paper and in exchange with the best minds in theoretical physics.
    Story Source:
    Materials provided by Helmholtz-Zentrum Berlin für Materialien und Energie. Note: Content may be edited for style and length. More

  • in

    Chaos theory provides hints for controlling the weather

    Under a project led by the RIKEN Center for Computational Science, researchers have used computer simulations to show that weather phenomena such as sudden downpours could potentially be modified by making small adjustments to certain variables in the weather system. They did this by taking advantage of a system known as a “butterfly attractor” in chaos theory, where a system can have one of two states — like the wings of a butterfly — and that it switches back and forth between the two states depending on small changes in certain conditions.
    While weather predictions have reached levels of high accuracy thanks to methods such as supercomputer-based simulations and data assimilation, where observational data is incorporated into simulations, scientists have long hoped to be able to control the weather. Research in this area has intensified due to climate change, which has led to more extreme weather events such as torrential rain and storms.
    There are methods at present for weather modification, but they have had limited success. Seeding the atmosphere to induce rain has been demonstrated, but it is only possible when the atmosphere is already in a state where it might rain. Geoengineering projects have been envisioned, but have not been carried out due to concerns about what unpredicted long-term effects they might have.
    As a promising approach, researchers from the RIKEN team have looked to chaos theory to create realistic possibilities for mitigating weather events such as torrential rain. Specifically, they have focused on a phenomenon known as a butterfly attractor, proposed by mathematician and meteorologist Edward Lorentz, one of the founders of modern chaos theory. Essentially, this refers to a system that can adopt one of two orbits that look like the wings of a butterfly, but can change the orbits randomly based on small fluctuations in the system.
    To perform the work, the RIKEN team ran one weather simulation, to serve as the control of “nature” itself, and then ran other simulations, using small variations in a number of variables describing the convection — how heat moves through the system — and discovered that small changes in several of the variables together could lead to the system being in a certain state once a certain amount of time elapsed.
    According to Takemasa Miyoshi of the RIKEN Center for Computational Science, who led the team, “This opens the path to research into the controllability of weather and could lead to weather control technology. If realized, this research could help us prevent and mitigate extreme windstorms, such as torrential rains and typhoons, whose risks are increasing with climate change.”
    “We have built a new theory and methodology for studying the controllability of weather,” he continues. “Based on the observing system simulation experiments used in previous predictability studies, we were able to design an experiment to investigate predictability based on the assumption that the true values (nature) cannot be changed, but rather that we can change the idea of what can be changed (the object to be controlled).”
    Looking to the future, he says, “In this case we used an ideal low-dimensional model to develop a new theory, and in the future we plan to use actual weather models to study the possible controllability of weather.”
    The work, published in Nonlinear Processes of Geophysics, was done as part of the Moonshot R&D Millennia program, contributing to the new Moonshot goal #8.
    Story Source:
    Materials provided by RIKEN. Note: Content may be edited for style and length. More

  • in

    Design of protein binders from target structure alone

    A team of scientists has created a powerful new method for generating protein drugs. Using computers, they designed molecules that can target important proteins in the body, such as the insulin receptor, as well as vulnerable proteins on the surface of viruses. This solves a long-standing challenge in drug development and may lead to new treatments for cancer, diabetes, infection, inflammation, and beyond.
    The research, appearing March 24 in the journal Nature, was led by scientists in the laboratory of David Baker, professor of biochemistry at the University of Washington School of Medicine and a recipient of the 2021 Breakthrough Prize in Life Sciences.
    “The ability to generate new proteins that bind tightly and specifically to any molecular target that you want is a paradigm shift in drug development and molecular biology more broadly,” said Baker.
    Antibodies are today’s most common protein-based drugs. They typically function by binding to a specific molecular target, which then becomes either activated or deactivated. Antibodies can treat a wide range of health disorders, including COVID-19 and cancer, but generating new ones is challenging. Antibodies can also be costly to manufacture.
    A team led by two postdoctoral scholars in the Baker lab, Longxing Cao and Brian Coventry, combined recent advances in the field of computational protein design to arrive at a strategy for creating new proteins that bind molecular targets in a manner similar to antibodies. They developed software that can scan a target molecule, identify potential binding sites, generate proteins targeting those sites, and then screen from millions of candidate binding proteins to identify those most likely to function.
    The team used the new software to generate high-affinity binding proteins against 12 distinct molecular targets. These targets include important cellular receptors such as TrkA, EGFR, Tie2, and the insulin receptor, as well proteins on the surface of the influenza virus and SARS-CoV-2 (the virus that causes COVID-19).
    “When it comes to creating new drugs, there are easy targets and there are hard targets,” said Cao, who is now an assistant professor at Westlake University. “In this paper, we show that even very hard targets are amenable to this approach. We were able to make binding proteins to some targets that had no known binding partners or antibodies,”
    In total, the team produced over half a million candidate binding proteins for the 12 selected molecular targets. Data collected on this large pool of candidate binding proteins was used to improve the overall method.
    “We look forward to seeing how these molecules might be used in a clinical context, and more importantly how this new method of designing protein drugs might lead to even more promising compounds in the future,” said Coventry.
    The research team included scientists from the University of Washington School of Medicine, Yale University School of Medicine, Stanford University School of Medicine, Ghent University, The Scripps Research Institute, and the National Cancer Institute, among other institutions.
    This work was supported in part by The Audacious Project at the Institute for Protein Design, Open Philanthropy Project, National Institutes of Health (HHSN272201700059C, R01AI140245, R01AI150855, R01AG063845), Defense Advanced Research Project Agency (HR0011835403 contract FA8750-17-C-0219), Defense Threat Reduction Agency (HDTRA1-16-C-0029), Schmidt Futures, Gates Ventures, Donald and Jo Anne Petersen Endowment, and an Azure computing gift for COVID-19 research provided by Microsoft. More

  • in

    Innovative AI technology aids personalized care for diabetes patients needing complex drug treatment

    Hitachi, Ltd., University of Utah Health, and Regenstrief Institute, Inc. today announced the development of an AI method to improve care for patients with type 2 diabetes mellitus who need complex treatment. One in 10 adults worldwide have been diagnosed with type 2 diabetes, but a smaller number require multiple medications to control blood glucose levels and avoid serious complications, such as loss of vision and kidney disease.
    For this smaller group of patients, physicians may have limited clinical decision-making experience or evidence-based guidance for choosing drug combinations. The solution is to expand the number of patients to support development of general principles to guide decision-making. Combining patient data from multiple healthcare institutions, however, requires deep expertise in artificial intelligence (AI) and wide-ranging experience in developing machine learning models using sensitive and complex healthcare data.
    Hitachi, U of U Health, and Regenstrief researchers partnered to develop and test a new AI method that analyzed electronic health record data across Utah and Indiana and learned generalizable treatment patterns of type 2 diabetes patients with similar characteristics. Those patterns can now be used to help determine an optimal drug regimen for a specific patient.
    Some of the results of this study are published in the peer-reviewed medical journal, Journal of Biomedical Informatics, in the article, “Predicting pharmacotherapeutic outcomes for type 2 diabetes: An evaluation of three approaches to leveraging electronic health record data from multiple sources.”
    Hitachi had been working with U of U Health for several years on development of a pharmacotherapy selection system for diabetes treatment. However, the system was not always able to accurately predict more complex and less prevalent treatment patterns because it did not have enough data. In addition, it was not easy to use data from multiple facilities, as it was necessary to account for differences in patient disease states and therapeutic drugs prescribed among facilities and regions. To address these challenges, the project partnered with Regenstrief to enrich the data it was working with.
    The new AI method initially groups patients with similar disease states and then analyzes their treatment patterns and clinical outcomes. It then matches the patient of interest to the disease state groups and predicts the range of potential outcomes for the patient depending on various treatment options. The researchers evaluated how well the method worked in predicting successful outcomes given drug regimens administered to patient with diabetes in Utah and Indiana. The algorithm was able to support medication selection for more than 83 percent of patients, even when two or more medications were used together.
    In the future, the research team expects to help patients with diabetes who require complex treatment in checking the efficacy of various drug combinations and then, with their doctors, deciding on a treatment plan that is right for them. This will lead not only to better management of diabetes but increased patient engagement, compliance, and quality of life.
    The three parties will continue to evaluate and improve the effectiveness of the new AI method and contribute to future patient care through further research in healthcare informatics.
    Hitachi will accelerate efforts, including the practical application of this technology through collaboration between its healthcare and IT business divisions and R&D group. GlobalLogic Inc., a Hitachi Group Company and leader in Digital Engineering, is promoting healthcare-related projects in the U.S., will also deepen the collaboration in this field. Through these efforts, the entire Hitachi group will contribute to the health and safety of people.
    Story Source:
    Materials provided by Regenstrief Institute. Note: Content may be edited for style and length. More

  • in

    Quantum physics sets a speed limit to electronics

    Semiconductor electronics is getting faster and faster — but at some point, physics no longer permits any increase. The speed can definitely not be increased beyond one petahertz (one million gigahertz), even if the material is excited in an optimal way with laser pulses.
    How fast can electronics be? When computer chips work with ever shorter signals and time intervals, at some point they come up against physical limits. The quantum-mechanical processes that enable the generation of electric current in a semiconductor material take a certain amount of time. This puts a limit to the speed of signal generation and signal transmission.
    TU Wien (Vienna), TU Graz and the Max Planck Institute of Quantum Optics in Garching have now been able to explore these limits: The speed can definitely not be increased beyond one petahertz (one million gigahertz), even if the material is excited in an optimal way with laser pulses. This result has now been published in the scientific journal Nature Communications.
    Fields and currents
    Electric current and light (i.e. electromagnetic fields) are always interlinked. This is also the case in microelectronics: In microchips, electricity is controlled with the help of electromagnetic fields. For example, an electric field can be applied to a transistor, and depending on whether the field is switched on or off, the transistor either allows electrical current to flow or blocks it. In this way, an electromagnetic field is converted into an electrical signal.
    In order to test the limits of this conversion of electromagnetic fields to current, laser pulses — the fastest, most precise electromagnetic fields available — are used, rather than transistors.
    “Materials are studied that initially do not conduct electricity at all,” explains Prof. Joachim Burgdörfer from the Institute for Theoretical Physics at TU Wien. “These are hit by an ultra-short laser pulse with a wavelength in the extreme UV range. This laser pulse shifts the electrons into a higher energy level, so that they can suddenly move freely. That way, the laser pulse turns the material into an electrical conductor for a short period of time.” As soon as there are freely moving charge carriers in the material, they can be moved in a certain direction by a second, slightly longer laser pulse. This creates an electric current that can then be detected with electrodes on both sides of the material.
    These processes happen extremely fast, on a time scale of atto- or femtoseconds. “For a long time, such processes were considered instantaneous,” says Prof. Christoph Lemell (TU Wien). “Today, however, we have the necessary technology to study the time evolution of these ultrafast processes in detail.” The crucial question is: How fast does the material react to the laser? How long does the signal generation take and how long does one have to wait until the material can be exposed to the next signal? The experiments were carried out in Garching and Graz, the theoretical work and complex computer simulations were done at TU Wien.
    Time or energy — but not both
    The experiment leads to a classic uncertainty dilemma, as it often occurs in quantum physics: in order to increase the speed, extremely short UV laser pulses are needed, so that free charge carriers are created very quickly. However, using extremely short pulses implies that the amount of energy which is transferred to the electrons is not precisely defined. The electrons can absorb very different energies. “We can tell exactly at which point in time the free charge carriers are created, but not in which energy state they are,” says Christoph Lemell. “Solids have different energy bands, and with short laser pulses many of them are inevitably populated by free charge carriers at the same time.”
    Depending on how much energy they carry, the electrons react quite differently to the electric field. If their exact energy is unknown, it is no longer possible to control them precisely, and the current signal that is produced is distorted — especially at high laser intensities.
    “It turns out that about one petahertz is an upper limit for controlled optoelectronic processes,” says Joachim Burgdörfer. Of course, this does not mean that it is possible to produce computer chips with a clock frequency of just below one petahertz. Realistic technical upper limits are most likely considerably lower. Even though the laws of nature determining the ultimate speed limits of optoelectronics cannot be outsmarted, they can now be analyzed and understood with sophisticated new methods. More

  • in

    Simply printing high-performance perovskite-based transistors

    The printing press has contributed immensely to the advancement of humankind by elevating politics, economy, and culture to higher grounds. Today, it goes beyond simply printing books or documents, and is expanding its influence to the realm of cutting-edge technology. Most notably, high-performance components in various smart devices have been successfully printed and have attracted much attention. And now, a technology to print perovskite-based devices — considered a challenge until now — has been proposed.
    A POSTECH research team led by Professor Yong-Young Noh and Ph.D. candidates Ao Liu and Huihui Zhu (Department of Chemical Engineering), in collaboration with Professor Myung-Gil Kim (School of Advanced Materials Science and Engineering) of Sungkyunkwan University, has improved the performance of a p-type semiconductortransistor using inorganic metal halide perovskite. One of the biggest advantages of the new technology is that it enables solution-processed perovskite transistors to be simply printed as semiconductor-like circuits.
    Perovskite-based transistors control the current by combining p-type semiconductors that exhibit hole mobilities with n-type semiconductors. Compared to n-type semiconductors that have been actively studied so far, fabricating high-performance p-type semiconductors has been a challenge.
    Many researchers have tried to utilize perovskite in the p-type semiconductor for its excellent electrical conductivity, but its poor electrical performance and reproducibility have hindered commercialization.
    To overcome this issue, the researchers used the modified inorganic metal halidecaesium tin triiodide (CsSnI3) to develop the p-type perovskite semiconductor and fabricated the high-performance transistor based on this. This transistor exhibits high hole mobility of 50cm2V-1s-1 and higher and the current ratio of more than 108, and recorded the highest performance among the perovskite semiconductor transistors that have been developed so far.
    By making the material into a solution, the researchers succeeded in simply printing the p-type semiconductor transistor as if printing a document. This method is not only convenient but also cost-effective, which can lead to the commercialization of perovskite devices in the future.
    “The newly developed semiconductor material and transistor can be widely applicable as logic circuits in high-end displays and in wearable electronic devices, and also be used in stacked electronic circuits and optoelectronic devices by stacking them vertically with silicon semiconductors,” explained Professor Yong-Young Noh on the significance of the study.
    This study was conducted with the support from the Mid-Career Researcher Program, Creative Materials Discovery Program, Next-generation Intelligence-Type Semiconductor Development Program, and the Basic Research Lab Program of the National Research Foundation of Korea, and from Samsung Display Corporation.
    Story Source:
    Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. More

  • in

    Keeping the light from fading

    Scientists from Nara Institute of Science and Technology created a new approach to compensate for variations in illumination while scanning cathedral stained-glass windows. This work may be applied to other objects of cultural significance to help capture their colors in the most lifelike way.
    It’s hard to think of a more inspirational experience than watching the sun slowly set through historic stained-glass windows, such as those found in the cathedrals in Europe. While the changing light levels over time may be breathtaking, it also makes high-resolution scans of the windows more challenging. That is, if the scanning process requires minutes or even hours to complete, variations in the natural illumination can lead to inconsistent results.
    Now, a team of researchers led by Nara Institute of Science and Technology has developed a new calibration method to help compensate for changes in the sun’s illumination over the course of the scan. “It can take hours to capture thousands of spectral channels pixel by pixel. Thus, the measurement can be significantly affected by the perturbations in natural light,” first author Takuya Funatomi says.
    The researchers set out to capture hyperspectral images of the famous stained-glass windows in the Amiens Cathedral in France. With some window panels dating back to the 13th century, this location which has been designated as a UNESCO World Heritage Site. A whisk-broom scanner was used to acquire hyperspectral images. This kind of sensor uses a movable mirror to slowly scan across an object. Each pixel is measured one at a time as its light is reflected onto the single detector with the sky in the background. However, when it is applied to outdoor cultural heritages, temporal illumination variations become an issue due to the lengthy measurement time. Hyperspectral scanning is not limited to the wavelengths of light that are visible to humans. For this research, the team used a spectrometer that recorded more than 2,000 channels over a spectrum ranging from about 200 nm to 1100 nm, which includes ultraviolet, visible and infrared colors.
    An extra single column scan was added to help calibrate the images. Using matrix methods, variations in temporal illumination could be removed. This allowed for much more accurate results compared with simply normalizing the total brightness, because each color might be impacted differently by the changing light. “Our method provides a new modality for the digital preservation of large cultural assets,” senior author Yasuhiro Mukaigawa says. This method can be easily adapted to other situations in which outdoor scanning has to occur over long time periods.
    Story Source:
    Materials provided by Nara Institute of Science and Technology. Note: Content may be edited for style and length. More