More stories

  • in

    Double locked: Polymer hydrogels secure confidential information

    The development of highly secure but simple and inexpensive encryption technology for the prevention of data leaks and forgeries is decidedly challenging. In the journal Angewandte Chemie, a research team has introduced a “double lock” based on thermoresponsive polymer hydrogels that encrypts information so that it can only be read at a specific window in temperature and time.
    In addition to digital encryption methods, physical methods play an important role. Their decoding is typically based on external stimuli such as light or heat. Multiple stimuli offer more security, but make reading the data cumbersome and complex. “Addition of the time domain greatly raises the chance to achieve the unity of security and simpleness,” according to the team led by Zhikun Zheng, Xudong Chen, and Wei Liu, at Sun Yat-sen University (Guangzhou, China). “We were inspired by the baking of bread: delicious bread can only be produced if the baking temperature is not too low or too high and the baking time is not too short or too long.”
    For their novel “double encryption system” they use thermoresponsive polymer hydrogels — cross-linked chain molecules with water incorporated into their “gaps.” Above or below a specific temperature, the clear gels become opaque due to partial unmixing. There are LCST and UCST gels, which have lower or upper critical solution temperatures, respectively. The phase retention and critical temperature can be controlled via the content of -CO-NH2 groups in the main chain of the polymer hydrogels. The density of cross-linking determines the rate of the phase transition.
    As an example of a locked label, the team used transparent acrylic plates with grooves in the pattern of a QR code. Three different gels were put in defined areas of the pattern; a UCST gel with a phase change around 40 °C, and two LCST gels with a phase change at 33 °C (one with a fast phase transition, and one with a slow phase change). Below 20 °C, the UCST gel is opaque, but highly shrinking. The pattern is deformed and unreadable. Between 20 and 33 °C, it swells and the part of the code formed by this gel becomes readable. The second part of the code, formed by the “fast” LCST gel, still cannot be read. Only heating to over 33 °C makes both LCST gels opaque. Now the timing comes into play; only the pattern of the “fast” LCST gel has the correct second part of the information. At 37 °C, it becomes readable after about half a minute and the complete code can be read. However, a brief three minutes later, the “slow” LCST gel becomes opaque and adds false information that makes the codes unreadable. Above 40 °C, both LCST gels become opaque simultaneously. In addition, the UCST gel becomes transparent and unreadable.
    This encryption can only be decoded if the specific temperature and time windows are known. The source of heat for decoding in this example could be an infrared lamp, a water bath, a hair dryer, or even the human body. If sealed to prevent the evaporation of water, these inexpensive labels are theoretically suitable for long-term use.
    Story Source:
    Materials provided by Wiley. Note: Content may be edited for style and length. More

  • in

    Surprising semiconductor properties revealed with innovative new method

    A research team probing the properties of a semiconductor combined with a novel thin oxide film have observed a surprising new source of conductivity from oxygen atoms trapped inside.
    Scott Chambers, a materials scientist at the Department of Energy’s Pacific Northwest National Laboratory, reported the team’s discovery at the Spring 2022 meeting of the American Physical Society. The research finding is described in detail in the journal Physical Review Materials.
    The discovery has broad implications for understanding the role of thin oxide films in future semiconductor design and manufacture. Specifically, semiconductors used in modern electronics come in two basic flavors — n-type and p-type — depending on the electronic impurity added during crystal growth. Modern electronic devices use both n- and p-type silicon-based materials. But there is ongoing interest in developing other types of semiconductors. Chambers and his team were testing germanium in combination with a specialized thin crystalline film of lanthanum-strontium-zirconium-titanium-oxide (LSZTO).
    “We are reporting on a powerful tool for probing semiconductor structure and function,” said Chambers. “Hard X-ray photoelectron spectroscopy revealed in this case that atoms of oxygen, an impurity in the germanium, dominate the properties of the material system when germanium is joined to a particular oxide material. This was a big surprise.”
    Using the Diamond Light Source on the Harwell Science and Innovation Campus in Oxfordshire, England, the research team discovered they could learn a great deal more about the electronic properties of the germanium/LSZTO system than was possible using the typical methods.
    “When we tried to probe the material with conventional techniques, the much higher conductivity of germanium essentially caused a short circuit,” Chambers said. “As a result, we could learn something about the electronic properties of the Ge, which we already know a lot about, but nothing about the properties of the LSZTO film or the interface between the LSZTO film and the germanium — which we suspected might be very interesting and possibly useful for technology.”
    A new role for hard X-rays
    The so-called “hard” X-rays produced by the Diamond Light Source could penetrate the material and generate information about what was going on at the atomic level.
    “Our results were best interpreted in terms of oxygen impurities in the germanium being responsible for a very interesting effect,” Chambers said. “The oxygen atoms near the interface donate electrons to the LSZTO film, creating holes, or the absence of electrons, in the germanium within a few atomic layers of the interface. These specialized holes resulted in behavior that totally eclipsed the semiconducting properties of both n- and p-type germanium in the different samples we prepared. This, too, was a big surprise.”
    The interface, where the thin-film oxide and the base semiconductor come together, is where interesting semiconducting properties often emerge. The challenge, according to Chambers, is to learn how to control the fascinating and potentially useful electric fields that forms at these interfaces by modifying the electric field at the surface. Ongoing experiments at PNNL are probing this possibility.
    While the samples used in this research do not likely have the immediate potential for commercial use, the techniques and scientific discoveries made are expected to pay dividends in the longer term, Chambers said. The new scientific knowledge will help materials scientists and physicists better understand how to design new semiconductor material systems with useful properties.
    PNNL researchers Bethany Matthews, Steven Spurgeon, Mark Bowden, Zihua Zhu and Peter Sushko contributed to the research. The study was supported by the Department of Energy Office of Science. Some experiments and sample preparation were performed at the Environmental Molecular Sciences Laboratory, a Department of Energy Office of Science user facility located at PNNL. Electron microscopy was performed in the PNNL Radiochemical Processing Laboratory. Collaborators Tien-Lin Lee and Judith Gabel performed experiments at the Diamond Light Source. Additional collaborators included the University of Texas at Arlington’s Matt Chrysler and Joe Ngai, who prepared the samples.
    Story Source:
    Materials provided by DOE/Pacific Northwest National Laboratory. Original written by Karyn Hede. Note: Content may be edited for style and length. More

  • in

    A UN report shows climate change’s escalating toll on people and nature

    Neither adaptation by humankind nor mitigation alone is enough to reduce the risk from climate impacts, hundreds of the world’s scientists say. Nothing less than a concerted, global effort to both drastically curb carbon emissions and proactively adapt to climate change can stave off the most disastrous consequences, according to the latest report from the United Nations’ Intergovernmental Panel on Climate Change, or IPCC.

    That dire warning comes as the effects of climate change on people and nature are playing out across the globe in a more widespread and severe manner than previously anticipated. And the most vulnerable communities — often low-income or Indigenous — are being hit the hardest, the report says.

    “It’s the strongest rebuttal that we’ve seen yet of this idea that we can just adapt our way out of climate change and we don’t have to mitigate emissions,” says Anne Christianson, the director of international climate policy at the Center for American Progress in Washington, D.C., who was not involved in the report.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    A consortium of 270 scientists from 67 countries synthesized the report after reviewing over 34,000 studies. Released February 28 as part of the IPCC’s sixth assessment of climate science, the report details how the impacts of climate change are playing out today in different regions, and assessed the capacities of communities and regions to adapt.

    Many countries understand the need for climate adaptation. And modern solutions, such as the building of urban gardens or adoption of agroforestry, where implemented, appear to show promise. But, the report finds, efforts to adapt are, by and large, reactionary, small and drastically underfunded. As a result, about 3.3 billion to 3.6 billion people remain highly vulnerable to climate risks such as extreme weather events, sea level rise and food and water shortages. The need for adaptation is greatest — and growing larger — in low-income regions, most notably in parts of Africa, South Asia, small island states and Central and South America.

    The report also underscores the importance of involving those who are impacted the most in climate plans. “We can no longer just make these decisions at the highest level; we need to include local stakeholders, Indigenous groups, local communities and those who are most as at risk for climate change, such as women, racial minorities, the elderly and children,” Christianson says.

    Last August, a previous report, also part of the IPCC’s sixth assessment, covered the physical science underpinning climate change (SN: 8/9/21). In that report, scientists stated loud and clear that there was no time to waste. By 2030, carbon emissions need to be cut in half, compared with 2017 levels, to prevent global temperatures from climbing 1.5° Celsius above the preindustrial baseline, the report found. Beyond that baseline, the capacity for humankind and nature to adapt severely deteriorates. In a bit of good news, the authors of that 2021 report also found that if all carbon emissions were to cease today, global temperatures would stop rising in about three years, not the 30 to 40 years once thought. In other words, we can make a big difference in very little time.

    Still, climate change is already affecting many parts of Earth. And some of the consequences aren’t going away anytime soon. Sea level will continue to rise for decades, driven in part by the runaway melting of Greenland’s ice sheet (SN: 9/30/20). By 2050, sea level along U.S. coastlines will have risen by 25 to 30 centimeters, or roughly one foot, the National Oceanic and Atmospheric Administration estimates.

    The latest IPCC report reveals that the effects of climate change, which include an increased frequency of wildfires (such as these in Turkey), are more widespread and severe than had been expected.YASIN AKGUL/AFP via Getty Images

    Extreme weather events and climate-fueled wildfires have already caused mass mortalities of corals and other animals and trees, and pushed entire species toward the brink of extinction (SN: 3/9/21). What’s more, climate change is forcing many people to relocate, as well as detrimentally affecting mental health and spreading disease as vectors such as mosquitoes shift to new habitats (SN: 5/12/20; SN: 10/7/19).

    Adaptation is especially needed in cities, which are growing and expected to contain two-thirds of the world’s population by 2050, including climate refugees from elsewhere, the new report finds. Urban communities are becoming increasingly vulnerable to extreme heat waves, urban heat island effects, floods and storm surges (SN: 9/18/21).

    Outside of cities, the breakdown of ecosystems and loss of biodiversity severely impacts the people who rely on natural systems for their livelihoods, the report emphasizes. Farmers in the global south are finding it increasingly challenging to grow crops as a result of droughts, heat waves, floods and sea-level rise (SN: 9/24/21). People who make their living fishing are being forced to travel greater distances to pursue species that are altering their natural ranges as ocean temperatures warm.

    Key to adapting to these impacts is the restoration and preservation of natural ecosystems, the report states. Conserving 30 to 50 percent of the planet’s land, ocean and freshwater ecosystems will help support biodiversity and enhance climate resilience (SN: 4/22/20). Preserving mangrove forests, for instance, along less developed coastlines sequesters large amounts of carbon and protects against storm surges (SN:5/7/21, SN: 6/4/20).

    “The truth is that nature can be our savior,” said Inger Andersen, executive director of the U.N. Environment Programme, at a February 28 news conference announcing the report’s release. “But only if we save it first.”

    Still, the natural world and many of the “services” it provides to humankind, such as carbon storage and flood control, begin to break down more rapidly at about 1.5° C above preindustrial temperatures, the report notes. And the window to prevent that from happening is closing. “We are on a trajectory to losing many of these systems and the services they provide” says Borja Reguero, a coastal science researcher at the University of California, Santa Cruz who reviewed the report.

    What that means is there is no time to waste. “We simultaneously need to reduce our greenhouse gas emissions, adapt to reduce the risks of climate change and also address losses and damages that are already being experienced,” Adelle Thomas, a climate scientist at the University of the Bahamas in Nassau, said at a February 27 news briefing. Thomas is the lead author of the new report’s chapter on key risks across sectors and regions.

    “And we have a very limited amount of time left to do this,” she stressed. More

  • in

    New approach to flexible robotics and metamaterials design mimics nature, encourages sustainability

    A new study challenges the conventional approach to designing soft robotics and a class of materials called metamaterials by utilizing the power of computer algorithms. Researchers from the University of Illinois Urbana-Champaign and Technical University of Denmark can now build multimaterial structures without dependence on human intuition or trial-and-error to produce highly efficient actuators and energy absorbers that mimic designs found in nature.
    The study, led by Illinois civil and environmental engineering professor Shelly Zhang, uses optimization theory and an algorithm-based design process called topology optimization. Also known as digital synthesis, the design process builds composite structures that can precisely achieve complex prescribed mechanical responses.
    The study results are published in the Proceedings of the National Academy of Sciences.
    “The complex mechanical responses called for in soft robotics and metamaterials require the use of multiple materials — but building these types of structures can be a challenge,” Zhang said. “There are so many materials to choose from, and determining the optimal combination of materials to fit a specific function presents an overwhelming amount of data for a researcher to process.”
    Zhang’s team set its sights on designing macroscale structures with the prescribed properties of swift stiffening, large-scale deformation buckling, multiphase stability and long-lasting force plateaus.
    The new digital synthesis process generated structures with optimal geometric characteristics composed of the optimal materials for the prescribed functions. More

  • in

    Physicists bring a once-theoretical effect of quantum matter into observable reality

    Physicists at UC Santa Barbara have become the first to experimentally observe a quirky behavior of the quantum world: a “quantum boomerang” effect that occurs when particles in a disordered system are kicked out of their locations. Instead of landing elsewhere as one might expect, they turn around and come back to where they started and stop there.
    “It’s really a fundamentally quantum mechanical effect,” said atomic physicist David Weld, whose lab produced the effect and documented it in a paper published in Physical Review X. “There’s no classical explanation for this phenomenon.”
    The boomerang effect has its roots in a phenomenon that physicist Philip Anderson predicted roughly 60 years ago, a disorder-induced behavior called Anderson localization which inhibits transport of electrons. The disorder, according to the paper’s lead author Roshan Sajjad, can be the result of imperfections in a material’s atomic lattice, whether they be impurities, defects, misalignments or other disturbances.
    “This type of disorder will keep them from basically dispersing anywhere,” Sajjad said. As a result, the electrons localize instead of zipping along the lattice, turning what would otherwise be a conducting material into an insulator. From this rather sticky quantum condition, the quantum boomerang effect was predicted a few years ago to arise.
    Launching disordered electrons away from their localized position and following them to observe their behavior is extremely difficult if not currently impossible, but the Weld Lab had a few tricks up its sleeve. Using a gas of 100,000 ultracold lithium atoms suspended in a standing wave of light and “kicking” them, emulating a so-called quantum kicked rotor (“similar to a periodically kicked pendulum,” both Weld and Sajjad said), the researchers were able to create the lattice and the disorder, and observe the launch and return of the boomerang. They worked in momentum space, a method that evades some experimental difficulties without changing the underlying physics of the boomerang effect.
    “In normal, position space, if you’re looking for the boomerang effect, you’d give your electron some finite velocity and then look for whether it came back to the same spot,” Sajjad explained. “Because we’re in momentum space, we start with a system that is at zero average momentum, and we look for some departure followed by a return to zero average momentum.”
    Using their quantum kicked rotor they pulsed the lattice a few dozen times, noting an initial shift in average momentum. Over time and despite repeated kicks, however, average momentum returned to zero. More

  • in

    BioCro software for growing virtual crops improved

    A team from the University of Illinois has revamped the popular crop growth simulation software BioCro, making it a more user-friendly and efficient way to predict crop yield. The updated version, BioCro II, allows modelers to use the technology much more easily and includes faster and more accurate algorithms.
    “In the original BioCro, all the math that the modelers were using was mixed into the programming language, which many people weren’t familiar with, so it was easy to make mistakes,” said Justin McGrath, a Research Plant Physiologist for the U.S. Department of Agriculture, Agricultural Research Service (USDA-ARS) at Illinois. “BioCro II separates those so modelers can do less programming and can instead focus on the equations.”
    Separating the equations from the programming language allows researchers to try new simulations more easily. For example, if a project is looking at how a gene can help plants to use light more efficiently, the equations for that specific gene can be added to existing models, rather than having to change the entire model to include the new information. This development also allows for the software to operate well with other models, a large improvement from the original BioCro.
    In a recent study, published in in silico Plants, McGrath and his team discuss all the improvements they made to the original BioCro software, and why they were necessary to improve modeling capabilities for researchers.
    “If you’ve got a gene and you’re wondering how much it can improve yield, you have a tiny piece in the context of the whole plant. Modeling lets you take that one change, put it in the plant and compare yield with and without that change,” said Edward Lochocki, lead author on the paper and postdoctoral researcher for RIPE. “With the updates we’ve made in BioCro II, if you have ten gene changes to make, you can look at all of them quickly and gauge relative importance before moving the work into the field.”
    This work is part of Realizing Increased Photosynthetic Efficiency (RIPE), an international research project that aims to increase global food production by developing food crops that turn the sun’s energy into food more efficiently with support from the Bill & Melinda Gates Foundation, Foundation for Food & Agriculture Research, and U.K. Foreign, Commonwealth & Development Office.
    “BioCro II represents a complete revamp of the original BioCro, eliminating significant duplication of code, improving the efficiency of code, and eliminating hard-wired parameters,” said RIPE Director Stephen Long, Ikenberry Endowed University Chair of Crop Sciences and Plant Biology at Illinois’ Carl R. Woese Institute for Genomic Biology. “All these changes make it much easier to use the model for new species and cultivars, as well as link to other models, as indeed recently demonstrated by adapting BioCro II for soybean.”
    With the latest updates, crop modeling with BioCro II is going to give researchers the ability to quickly test ideas and get results to farmers faster.
    RIPE is led by the University of Illinois in partnership with The Australian National University, Chinese Academy of Sciences, Commonwealth Scientific and Industrial Research Organisation, Lancaster University, Louisiana State University, University of California, Berkeley, University of Cambridge, University of Essex, and U.S. Department of Agriculture, Agricultural Research Service. More

  • in

    Metasurface-based antenna turns ambient radio waves into electric power

    Researchers have developed a new metasurface-based antenna that represents an important step toward making it practical to harvest energy from radio waves, such as the ones used in cell phone networks or Bluetooth connections. This technology could potentially provide wireless power to sensors, LEDs and other simple devices with low energy requirements.
    “By eliminating wired connections and batteries, these antennas could help reduce costs, improve reliability and make some electrical systems more efficient,” said research team leader Jiangfeng Zhou from the University of South Florida. “This would be useful for powering smart home sensors such as those used for temperature, lighting and motion or sensors used to monitor the structure of buildings or bridges, where replacing a battery might be difficult or impossible.”
    In the journal Optical Materials Express, the researchers report that lab tests of their new antenna showed that it can harvest 100 microwatts of power, enough to power simple devices, from low power radio waves. This was possible because the metamaterial used to make the antenna exhibits perfect absorption of radio waves and was designed to work with low intensities.
    “Although more work is needed to miniaturize the antenna, our device crosses a key threshold of 100 microwatts of harvested power with high efficiency using ambient power levels found in the real world,” said Clayton Fowler, the team member who fabricated the sample and performed the measurements. “The technology could also be adapted so that a radio wave source could be provided to power or charge devices around a room.”
    Harvesting energy from the air
    Scientists have been trying to capture energy from radio waves for quite some time, but it has been difficult to obtain enough energy to be useful. This is changing thanks to the development of metamaterials and the ever-growing number of ambient sources of radio frequency energy available, such as cell phone networks, Wi-Fi, GPS, and Bluetooth signals.
    “With the huge explosion in radio wave-based technologies, there will be a lot of waste electromagnetic emissions that could be collected,” said Zhou. “This, combined with advancements in metamaterials, has created a ripe environment for new devices and applications that could benefit from collecting this waste energy and putting it to use.”
    Metamaterials use small, carefully designed structures to interact with light and radio waves in ways that naturally occurring materials do not. To make the energy-harvesting antenna, the researchers used a metamaterial designed for high absorption of radio waves and that allows a higher voltage to flow across the device’s diode. This improved its efficiency at turning radio waves into power, particularly at low intensity.
    Testing with ambient power levels
    For lab tests of the device, which measured 16 cm by 16 cm, the researchers measured the amount of power harvested while changing the power and frequency of a radio source between 0.7 and 2.0 GHz. They demonstrated the ability to harvest 100 microwatts of power from radio waves with an intensity of just 0.4 microwatts per centimeter squared, approximately the level of intensity of the radio waves 100 meters from a cell phone tower.
    “We also placed a cell phone very close to the antenna during a phone call, and it captured enough energy to power an LED during the call,” said Zhou. “Although it would be more practical to harvest energy from cell phone towers, this demonstrated the power capturing abilities of the antenna.”
    Because the current version of the antenna is much larger than most of the devices it would potentially power, the researchers are working to make it smaller. They would also like to make a version that could collect energy from multiple types of radio waves simultaneously so that more energy could be gathered.
    Story Source:
    Materials provided by Optica. Note: Content may be edited for style and length. More

  • in

    Spintronics: Innovative crystals for future computer electronics

    While modern computers are already very fast, they also consume vast amounts of electricity. For some years now a new technology has been much talked about, which although it is still in its infancy could one day revolutionise computer technology — spintronics. The word is a portmanteau meaning “spin” and “electronics,” because with these components electrons no longer flow through computer chips, but the spin of the electrons serves as the information carrier. A team of researchers with staff from Goethe University Frankfurt has now identified materials that have surprisingly fast properties for spintronics. The results have been published in the specialist magazine “Nature Materials.”
    “You have to imagine the electron spins as if they were tiny magnetic needles which are attached to the atoms of a crystal lattice and which communicate with one another,” says Cornelius Krellner, Professor for Experimental Physics at Goethe University Frankfurt. How these magnetic needles react with one another fundamentally depends on the properties of the material. To date ferromagnetic materials have been examined in spintronics above all; with these materials — similarly to iron magnets — the magnetic needles prefer to point in one direction. In recent years, however, the focus has been placed on so-called antiferromagnets to a greater degree, because these materials are said to allow for even faster and more efficient switchability than other spintronic materials.
    With antiferromagnets the neighbouring magnetic needles always point in opposite directions. If an atomic magnetic needle is pushed in one direction, the neighbouring needle turns to face in the opposite direction. This in turn causes the next but one neighbour to point in the same direction as the first needle again. “As this interplay takes place very quickly and with virtually no friction loss, it offers considerable potential for entirely new forms of electronic componentry,” explains Krellner.
    Above all crystals with atoms from the group of rare earths are regarded as interesting candidates for spintronics as these comparatively heavy atoms have strong magnetic moments — chemists call the corresponding states of the electrons 4f orbitals. Among the rare-earth metals — some of which are neither rare nor expensive — are elements such as praseodymium and neodymium, which are also used in magnet technology. The research team has now studied seven materials with differing rare-earth atoms in total, from praseodymium to holmium.
    The problem in the development of spintronic materials is that perfectly designed crystals are required for such components as the smallest discrepancies immediately have a negative impact on the overall magnetic order in the material. This is where the expertise in Frankfurt came into play. “The rare earths melt at about 1000 degrees Celsius, but the rhodium that is also needed for the crystal does not melt until about 2000 degrees Celsius,” says Krellner. “This is why customary crystallisation methods do not function here.”
    Instead the scientists used hot indium as a solvent. The rare earths, as well as the rhodium and silicon that are required, dissolve in this at about 1500 degrees Celsius. The graphite crucible was kept at this temperature for about a week and then gently cooled. As a result the desired crystals grew in the form of thin disks with an edge length of two to three millimetres. These were then studied by the team with the aid of X-rays produced on the Berlin synchrotron BESSY II and on the Swiss Light Source of the Paul Scherrer Institute in Switzerland.
    “The most important finding is that in the crystals which we have grown the rare-earth atoms react magnetically with one another very quickly and that the strength of these reactions can be specifically adjusted through the choice of atoms,” says Krellner. This opens up the path for further optimisation — ultimately spintronics is still purely fundamental research and years away from the production of commercial components.
    There are still a great many problems to be solved on the path to market maturity, however. Thus, the crystals — which are produced in blazing heat — only deliver convincing magnetic properties at temperatures of less than minus 170 degrees Celsius. “We suspect that the operating temperatures can be raised significantly by adding iron atoms or similar elements,” says Krellner. “But it remains to be seen whether the magnetic properties are then just as positive.” Thanks to the new results the researchers now have a better idea of where it makes sense to change parameters, however.
    Story Source:
    Materials provided by Goethe University Frankfurt. Note: Content may be edited for style and length. More