More stories

  • in

    Climate change made Europe’s flash floods in July more likely

    Climate change has increased the likelihood of heavy downpours in Western Europe such as the July rains that led to devastating flash floods, researchers affiliated with the World Weather Attribution network report August 23. Such extreme rains are 1.2 to 9 times more likely to happen — and those downpours are 3 to 19 percent heavier — as a result of human-caused climate change, the team found.

    The World Weather Attribution conducts quick analyses of extreme events to assess the contribution of climate change (SN: 7/7/21). The new study focused on two regions where record-setting rains fell July 12–15 and triggered floods that killed more than 200 people.

    In a single day, an average 93 millimeters of rain fell near Germany’s Ahr and Erft rivers; in just two days, 106 millimeters of rain fell in Belgium’s Meuse River region. With many river measurement stations destroyed, the researchers focused on assessing the contribution of climate change to the intense rainfall using climate simulations comparing conditions with and without human-caused climate change.

    That intense rainfall might occur once every 400 years under current climate conditions, but those odds are likely to increase as the world continues to warm, said coauthor Maarten van Aalst on August 23 at a news conference on the report. It’s “still a rare event, but a rare event we should prepare for,” said van Aalst, a climate and disaster risk researcher at the University of Twente in the Netherlands and the director of the Red Cross Red Crescent Climate Centre.

    That finding is consistent with data cited in the Intergovernmental Panel on Climate Change’s sixth assessment report, which notes that as global temperatures continue to rise, western and central Europe will see more intense rainfall events (SN: 8/9/21). More

  • in

    Mathematicians build an algorithm to ‘do the twist’

    Mathematicians at the Center for Advanced Mathematics for Energy Research Applications (CAMERA) at Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a mathematical algorithm to decipher the rotational dynamics of twisting particles in large complex systems from the X-ray scattering patterns observed in highly sophisticated X-ray photon correlation spectroscopy (XPCS) experiments.
    These experiments — designed to study the properties of suspensions and solutions of colloids, macromolecules, and polymers — have been established as key scientific drivers to many of the ongoing coherent light source upgrades occurring within the U.S. Department of Energy (DOE). The new mathematical methods, developed by the CAMERA team of Zixi Hu, Jeffrey Donatelli, and James Sethian, have the potential to reveal far more information about the function and properties of complex materials than was previously possible.
    Particles in a suspension undergo Brownian motion, jiggling around as they move (translate) and spin (rotate). The sizes of these random fluctuations depend on the shape and structure of the materials and contain information about dynamics, with applications across molecular biology, drug discovery, and materials science.
    XPCS works by focusing a coherent beam of X-rays to capture light scattered off of particles in suspension. A detector picks up the resulting speckle patterns, which contain several tiny fluctuations in the signal that encode detailed information about the dynamics of the observed system. To capitalize on this capability, the upcoming coherent light source upgrades at Berkeley Lab’s Advanced Light Source (ALS), Argonne’s Advanced Photon Source (APS), and SLAC’s Linac Coherent Light Source are all planning some of the world’s most advanced XPCS experiments, taking advantage of the unprecedented coherence and brightness.
    But once you collect the data from all these images, how do you get any useful information out of them? A workhorse technique to extract dynamical information from XPCS is to compute what’s known as the temporal autocorrelation, which measures how the pixels in the speckle patterns change after a certain passage of time. The autocorrelation function stitches the still images together, just as an old-time movie comes to life as closely related postcard images fly by.
    Current algorithms have mainly been limited to extracting translational motions; think of a Pogo stick jumping from spot to spot. However, no previous algorithms were capable of extracting “rotational diffusion” information about how structures spin and rotate — information that is critical to understanding the function and dynamical properties of a physical system. Getting to this hidden information is a major challenge. More

  • in

    Statistics say large pandemics are more likely than we thought

    The COVID-19 pandemic may be the deadliest viral outbreak the world has seen in more than a century. But statistically, such extreme events aren’t as rare as we may think, asserts a new analysis of novel disease outbreaks over the past 400 years.
    The study, appearing the week of Aug. 23 in the Proceedings of the National Academy of Sciences, used a newly assembled record of past outbreaks to estimate the intensity of those events and the yearly probability of them recurring.
    It found the probability of a pandemic with similar impact to COVID-19 is about 2% in any year, meaning that someone born in the year 2000 would have about a 38% chance of experiencing one by now. And that probability is only growing, which the authors say highlights the need to adjust perceptions of pandemic risks and expectations for preparedness.
    “The most important takeaway is that large pandemics like COVID-19 and the Spanish flu are relatively likely,” said William Pan, Ph.D., associate professor of global environmental health at Duke and one of the paper’s co-authors. Understanding that pandemics aren’t so rare should raise the priority of efforts to prevent and control them in the future, he said.
    The study, led by Marco Marani, Ph.D., of the University of Padua in Italy, used new statistical methods to measure the scale and frequency of disease outbreaks for which there was no immediate medical intervention over the past four centuries. Their analysis, which covered a murderer’s row of pathogens including plague, smallpox, cholera, typhus and novel influenza viruses, found considerable variability in the rate at which pandemics have occurred in the past. But they also identified patterns that allowed them to describe the probabilities of similar-scale events happening again.
    In the case of the deadliest pandemic in modern history — the Spanish flu, which killed more than 30 million people between 1918 and 1920 — the probability of a pandemic of similar magnitude occurring ranged from 0.3% to 1.9% per year over the time period studied. Taken another way, those figures mean it is statistically likely that a pandemic of such extreme scale would occur within the next 400 years. More

  • in

    Layered graphene with a twist displays unique quantum confinement in 2-D

    Scientists studying two different configurations of bilayer graphene — the two-dimensional (2-D), atom-thin form of carbon — have detected electronic and optical interlayer resonances. In these resonant states, electrons bounce back and forth between the two atomic planes in the 2-D interface at the same frequency. By characterizing these states, they found that twisting one of the graphene layers by 30 degrees relative to the other, instead of stacking the layers directly on top of each other, shifts the resonance to a lower energy. From this result, just published in Physical Review Letters, they deduced that the distance between the two layers increased significantly in the twisted configuration, compared to the stacked one. When this distance changes, so do the interlayer interactions, influencing how electrons move in the bilayer system. An understanding of this electron motion could inform the design of future quantum technologies for more powerful computing and more secure communication.
    “Today’s computer chips are based on our knowledge of how electrons move in semiconductors, specifically silicon,” said first and co-corresponding author Zhongwei Dai, a postdoc in the Interface Science and Catalysis Group at the Center for Functional Nanomaterials (CFN) at the U.S. Department of Energy (DOE)’s Brookhaven National Laboratory. “But the physical properties of silicon are reaching a physical limit in terms of how small transistors can be made and how many can fit on a chip. If we can understand how electrons move at the small scale of a few nanometers in the reduced dimensions of 2-D materials, we may be able to unlock another way to utilize electrons for quantum information science.”
    At a few nanometers, or billionths of a meter, the size of a material system is comparable to that of the wavelength of electrons. When electrons are confined in a space with dimensions of their wavelength, the material’s electronic and optical properties change. These quantum confinement effects are the result of quantum mechanical wave-like motion rather than classical mechanical motion, in which electrons move through a material and are scattered by random defects.
    For this research, the team selected a simple material model — graphene — to investigate quantum confinement effects, applying two different probes: electrons and photons (particles of light). To probe both electronic and optical resonances, they used a special substrate onto which the graphene could be transferred. Co-corresponding author and CFN Interface Science and Catalysis Group scientist Jurek Sadowski had previously designed this substrate for the Quantum Material Press (QPress). The QPress is an automated tool under development in the CFN Materials Synthesis and Characterization Facility for the synthesis, processing, and characterization of layered 2-D materials. Conventionally, scientists exfoliate 2-D material “flakes” from 3-D parent crystals (e.g., graphene from graphite) on a silicon dioxide substrate several hundred nanometers thick. However, this substrate is insulating, and thus electron-based interrogation techniques don’t work. So, Sadowski and CFN scientist Chang-Yong Nam and Stony Brook University graduate student Ashwanth Subramanian deposited a conductive layer of titanium oxide only three nanometers thick on the silicon dioxide substrate.
    “This layer is transparent enough for optical characterization and determination of the thickness of exfoliated flakes and stacked monolayers while conductive enough for electron microscopy or synchrotron-based spectroscopy techniques,” explained Sadowski.
    In the Charlie Johnson Group at the University of Pennsylvania — Rebecca W. Bushnell Professor of Physics and Astronomy Charlie Johnson, postdoc Qicheng Zhang, and former postdoc Zhaoli Gao (now an assistant professor at the Chinese University of Hong Kong) — grew the graphene on metal foils and transferred it onto the titanium oxide/silicon dioxide substrate. When graphene is grown in this way, all three domains (single layer, stacked, and twisted) are present. More

  • in

    Compact system designed for high-precision, robot-based surface measurements

    Researchers have developed a lightweight optical system for 3D inspection of surfaces with micron-scale precision. The new measurement tool could greatly enhance quality control inspection for high-tech products including semiconductor chips, solar panels and consumer electronics such as flat panel televisions.
    Because vibrations make it difficult to capture precision 3D measurements on the production line, samples are periodically taken for analysis in a lab. However, any defective products made while waiting for results must be discarded.
    To create a system that could operate in the vibration-prone environment of an industrial manufacturing plant, researchers headed by Georg Schitter from Technische Universität Wien in Austria combined a compact 2D fast steering mirror with a high precision 1D confocal chromatic sensor.
    “Robot-based inline inspection and measurement systems such as what we developed can enable 100% quality control in industrial production, replacing current sample-based methods,” said Ernst Csencsics, who co-led the research team with Daniel Wertjanz. “This creates a production process that is more efficient because it saves energy and resources.”
    As described in The Optical Society (OSA) journal Applied Optics, the new system is designed to be mounted on tracking platform placed on a robotic arm for contactless 3D measurements of arbitrary shapes and surfaces. It weighs just 300 grams and measures 75 x 63 x 55 millimeters cubed, which is about the size of an espresso cup.
    “Our system can measure 3D surface topographies with unprecedented combination of flexibility, precision, and speed,” said Wertjanz, who is pursuing a PhD on this research topic. “This creates less waste because manufacturing problems can be identified in real-time, and processes can be quickly adapted and optimized.”
    From lab to fab More

  • in

    Mathematical model predicts best way to build muscle

    Researchers have developed a mathematical model that can predict the optimum exercise regime for building muscle.
    The researchers, from the University of Cambridge, used methods of theoretical biophysics to construct the model, which can tell how much a specific amount of exertion will cause a muscle to grow and how long it will take. The model could form the basis of a software product, where users could optimise their exercise regimes by entering a few details of their individual physiology.
    The model is based on earlier work by the same team, which found that a component of muscle called titin is responsible for generating the chemical signals which affect muscle growth.
    The results, reported in the Biophysical Journal, suggest that there is an optimal weight at which to do resistance training for each person and each muscle growth target. Muscles can only be near their maximal load for a very short time, and it is the load integrated over time which activates the cell signalling pathway that leads to synthesis of new muscle proteins. But below a certain value, the load is insufficient to cause much signalling, and exercise time would have to increase exponentially to compensate. The value of this critical load is likely to depend on the particular physiology of the individual.
    We all know that exercise builds muscle. Or do we? “Surprisingly, not very much is known about why or how exercise builds muscles: there’s a lot of anecdotal knowledge and acquired wisdom, but very little in the way of hard or proven data,” said Professor Eugene Terentjev from Cambridge’s Cavendish Laboratory, one of the paper’s authors.
    When exercising, the higher the load, the more repetitions or the greater the frequency, then the greater the increase in muscle size. However, even when looking at the whole muscle, why or how much this happens isn’t known. The answers to both questions get even trickier as the focus goes down to a single muscle or its individual fibres. More

  • in

    Smallest biosupercapacitor provides energy for biomedical applications

    The miniaturization of microelectronic sensor technology, microelectronic robots or intravascular implants is progressing rapidly. However, it also poses major challenges for research. One of the biggest is the development of tiny but efficient energy storage devices that enable the operation of autonomously working microsystems — in more and more smaller areas of the human body for example. In addition, these energy storage devices must be bio-compatible if they are to be used in the body at all. Now there is a prototype that combines these essential properties. The breakthrough was achieved by an international research team led by Prof. Dr. Oliver G. Schmidt, Professorship of Materials Systems for Nanoelectronics at Chemnitz University of Technology, initiator of the Center for Materials, Architectures and Integration of Nanomembranes (MAIN) at Chemnitz University of Technology and director at the Leibniz Institute for Solid State and Materials Research (IFW) Dresden. The Leibniz Institute of Polymer Research Dresden (IPF) was also involved in the study as a cooperation partner.
    In the current issue of Nature Communication, the researchers report on the smallest microsupercapacitors to date, which already functions in (artificial) blood vessels and can be used as an energy source for a tiny sensor system to measure pH.
    This storage system opens up possibilities for intravascular implants and microrobotic systems for next-generation biomedicine that could operate in hard-to-reach small spaces deep inside the human body. For example, real-time detection of blood pH can help predict early tumor growing. “It is extremely encouraging to see how new, extremely flexible, and adaptive microelectronics is making it into the miniaturized world of biological systems,” says research group leader Prof. Dr. Oliver G. Schmidt, who is extremely pleased with this research success.
    The fabrication of the samples and the investigation of the biosupercapacitor were largely carried out at the Research Center MAIN at Chemnitz University of Technology.
    “The architecture of our nano-bio supercapacitors offers the first potential solution to one of the biggest challenges — tiny integrated energy storage devices that enable the self-sufficient operation of multifunctional microsystems,” says Dr. Vineeth Kumar, researcher in Prof. Schmidt’s team and a research associate at the MAIN research center.
    Smaller than a speck of dust — voltage comparable to a AAA battery
    Ever smaller energy storage devices in the submillimeter range — so-called “nano-supercapacitors” (nBSC) — for even smaller microelectronic components are not only a major technical challenge, however. This is because, as a rule, these supercapacitors do not use biocompatible materials but, for example, corrosive electrolytes and quickly discharge themselves in the event of defects and contamination. Both aspects make them unsuitable for biomedical applications in the body. So-called “biosupercapacitors (BSCs)” offer a solution. They have two outstanding properties: they are fully biocompatible, which means that they can be used in body fluids such as blood and can be used for further medical studies. More

  • in

    Discovery could improve reliability of future smart electronics

    An undergraduate student from the University of Surrey has discovered a way to suppress hot-carrier effects that have plagued devices that use thin-film transistor architecture — such as smartwatches and solar panels.
    Hot-carrier effects occur when unwanted electron energy builds up in certain regions of transistors, resulting in devices performing unreliably.
    In her final-year project, Lea Motte studied a new device, the multimodal transistor, an alternative to conventional thin-film transistors, invented and developed by PhD candidate Eva Bestelink and supervisor Dr Radu Sporea at Surrey.
    Lea used a defining feature of multimodal transistors, the separation of controls for introducing electrons into the device and allowing them to move across the transistor. Through computer simulations, Lea discovered that choosing the right voltage to apply to the transport control region can prevent unwanted hot-carrier effects. In addition, it ensures that the current through the transistor remains constant in a wide range of operating conditions.
    In a paper published in the journal Advanced Electronic Materials, PhD student Eva Bestelink systematically studies Lea’s discovery of the unusual behaviour in multimodal transistors by confirming it with measurements in microcrystalline silicon transistors and performing extensive device simulations to understand the device physics that underpins its unique ability.
    This discovery means that future technologies that use multimodal transistors could be more power-efficient, and it could lead to high-performance amplifiers, which are essential for measuring signals from environmental and biological sensors.
    Eva Bestelink, lead author of the study from the University of Surrey, said:
    “We now have a better understanding of what the multimodal transistor can offer when made with materials that cause numerous challenges to regular devices.
    “For circuit designers, this work offers insight into how to operate the device for optimum performance. In the long term, the multimodal transistor offers an alternative for emerging high-performance materials, where traditional solutions are no longer applicable.”
    Story Source:
    Materials provided by University of Surrey. Note: Content may be edited for style and length. More