More stories

  • in

    Stressed teens benefit from coping online, but a little goes a long way

    New research published in the journal Clinical Psychological Science reveals that teenagers (ages 13-17) in low socioeconomic settings who spend a moderate amount of time online after a stressful experience deal with adversity far better than those who spend many hours online or avoid digital technology altogether.
    “Adolescents are smart, and they make use of technology to their own advantage. Because adolescents in disadvantaged settings tend to have fewer local supports, the study sought to find out whether online engagement helped reduce their stress,” said lead author Kathryn Modecki with Griffith University’s Menzies Health Institute and School of Applied Psychology. “There has been a tendency to assume that technology use by teens is negative and harmful, but such a broad assumption isn’t borne out by what we know about the developmental stage of adolescence.”
    To gather firsthand data on teens and technology, the researchers provided iPhones to more than 200 adolescents living in low socioeconomic settings. The teens were instructed to report on their technology use, stressors, and emotions five times a day for a week while using the iPhones exactly as they would use personal smartphones. The data were used to compare the emotional states of adolescents who used technology moderately, excessively, or not at all when coping with stress.
    The results revealed that adolescents who engaged with technology in moderation in the hours after a stressful situation bounced back more readily and experienced smaller surges in negative emotions, like sadness and worry, compared to adolescents who didn’t use technology or who routinely used technology as a coping mechanism.
    “We found a just-right ‘Goldilocks’ effect in which moderate amounts of online coping helped mitigate surges in negative emotions and dips in happiness,” said Modecki. “In the face of daily stressors, when adolescents engaged in emotional support seeking, they experienced better short-term stress relief.”
    According to the researchers, the online space serves not just as a short-term distraction but as a resource for adolescents to find support and information about what is troubling them. By leveling the playing field for accessing that information and support, this coping strategy may be especially pertinent for teens in low-income settings.
    Story Source:
    Materials provided by Association for Psychological Science. Note: Content may be edited for style and length. More

  • in

    Scientists develop an energy harvesting technology based on ferromagnetic resonance

    Researchers from the Graduate School of Engineering, Osaka City University have succeeded in storing electricity with the voltage generated from the conversion phenomenon of ferromagnetic resonance (FMR) using an ultra-thin magnetic film of several tens of nanometers.
    The research was conducted under the leadership of Prof. Eiji Shikoh. “We are interested in efficiently using the Earth’s natural resources to harvest energy,” states the professor, “and capturing the energy from electromagnetic waves that surround us through the electromotive force (EMF) they generate in magnetic films under FMR shows potential as one such way.” Their research was published in the journal AIP Advances.
    Ferromagnetic resonance is a state in which applying electromagnetic waves and an electrostatic magnetic field to a magnetic media causes the electromagnets inside the media to undergo precession at the same frequency as that of the electromagnetic waves. As a technique, it is often used to probe the magnetic properties of a variety of media, from bulk ferromagnetic materials to nano-scale magnetic thin films.
    “Research has shown that an EMF is generated in a ferromagnetic metal (FM) that is under FMR,” states Yuta Nogi, first author of the study, “and we explored energy storage possibilities using two FMs that are highly durable, well understood, and thus commonly used in FMR research — an iron-nickel (Ni80Fe20) and iron-cobalt (Co50Fe50) alloy thin film.”
    First, the team confirmed the two alloy films generated electricity under ferromagnetic resonance and found that Ni80Fe20 generated about 28 microvolts while Co50Fe50 generated about 6 microvolts of electricity. To store the electricity, they used an electron spin resonance device to pressurize the electromagnetic wave, and the electromagnet of the device for the static magnetic field. Connecting a storage battery directly to the membrane of the sample via a conductor, the team observed that both FM samples successfully stored energy after being in a state of FMR for 30 minutes. However, as the resonance time extended, the amount of energy stored with the iron-nickel alloy film did not change while the iron-cobalt alloy film saw a steady increase.
    “This is due to the respective magnetic field ranges for the FMR excitation,” concludes Prof. Shikoh. Upon investigating the different energy storage characteristics of the thin films, the team found when they were in the same thermal states during the experiments, Co50Fe50 could maintain FMR in a detuned condition, while Ni80Fe20 was outside the FMR excitation range. “By appropriately controlling the thermal conditions of the FM film,” continues the professor, “EMF generation under ferromagnetic resonance can be used as an energy harvesting technology.”
    Another interesting point about this research is that the team focused on EMF generation itself, independent of its origin. This means that as long as the FMR conditions are met, energy can be stored from electromagnetic waves we interact with daily — for example the Wi-Fi at your favorite café.
    Story Source:
    Materials provided by Osaka City University. Note: Content may be edited for style and length. More

  • in

    ‘Flashes of Creation’ recounts the Big Bang theory’s origin story

    Flashes of CreationPaul HalpernBasic Books, $30

    The Big Bang wasn’t always a sure bet. For several decades in the 20th century, researchers wrestled with interpreting cosmic origins, or if there even was a beginning at all. At the forefront of that debate stood physicists George Gamow and Fred Hoyle: One advocated for an expanding universe that sprouted from a hot, dense state; the other for a cosmos that is eternal and unchanging. Both pioneered contemporary cosmology, laid the groundwork for our understanding of where atoms come from and brought science to the masses.

    In Flashes of Creation, physicist Paul Halpern recounts Gamow’s and Hoyle’s interwoven stories. The book bills itself as a “joint biography,” but that is a disservice. While Gamow and Hoyle are the central characters, the book is a meticulously researched history of the Big Bang as an idea: from theoretical predictions in the 1920s, to the discovery of its microwave afterglow in 1964, and beyond to the realization in the late 1990s that the expansion of the universe is accelerating.

    Although the development of cosmology was the work of far more than just two scientists, Halpern would be hard-pressed to pick two better mascots. George Gamow was an aficionado of puns and pranks and had a keen sense of how to explain science with charm and whimsy (SN: 8/28/18). The fiercely stubborn Fred Hoyle had a darker, more cynical wit, with an artistic side that showed through in science fiction novels and even the libretto of an opera. Both wrote popular science books — Gamow’s Mr Tompkins series, which explores modern physics through the titular character’s dreams, are a milestone of the genre — and took to the airwaves to broadcast the latest scientific thinking into people’s homes.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!
    There was a problem signing you up.

    “Gamow and Hoyle were adventurous loners who cared far more about cosmic mysteries than social conventions,” Halpern writes. “Each, in his own way, was a polymath, a rebel, and a master of science communication.”

    While the Big Bang is now entrenched in the modern zeitgeist, it wasn’t always so. The idea can be traced to Georges Lemaître, a physicist and priest who proposed in 1927 that the universe is expanding. A few years later, he suggested that perhaps the cosmos began with all of its matter in a single point — the “primeval atom,” he called it. In the 1940s, Gamow latched on to the idea as way to explain how all the atomic elements came to be, forged in the “fireball” that would have filled the cosmos in its earliest moments. Hoyle balked at the notion of a moment of creation, convinced that the universe has always existed — and always will exist — in pretty much the same state we find it today. He even coined the term “Big Bang” as a put-down during a 1949 BBC radio broadcast. The elements, Hoyle argued, were forged in stars.

    As far as the elements go, both were right. “One wrote the beginning of the story of element creation,” Halpern writes, “and the other wrote the ending.” We now know that hydrogen and helium nuclei emerged in overwhelming abundance during the first few minutes following the Big Bang. Stars took care of the rest.

    Halpern treats Gamow and Hoyle with reverence and compassion. Re-created scenes provide insight into how both approached science and life. We learn how Gamow, ever the scientist, roped in physicist Niels Bohr to test ideas about why movie heroes always drew their gun faster than villains — a test that involved staging a mock attack with toy pistols. We sit in with Hoyle and colleagues while they discuss a horror film, Dead of Night, whose circular timeline inspired their ideas about an eternal universe.

    In the mid-20th century, two astronomers emerged as spokesmen for dueling ideas about the origin of the cosmos. George Gamow (left) was a passionate defender of the Big Bang theory, arguing that the universe evolved from a hot, dense state. Fred Hoyle (right) upheld the rival “steady state model,” insisting that the universe is eternal and unchanging.From left: AIP Emilio Segrè Visual Archives, George Gamow Collection; AIP Emilio Segrè Visual Archives, Clayton Collection

    And Halpern doesn’t shy away from darker moments, inviting readers to know these scientists as flawed human beings. Gamow’s devil-may-care attitude wore on his colleagues, and his excessive drinking took its toll. Hoyle, in his waning decades, embraced outlandish ideas, suggesting that epidemics come from space and that a dinosaur fossil had been tampered with to show an evolutionary link to birds. And he went to his grave in 2001 still railing against the Big Bang.

    Capturing the history of the Big Bang theory is no easy task, but Halpern pulls it off. The biggest mark against the book, in fact, may be its scope. To pull in all the other characters and side plots that drove 20th century cosmology, Gamow and Hoyle sometimes get forgotten about for long stretches. A bit more editing could have sharpened the book’s focus.

    But to anyone interested in how the idea of the Big Bang grew — or how any scientific paradigm changes — Flashes of Creation is a treat and a worthy tribute to two scientific mavericks.

    Buy Flashes of Creation from Bookshop.org. Science News is a Bookshop.org affiliate and will earn a commission on purchases made from links in this article. More

  • in

    Climate change made Europe’s flash floods in July more likely

    Climate change has increased the likelihood of heavy downpours in Western Europe such as the July rains that led to devastating flash floods, researchers affiliated with the World Weather Attribution network report August 23. Such extreme rains are 1.2 to 9 times more likely to happen — and those downpours are 3 to 19 percent heavier — as a result of human-caused climate change, the team found.

    The World Weather Attribution conducts quick analyses of extreme events to assess the contribution of climate change (SN: 7/7/21). The new study focused on two regions where record-setting rains fell July 12–15 and triggered floods that killed more than 200 people.

    In a single day, an average 93 millimeters of rain fell near Germany’s Ahr and Erft rivers; in just two days, 106 millimeters of rain fell in Belgium’s Meuse River region. With many river measurement stations destroyed, the researchers focused on assessing the contribution of climate change to the intense rainfall using climate simulations comparing conditions with and without human-caused climate change.

    That intense rainfall might occur once every 400 years under current climate conditions, but those odds are likely to increase as the world continues to warm, said coauthor Maarten van Aalst on August 23 at a news conference on the report. It’s “still a rare event, but a rare event we should prepare for,” said van Aalst, a climate and disaster risk researcher at the University of Twente in the Netherlands and the director of the Red Cross Red Crescent Climate Centre.

    That finding is consistent with data cited in the Intergovernmental Panel on Climate Change’s sixth assessment report, which notes that as global temperatures continue to rise, western and central Europe will see more intense rainfall events (SN: 8/9/21). More

  • in

    Mathematicians build an algorithm to ‘do the twist’

    Mathematicians at the Center for Advanced Mathematics for Energy Research Applications (CAMERA) at Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a mathematical algorithm to decipher the rotational dynamics of twisting particles in large complex systems from the X-ray scattering patterns observed in highly sophisticated X-ray photon correlation spectroscopy (XPCS) experiments.
    These experiments — designed to study the properties of suspensions and solutions of colloids, macromolecules, and polymers — have been established as key scientific drivers to many of the ongoing coherent light source upgrades occurring within the U.S. Department of Energy (DOE). The new mathematical methods, developed by the CAMERA team of Zixi Hu, Jeffrey Donatelli, and James Sethian, have the potential to reveal far more information about the function and properties of complex materials than was previously possible.
    Particles in a suspension undergo Brownian motion, jiggling around as they move (translate) and spin (rotate). The sizes of these random fluctuations depend on the shape and structure of the materials and contain information about dynamics, with applications across molecular biology, drug discovery, and materials science.
    XPCS works by focusing a coherent beam of X-rays to capture light scattered off of particles in suspension. A detector picks up the resulting speckle patterns, which contain several tiny fluctuations in the signal that encode detailed information about the dynamics of the observed system. To capitalize on this capability, the upcoming coherent light source upgrades at Berkeley Lab’s Advanced Light Source (ALS), Argonne’s Advanced Photon Source (APS), and SLAC’s Linac Coherent Light Source are all planning some of the world’s most advanced XPCS experiments, taking advantage of the unprecedented coherence and brightness.
    But once you collect the data from all these images, how do you get any useful information out of them? A workhorse technique to extract dynamical information from XPCS is to compute what’s known as the temporal autocorrelation, which measures how the pixels in the speckle patterns change after a certain passage of time. The autocorrelation function stitches the still images together, just as an old-time movie comes to life as closely related postcard images fly by.
    Current algorithms have mainly been limited to extracting translational motions; think of a Pogo stick jumping from spot to spot. However, no previous algorithms were capable of extracting “rotational diffusion” information about how structures spin and rotate — information that is critical to understanding the function and dynamical properties of a physical system. Getting to this hidden information is a major challenge. More

  • in

    Statistics say large pandemics are more likely than we thought

    The COVID-19 pandemic may be the deadliest viral outbreak the world has seen in more than a century. But statistically, such extreme events aren’t as rare as we may think, asserts a new analysis of novel disease outbreaks over the past 400 years.
    The study, appearing the week of Aug. 23 in the Proceedings of the National Academy of Sciences, used a newly assembled record of past outbreaks to estimate the intensity of those events and the yearly probability of them recurring.
    It found the probability of a pandemic with similar impact to COVID-19 is about 2% in any year, meaning that someone born in the year 2000 would have about a 38% chance of experiencing one by now. And that probability is only growing, which the authors say highlights the need to adjust perceptions of pandemic risks and expectations for preparedness.
    “The most important takeaway is that large pandemics like COVID-19 and the Spanish flu are relatively likely,” said William Pan, Ph.D., associate professor of global environmental health at Duke and one of the paper’s co-authors. Understanding that pandemics aren’t so rare should raise the priority of efforts to prevent and control them in the future, he said.
    The study, led by Marco Marani, Ph.D., of the University of Padua in Italy, used new statistical methods to measure the scale and frequency of disease outbreaks for which there was no immediate medical intervention over the past four centuries. Their analysis, which covered a murderer’s row of pathogens including plague, smallpox, cholera, typhus and novel influenza viruses, found considerable variability in the rate at which pandemics have occurred in the past. But they also identified patterns that allowed them to describe the probabilities of similar-scale events happening again.
    In the case of the deadliest pandemic in modern history — the Spanish flu, which killed more than 30 million people between 1918 and 1920 — the probability of a pandemic of similar magnitude occurring ranged from 0.3% to 1.9% per year over the time period studied. Taken another way, those figures mean it is statistically likely that a pandemic of such extreme scale would occur within the next 400 years. More

  • in

    Layered graphene with a twist displays unique quantum confinement in 2-D

    Scientists studying two different configurations of bilayer graphene — the two-dimensional (2-D), atom-thin form of carbon — have detected electronic and optical interlayer resonances. In these resonant states, electrons bounce back and forth between the two atomic planes in the 2-D interface at the same frequency. By characterizing these states, they found that twisting one of the graphene layers by 30 degrees relative to the other, instead of stacking the layers directly on top of each other, shifts the resonance to a lower energy. From this result, just published in Physical Review Letters, they deduced that the distance between the two layers increased significantly in the twisted configuration, compared to the stacked one. When this distance changes, so do the interlayer interactions, influencing how electrons move in the bilayer system. An understanding of this electron motion could inform the design of future quantum technologies for more powerful computing and more secure communication.
    “Today’s computer chips are based on our knowledge of how electrons move in semiconductors, specifically silicon,” said first and co-corresponding author Zhongwei Dai, a postdoc in the Interface Science and Catalysis Group at the Center for Functional Nanomaterials (CFN) at the U.S. Department of Energy (DOE)’s Brookhaven National Laboratory. “But the physical properties of silicon are reaching a physical limit in terms of how small transistors can be made and how many can fit on a chip. If we can understand how electrons move at the small scale of a few nanometers in the reduced dimensions of 2-D materials, we may be able to unlock another way to utilize electrons for quantum information science.”
    At a few nanometers, or billionths of a meter, the size of a material system is comparable to that of the wavelength of electrons. When electrons are confined in a space with dimensions of their wavelength, the material’s electronic and optical properties change. These quantum confinement effects are the result of quantum mechanical wave-like motion rather than classical mechanical motion, in which electrons move through a material and are scattered by random defects.
    For this research, the team selected a simple material model — graphene — to investigate quantum confinement effects, applying two different probes: electrons and photons (particles of light). To probe both electronic and optical resonances, they used a special substrate onto which the graphene could be transferred. Co-corresponding author and CFN Interface Science and Catalysis Group scientist Jurek Sadowski had previously designed this substrate for the Quantum Material Press (QPress). The QPress is an automated tool under development in the CFN Materials Synthesis and Characterization Facility for the synthesis, processing, and characterization of layered 2-D materials. Conventionally, scientists exfoliate 2-D material “flakes” from 3-D parent crystals (e.g., graphene from graphite) on a silicon dioxide substrate several hundred nanometers thick. However, this substrate is insulating, and thus electron-based interrogation techniques don’t work. So, Sadowski and CFN scientist Chang-Yong Nam and Stony Brook University graduate student Ashwanth Subramanian deposited a conductive layer of titanium oxide only three nanometers thick on the silicon dioxide substrate.
    “This layer is transparent enough for optical characterization and determination of the thickness of exfoliated flakes and stacked monolayers while conductive enough for electron microscopy or synchrotron-based spectroscopy techniques,” explained Sadowski.
    In the Charlie Johnson Group at the University of Pennsylvania — Rebecca W. Bushnell Professor of Physics and Astronomy Charlie Johnson, postdoc Qicheng Zhang, and former postdoc Zhaoli Gao (now an assistant professor at the Chinese University of Hong Kong) — grew the graphene on metal foils and transferred it onto the titanium oxide/silicon dioxide substrate. When graphene is grown in this way, all three domains (single layer, stacked, and twisted) are present. More

  • in

    Compact system designed for high-precision, robot-based surface measurements

    Researchers have developed a lightweight optical system for 3D inspection of surfaces with micron-scale precision. The new measurement tool could greatly enhance quality control inspection for high-tech products including semiconductor chips, solar panels and consumer electronics such as flat panel televisions.
    Because vibrations make it difficult to capture precision 3D measurements on the production line, samples are periodically taken for analysis in a lab. However, any defective products made while waiting for results must be discarded.
    To create a system that could operate in the vibration-prone environment of an industrial manufacturing plant, researchers headed by Georg Schitter from Technische Universität Wien in Austria combined a compact 2D fast steering mirror with a high precision 1D confocal chromatic sensor.
    “Robot-based inline inspection and measurement systems such as what we developed can enable 100% quality control in industrial production, replacing current sample-based methods,” said Ernst Csencsics, who co-led the research team with Daniel Wertjanz. “This creates a production process that is more efficient because it saves energy and resources.”
    As described in The Optical Society (OSA) journal Applied Optics, the new system is designed to be mounted on tracking platform placed on a robotic arm for contactless 3D measurements of arbitrary shapes and surfaces. It weighs just 300 grams and measures 75 x 63 x 55 millimeters cubed, which is about the size of an espresso cup.
    “Our system can measure 3D surface topographies with unprecedented combination of flexibility, precision, and speed,” said Wertjanz, who is pursuing a PhD on this research topic. “This creates less waste because manufacturing problems can be identified in real-time, and processes can be quickly adapted and optimized.”
    From lab to fab More