More stories

  • in

    ‘Flashes of Creation’ recounts the Big Bang theory’s origin story

    Flashes of CreationPaul HalpernBasic Books, $30

    The Big Bang wasn’t always a sure bet. For several decades in the 20th century, researchers wrestled with interpreting cosmic origins, or if there even was a beginning at all. At the forefront of that debate stood physicists George Gamow and Fred Hoyle: One advocated for an expanding universe that sprouted from a hot, dense state; the other for a cosmos that is eternal and unchanging. Both pioneered contemporary cosmology, laid the groundwork for our understanding of where atoms come from and brought science to the masses.

    In Flashes of Creation, physicist Paul Halpern recounts Gamow’s and Hoyle’s interwoven stories. The book bills itself as a “joint biography,” but that is a disservice. While Gamow and Hoyle are the central characters, the book is a meticulously researched history of the Big Bang as an idea: from theoretical predictions in the 1920s, to the discovery of its microwave afterglow in 1964, and beyond to the realization in the late 1990s that the expansion of the universe is accelerating.

    Although the development of cosmology was the work of far more than just two scientists, Halpern would be hard-pressed to pick two better mascots. George Gamow was an aficionado of puns and pranks and had a keen sense of how to explain science with charm and whimsy (SN: 8/28/18). The fiercely stubborn Fred Hoyle had a darker, more cynical wit, with an artistic side that showed through in science fiction novels and even the libretto of an opera. Both wrote popular science books — Gamow’s Mr Tompkins series, which explores modern physics through the titular character’s dreams, are a milestone of the genre — and took to the airwaves to broadcast the latest scientific thinking into people’s homes.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!
    There was a problem signing you up.

    “Gamow and Hoyle were adventurous loners who cared far more about cosmic mysteries than social conventions,” Halpern writes. “Each, in his own way, was a polymath, a rebel, and a master of science communication.”

    While the Big Bang is now entrenched in the modern zeitgeist, it wasn’t always so. The idea can be traced to Georges Lemaître, a physicist and priest who proposed in 1927 that the universe is expanding. A few years later, he suggested that perhaps the cosmos began with all of its matter in a single point — the “primeval atom,” he called it. In the 1940s, Gamow latched on to the idea as way to explain how all the atomic elements came to be, forged in the “fireball” that would have filled the cosmos in its earliest moments. Hoyle balked at the notion of a moment of creation, convinced that the universe has always existed — and always will exist — in pretty much the same state we find it today. He even coined the term “Big Bang” as a put-down during a 1949 BBC radio broadcast. The elements, Hoyle argued, were forged in stars.

    As far as the elements go, both were right. “One wrote the beginning of the story of element creation,” Halpern writes, “and the other wrote the ending.” We now know that hydrogen and helium nuclei emerged in overwhelming abundance during the first few minutes following the Big Bang. Stars took care of the rest.

    Halpern treats Gamow and Hoyle with reverence and compassion. Re-created scenes provide insight into how both approached science and life. We learn how Gamow, ever the scientist, roped in physicist Niels Bohr to test ideas about why movie heroes always drew their gun faster than villains — a test that involved staging a mock attack with toy pistols. We sit in with Hoyle and colleagues while they discuss a horror film, Dead of Night, whose circular timeline inspired their ideas about an eternal universe.

    In the mid-20th century, two astronomers emerged as spokesmen for dueling ideas about the origin of the cosmos. George Gamow (left) was a passionate defender of the Big Bang theory, arguing that the universe evolved from a hot, dense state. Fred Hoyle (right) upheld the rival “steady state model,” insisting that the universe is eternal and unchanging.From left: AIP Emilio Segrè Visual Archives, George Gamow Collection; AIP Emilio Segrè Visual Archives, Clayton Collection

    And Halpern doesn’t shy away from darker moments, inviting readers to know these scientists as flawed human beings. Gamow’s devil-may-care attitude wore on his colleagues, and his excessive drinking took its toll. Hoyle, in his waning decades, embraced outlandish ideas, suggesting that epidemics come from space and that a dinosaur fossil had been tampered with to show an evolutionary link to birds. And he went to his grave in 2001 still railing against the Big Bang.

    Capturing the history of the Big Bang theory is no easy task, but Halpern pulls it off. The biggest mark against the book, in fact, may be its scope. To pull in all the other characters and side plots that drove 20th century cosmology, Gamow and Hoyle sometimes get forgotten about for long stretches. A bit more editing could have sharpened the book’s focus.

    But to anyone interested in how the idea of the Big Bang grew — or how any scientific paradigm changes — Flashes of Creation is a treat and a worthy tribute to two scientific mavericks.

    Buy Flashes of Creation from Bookshop.org. Science News is a Bookshop.org affiliate and will earn a commission on purchases made from links in this article. More

  • in

    Climate change made Europe’s flash floods in July more likely

    Climate change has increased the likelihood of heavy downpours in Western Europe such as the July rains that led to devastating flash floods, researchers affiliated with the World Weather Attribution network report August 23. Such extreme rains are 1.2 to 9 times more likely to happen — and those downpours are 3 to 19 percent heavier — as a result of human-caused climate change, the team found.

    The World Weather Attribution conducts quick analyses of extreme events to assess the contribution of climate change (SN: 7/7/21). The new study focused on two regions where record-setting rains fell July 12–15 and triggered floods that killed more than 200 people.

    In a single day, an average 93 millimeters of rain fell near Germany’s Ahr and Erft rivers; in just two days, 106 millimeters of rain fell in Belgium’s Meuse River region. With many river measurement stations destroyed, the researchers focused on assessing the contribution of climate change to the intense rainfall using climate simulations comparing conditions with and without human-caused climate change.

    That intense rainfall might occur once every 400 years under current climate conditions, but those odds are likely to increase as the world continues to warm, said coauthor Maarten van Aalst on August 23 at a news conference on the report. It’s “still a rare event, but a rare event we should prepare for,” said van Aalst, a climate and disaster risk researcher at the University of Twente in the Netherlands and the director of the Red Cross Red Crescent Climate Centre.

    That finding is consistent with data cited in the Intergovernmental Panel on Climate Change’s sixth assessment report, which notes that as global temperatures continue to rise, western and central Europe will see more intense rainfall events (SN: 8/9/21). More

  • in

    Mathematicians build an algorithm to ‘do the twist’

    Mathematicians at the Center for Advanced Mathematics for Energy Research Applications (CAMERA) at Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a mathematical algorithm to decipher the rotational dynamics of twisting particles in large complex systems from the X-ray scattering patterns observed in highly sophisticated X-ray photon correlation spectroscopy (XPCS) experiments.
    These experiments — designed to study the properties of suspensions and solutions of colloids, macromolecules, and polymers — have been established as key scientific drivers to many of the ongoing coherent light source upgrades occurring within the U.S. Department of Energy (DOE). The new mathematical methods, developed by the CAMERA team of Zixi Hu, Jeffrey Donatelli, and James Sethian, have the potential to reveal far more information about the function and properties of complex materials than was previously possible.
    Particles in a suspension undergo Brownian motion, jiggling around as they move (translate) and spin (rotate). The sizes of these random fluctuations depend on the shape and structure of the materials and contain information about dynamics, with applications across molecular biology, drug discovery, and materials science.
    XPCS works by focusing a coherent beam of X-rays to capture light scattered off of particles in suspension. A detector picks up the resulting speckle patterns, which contain several tiny fluctuations in the signal that encode detailed information about the dynamics of the observed system. To capitalize on this capability, the upcoming coherent light source upgrades at Berkeley Lab’s Advanced Light Source (ALS), Argonne’s Advanced Photon Source (APS), and SLAC’s Linac Coherent Light Source are all planning some of the world’s most advanced XPCS experiments, taking advantage of the unprecedented coherence and brightness.
    But once you collect the data from all these images, how do you get any useful information out of them? A workhorse technique to extract dynamical information from XPCS is to compute what’s known as the temporal autocorrelation, which measures how the pixels in the speckle patterns change after a certain passage of time. The autocorrelation function stitches the still images together, just as an old-time movie comes to life as closely related postcard images fly by.
    Current algorithms have mainly been limited to extracting translational motions; think of a Pogo stick jumping from spot to spot. However, no previous algorithms were capable of extracting “rotational diffusion” information about how structures spin and rotate — information that is critical to understanding the function and dynamical properties of a physical system. Getting to this hidden information is a major challenge. More

  • in

    Statistics say large pandemics are more likely than we thought

    The COVID-19 pandemic may be the deadliest viral outbreak the world has seen in more than a century. But statistically, such extreme events aren’t as rare as we may think, asserts a new analysis of novel disease outbreaks over the past 400 years.
    The study, appearing the week of Aug. 23 in the Proceedings of the National Academy of Sciences, used a newly assembled record of past outbreaks to estimate the intensity of those events and the yearly probability of them recurring.
    It found the probability of a pandemic with similar impact to COVID-19 is about 2% in any year, meaning that someone born in the year 2000 would have about a 38% chance of experiencing one by now. And that probability is only growing, which the authors say highlights the need to adjust perceptions of pandemic risks and expectations for preparedness.
    “The most important takeaway is that large pandemics like COVID-19 and the Spanish flu are relatively likely,” said William Pan, Ph.D., associate professor of global environmental health at Duke and one of the paper’s co-authors. Understanding that pandemics aren’t so rare should raise the priority of efforts to prevent and control them in the future, he said.
    The study, led by Marco Marani, Ph.D., of the University of Padua in Italy, used new statistical methods to measure the scale and frequency of disease outbreaks for which there was no immediate medical intervention over the past four centuries. Their analysis, which covered a murderer’s row of pathogens including plague, smallpox, cholera, typhus and novel influenza viruses, found considerable variability in the rate at which pandemics have occurred in the past. But they also identified patterns that allowed them to describe the probabilities of similar-scale events happening again.
    In the case of the deadliest pandemic in modern history — the Spanish flu, which killed more than 30 million people between 1918 and 1920 — the probability of a pandemic of similar magnitude occurring ranged from 0.3% to 1.9% per year over the time period studied. Taken another way, those figures mean it is statistically likely that a pandemic of such extreme scale would occur within the next 400 years. More

  • in

    Layered graphene with a twist displays unique quantum confinement in 2-D

    Scientists studying two different configurations of bilayer graphene — the two-dimensional (2-D), atom-thin form of carbon — have detected electronic and optical interlayer resonances. In these resonant states, electrons bounce back and forth between the two atomic planes in the 2-D interface at the same frequency. By characterizing these states, they found that twisting one of the graphene layers by 30 degrees relative to the other, instead of stacking the layers directly on top of each other, shifts the resonance to a lower energy. From this result, just published in Physical Review Letters, they deduced that the distance between the two layers increased significantly in the twisted configuration, compared to the stacked one. When this distance changes, so do the interlayer interactions, influencing how electrons move in the bilayer system. An understanding of this electron motion could inform the design of future quantum technologies for more powerful computing and more secure communication.
    “Today’s computer chips are based on our knowledge of how electrons move in semiconductors, specifically silicon,” said first and co-corresponding author Zhongwei Dai, a postdoc in the Interface Science and Catalysis Group at the Center for Functional Nanomaterials (CFN) at the U.S. Department of Energy (DOE)’s Brookhaven National Laboratory. “But the physical properties of silicon are reaching a physical limit in terms of how small transistors can be made and how many can fit on a chip. If we can understand how electrons move at the small scale of a few nanometers in the reduced dimensions of 2-D materials, we may be able to unlock another way to utilize electrons for quantum information science.”
    At a few nanometers, or billionths of a meter, the size of a material system is comparable to that of the wavelength of electrons. When electrons are confined in a space with dimensions of their wavelength, the material’s electronic and optical properties change. These quantum confinement effects are the result of quantum mechanical wave-like motion rather than classical mechanical motion, in which electrons move through a material and are scattered by random defects.
    For this research, the team selected a simple material model — graphene — to investigate quantum confinement effects, applying two different probes: electrons and photons (particles of light). To probe both electronic and optical resonances, they used a special substrate onto which the graphene could be transferred. Co-corresponding author and CFN Interface Science and Catalysis Group scientist Jurek Sadowski had previously designed this substrate for the Quantum Material Press (QPress). The QPress is an automated tool under development in the CFN Materials Synthesis and Characterization Facility for the synthesis, processing, and characterization of layered 2-D materials. Conventionally, scientists exfoliate 2-D material “flakes” from 3-D parent crystals (e.g., graphene from graphite) on a silicon dioxide substrate several hundred nanometers thick. However, this substrate is insulating, and thus electron-based interrogation techniques don’t work. So, Sadowski and CFN scientist Chang-Yong Nam and Stony Brook University graduate student Ashwanth Subramanian deposited a conductive layer of titanium oxide only three nanometers thick on the silicon dioxide substrate.
    “This layer is transparent enough for optical characterization and determination of the thickness of exfoliated flakes and stacked monolayers while conductive enough for electron microscopy or synchrotron-based spectroscopy techniques,” explained Sadowski.
    In the Charlie Johnson Group at the University of Pennsylvania — Rebecca W. Bushnell Professor of Physics and Astronomy Charlie Johnson, postdoc Qicheng Zhang, and former postdoc Zhaoli Gao (now an assistant professor at the Chinese University of Hong Kong) — grew the graphene on metal foils and transferred it onto the titanium oxide/silicon dioxide substrate. When graphene is grown in this way, all three domains (single layer, stacked, and twisted) are present. More

  • in

    Compact system designed for high-precision, robot-based surface measurements

    Researchers have developed a lightweight optical system for 3D inspection of surfaces with micron-scale precision. The new measurement tool could greatly enhance quality control inspection for high-tech products including semiconductor chips, solar panels and consumer electronics such as flat panel televisions.
    Because vibrations make it difficult to capture precision 3D measurements on the production line, samples are periodically taken for analysis in a lab. However, any defective products made while waiting for results must be discarded.
    To create a system that could operate in the vibration-prone environment of an industrial manufacturing plant, researchers headed by Georg Schitter from Technische Universität Wien in Austria combined a compact 2D fast steering mirror with a high precision 1D confocal chromatic sensor.
    “Robot-based inline inspection and measurement systems such as what we developed can enable 100% quality control in industrial production, replacing current sample-based methods,” said Ernst Csencsics, who co-led the research team with Daniel Wertjanz. “This creates a production process that is more efficient because it saves energy and resources.”
    As described in The Optical Society (OSA) journal Applied Optics, the new system is designed to be mounted on tracking platform placed on a robotic arm for contactless 3D measurements of arbitrary shapes and surfaces. It weighs just 300 grams and measures 75 x 63 x 55 millimeters cubed, which is about the size of an espresso cup.
    “Our system can measure 3D surface topographies with unprecedented combination of flexibility, precision, and speed,” said Wertjanz, who is pursuing a PhD on this research topic. “This creates less waste because manufacturing problems can be identified in real-time, and processes can be quickly adapted and optimized.”
    From lab to fab More

  • in

    Mathematical model predicts best way to build muscle

    Researchers have developed a mathematical model that can predict the optimum exercise regime for building muscle.
    The researchers, from the University of Cambridge, used methods of theoretical biophysics to construct the model, which can tell how much a specific amount of exertion will cause a muscle to grow and how long it will take. The model could form the basis of a software product, where users could optimise their exercise regimes by entering a few details of their individual physiology.
    The model is based on earlier work by the same team, which found that a component of muscle called titin is responsible for generating the chemical signals which affect muscle growth.
    The results, reported in the Biophysical Journal, suggest that there is an optimal weight at which to do resistance training for each person and each muscle growth target. Muscles can only be near their maximal load for a very short time, and it is the load integrated over time which activates the cell signalling pathway that leads to synthesis of new muscle proteins. But below a certain value, the load is insufficient to cause much signalling, and exercise time would have to increase exponentially to compensate. The value of this critical load is likely to depend on the particular physiology of the individual.
    We all know that exercise builds muscle. Or do we? “Surprisingly, not very much is known about why or how exercise builds muscles: there’s a lot of anecdotal knowledge and acquired wisdom, but very little in the way of hard or proven data,” said Professor Eugene Terentjev from Cambridge’s Cavendish Laboratory, one of the paper’s authors.
    When exercising, the higher the load, the more repetitions or the greater the frequency, then the greater the increase in muscle size. However, even when looking at the whole muscle, why or how much this happens isn’t known. The answers to both questions get even trickier as the focus goes down to a single muscle or its individual fibres. More

  • in

    Smallest biosupercapacitor provides energy for biomedical applications

    The miniaturization of microelectronic sensor technology, microelectronic robots or intravascular implants is progressing rapidly. However, it also poses major challenges for research. One of the biggest is the development of tiny but efficient energy storage devices that enable the operation of autonomously working microsystems — in more and more smaller areas of the human body for example. In addition, these energy storage devices must be bio-compatible if they are to be used in the body at all. Now there is a prototype that combines these essential properties. The breakthrough was achieved by an international research team led by Prof. Dr. Oliver G. Schmidt, Professorship of Materials Systems for Nanoelectronics at Chemnitz University of Technology, initiator of the Center for Materials, Architectures and Integration of Nanomembranes (MAIN) at Chemnitz University of Technology and director at the Leibniz Institute for Solid State and Materials Research (IFW) Dresden. The Leibniz Institute of Polymer Research Dresden (IPF) was also involved in the study as a cooperation partner.
    In the current issue of Nature Communication, the researchers report on the smallest microsupercapacitors to date, which already functions in (artificial) blood vessels and can be used as an energy source for a tiny sensor system to measure pH.
    This storage system opens up possibilities for intravascular implants and microrobotic systems for next-generation biomedicine that could operate in hard-to-reach small spaces deep inside the human body. For example, real-time detection of blood pH can help predict early tumor growing. “It is extremely encouraging to see how new, extremely flexible, and adaptive microelectronics is making it into the miniaturized world of biological systems,” says research group leader Prof. Dr. Oliver G. Schmidt, who is extremely pleased with this research success.
    The fabrication of the samples and the investigation of the biosupercapacitor were largely carried out at the Research Center MAIN at Chemnitz University of Technology.
    “The architecture of our nano-bio supercapacitors offers the first potential solution to one of the biggest challenges — tiny integrated energy storage devices that enable the self-sufficient operation of multifunctional microsystems,” says Dr. Vineeth Kumar, researcher in Prof. Schmidt’s team and a research associate at the MAIN research center.
    Smaller than a speck of dust — voltage comparable to a AAA battery
    Ever smaller energy storage devices in the submillimeter range — so-called “nano-supercapacitors” (nBSC) — for even smaller microelectronic components are not only a major technical challenge, however. This is because, as a rule, these supercapacitors do not use biocompatible materials but, for example, corrosive electrolytes and quickly discharge themselves in the event of defects and contamination. Both aspects make them unsuitable for biomedical applications in the body. So-called “biosupercapacitors (BSCs)” offer a solution. They have two outstanding properties: they are fully biocompatible, which means that they can be used in body fluids such as blood and can be used for further medical studies. More