More stories

  • in

    In Einstein's footsteps and beyond

    In physics, as in life, it’s always good to look at things from different perspectives.
    Since the beginning of quantum physics, how light moves and interacts with matter around it has mostly been described and understood mathematically through the lens of its energy. In 1900, Max Planck used energy to explain how light is emitted by heated objects, a seminal study in the foundation of quantum mechanics. In 1905, Albert Einstein used energy when he introduced the concept of photon.
    But light has another, equally important quality known as momentum. And, as it turns out, when you take momentum away, light starts behaving in really interesting ways.
    An international team of physicists led by Michaël Lobet, a research associate at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Eric Mazur, the Balkanski Professor of Physics and Applied Physics at SEAS, are re-examining the foundations of quantum physics from the perspective of momentum and exploring what happens when the momentum of light is reduced to zero.
    The research is published in Nature Light Science & Applications.
    Any object with mass and velocity has momentum — from atoms to bullets to asteroids — and momentum can be transferred from one object to another. A gun recoils when a bullet is fired because the momentum of the bullet is transferred to the gun. At the microscopic scale, an atom recoils when it emits light because of the acquired momentum of the photon. Atomic recoil, first described by Einstein when he was writing the quantum theory of radiation, is a fundamental phenomenon which governs light emission. More

  • in

    New device: Tallest height of any known jumper, engineered or biological

    A mechanical jumper developed by UC Santa Barbara engineering professor Elliot Hawkes and collaborators is capable of achieving the tallest height — roughly 100 feet (30 meters) — of any jumper to date, engineered or biological. The feat represents a fresh approach to the design of jumping devices and advances the understanding of jumping as a form of locomotion.
    “The motivation came from a scientific question,” said Hawkes, who as a roboticist seeks to understand the many possible methods for a machine to be able to navigate its environment. “We wanted to understand what the limits were on engineered jumpers.” While there are centuries’ worth of studies on biological jumpers (that would be us in the animal kingdom), and decades’ worth of research on mostly bio-inspired mechanical jumpers, he said, the two lines of inquiry have been kept somewhat separate.
    “There hadn’t really been a study that compares and contrasts the two and how their limits are different — whether engineered jumpers are really limited to the same laws that biological jumpers are,” Hawkes said.
    Their research is published in the journal Nature.
    Big Spring, Tiny Motor
    Biological systems have long served as the first and best models for locomotion, and that has been especially true for jumping, defined by the researchers as a “movement created by forces applied to the ground by the jumper, while maintaining a constant mass.” Many engineered jumpers have focused on duplicating the designs provided by evolution, and to great effect. More

  • in

    Plug-and-play organ-on-a-chip can be customized to the patient

    Engineered tissues have become a critical component for modeling diseases and testing the efficacy and safety of drugs in a human context. A major challenge for researchers has been how to model body functions and systemic diseases with multiple engineered tissues that can physiologically communicate — just like they do in the body. However, it is essential to provide each engineered tissue with its own environment so that the specific tissue phenotypes can be maintained for weeks to months, as required for biological and biomedical studies. Making the challenge even more complex is the necessity of linking the tissue modules together to facilitate their physiological communication, which is required for modeling conditions that involve more than one organ system, without sacrificing the individual engineered tissue environments.
    Novel plug-and-play multi-organ chip, customized to the patient
    Up to now, no one has been able to meet both conditions. Today, a team of researchers from Columbia Engineering and Columbia University Irving Medical Center reports that they have developed a model of human physiology in the form of a multi-organ chip consisting of engineered human heart, bone, liver, and skin that are linked by vascular flow with circulating immune cells, to allow recapitulation of interdependent organ functions. The researchers have essentially created a plug-and-play multi-organ chip, which is the size of a microscope slide, that can be customized to the patient. Because disease progression and responses to treatment vary greatly from one person to another, such a chip will eventually enable personalized optimization of therapy for each patient. The study is the cover story of the April 2022 issue of Nature Biomedical Engineering.
    “This is a huge achievement for us — we’ve spent ten years running hundreds of experiments, exploring innumerable great ideas, and building many prototypes, and now at last we’ve developed this platform that successfully captures the biology of organ interactions in the body,” said the project leader Gordana Vunjak-Novakovic, University Professor and the Mikati Foundation Professor of Biomedical Engineering, Medical Sciences, and Dental Medicine.
    Inspired by the human body
    Taking inspiration from how the human body works, the team has built a human tissue-chip system in which they linked matured heart, liver, bone, and skin tissue modules by recirculating vascular flow, allowing for interdependent organs to communicate just as they do in the human body. The researchers chose these tissues because they have distinctly different embryonic origins, structural and functional properties, and are adversely affected by cancer treatment drugs, presenting a rigorous test of the proposed approach. More

  • in

    Physicists embark on a hunt for a long-sought quantum glow

    For “Star Wars” fans, the streaking stars seen from the cockpit of the Millennium Falcon as it jumps to hyperspace is a canonical image. But what would a pilot actually see if she could accelerate in an instant through the vacuum of space? According to a prediction known as the Unruh effect, she would more likely see a warm glow.
    Since the 1970s when it was first proposed, the Unruh effect has eluded detection, mainly because the probability of seeing the effect is infinitesimally small, requiring either enormous accelerations or vast amounts of observation time. But researchers at MIT and the University of Waterloo believe they have found a way to significantly increase the probability of observing the Unruh effect, which they detail in a study appearing in Physical Review Letters.
    Rather than observe the effect spontaneously as others have attempted in the past, the team proposes stimulating the phenomenon, in a very particular way that enhances the Unruh effect while suppressing other competing effects. The researchers liken their idea to throwing an invisibility cloak over other conventional phenomena, which should then reveal the much less obvious Unruh effect.
    If it can be realized in a practical experiment, this new stimulated approach, with an added layer of invisibility (or “acceleration-induced transparency,” as described in the paper) could vastly increase the probability of observing the Unruh effect. Instead of waiting longer than the age of the universe for an accelerating particle to produce a warm glow as the Unruh effect predicts, the team’s approach would shave that wait time down to a few hours.
    “Now at least we know there is a chance in our lifetimes where we might actually see this effect,” says study co-author Vivishek Sudhir, assistant professor of mechanical engineering at MIT, who is designing an experiment to catch the effect based on the group’s theory. “It’s a hard experiment, and there’s no guarantee that we’d be able to do it, but this idea is our nearest hope.”
    The study’s co-authors also include Barbara Šoda and Achim Kempf of the University of Waterloo. More

  • in

    AI may detect earliest signs of pancreatic cancer

    An artificial intelligence (AI) tool developed by Cedars-Sinai investigators accurately predicted who would develop pancreatic cancer based on what their CT scan images looked like years prior to being diagnosed with the disease. The findings, which may help prevent death through early detection of one of the most challenging cancers to treat, are published in the journal Cancer Biomarkers.
    “This AI tool was able to capture and quantify very subtle, early signs of pancreatic ductal adenocarcinoma in CT scans years before occurrence of the disease. These are signs that the human eye would never be able to discern,” said Debiao Li, PhD, director of the Biomedical Imaging Research Institute, professor of Biomedical Sciences and Imaging at Cedars-Sinai, and senior and corresponding author of the study. Li is also the Karl Storz Chair in Minimally Invasive Surgery in Honor of George Berci, MD.
    Pancreatic ductal adenocarcinoma is not only the most common type of pancreatic cancer, but it’s also the most deadly. Less than 10% of people diagnosed with the disease live more than five years after being diagnosed or starting treatment. But recent studies have reported that finding the cancer early can increase survival rates by as much as 50%. There currently is no easy way to find pancreatic cancer early, however.
    People with this type of cancer may experience symptoms such as general abdominal pain or unexplained weight loss, but these symptoms are often ignored or overlooked as signs of the cancer since they are common in many health conditions.
    “There are no unique symptoms that can provide an early diagnosis forpancreatic ductal adenocarcinoma,” said Stephen J. Pandol, MD, director of Basic and Translational Pancreas Research and program director of the Gastroenterology Fellowship Program at Cedars-Sinai, and another author of the study. “This AI tool may eventually be used to detect early disease in people undergoing CT scans for abdominal pain or other issues.”
    The investigators reviewed electronic medical records to identify people who were diagnosed with the cancer within the last 15 years and who underwent CT scans six months to three years prior to their diagnosis. These CT images were considered normal at the time they were taken. The team identified 36 patients who met these criteria, the majority of whom had CT scans done in the ER because of abdominal pain.
    The AI tool was trained to analyze these pre-diagnostic CT images from people with pancreatic cancer and compare them with CT images from 36 people who didn’t develop the cancer. The investigators reported that the model was 86% accurate in identifying people who would eventually be found to have pancreatic cancer and those who would not develop the cancer.
    The AI model picked up on variations on the surface of the pancreas between people with cancer and healthy controls. These textural differences could be the result of molecular changes that occur during the development of pancreatic cancer.
    “Our hope is this tool could catch the cancer early enough to make it possible for more people to have their tumor completely removed through surgery,” said Touseef Ahmad Qureshi, PhD, a scientist at Cedars-Sinai and first author of the study.
    The investigators are currently collecting data from thousands of patients at healthcare sites throughout the U.S. to continue to study the AI tool’s prediction capability.
    Funding: The study was funded by the Board of Counselors of Cedars-Sinai Medical Center, the Cedars-Sinai Samuel Oschin Comprehensive Cancer Institute and the National Institutes of Health under award number R01 CA260955.
    Story Source:
    Materials provided by Cedars-Sinai Medical Center. Note: Content may be edited for style and length. More

  • in

    COVID-19 lockdown measures affect air pollution from cities differently

    The COVID-19 pandemic and its public response created large shifts in how people travel. In some areas, these restrictions on travel appear to have had little effect on air pollution, and some cities have worse air quality than ever.
    In Chaos, by AIP Publishing, researchers in China created a network model drawn from the traffic index and air quality index of 21 cities across six regions in their country to quantify how traffic emissions from one city affect another. They wanted to leverage data from COVID-19 lockdown procedures to better explain the relationship between traffic and air pollution and saw the COVID-19 lockdowns as a rare opportunity for research.
    “Air pollution is a typical ‘commons governance’ issue,” said author Jingfang Fan. “The impact of the pandemic has led cities to implement different traffic restriction policies, one after another, which naturally forms a controlled experiment to reveal their relationship.”
    To address these questions, they turned to a weighted climate network framework to model each city as a node using pre-pandemic data from 2019 and data from 2020. They added a two-layer network that incorporated different regions, lockdown stages, and outbreak levels.
    Surrounding traffic conditions influenced air quality in Beijing-Tianjin-Hebei, the Chengdu-Chongqing Economic Circle, and central China after the outbreak. Pollution tended to peak in cities as they made initial progress for containing the virus.
    During this time, pollution in Beijing-Tianjin-Hebei and central China lessened over time. Beijing-Tianjin-Hebei, however, saw another spike as control measures for outbound traffic from Wuhan and Hubei were lifted.
    “Air pollution in big cities, such as Beijing and Shanghai, is more affected by other cities,” said author Saini Yang. “This is contrary to what we generally think, that air pollution in big cities is mainly caused by its own conditions, including the traffic congestion.”
    Author Weiping Wang hopes the team’s work inspires other interdisciplinary teams to explore unique ways to explore problems in environmental science. They will look to improve their model with a higher degree of detail for traffic emissions.
    “Our discovery is that in order to improve air pollution, it is not only necessary to improve and reduce our own urban traffic and increase green travel, but also need the joint efforts of surrounding cities,” said author Na Ying. “Everyone is important in the governance of commons.”
    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    Existing infrastructure will be unable to support future demand for high-speed internet

    Researchers have shown that the UK’s existing copper network cables can support faster internet speeds, but only to a limit. They say additional investment is urgently needed if the government is serious about its commitment to making high-speed internet available to all.
    The researchers, from the University of Cambridge and BT, have established the maximum speed at which data can be transmitted through existing copper cables. This limit would allow for faster internet compared to the speeds currently achievable using standard infrastructure, however it will not be able to support high-speed internet in the longer term.
    The team found that the ‘twisted pair’ copper cables that reach every house and business in the UK are physically limited in their ability to support higher frequencies, which in turn support higher data rates.
    While full-fibre internet is currently available to around one in four households, it is expected to take at least two decades before it reaches every home in the UK. In the meantime, however, existing infrastructure can be improved to temporarily support high-speed internet.
    The results, reported in the journal Nature Communications, both establish a physical limit on the UK’s ubiquitous copper cables, and emphasise the importance of immediate investment in future technologies.
    The Cambridge-led team used a combination of computer modelling and experiments to determine whether it was possible to get higher speeds out of existing copper infrastructure and found that it can carry a maximum frequency of about 5 GHz, above the currently used spectrum, which is lower than 1 GHz. Above 5 GHz however, the copper cables start to behave like antennas. More

  • in

    Electronics can grow on trees thanks to nanocellulose paper semiconductors

    Semiconducting nanomaterials with 3D network structures have high surface areas and lots of pores that make them excellent for applications involving adsorbing, separating, and sensing. However, simultaneously controlling the electrical properties and creating useful micro- and macro-scale structures, while achieving excellent functionality and end-use versatility, remains challenging. Now, Osaka University researchers, in collaboration with The University of Tokyo, Kyushu University, and Okayama University, have developed a nanocellulose paper semiconductor that provides both nano-micro-macro trans-scale designability of the 3D structures and wide tunability of the electrical properties. Their findings are published in ACS Nano.
    Cellulose is a natural and easy to source material derived from wood. Cellulose nanofibers (nanocellulose) can be made into sheets of flexible nanocellulose paper (nanopaper) with dimensions like those of standard A4. Nanopaper does not conduct an electric current; however, heating can introduce conducting properties. Unfortunately, this exposure to heat can also disrupt the nanostructure.
    The researchers have therefore devised a treatment process that allows them to heat the nanopaper without damaging the structures of the paper from the nanoscale up to the macroscale.
    “An important property for the nanopaper semiconductor is tunability because this allows devices to be designed for specific applications,” explains study author Hirotaka Koga. “We applied an iodine treatment that was very effective for protecting the nanostructure of the nanopaper. Combining this with spatially controlled drying meant that the pyrolysis treatment did not substantially alter the designed structures and the selected temperature could be used to control the electrical properties.”
    The researchers used origami (paper folding) and kirigami (paper cutting) techniques to provide playful examples of the flexibility of the nanopaper at the macrolevel. A bird and box were folded, shapes including an apple and snowflake were punched out, and more intricate structures were produced by laser cutting. This demonstrated the level of detail possible, as well as the lack of damage caused by the heat treatment.
    Examples of successful applications showed nanopaper semiconductor sensors incorporated into wearable devices to detect exhaled moisture breaking through facemasks and moisture on the skin. The nanopaper semiconductor was also used as an electrode in a glucose biofuel cell and the energy generated lit a small bulb.
    “The structure maintenance and tunability that we have been able to show is very encouraging for the translation of nanomaterials into practical devices,” says Associate Professor Koga. “We believe that our approach will underpin the next steps in sustainable electronics made entirely from plant materials.”
    Story Source:
    Materials provided by Osaka University. Note: Content may be edited for style and length. More