More stories

  • in

    Virtual patient ‘surrogates’ can personalize cancer treatments

    Scientists have developed mathematical models that act as patient ‘surrogates’ for evaluating potential prostate cancer treatments.
    The research, published today in eLife, could ultimately help clinicians choose the most effective drug combination before they start to treat a patient, potentially improving their response and avoiding drug resistance.
    Researchers used an approach called Boolean modelling, which is already used to describe dynamics of complex cell signalling processes. But existing models have been generic and have not accounted for the differences between individual patients’ diseases or how they respond to treatment.
    “The dream has always been to use more and more complex models and data until we can have digital twins, or virtual humans or surrogates — a simulation that helps select the proper clinical treatment for a given patient with high degrees of specificity or sensitivity,” explains Arnau Montagud, who was a researcher at Institut Curie, Paris, France, at the time the study was carried out, and is now at the Barcelona Supercomputing Center (BSC), Spain. “We wanted to know if our method of tailoring Boolean models of cell signalling was accurate enough to discriminate between different patients, and whether the models could be used as testbeds to rank personalised drug treatments.”
    To begin, the team used data from The Cancer Genome Atlas (TCGA) and other databases to create a network of all relevant pathways involved in prostate cell signalling. Then they converted this into a generic Boolean model where all the nodes in the network can be assigned one of two values — 0 (inactivated or absent) or 1 (activated or present). Data from 488 prostate cancer patients from TCGA were used to create 488 patient-specific Boolean models. For example, where a patient’s tumour had a mutation in a specific gene, this meant the node in the network was inactivated, and assigned a value of 0.
    Having built these models, the team looked in each patient model for genes that, when inhibited, would block growth or encourage death of cancer cells. They narrowed these genes down to a list of targets of existing drugs, and ran simulations to predict what would happen if the drugs were combined. This allowed them to compare the effects of individual drugs on each patient, and to propose certain drugs that would work for specific patients or for groups of patients. Inactivation of some of the genes had a greater effect in some patients compared with others, highlighting opportunities for personalised drug treatments. The simulations also spotted patterns linked to the grade of patients’ tumours as measured by the Gleason score, suggesting it might be possible to tailor drug treatments to prostate cancer patients according to their score in the future.
    Testing whether these treatment predictions hold true in patients would require a clinical trial, so the team instead built eight different personalised prostate cancer cell line models from publicly available data. As with the patient models, they looked for commonly occurring mutations in the cell lines that influenced cancer cell growth or death. This resulted in the identification of 17 proteins that could be targeted with drugs.
    Next, to investigate if drugs targeting these proteins would have the anticipated effects, they mimicked the effect of different drug dosages in the model by switching off each node from 100% active to 0% active and looking at the effects on growth, death and spread of the cancer cells. When they carried out the same experiment in real cell lines, it confirmed that blocking the identified nodes in the model had differential effects on cell growth and survival. Moreover, the model could predict synergistic effects of treatments that work against different nodes in the network, which could help to identify promising drug combinations for future investigation.
    “Our personalised models suggest single and combined drug treatments for individual prostate cancer patients,” concludes Laurence Calzone, a researcher at Institut Curie, and a co-senior author of the study alongside Julio Saez-Rodriguez from Heidelberg University, Germany. “These advances are incremental steps towards having digital twins that will help clinicians before they go to the patient’s bedside, allowing them to capture patient individualities and test and rank different drug treatments.”
    Story Source:
    Materials provided by eLife. Note: Content may be edited for style and length. More

  • in

    Algorithm could shorten quality testing, research in many industries by months

    A machine-learning algorithm developed at Sandia National Laboratories could provide auto manufacturing, aerospace and other industries a faster and more cost-efficient way to test bulk materials.
    The technique was published recently in the scientific journal Materials Science and Engineering: A.
    Production stoppages are costly. So, manufacturers screen materials like sheet metal for formability before using them to make sure the material will not crack when it is stamped, stretched and strained as it’s formed into different parts. Companies often use commercial simulation software calibrated to the results of various mechanical tests, said Sandia scientist David Montes de Oca Zapiain, the lead author on the paper. However, these tests can take months to complete.
    And while certain high-fidelity computer simulations can assess formability in only a few weeks, companies need access to a supercomputer and specialized expertise to run them, Montes de Oca Zapiain said.
    Sandia has shown machine learning can dramatically cut time and resources to calibrate commercial software because the algorithm does not need information from mechanical tests, said Montes de Oca Zapiain. Nor does the method need a supercomputer. Additionally, it opens a new path to perform faster research and development.
    “You could efficiently use this algorithm to potentially find lighter materials with minimal resources without sacrificing safety or accuracy,” Montes de Oca Zapiain said. More

  • in

    Liquid electronics: Wrapping droplets in graphene for printed microchips and wearable sensors

    New research from physicists at the University of Sussex will ‘significantly advance’ the new technology area of liquid electronics, enhancing the functionality and sustainability of potential applications in printed electronics, wearable health monitors and even batteries.
    In their research paper published in ACS Nano, the Sussex scientists have built on their previous work to wrap emulsion droplets with graphene and other 2D materials by reducing the coatings down to atomically-thin nanosheet layers. In doing so they were able to create electrically-conducting liquid emulsions that are the lowest-loading graphene networks ever reported — just 0.001 vol%.
    This means that the subsequent liquid electronic technology — whether that might be strain sensors to monitor physical performance and health, electronic devices printed from emulsion droplets, and even potentially more efficient and longer-lasting electric vehicle batteries, will be both cheaper and more sustainable because they will require less graphene or other 2D nanosheets coating the droplets.
    Another significant development was that the scientists can now make these electronic droplet networks using any liquids — whereas previous research focused on conventional oils and water — because they have discovered how to control which liquid droplets are wrapped in graphene, meaning that they can design the emulsions specifically to the desired application.
    Research Fellow in Material Physics in the University of Sussex School of Mathematical and Physical Science and lead author of the paper, Dr Sean Ogilvie explains the science behind the development: “The potential of 2D materials, such as graphene, is in their electronic properties and their processability; we developed a process to harness the surface area of our nanosheet dispersions to stabilise emulsion droplets with ultra-thin coatings.
    “The tuneability of these emulsions allows us to wrap 2D materials around any liquid droplets to exploit their electronic properties. This includes emulsion inks, in which, we’ve discovered that droplets can be deposited without the coffee ring effect which hinders printing of conventional functional inks, potentially enabling single-droplet films for printed transistors and other electronic devices.
    “Another exciting development for our research group is that we can now also design and control our emulsions towards specific applications such as wrapping soft polymers such as silicone for wearable strain sensors that exhibit increased sensitivity at low graphene loading, and we are also investigating emulsion assembly of battery electrode materials to enhance the robustness of these energy storage devices.”
    Professor of Experimental Physics at the University of Sussex, Alan Dalton, who was first inspired by the making of a salad dressing to explore the potential of adding graphene to liquid emulsions, explains why this development is exciting: “In bringing the graphene coatings of the liquid droplets down to atomically-thin layers and in opening wide the potential for real-world applications by being able to do so with any liquid material, this research development will significantly advance the emerging and scientifically exciting field of liquid electronics.”
    Story Source:
    Materials provided by University of Sussex. Original written by Alice Ingall. Note: Content may be edited for style and length. More

  • in

    An innovative technology for 6G communication networks

    Carrying data streams using the terahertz (THz) spectral region could meet the ever-growing demand for unprecedented data transfer rates, i.e. terabits-per-second (Tb/s), since it offers a higher available bandwidth. However, it is extremely challenging to develop physical components that go beyond the most elementary processing functionalities for constructing future communication systems at THz frequencies. Postdoctoral researcher Junliang Dong and an international team of scientists, under the supervision of Professor Roberto Morandotti at the Institut national de la recherche scientifique (INRS) have developed a new waveguide to overcome those limitations. Their work, a first in the field, was published in the journal Nature Communications.
    Engraving the waveguide
    In the paper, the scientists proposed a novel approach for the realization of broadband THz signal processing in metal-wire waveguides by engineering the wire surfaces. They act like pipes for electromagnetic waves and confine their propagation.
    “We demonstrate that, by engraving judiciously designed grooves with multiscale structures directly on the metal-wires, we can change which frequencies are reflected or transmitted (i.e., a THz Bragg grating) without adding any material to the waveguide.”
    -Junliang Dong
    This concept is exploited for the first time in the THz regime. It allows for unprecedented flexibility towards manipulating THz pulses propagating within the waveguides, which in turn enables more complex signal-processing functionalities. For example, we could think of “holographic messaging” in 6G, comparatively to SMS and voice mail in 1G and 2G.
    Besides transporting the data streams, innovative THz waveguides can provide versatile signal-processing functionalities. The distinct advantages of metal-wire waveguides, including structural simplicity, tolerance to bending, as well as similarity to cables for connections, make them very promising. However, the tight confinement limits the possible ways to manipulate the propagating THz waves.
    A universal approach
    As a proof of concept, the researchers introduce a completely new waveguide geometry: the four-wire waveguide (FWWG), which is capable of sustaining two independent waves that are orthogonally polarized (vertically and horizontally) so they do not interfere with each other. It pioneers, for the first time, polarization-division multiplexing in THz waveguides. In other words, it allows the two channels of information to be transmitted over a single transmission path. Most importantly, by integrating the Bragg gratings with the engraving, they can be manipulated independently.
    “Our device represents the first THz waveguide architecture, with a new metal-based design, which supports polarization-division multiplexing. In particular, the capability of realizing such a complex signal-processing functionality, i.e., the independent manipulation of multiplexed THz signals, has never been achieved elsewhere,” concludes Professor Morandotti.
    This universal approach for the realization of broadband THz signal processing, in combination with novel waveguide designs, paves the way to the next generation network. It will allow for fascinating application scenarios, such as the multi-channel transmission of uncompressed ultra-high-definition video, ultra-high-speed short-distance data transfer between devices, as well as chip-to-chip communications.
    Story Source:
    Materials provided by Institut national de la recherche scientifique – INRS. Original written by Audrey-Maude Vezina. Note: Content may be edited for style and length. More

  • in

    Researchers create molecule that can pave way for mini-transistors

    Researchers at Lund University in Sweden have succeeded in developing a simple hydrocarbon molecule with a logic gate function, similar to that in transistors, in a single molecule. The discovery could make electric components on a molecular scale possible in the future. The results are published in Nature Communications.
    Manufacturing very small components is an important challenge in both research and development. One example is transistors — the smaller they are, the faster and more energy efficient our computers become. But is there a limit to how small logic gates can become? And is it possible to create electric machines on a molecular scale? Yes, perhaps, is the answer from a chemistry research team at Lund University.
    “We have developed a simple hydrocarbon molecule that changes its form, and at the same time goes from insulating to conductive, when exposed to electric potential. The successful formula was to design a so-called anti-aromatic ring in a molecule so that it becomes more robust and can both receive and relay electrons,” says Daniel Strand, chemistry researcher at Lund University.
    Many organic molecules consist of aromatic benzene rings, ie flat rings made up of six carbon atoms. A simple example is graphene. However, such molecules do not change properties or shape if subjected to electric potential. Therefore, the research group chose to look at hydrocarbons made up of rings with eight carbon atoms. These are anti-aromatic and bent into a tub-shape. If two electrons are injected into such a molecule, it flattens and goes from insulating to conducting — a function similar to that of a transistor switching from 0 to 1.
    “A unique aspect of the molecules is that they are so simple. They only consist only of carbon and hydrogen atoms which makes them easier to produce synthetically,” says Daniel Strand.
    The discovery means researchers can now think about how to develop both electrical switches and new mechanical systems at the single-molecule level using anti-aromatic hydrocarbons.
    “Molecules that change form in response to electric potential lead to exciting possibilities. One can imagine energy-efficient computer architectures and in the future perhaps electric machines on a molecular scale,” concludes Daniel Strand.
    Story Source:
    Materials provided by Lund University. Note: Content may be edited for style and length. More

  • in

    ‘From Data to Quanta’ defends Niels Bohr’s view of quantum mechanics

    From Data to QuantaSlobodan PerovićUniv. of Chicago, $45

    Ever since Max Planck introduced the idea of the quantum to the world, physicists have argued about whether reality is more like sand or water.

    Planck’s famous 1900 discovery that energy is grainy — at least when absorbed or emitted — moved him to label those smallest bits of energy grains “quanta.” But he believed that once emitted, as in light from a fire, those grains merged into smooth, continuous waves, just as water seems a smooth liquid to human perception. Einstein, on the other hand, insisted that light quanta traveled through space on their own, behaving like particles later called photons.

    By the mid-1920s, both the wave and particle views of light had gained experimental support, with the additional paradox that electrons — supposedly particles — could sometimes disguise themselves as waves.

    Into this arena of controversy stepped the famed Danish physicist Niels Bohr, the pioneer of exploring the architecture of the atom. Bohr announced that resolving the wave-particle paradox required a new view of reality, in which both notions shared a role in explaining experimental phenomena. In experiments designed to observe waves, waves you would find, whether electrons or light. In experiments designed to detect particles, you’d see particles. But in no experiment could you demonstrate both at once. Bohr called this viewpoint the principle of complementarity, and it successfully guided the pursuit of quantum mechanics during the following decades.

    More recently, as philosopher Slobodan Perović recounts in From Data to Quanta, Bohr’s success has been questioned by some physicists and philosophers and even popular science writers (SN: 1/19/19, p. 26). Complementarity has been derided as an incoherent application of vague philosophy expressed in incomprehensible language. But as Perović’s investigations reveal, such criticisms are rarely rooted in any deep understanding of Bohr’s methods. Rather than Bohr’s philosophy contaminating his science, Perović argues, it is his opponents’ philosophical prejudices that have led to misstatements, misunderstandings and misrepresentations of Bohr’s physics. And Bohr can’t be understood by attempting to understand his philosophy, Perović asserts, because philosophy did not guide him — experiments did.

    In fact, Bohr’s drive to understand the wave-particle paradox was fueled by a deep devotion to comprehending the experimental evidence in its totality. It was the same approach the younger Bohr took when developing his model of the atom in 1913 (SN: 7/13/13, p. 20). Various experiments suggested properties of the atom that seemed irreconcilable. But Bohr forged those experimental clues into a “master hypothesis” that produced a thoroughly novel understanding of the atom and its structure.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    Perović describes how Bohr’s process began with lower-level hypotheses stemming from features directly given by experiment. Spectral lines — different specific colors of light emitted by atoms — led to basic hypotheses that some vibratory process, of an atom itself or its constituents, produced electromagnetic radiation exhibiting precise patterns. Intermediate hypotheses about the structure of the atom did not explain such lines, though. And then Ernest Rutherford, on the basis of experiments in his lab, inferred that an atom was mostly empty space. It contained a dense, tiny central nucleus encompassing most of the mass, while lightweight electrons orbited at a distance. But that hypothesis didn’t mesh with the precise patterns of spectral lines. And such an atom would be unstable, persisting for less than a millisecond. From all these disparate experiment-based hypotheses, Bohr applied Planck’s quantum idea to construct a master hypothesis. He reconciled the spectral lines and Rutherford’s nuclear atom with a new atomic model, in which electrons maintained stability of the atom but jumped from one orbit to another, emitting specific patterns of spectral lines in the process.

    As Perović demonstrates, Bohr followed a similar course in arriving at complementarity. While numerous experiments showed that light was a wave, by the early 1920s other experiments established that X-rays, highly energetic light, collided with electrons just as though both were particles (momentum and energy were conserved in the collisions just as the particle view required). Bohr’s master hypothesis, complementarity, seemed the only way forward.

    Throughout the book, Perović relates how Bohr has been misinterpreted, his views misleadingly conflated with those of others (like John von Neumann and Werner Heisenberg), and his philosophy incorrectly portrayed as antirealist — suggesting that only observations brought reality into existence. Bohr never said any such thing, and in fact cautioned against using language so loosely.

    Perović’s account offers a thorough survey of other historical investigations into Bohr’s work and draws liberally from Bohr’s own writings. It’s a nuanced and insightful presentation of the interplay of experiment and theory in the scientific process. This book is not easy reading, though. It’s not the place to seek clear explanations of quantum physics and Bohr’s interpretation of it. Perović opts for scholarly thoroughness and careful reasoning with a propensity for long sentences. But then again, Bohr’s writings were no breeze, either. In fact, a major complaint against Bohr has been expressed by authors who say his writings are very difficult to understand. It’s unfortunate that so many seem to think that because they can’t understand Bohr, he must have been wrong. Perović’s book provides a useful antidote to that attitude.

    Buy From Data to Quanta from Bookshop.org. Science News is a Bookshop.org affiliate and will earn a commission on purchases made from links in this article. More

  • in

    Researchers use supercomputers for largest-ever turbulence simulations of its kind

    From designing new airplane wings to better understanding how fuel sprays ignite in a combustion engine, researchers have long been interested in better understanding how chaotic, turbulent motions impact fluid flows under a variety of conditions. Despite decades of focused research on the topic, physicists still consider a fundamental understanding of turbulence statistics to be among the last major unsolved challenges in physics.
    Due to its complexity, researchers have come to rely on a combination of experiments, semi-empirical turbulence models, and computer simulation to advance the field. Supercomputers have played an essential role in advancing researchers’ understanding of turbulence physics, but even today’s most computationally expensive approaches have limitations.
    Recently, researchers at the Technical University of Darmstadt (TU Darmstadt) led by Prof. Dr. Martin Oberlack and the Universitat Politècnica de València headed by Prof. Dr. Sergio Hoyas started using a new approach for understanding turbulence, and with the help of supercomputing resources at the Leibniz Supercomputing Centre (LRZ), the team was able to calculate the largest turbulence simulation of its kind. Specifically, the team generated turbulence statistics through this large simulation of the Navier-Stokes equations, which provided the critical data base for underpinning a new theory of turbulence.
    “Turbulence is statistical, because of the random behaviour we observe,” Oberlack said. “We believe Navier-Stokes equations do a very good job of describing it, and with it we are able to study the entire range of scales down to the smallest scales, but that is also the problem — all of these scales play a role in turbulent motion, so we have to resolve all of it in simulations. The biggest problem is resolving the smallest turbulent scales, which decrease inversely with Reynolds number (a number that indicates how turbulent a fluid is moving, based on a ratio of velocity, length scale, and viscosity). For airplanes like the Airbus A 380, the Reynolds number is so large and thus the smallest turbulent scales are so small that they cannot be represented even on the SuperMUC NG.”
    Statistical averages show promise for closing an unending equation loop
    In 2009, while visiting the University of Cambridge, Oberlack had an epiphany — while thinking about turbulence, he thought about symmetry theory, a concept that forms the fundamental basis to all areas of physics research. In essence, the concept of symmetry in mathematics demonstrates that equations can equal the same result even when being done in different arrangements or operating conditions. More

  • in

    Time crystals leave the lab

    We have all seen crystals, whether a simple grain of salt or sugar, or an elaborate and beautiful amethyst. These crystals are made of atoms or molecules repeating in a symmetrical three-dimensional pattern called a lattice, in which atoms occupy specific points in space. By forming a periodic lattice, carbon atoms in a diamond, for example, break the symmetry of the space they sit in. Physicists call this “breaking symmetry.”
    Scientists have recently discovered that a similar effect can be witnessed in time. Symmetry breaking, as the name suggests, can arise only where some sort of symmetry exists. In the time domain, a cyclically changing force or energy source naturally produces a temporal pattern.
    Breaking of the symmetry occurs when a system driven by such a force faces a déjà vu moment, but not with the same period as that of the force. ‘Time crystals’ have in the past decade been pursued as a new phase of matter, and more recently observed under elaborate experimental conditions in isolated systems. These experiments require extremely low temperatures or other rigorous conditions to minimize undesired external influences, called noise.
    In order for scientists to learn more about time crystals and employ their potential in technology, they need to find ways to produce time crystalline states and keep them stable outside the laboratory.
    Cutting-edge research led by UC Riverside and published this week in Nature Communications has now observed time crystals in a system that is not isolated from its ambient environment. This major achievement brings scientists one step closer to developing time crystals for use in real-world applications.
    “When your experimental system has energy exchange with its surroundings, dissipation and noise work hand-in-hand to destroy the temporal order,” said lead author Hossein Taheri, an assistant research professor of electrical and computer engineering in UC Riverside’s Marlan and Rosemary Bourns College of Engineering. “In our photonic platform, the system strikes a balance between gain and loss to create and preserve time crystals.”
    The all-optical time crystal is realized using a disk-shaped magnesium fluoride glass resonator one millimeter in diameter. When bombarded by two laser beams, the researchers observed subharmonic spikes, or frequency tones between the two laser beams, that indicated breaking of temporal symmetry and creation of time crystals.
    The UCR-led team utilized a technique called self-injection locking of the two lasers to the resonator to achieve robustness against environmental effects. Signatures of the temporally repeating state of this system can readily be measured in the frequency domain. The proposed platform therefore simplifies the study of this new phase of matter.
    Without the need for a low temperature, the system can be moved outside a complex lab for field applications. One such application could be highly accurate measurements of time. Because frequency and time are mathematical inverses of each other, accuracy in measuring frequency enables accurate time measurement.
    “We hope that this photonic system can be utilized in compact and lightweight radiofrequency sources with superior stability as well as in precision timekeeping,” said Taheri.
    The open-access Nature Communications paper, “All-optical dissipative discrete time crystals,” is available here. Taheri was joined in the research by Andrey B. Matsko at NASA’s Jet Propulsion Laboratory, Lute Maleki at OEwaves Inc. in Pasadena, Calif., and Krzysztof Sacha at Jagiellonian University in Poland.
    Story Source:
    Materials provided by University of California – Riverside. Original written by Holly Ober. Note: Content may be edited for style and length. More