More stories

  • in

    Discovery of a mechanism for making superconductors more resistant to magnetic fields

    Superconductivity is known to be easily destroyed by strong magnetic fields. NIMS, Osaka University and Hokkaido University have jointly discovered that a superconductor with atomic-scale thickness can retain its superconductivity even when a strong magnetic field is applied to it. The team has also identified a new mechanism behind this phenomenon. These results may facilitate the development of superconducting materials resistant to magnetic fields and topological superconductors composed of superconducting and magnetic materials.
    Superconductivity has been used in various technologies, such as magnetic resonance imaging (MRI) and highly sensitive magnetic sensors. Topological superconductors, a special type of superconductor, have been attracting great attention in recent years. They are able to retain quantum information for a long time and can be used in combination with magnetic materials to form qubits that may enable quantum computers to perform very complex calculations. However, superconductivity is easily destroyed by strong magnetic fields or magnetic materials in close proximity. It is therefore desirable to develop a topological superconducting material resistant to magnetic fields.
    The research team recently fabricated crystalline films of indium, a common superconducting material, with atomic-scale thickness. The team then discovered a new mechanism that prevents the superconductivity of these films from being destroyed by a strong magnetic field. When a magnetic field is applied to a superconducting material, the magnetic field interacts with electron spins. It causes the electronic energy of the material to change and destroys its superconductivity. However, when a superconducting material is thinned to a two-dimensional atomic layer, the spin and the momentum of the electrons in the layer are coupled, causing the electron spins to frequently rotate. This offsets the effect of the changes in electronic energy induced by the magnetic field and thus preserves superconductivity. This mechanism can enhance the critical magnetic field — the maximum magnetic field strength above which superconductivity disappears — up to 16-20 Tesla, which is approximately triple the generally accepted theoretical value. It is expected to have a wide range of applications as it was observed for an ordinary superconducting material and does not require either special crystalline structures or strong electronic correlations.
    Based on these results, we plan to develop superconducting thin films capable of resisting even stronger magnetic fields. We also intend to create a hybrid device composed of superconducting and magnetic materials that is needed for the development of topological superconductors: a vital component in next-generation quantum computers.
    Story Source:
    Materials provided by National Institute for Materials Science, Japan. Note: Content may be edited for style and length. More

  • in

    New statistical method eases data reproducibility crisis

    A reproducibility crisis is ongoing in scientific research, where many studies may be difficult or impossible to replicate and thereby validate, especially when the study involves a very large sample size. For example, to evaluate the validity of a high-throughput genetic study’s findings scientists must be able to replicate the study and achieve the same results. Now researchers at Penn State and the University of Minnesota have developed a statistical tool that can accurately estimate the replicability of a study, thus eliminating the need to duplicate the work and effectively mitigating the reproducibility crisis.
    The team used its new method, which they describe in a paper publishing today (March 30) in Nature Communications, to confirm the findings of a 2019 study on the genetic factors that contribute to smoking and drinking addiction but noted that it also can be applied to other genome-wide association studies — or studies that investigate the genetic underpinnings for diseases.
    “While we applied the method to study smoking and drinking addiction-related outcomes, it could benefit other similar large-scale consortia studies, including current studies on the host genetic contribution to COVID-19 symptoms,” said Dajiang Liu, associate professor of public health sciences and biochemistry and molecular biology, Penn State.
    According to Liu, to detect patterns in genome-wide association studies it is important to obtain data from a large number of individuals. Scientists often acquire these data by combining many existing similarly designed studies, which is what Liu and his colleagues did for the 2019 smoking and drinking addiction study that ultimately comprised 1.2 million individuals.
    “We worked really hard to collect all of the patient samples that we could manage,” said Liu, noting that the data came from biobanks, epidemiology studies and direct-to-consumer genetic testing companies, such as 23andMe. However, he added, since the team used all of the available studies in its analysis, there were none leftover to use as comparisons for validation. “Our statistical method allows researchers to assess the replicability of genetic association signals without a replication dataset,” he said. “It helps to maximize the power of genetic studies as no samples need to be reserved for replication; instead, all samples can be used for discoveries.”
    The team’s method, which they call MAMBA (Meta-Analysis Model-Based Assessment of replicability), evaluates the strength and consistency of the associations between atypical bits of DNA, called single nucleotide polymorphisms (SNPs), and disease traits such as addiction. Specifically, MAMBA calculates the probability that if an experiment can be repeated with a different set of individuals, the relationships between the SNPs and those individuals’ traits would be the same or similar as in the first experiment. More

  • in

    Topological protection of entangled two-photon light in photonic topological insulators

    In a joint effort, researchers from the Humboldt-Universität (Berlin), the Max Born Institute (Berlin) and the University of Central Florida (USA), have revealed the necessary conditions for the robust transport of entangled states of two-photon light in photonic topological insulators, paving the way the towards noise-resistant transport of quantum information. The results have appeared in Nature Communications.
    Originally discovered in condensed matter systems, topological insulators are two-dimensional materials that support scattering-free (uni-directional) transport along their edges, even in the presence of defects and disorder. In essence, topological insulators are finite lattice systems where, given a suitable termination of the underlying infinite lattice, edge states are formed that lie in a well-defined energy gap associated with the bulk states, i.e. these edge states are energetically separated from the bulk states.
    Importantly, single-particle edge states in such systems are topologically protected from scattering: they cannot scatter into the bulk due to their energy lying in the gap, and they cannot scatter backwards because backward propagating edge states are either absent or not coupled to the forward propagating edge states.
    The feasibility of engineering complex Hamiltonians using integrated photonic lattices, combined with the availability of entangled photons, raises the intriguing possibility of employing topologically protected entangled states in optical quantum computing and information processing (Science 362, 568, (2018), Optica 6, 955 (2019)).
    Achieving this goal, however, is highly nontrivial as topological protection does not straightforwardly extend to multi-particle (back-)scattering. At first, this fact appears to be counterintuitive because, individually, each particle is protected by topology whilst, jointly, entangled (correlated) particles become highly susceptible to perturbations of the ideal lattice. The underlying physical principle behind this apparent ‘discrepancy’ is that, quantum-mechanically, identical particles are described by states that satisfy an exchange symmetry principle.
    In their work, the researchers make several fundamental advances towards understanding and controlling topological protection in the context of multi-particle states:- First, they identify physical mechanisms which induce a vulnerability of entangled states in topological photonic lattices and present clear guidelines for maximizing entanglement without sacrificing topological protection. -Second, they stablish and demonstrate a threshold-like behavior of entanglement vulnerability and identify conditions for robust protection of highly entangled two-photon states.To be precise, they explore the impact of disorder onto a range of two-photon states that extend from the fully correlated to the fully anti-correlated limits, thereby also covering a completely separable state. For their analysis, they consider two topological lattices, one periodic and one aperiodic. In the periodic case they consider the Haldane model, and for the aperiodic case a square lattice, whose single-particle dynamics corresponds to the quantum Hall effect, is studied.
    The results offer a clear roadmap for generating robust wave packets tailored to the particular disorder at hand. Specifically, they establish limits on the stability of entangled states up to relatively high degrees of entanglement that offer practical guidelines for generating useful entangled states in topological photonic systems. Further, these findings demonstrate that in order to maximize entanglement without sacrificing topological protection, the joint spectral correlation map of two-photon states must fit inside a well-defined topological window of protection. More

  • in

    Unique AI method for generating proteins will speed up drug development

    Artificial Intelligence is now capable of generating novel, functionally active proteins, thanks to recently published work by researchers from Chalmers University of Technology, Sweden.
    “What we are now able to demonstrate offers fantastic potential for a number of future applications, such as faster and more cost-efficient development of protein-based drugs,” says Aleksej Zelezniak, Associate Professor at the Department of Biology and Biological Engineering at Chalmers.
    Proteins are large, complex molecules that play a crucial role in all living cells, building, modifying, and breaking down other molecules naturally inside our cells. They are also widely used in industrial processes and products, and in our daily lives.
    Protein-based drugs are very common — the diabetes drug insulin is one of the most prescribed. Some of the most expensive and effective cancer medicines are also protein-based, as well as the antibody formulas currently being used to treat COVID-19.
    From computer design to working proteins in just a few weeks
    Current methods used for protein engineering rely on introducing random mutations to protein sequences. However, with each additional random mutation introduced, the protein activity declines. More

  • in

    Mathematical modeling used to analyze dynamics of CAR T-cell therapy

    Chimeric antigen receptor T-cell therapy, or CAR T, is a relatively new type of therapy approved to treat several types of aggressive B cell leukemias and lymphomas. Many patients have strong responses to CAR T; however, some have only a short response and develop disease progression quickly. Unfortunately, it is not completely understood why these patients have progression. In an article published in Proceedings of the Royal Society B, Moffitt Cancer Center researchers use mathematical modeling to help explain why CAR T cells work in some patients and not in others.
    CAR T is a type of personalized immunotherapy that uses a patient’s own T cells to target cancer cells. T cells are harvested from a patient and genetically modified in a laboratory to add a specific receptor that targets cancer cells. The patient then undergoes lymphodepletion with chemotherapy to lower some of their existing normal immune cells to help with expansion of the CAR T cells that are infused back into the patient, where they can get to work and attack the tumor.
    Mathematical modeling has been used to help predict how CAR T cells will behave after being infused back into patients; however, no studies have yet considered how interactions between the normal T cells and CAR T cells impact the dynamics of the therapy, in particular how the nonlinear T cell kinetics factor into the chances of therapy success. Moffitt researchers integrated clinical data with mathematical and statistical modeling to address these unknown factors.
    The researchers demonstrate that CAR T cells are effective because they rapidly expand after being infused back into the patient; however, the modified T cells are shown to compete with existing normal T cells, which can limit their ability to expand.
    “Treatment success critically depends on the ability of the CAR T cells to multiply in the patient, and this is directly dependent upon the effectiveness of lymphodepletion that reduces the normal T cells before CAR T infusion,” said Frederick Locke, M.D., co-lead study author and vice chair of the Blood and Marrow Transplant and Cellular Immunotherapy Department at Moffitt.
    In their model, the researchers discovered that tumor eradication is a random, yet potentially highly probable event. Despite this randomness of cure, the authors demonstrated that differences in the timing and probability of cures are determined largely by variability among patient and disease factors. The model confirmed that cures tend to happen early, within 20 to 80 days before CAR T cells decline in number, while disease progression tends to happen over a wider time range between 200 to 500 days after treatment.
    The researchers’ model could also be used to test new treatments or propose refined clinical trial designs. For example, the researchers used their model to demonstrate that another round of CAR T-cell therapy would require a second chemotherapy lymphodepletion to improve patient outcomes.
    “Our model confirms the hypothesis that sufficient lymphodepletion is an important factor in determining durable response. Improving the adaptation of CAR T cells to expand more and survive longer in vivo could result in increased likelihood and duration of response,” explained Philipp Altrock, Ph.D., lead study author and assistant member of the Integrated Mathematical Oncology Department at Moffitt.
    Story Source:
    Materials provided by H. Lee Moffitt Cancer Center & Research Institute. Note: Content may be edited for style and length. More

  • in

    Deciphering the secrets of printed electronics

    Next-gen electronics is envisioned to be non-rigid, component-free, flexible, bendable, and easily integrable with different objects.
    Direct-write printing techniques provide unique opportunity to enable this vision through use of nanomaterial so-called functional inks, that can be tailored to add desired functionalities on various flexible substrates, such as textiles or plastic.
    The technology, known as Printed Electronics (PE), has been known for decades, but has recently gained considerable attention due to innovation in material inks, process technology and design revolution.
    To keep the research community abreast with the latest technological advancements in the area of droplet-based PE techniques for next-gen devices, researchers from Aarhus University have now published a comprehensive review of the technology in the   scientific journal Advanced Materials.
    “Through this paper, we have tried to fill the existing void in literature by discussing techniques, material inks, ink properties, post processing, substrates and application to provide a complete guide. PE is an industry relevant technology and the gateway to future portable electronics, where advanced printers can print complex circuits on any material,” says Assistant Professor Shweta Agarwala, an expert in PE at the Department of Electrical and Computer Engineering at Aarhus University.
    PE is already being used for many different applications today. It is an attractive method to impart electrical functionality on any surface and the major advantage of PE is that it is inexpensive and readily scalable.
    “PE offers a wide range of advantages over conventional lithography-based technologies. It provides much more production flexibility, it is cheaper and far simpler. More importantly, it opens up a plethora of new possibilities to print flexible electrical circuits directly onto a wide range of substrates such as plastics, papers, clothes, and quite literally any other planar and non-planar surfaces. The research area is moving forwards fast, and this publication provides an overview of how far we have progressed today,” says Hamed Abdolmaleki, a PhD student and first author of the paper.
    Even though PE is being used in more and more industries, and is considered very important in the electronics of the future, the technology is still in its infancy.
    For Shweta Agarwala, the sustainability aspect is very important for the future perspectives of electronics and PE technology:
    “PE is the way towards biodegradable electronics, and with this technology, we can address the huge societal problem that electronics already present, and which will only get more pressing in the future. The world is not only suffering from a huge amount of plastic pollution; it is also burdened by enormous pollution from electronics in all the devices we discard rapidly. In the review article, we have also discussed the emerging field of biodegradable substrates that will have huge environmental impact,” she adds.
    Story Source:
    Materials provided by Aarhus University. Original written by Jesper Bruun. Note: Content may be edited for style and length. More

  • in

    First steps towards revolutionary ULTRARAM™ memory chips

    A new type of universal computer memory — ULTRARAM™ — has taken a step closer towards development with a successful experiment by Lancaster physicists.
    Professor Manus Hayne, who is leading the research, commented: “These new results confirm the astonishing properties of ULTRARAM™, allowing us to demonstrate its potential as a fast and efficient non-volatile memory with high-endurance.”
    Currently, the two main types of memory, dynamic RAM (DRAM) and flash, have complementary characteristics and roles:- DRAM is fast, so used for active (working) memory but it is volatile, meaning that information is lost when power is removed. Indeed, DRAM continually ‘forgets’ and needs to be constantly refreshed. Flash is non-volatile, allowing you to carry data in your pocket, but is very slow and wears out. It is well-suited for data storage but can’t be used for active memory.”Universal memory” is a memory where the data is very robustly stored, but can also easily be changed; something that was widely considered to be unachievable until now.
    The Lancaster team has solved the paradox of universal memory by exploiting a quantum mechanical effect called resonant tunnelling that allows a barrier to switch from opaque to transparent by applying a small voltage.
    Their new non-volatile RAM, called ULTRARAM™, is a working implementation of so-called ‘universal memory’, promising all the advantages of DRAM and flash, with none of the drawbacks.
    In their latest work, published in IEEE Transactions on Electron Devices, the researchers have integrated ULTRARAM™ devices into small (4-bit) arrays for the first time. This has allowed them to experimentally verify a novel, patent-pending, memory architecture that would form the basis of future ULTRARAM™ memory chips.
    They have also modified the device design to take full advantage of the physics of resonant tunnelling, resulting in devices that are 2,000 times faster than the first prototypes, and with program/erase cycling endurance that is at least ten times better than flash, without any compromise in data retention.
    Story Source:
    Materials provided by Lancaster University. Note: Content may be edited for style and length. More

  • in

    Privacy-preserving 'encounter metrics' could slow down future pandemics

    When you bump into someone in the workplace or at your local coffee shop, you might call that an “encounter.” That’s the scientific term for it, too. As part of urgent efforts to fight COVID-19, a science is rapidly developing for measuring the number of encounters and the different levels of interaction in a group.
    At the National Institute of Standards and Technology (NIST), researchers are applying that science to a concept they have created called “encounter metrics.” They have developed an encrypted method that can be applied to a device such as your phone to help with the ultimate goal of slowing down or preventing future pandemics. The method is also applicable to the COVID-19 pandemic.
    Their research is explained in a pilot study published in the Journal of Research of NIST.
    Encounter metrics measure the levels of interactions between members of a population. A level of interaction could be the number of people in a bathroom who are talking to each other or a group of people walking down a hallway. There are numerous levels of interactions because there are so many different ways people can interact with one another in different environments.
    In order to mitigate the spread of an infectious disease there is the assumption that less communication and interaction with people in a community is essential. Fewer interactions among people means there is less of a chance of the disease spreading from one person to another. “We need to measure that. It’s important to develop technology to measure that and then see how we can use that technology to shape our working environment to slow future pandemics,” said NIST researcher René Peralta, an author of the NIST study.
    Picture two people walking from opposite ends of a hallway who meet in the middle. To record this encounter, each person could carry their own phone or a Bluetooth device that broadcasts a signal as soon as the encounter occurs. One way of labeling this encounter is through the exchange of device IDs or pseudonyms. Each device sends its own pseudonym that belongs to the device itself. The pseudonyms could be changed every 10 minutes as a way to promote the privacy of the person’s identity. More