More stories

  • in

    Computational approach enables spatial mapping of single-cell data within tissues

    A new computational approach developed by researchers at The University of Texas MD Anderson Cancer Center successfully combines data from parallel gene-expression profiling methods to create spatial maps of a given tissue at single-cell resolution. The resulting maps can provide unique biological insights into the cancer microenvironment and many other tissue types.
    The study was published today in Nature Biotechnology and will be presented at the upcoming American Association for Cancer Research (AACR) Annual Meeting 2022 (Abstract 2129).
    The tool, called CellTrek, uses data from single-cell RNA sequencing (scRNA-seq) together with that of spatial transcriptomics (ST) assays — which measure spatial gene expression in many small groups of cells — to accurately pinpoint the location of individual cell types within a tissue. The researchers presented findings from analysis of kidney and brain tissues as well as samples of ductal carcincoma in situ (DCIS) breast cancer.
    “Single-cell RNA sequencing provides tremendous information about the cells within a tissue, but, ultimately, you want to know where these cells are distributed, particularly in tumor samples,” said senior author Nicholas Navin, Ph.D., professor of Genetics and Bioinformatics & Computational Biology. “This tool allows us to answer that question with an unbiased approach that improves upon currently available spatial mapping techniques.”
    Single-cell RNA sequencing is an established method to analyze the gene expression of many individual cells from a sample, but it cannot provide information on the location of cells within a tissue. On the other hand, ST assays can measure spatial gene expression by analyzing many small groups of cells across a tissue but are not capable of providing single-cell resolution.
    Current computational approaches, known as deconvolution techniques, can identify different cell types present from ST data, but they are not capable of providing detailed information at the single-cell level, Navin explained.
    Therefore, co-first authors Runmin Wei, Ph.D., and Siyuan He of the Navin Laboratory led the efforts to develop CellTrek as a tool to combine the unique advantages of scRNA-seq and ST assays and create accurate spatial maps of tissue samples.
    Using publicly available scRNA-seq and ST data from brain and kidney tissues, the researchers demonstrated that CellTrek achieved the most accurate and detailed spatial resolution of the methods evaluated. The CellTrek approach also was able to distinguish subtle gene expression differences within the same cell type to gain information on their heterogeneity within a sample.
    The researchers also collaborated with Savitri Krishnamurthy, M.D., professor of Pathology, to apply CellTrek to study DCIS breast cancer tissues. In an analysis of 6,800 single cells and 1,500 ST regions from a single DCIS sample, the team learned that different subgroups of tumor cells were evolving in unique patterns within specific regions of the tumor. Analysis of a second DCIS sample demonstrated the ability of CellTrek to reconstruct the spatial tumor-immune microenvironment within a tumor tissue.
    “While this approach is not restricted to analyzing tumor tissues, there are obvious applications for better understanding cancer,” Navin said. “Pathology really drives cancer diagnoses and, with this tool, we’re able to map molecular data on top of pathological data to allow even deeper classifications of tumors and to better guide treatment approaches.”
    This research was supported by the National Institutes of Health/National Cancer Institute (RO1CA240526, RO1CA236864, CA016672), the Cancer Prevention and Research Institute of Texas (CPRIT) (RP180684), the Chan Zuckerberg Initiative SEED Network Grant, and the PRECISION Cancer Grand Challenges Grant. Navin is supported by the American Association for the Advancement of Science (AAAS) Martin and Rose Wachtel Cancer Research Award, the Damon Runyon-Rachleff Innovation Award, the Andrew Sabin Family Fellowship, and the Jack and Beverly Randall Prize for Excellence in Cancer Research. Wei is supported by a Damon Runyon Quantitative Biology Fellowship Award.
    Collaborating MD Anderson authors include Shanshan Bai, Emi Sei, Ph.D., and Min Hu, all of Genetics; and Ken Chen, Ph.D., of Bioinformatics. Additional authors include Alastair Thompson, M.D., of Baylor College of Medicine, Houston. The authors have no conflicts of interest. More

  • in

    Tiny magnets could hold the secret to new quantum computers

    Magnetic interactions could point to miniaturizable quantum devices.
    From MRI machines to computer hard disk storage, magnetism has played a role in pivotal discoveries that reshape our society. In the new field of quantum computing, magnetic interactions could play a role in relaying quantum information.
    In new research from the U.S. Department of Energy’s (DOE) Argonne National Laboratory, scientists have achieved efficient quantum coupling between two distant magnetic devices, which can host a certain type of magnetic excitations called magnons. These excitations happen when an electric current generates a magnetic field. Coupling allows magnons to exchange energy and information. This kind of coupling may be useful for creating new quantum information technology devices.
    “Remote coupling of magnons is the first step, or almost a prerequisite, for doing quantum work with magnetic systems,” said Argonne senior scientist Valentine Novosad, an author of the study. ​”We show the ability for these magnons to communicate instantly with each other at a distance.”
    This instant communication does not require sending a message between magnons limited by the speed of light. It is analogous to what physicists call quantum entanglement.
    Following on from a 2019 study, the researchers sought to create a system that would allow magnetic excitations to talk to one another at a distance in a superconducting circuit. This would allow the magnons to potentially form the basis of a type of quantum computer. For the basic underpinnings of a viable quantum computer, researchers need the particles to be coupled and stay coupled for a long time.
    In order to achieve a strong coupling effect, researchers have built a superconducting circuit and used two small yttrium iron garnet (YIG) magnetic spheres embedded on the circuit. This material, which supports magnonic excitations, ensures efficient and low-loss coupling for the magnetic spheres.
    The two spheres are both magnetically coupled to a shared superconducting resonator in the circuit, which acts like a telephone line to create strong coupling between the two spheres even when they are almost a centimeter away from each other — 30 times the distance of their diameters.
    “This is a significant achievement,” said Argonne materials scientist Yi Li, lead author of the study. ​”Similar effects can also be observed between magnons and superconducting resonators, but this time we did it between two magnon resonators without direct interaction. The coupling comes from indirect interaction between the two spheres and the shared superconducting resonator.”
    One additional improvement over the 2019 study involved the longer coherence of the magnons in the magnetic resonator. ​”If you speak in a cave, you may hear an echo,” said Novosad. ​”The longer that echo lasts, the longer the coherence.”
    “Before, we definitely saw a relationship between magnons and a superconducting resonator, but in this study their coherence times are much longer because of the use of the spheres, which is why we can see evidence of separated magnons talking to each other,” Li added.
    According to Li, because the magnetic spins are highly concentrated in the device, the study could point to miniaturizable quantum devices. ​”It’s possible that tiny magnets could hold the secret to new quantum computers,” he said.
    Story Source:
    Materials provided by DOE/Argonne National Laboratory. Original written by Jared Sagoff. Note: Content may be edited for style and length. More

  • in

    Characterizing super-semi sandwiches for quantum computing

    “There is an international race to identify the best platform for controlling and processing quantum information for quantum computers, where superconductors play a prominent role,” says Duc Phan, PhD student at the Institute of Science and Technology Austria (ISTA) and first author of a new paper now published in Physical Review Letters. “Microsoft is working on topological qubits using superconductor-semiconductor sandwiches. However, before we can use them, we must understand the fundamental physics behind them.”
    Phan and his ISTA colleagues Jorden Senior and Andrew Higginbotham from the Condensed Matter and Quantum Circuits group conducted this study in close collaboration with partners from New York University and with theory support from Areg Ghazaryan and Maksym Serbyn from ISTA’s Quantum Dynamics group. They developed a technique to probe the quantum interactions in super-semi sandwiches paving the way for new applications like topological quantum bits based on so-called Majorana zero modes.
    Cold Environment
    For their experiment, the researchers created a microscopic sandwich made of an aluminium (Al) superconductor on top of an indium-arsenic (InAs) semiconductor. Superconductors are materials that have no electrical resistance. For that to happen, they are cooled down to close to absolute zero temperature. Semiconductors like InAs or silicon can be insulating or conduct electricity depending on their environment and applied electric field.
    Just like in a conventional sandwich that becomes more than the sum of its parts, the combined properties of Al and InAs become modified in super-semi sandwiches. At the interface between the Al superconductor and the InAs semiconductor, the proximity effect spills the superconductivity into the semiconductor creating new quantum states there. However, until now researchers had a hard time studying them because they could not be probed directly because of being concealed by a presence of the Al superconducting layer.
    “We found that by sending a current alternating billions of times a second through the vicinity of the sandwich, we could make the superconductor’s veil partially transparent and get feedback about the properties of the semiconductor,” explains Senior. “We also applied a magnetic field to create new quantum states we were looking for and developed a new model that explained our observations.”
    A new level of detail
    This first experimental result of the Higginbotham group since its establishment at ISTA lays the groundwork to study superconductor-semiconductor hybrid structures at a new level of detail. “The parameters we can infer from this could provide much-needed guidance to construct topological quantum bits based on Majorana zero modes,” says Jorden. He also highlights that “ISTA is very well placed in this developing field because here experimental expertise, theoretical understanding, as well as excellent infrastructure provided by the state-of-the-art clean room — the kitchen for sandwich production — come together.”
    Phan and his colleagues are excited about what insights they will gain with their novel probing technique and what future applications may become possible once the fundamental physics of this exotic sandwich has been understood.
    Story Source:
    Materials provided by Institute of Science and Technology Austria. Note: Content may be edited for style and length. More

  • in

    New technology to make charging electric cars as fast as pumping gas

    Whether it’s photovoltaics or fusion, sooner or later, human civilization must turn to renewable energies. This is deemed inevitable considering the ever-growing energy demands of humanity and the finite nature of fossil fuels. As such, much research has been pursued in order to develop alternative sources of energy, most of which utilize electricity as the main energy carrier. The extensive R&D in renewables has been accompanied by gradual societal changes as the world adopted new products and devices running on renewables. The most striking change as of recently is the rapid adoption of electric vehicles. While they were hardly seen on the roads even 10 years ago, now millions of electric cars are being sold annually. The electric car market is one of the most rapidly growing sectors, and it helped propel Elon Musk to become the wealthiest man in the world.
    Unlike traditional cars which derive energy from the combustion of hydrocarbon fuels, electric vehicles rely on batteries as the storage medium for their energy. For a long time, batteries had far lower energy density than those offered by hydrocarbons, which resulted in very low ranges of early electric vehicles. However, gradual improvement in battery technologies eventually allowed the drive ranges of electric cars to be within acceptable levels in comparison to gasoline-burning cars. It is no understatement that the improvement in battery storage technology was one of the main technical bottlenecks which had to be solved in order to kickstart the current electric vehicle revolution.
    However, despite the vast improvements in battery technology, today consumers of electric vehicles face another difficulty — slow battery charging speed. Currently, cars take about 10 hours to fully recharge at home. Even the fastest superchargers at the charging stations require up to 20-40 minutes to fully recharge the vehicles. This creates additional costs and inconvenience to the customers.
    To address this problem, scientists looked for answers in the mysterious field of quantum physics. Their search has led to the discovery that quantum technologies may promise new mechanisms to charge batteries at a faster rate. Such concept of “quantum battery” has been first proposed in a seminal paper published by Alicki and Fannes in 2012. It was theorized that quantum resources, such as entanglement, can be used to vastly speed up the battery charging process by charging all cells within the battery simultaneously in a collective manner.
    This is particularly exciting as modern large-capacity batteries can contain numerous cells. Such collective charging is not possible in classical batteries, where the cells are charged in parallel independently of one another. The advantage of this collective versus parallel charging can be measured by the ratio called the ‘quantum charging advantage’. Later, around the year 2017, it was noticed that there can be two possible sources behind this quantum advantage — namely ‘global operation’ (in which all the cells talk to all others simultaneously, i.e., “all sitting at one table”) and ‘all-to-all coupling’ (every cell can talk with every other, but a single cell, i.e., “many discussions, but every discussion has only two participants”). However, it is unclear whether both these sources are necessary and whether there are any limits to the charging speed that can be achieved.
    Recently, scientists from the Center for Theoretical Physics of Complex Systems within the Institute for Basic Science (IBS) further explored these questions. The paper, which was chosen as an “Editor’s Suggestion” in the journal Physical Review Letters, showed that all-to-all coupling is irrelevant in quantum batteries and that the presence of global operations is the only ingredient in the quantum advantage. The group went further to pinpoint the exact source of this advantage while ruling out any other possibilities and even provided an explicit way of designing such batteries.
    In addition, the group was able to precisely quantify how much charging speed can be achieved in this scheme. While the maximum charging speed increases linearly with the number of cells in classical batteries, the study showed that quantum batteries employing global operation can achieve quadratic scaling in charging speed. To illustrate this, we will consider a typical electric vehicle with a battery that contains about 200 cells. Employing this quantum charging would lead to a 200 times speedup over classical batteries, which means that at home charging time would be cut from 10 hours to about 3 minutes. At high-speed charging stations, the charge time would be cut from 30 minutes to mere seconds.
    Researchers say that consequences can be far-reaching and that the implications of quantum charging can go well beyond electric cars and consumer electronics. For example, it may find key uses in future fusion power plants, which require large amounts of energy to be charged and discharged in an instant. Of course, quantum technologies are still in their infancy and there is a long way to go before these methods can be implemented in practice. Research findings such as these, however, create a promising direction and can incentivize the funding agencies and businesses to further invest in these technologies. If employed, it is believed that quantum batteries would completely revolutionize the way we use energy and take us a step closer to our sustainable future.
    Story Source:
    Materials provided by Institute for Basic Science. Note: Content may be edited for style and length. More

  • in

    Artificial intelligence paves the way to discovering new rare-earth compounds

    Artificial intelligence advances how scientists explore materials. Researchers from Ames Laboratory and Texas A&M University trained a machine-learning (ML) model to assess the stability of rare-earth compounds. This work was supported by Laboratory Directed Research and Development Program (LDRD) program at Ames Laboratory. The framework they developed builds on current state-of-the-art methods for experimenting with compounds and understanding chemical instabilities.
    Ames Lab has been a leader in rare-earths research since the middle of the 20th century. Rare earth elements have a wide range of uses including clean energy technologies, energy storage, and permanent magnets. Discovery of new rare-earth compounds is part of a larger effort by scientists to expand access to these materials.
    The present approach is based on machine learning (ML), a form of artificial intelligence (AI), which is driven by computer algorithms that improve through data usage and experience. Researchers used the upgraded Ames Laboratory Rare Earth database (RIC 2.0) and high-throughput density-functional theory (DFT) to build the foundation for their ML model.
    High-throughput screening is a computational scheme that allows a researcher to test hundreds of models quickly. DFT is a quantum mechanical method used to investigate thermodynamic and electronic properties of many body systems. Based on this collection of information, the developed ML model uses regression learning to assess phase stability of compounds.
    Tyler Del Rose, an Iowa State University graduate student, conducted much of the foundational research needed for the database by writing algorithms to search the web for information to supplement the database and DFT calculations. He also worked on experimental validation of the AI predictions and helped to improve the ML based models by ensuring they are representative of reality.
    “Machine learning is really important here because when we are talking about new compositions, ordered materials are all very well known to everyone in the rare earth community,” said Ames Laboratory Scientist Prashant Singh, who led the DFT plus machine learning effort with Guillermo Vazquez and Raymundo Arroyave. “However, when you add disorder to known materials, it’s very different. The number of compositions becomes significantly larger, often thousands or millions, and you cannot investigate all the possible combinations using theory or experiments.”
    Singh explained that the material analysis is based on a discrete feedback loop in which the AI/ML model is updated using new DFT database based on real-time structural and phase information obtained from our experiments. This process ensures that information is carried from one step to the next and reduces the chance of making mistakes.
    Yaroslav Mudryk, the project supervisor, said that the framework was designed to explore rare earth compounds because of their technological importance, but its application is not limited to rare-earths research. The same approach can be used to train an ML model to predict magnetic properties of compounds, process controls for transformative manufacturing, and optimize mechanical behaviors.
    “It’s not really meant to discover a particular compound,” Mudryk said. “It was, how do we design a new approach or a new tool for discovery and prediction of rare earth compounds? And that’s what we did.”
    Mudryk emphasized that this work is just the beginning. The team is exploring the full potential of this method, but they are optimistic that there will be a wide range of applications for the framework in the future.
    Story Source:
    Materials provided by DOE/Ames Laboratory. Note: Content may be edited for style and length. More

  • in

    Researchers develop the world's first power-free frequency tuner using nanomaterials

    In a paper published today in Nature Communications, researchers at the University of Oxford and the University of Pennsylvania have found a power-free and ultra-fast way of frequency tuning using functional nanowires.
    Think of an orchestra warming up before the performance. The oboe starts to play a perfect A note at a frequency of 440 Hz while all the other instruments adjust themselves to that frequency. Telecommunications technology relies on this very concept of matching the frequencies of transmitters and receivers. In practice, this is achieved when both ends of the communication link tune into the same frequency channel.
    In today’s colossal communications networks, the ability to reliably synthesise as many frequencies as possible and to rapidly switch from one to another is paramount for seamless connectivity.
    Researchers at the University of Oxford and the University of Pennsylvania have fabricated vibrating nanostrings of a chalcogenide glass (germanium telluride) that resonate at predetermined frequencies, just like guitar strings. To tune the frequency of these resonators, the researchers switch the atomic structure of the material, which in turn changes the mechanical stiffness of the material itself.
    This differs from existing approaches that apply mechanical stress on the nanostrings similar to tuning a guitar using the tuning pegs. This directly translates into higher power consumption because the pegs are not permanent and require a voltage to hold the tension.
    Utku Emre Ali, at the University of Oxford who completed the research as part of his doctoral work said:
    ‘By changing how atoms bond with each other in these glasses, we are able to change the Young’s modulus within a few nanoseconds. Young’s modulus is a measure of stiffness, and it directly affects the frequency at which the nanostrings vibrate.’ More

  • in

    Making memory serve correctly: Fixing an inherent problem in next-generation magnetic RAM

    With the advent of the Internet of Things (IoT) era, many researchers are focused on making most of the technologies involved more sustainable. To reach this target of ‘green IoT,’ some of the building blocks of conventional electronics will have to be improved or radically changed to make them not only faster, but also more energy efficient. In line with this reasoning, many scientists worldwide are currently trying to develop and commercialize a new type of random-access memory (RAM) that will enable ultra-low-power electronics: magnetic RAMs.
    Each memory cell in a magnetic RAM stores either a ‘1’ or a ‘0’ depending on whether the magnetic orientation of two magnetic layers are equal or opposite to each other. Various types of magnetic RAM exist, and they mainly differ in how they modify the magnetic orientation of the magnetic layers when writing to a memory cell. In particular, spin injection torque RAM, or STT-RAM, is one type of magnetic memory that is already being commercialized. However, to achieve even lower write currents and higher reliability, a new type of magnetic memory called spin orbit torque RAM (SOT-RAM), is being actively researched.
    In SOT-RAM, by leveraging spin-orbit interactions, the write current can be immensely reduced, which lowers power consumption. Moreover, since the memory readout and write current paths are different, researchers initially thought that the potential disturbances on the stored values would also be small when either reading or writing. Unfortunately, this turned out not to be the case.
    In 2017, in a study led by Professor Takayuki Kawahara of Tokyo University of Science, Japan, researchers reported that SOT-RAMs face an additional source of disturbance when reading a stored value. In conventional SOT-RAMs, the readout current actually shares part of the path of the write current. When reading a value, the readout operation generates unbalanced spin currents due to the Spin Hall effect. This can unintentionally flip the stored bit if the effect is large enough, making reading in SOT-RAMs less reliable.
    To address this problem, Prof. Kawahara and colleagues conducted another study, which was recently published in IEEE Transactions on Magnetics. The team came up with a new reading method for SOT-RAMs that can nullify this new source of readout disturbance. In short, their idea is to alter the original SOT-RAM structure to create a bi-directional read path. When reading a value, the read current flows out of the magnetic layers in two opposite directions simultaneously. In turn, the disturbances produced by the spin currents generated on each side end up cancelling each other out. An explainer video on the same topic can be watched here: https://youtu.be/Gbz4rDOs4yQ.
    In addition to cementing the theory behind this new source of readout disturbance, the researchers conducted a series of simulations to verify the effectiveness of their proposed method. They tested three different types of ferromagnetic materials for the magnetic layers and various device shapes. The results were very favorable, as Prof. Kawahara remarks: “We confirmed that the proposed method reduces the readout disturbance by at least 10 times for all material parameters and device geometries compared with the conventional read path in SOT-RAM.”
    To top things off, the research team checked the performance of their method in the type of realistic array structure that would be used in an actual SOT-RAM. This test is important because the read paths in an array structure would not be perfectly balanced depending on each memory cell’s position. The results show that a sufficient readout disturbance reduction is possible even when connecting about 1,000 memory cells together. The team is now working towards improving their method to reach a higher number of integrated cells.
    This study could pave the way toward a new era in low-power electronics, from personal computers and portable devices to large-scale servers. Satisfied with what they have achieved, Prof. Kawahara remarks: “We expect next-generation SOT-RAMs to employ write currents an order of magnitude lower than current STT-RAMs, resulting in significant power savings. The results of our work will help solve one of the inherent problems of SOT-RAMs, which will be essential for their commercialization.” 
    Story Source:
    Materials provided by Tokyo University of Science. Note: Content may be edited for style and length. More

  • in

    AI provides accurate breast density classification

    An artificial intelligence (AI) tool can accurately and consistently classify breast density on mammograms, according to a study in Radiology: Artificial Intelligence.
    Breast density reflects the amount of fibroglandular tissue in the breast commonly seen on mammograms. High breast density is an independent breast cancer risk factor, and its masking effect of underlying lesions reduces the sensitivity of mammography. Consequently, many U.S. states have laws requiring that women with dense breasts be notified after a mammogram, so that they can choose to undergo supplementary tests to improve cancer detection.
    In clinical practice, breast density is visually assessed on two-view mammograms, most commonly with the American College of Radiology Breast Imaging-Reporting and Data System (BI-RADS) four-category scale, ranging from Category A for almost entirely fatty breasts to Category D for extremely dense. The system has limitations, as visual classification is prone to inter-observer variability, or the differences in assessments between two or more people, and intra-observer variability, or the differences that appear in repeated assessments by the same person.
    To overcome this variability, researchers in Italy developed software for breast density classification based on a sophisticated type of AI called deep learning with convolutional neural networks, a sophisticated type of AI that is capable of discerning subtle patterns in images beyond the capabilities of the human eye. The researchers trained the software, known as TRACE4BDensity, under the supervision of seven experienced radiologists who independently visually assessed 760 mammographic images.
    External validation of the tool was performed by the three radiologists closest to the consensus on a dataset of 384 mammographic images obtained from a different center.
    TRACE4BDensity showed 89% accuracy in distinguishing between low density (BI-RADS categories A and B) and high density (BI-RADS categories C and D) breast tissue, with an agreement of 90% between the tool and the three readers. All disagreements were in adjacent BI-RADS categories.
    “The particular value of this tool is the possibility to overcome the suboptimal reproducibility of visual human density classification that limits its practical usability,” said study co-author Sergio Papa, M.D., from the Centro Diagnostico Italiano in Milan, Italy. “To have a robust tool that proposes the density assignment in a standardized fashion may help a lot in decision-making.”
    Such a tool would be particularly valuable, the researchers said, as breast cancer screening becomes more personalized, with density assessment accounting for one important factor in risk stratification.
    “A tool such as TRACE4BDensity can help us advise women with dense breasts to have, after a negative mammogram, supplemental screening with ultrasound, MRI or contrast-enhanced mammography,” said study co-author Francesco Sardanelli, M.D., from the IRCCS Policlinico San Donato in San Donato, Italy.
    The researchers plan additional studies to better understand the full capabilities of the software.
    “We would like to further assess the AI tool TRACE4BDensity, particularly in countries where regulations on women density is not active, by evaluating the usefulness of such tool for radiologists and patients,” said study co-author Christian Salvatore, Ph.D., senior researcher, University School for Advanced Studies IUSS Pavia and co-founder and chief executive officer of DeepTrace Technologies.
    “Development and Validation of an AI-driven Mammographic Breast Density Classification Tool Based on Radiologist Consensus.” Collaborating with Drs. Papa, Sardanelli and Salvatore were Veronica Magni, M.D., Matteo Interlenghi, M.Sc., Andrea Cozzi, M.D., Marco Alì, Ph.D., Alcide A. Azzena, M.D., Davide Capra, M.D., Serena Carriero, M.D., Gianmarco Della Pepa, M.D., Deborah Fazzini, M.D., Giuseppe Granata, M.D., Caterina B. Monti, M.D., Ph.D., Giulia Muscogiuri, M.D., Giuseppe Pellegrino, M.D., Simone Schiaffino, M.D., and Isabella Castiglioni, M.Sc., M.B.A. More