More stories

  • in

    Semiconductors reach the quantum world

    Quantum effects in superconductors could give semiconductor technology a new twist. Researchers at the Paul Scherrer Institute PSI and Cornell University in New York State have identified a composite material that could integrate quantum devices into semiconductor technology, making electronic components significantly more powerful. They publish their findings today in the journal Science Advances.
    Our current electronic infrastructure is based primarily on semiconductors. This class of materials emerged around the middle of the 20th century and has been improving ever since. Currently, the most important challenges in semiconductor electronics include further improvements that would increase the bandwidth of data transmission, energy efficiency and information security. Exploiting quantum effects is likely to be a breakthrough.
    Quantum effects that can occur in superconducting materials are particularly worthy of consideration. Superconductors are materials in which the electrical resistance disappears when they are cooled below a certain temperature. The fact that quantum effects in superconductors can be utilised has already been demonstrated in first quantum computers.
    To find possible successors for today’s semiconductor electronics, some researchers — including a group at Cornell University — are investigating so-called heterojunctions, i.e. structures made of two different types of materials. More specifically, they are looking at layered systems of superconducting and semiconducting materials. “It has been known for some time that you have to select materials with very similar crystal structures for this, so that there is no tension in the crystal lattice at the contact surface,” explains John Wright, who produced the heterojunctions for the new study at Cornell University.
    Two suitable materials in this respect are the superconductor niobium nitride (NbN) and the semiconductor gallium nitride (GaN). The latter already plays an important role in semiconductor electronics and is therefore well researched. Until now, however, it was unclear exactly how the electrons behave at the contact interface of these two materials — and whether it is possible that the electrons from the semiconductor interfere with the superconductivity and thus obliterate the quantum effects.
    “When I came across the research of the group at Cornell, I knew: here at PSI we can find the answer to this fundamental question with our spectroscopic methods at the ADRESS beamline,” explains Vladimir Strocov, researcher at the Synchrotron Light Source SLS at PSI.
    This is how the two groups came to collaborate. In their experiments, they eventually found that the electrons in both materials “keep to themselves.” No unwanted interaction that could potentially spoil the quantum effects takes place.
    Synchrotron light reveals the electronic structures
    The PSI researchers used a method well-established at the ADRESS beamline of the SLS: angle-resolved photoelectron spectroscopy using soft X-rays — or SX-ARPES for short. “With this method, we can visualise the collective motion of the electrons in the material,” explains Tianlun Yu, a postdoctoral researcher in Vladimir Strocov’s team, who carried out the measurements on the NbN/GaN heterostructure. Together with Wright, Yu is the first author of the new publication.
    The SX-ARPES method provides a kind of map whose spatial coordinates show the energy of the electrons in one direction and something like their velocity in the other; more precisely, their momentum. “In this representation, the electronic states show up as bright bands in the map,” Yu explains. The crucial research result: at the material boundary between the niobium nitride NbN and the gallium nitride GaN, the respective “bands” are clearly separated from each other. This tells the researchers that the electrons remain in their original material and do not interact with the electrons in the neighbouring material.
    “The most important conclusion for us is that the superconductivity in the niobium nitride remains undisturbed, even if this is placed atom by atom to match a layer of gallium nitride,” says Vladimir Strocov. “With this, we were able to provide another piece of the puzzle that confirms: This layer system could actually lend itself to a new form of semiconductor electronics that embeds and exploits the quantum effects that happen in superconductors.”
    Story Source:
    Materials provided by Paul Scherrer Institute. Original written by Laura Hennemann. Note: Content may be edited for style and length. More

  • in

    Machine learning used to predict synthesis of complex novel materials

    Scientists and institutions dedicate more resources each year to the discovery of novel materials to fuel the world. As natural resources diminish and the demand for higher value and advanced performance products grows, researchers have increasingly looked to nanomaterials.
    Nanoparticles have already found their way into applications ranging from energy storage and conversion to quantum computing and therapeutics. But given the vast compositional and structural tunability nanochemistry enables, serial experimental approaches to identify new materials impose insurmountable limits on discovery.
    Now, researchers at Northwestern University and the Toyota Research Institute (TRI) have successfully applied machine learning to guide the synthesis of new nanomaterials, eliminating barriers associated with materials discovery. The highly trained algorithm combed through a defined dataset to accurately predict new structures that could fuel processes in clean energy, chemical and automotive industries.
    “We asked the model to tell us what mixtures of up to seven elements would make something that hasn’t been made before,” said Chad Mirkin, a Northwestern nanotechnology expert and the paper’s corresponding author. “The machine predicted 19 possibilities, and, after testing each experimentally, we found 18 of the predictions were correct.”
    The study, “Machine learning-accelerated design and synthesis of polyelemental heterostructures,” will be published December 22 in the journal Science Advances.
    Mirkin is the George B. Rathmann Professor of Chemistry in the Weinberg College of Arts and Sciences; a professor of chemical and biological engineering, biomedical engineering, and materials science and engineering at the McCormick School of Engineering; and a professor of medicine at the Feinberg School of Medicine. He also is the founding director of the International Institute for Nanotechnology. More

  • in

    Quantum marbles in a bowl of light

    Which factors determine how fast a quantum computer can perform its calculations? Physicists at the University of Bonn and the Technion — Israel Institute of Technology have devised an elegant experiment to answer this question. The results of the study are published in the journal Science Advances.
    Quantum computers are highly sophisticated machines that rely on the principles of quantum mechanics to process information. This should enable them to handle certain problems in the future that are completely unsolvable for conventional computers. But even for quantum computers, fundamental limits apply to the amount of data they can process in a given time.
    Quantum gates require a minimum time
    The information stored in conventional computers can be thought of as a long sequence of zeros and ones, the bits. In quantum mechanics it is different: The information is stored in quantum bits (qubits), which resemble a wave rather than a series of discrete values. Physicists also speak of wave functions when they want to precisely represent the information contained in qubits.
    In a traditional computer, information is linked together by so-called gates. Combining several gates allows elementary calculations, such as the addition of two bits. Information is processed in a very similar way in quantum computers, where quantum gates change the wave function according to certain rules.
    Quantum gates resemble their traditional relatives in another respect: “Even in the quantum world, gates do not work infinitely fast,” explains Dr. Andrea Alberti of the Institute of Applied Physics at the University of Bonn. “They require a minimum amount of time to transform the wave function and the information this contains.”
    More than 70 years ago, Soviet physicists Leonid Mandelstam and Igor Tamm deduced theoretically this minimum time for transforming the wave function. Physicists at the University of Bonn and the Technion have now investigated this Mandelstam-Tamm limit for the first time with an experiment on a complex quantum system. To do this, they used cesium atoms that moved in a highly controlled manner. “In the experiment, we let individual atoms roll down like marbles in a light bowl and observe their motion,” explains Alberti, who led the experimental study. More

  • in

    Machine learning models quantum devices

    Technologies that take advantage of novel quantum mechanical behaviors are likely to become commonplace in the near future. These may include devices that use quantum information as input and output data, which require careful verification due to inherent uncertainties. The verification is more challenging if the device is time dependent when the output depends on past inputs. For the first time, researchers using machine learning dramatically improved the efficiency of verification for time-dependent quantum devices by incorporating a certain memory effect present in these systems.
    Quantum computers make headlines in the scientific press, but these machines are considered by most experts to still be in their infancy. A quantum internet, however, may be a little closer to the present. This would offer significant security advantages over our current internet, amongst other things. But even this will rely on technologies that have yet to see the light of day outside the lab. While many fundamentals of the devices that can create our quantum internet may have been worked out, there are many engineering challenges in order to realize these as products. But much research is underway to create tools for the design of quantum devices.
    Postdoctoral researcher Quoc Hoan Tran and Associate Professor Kohei Nakajima from the Graduate School of Information Science and Technology at the University of Tokyo have pioneered just such a tool, which they think could make verifying the behavior of quantum devices a more efficient and precise undertaking than it is at present. Their contribution is an algorithm that can reconstruct the workings of a time-dependent quantum device by simply learning the relationship between the quantum inputs and outputs. This approach is actually commonplace when exploring a classical physical system, but quantum information is generally tricky to store, which usually makes it impossible.
    “The technique to describe a quantum system based on its inputs and outputs is called quantum process tomography,” said Tran. “However, many researchers now report that their quantum systems exhibit some kind of memory effect where present states are affected by previous ones. This means that a simple inspection of input and output states cannot describe the time-dependent nature of the system. You could model the system repeatedly after every change in time, but this would be extremely computationally inefficient. Our aim was to embrace this memory effect and use it to our advantage rather than use brute force to overcome it.”
    Tran and Nakajima turned to machine learning and a technique called quantum reservoir computing to build their novel algorithm. This learns patterns of inputs and outputs that change over time in a quantum system and effectively guesses how these patterns will change, even in situations the algorithm has not yet witnessed. As it does not need to know the inner workings of a quantum system as a more empirical method might, but only the inputs and outputs, the team’s algorithm can be simpler and produce results faster as well.
    “At present, our algorithm can emulate a certain kind of quantum system, but hypothetical devices may vary widely in their processing ability and have different memory effects. So the next stage of research will be to broaden the capabilities of our algorithms, essentially making something more general purpose and thus more useful,” said Tran. “I am excited by what quantum machine learning methods could do, by the hypothetical devices they might lead to.”
    This work is supported by MEXT Quantum Leap Flagship Program (MEXT Q-LEAP) Grant
    Nos. JPMXS0118067394 and JPMXS0120319794.
    Story Source:
    Materials provided by University of Tokyo. Note: Content may be edited for style and length. More

  • in

    Could EKGs help doctors use AI to detect pulmonary embolisms?

    Pulmonary embolisms are dangerous, lung-clogging blot clots. In a pilot study, scientists at the Icahn School of Medicine at Mount Sinai showed for the first time that artificial intelligence (AI) algorithms can detect signs of these clots in electrocardiograms (EKGs), a finding which may one day help doctors with screening.
    The results published in the European Heart Journal — Digital Health suggested that new machine learning algorithms, which are designed to exploit a combination of EKG and electronic health record (EHR) data, may be more effective than currently used screening tests at determining whether moderate- to high-risk patients actually have pulmonary embolisms.
    The study was led by Sulaiman S. Somani, MD, a former medical student in the lab of Benjamin S. Glicksberg, PhD, Assistant Professor of Genetics and Genomic Sciences and a member of the Hasso Plattner Institute for Digital Health at Mount Sinai.
    Pulmonary embolisms happen when deep vein blood clots, usually formed in the legs or arms, break away and clog lung arteries. These clots can be lethal or cause long-term lung damage. Although some patients may experience shortness of breath or chest pain, these symptoms may also signal other problems that have nothing to do with blood clots, making it difficult for doctors to properly diagnose and treat cases. Moreover, current official diagnoses rely on computed tomography pulmonary angiograms (CTPAs), which are time-consuming chest scans that can only be performed at select hospitals and require patients to be exposed to potentially dangerous levels of radiation.
    To make diagnoses easier and more accessible, researchers have spent more than 20 years developing advanced computer programs, or algorithms, designed to help doctors determine whether at-risk patients are actually experiencing pulmonary embolisms. The results have been mixed. For example, algorithms that used EHRs have produced a wide range of success rates for accurately detecting clots and can be labor-intensive. Meanwhile, the more accurate ones depend heavily on data from the CTPAs.
    In this study the researchers found that fusing algorithms that rely on EKG and EHR data may be an effective alternative, because EKGs are widely available and relatively easy to administer.
    The researchers created and tested out various algorithms on data from 21,183 Mount Sinai Health System patients who showed moderate to highly suspicious signs of having pulmonary embolisms. While some algorithms were designed to use EKG data to screen for pulmonary embolisms, others were designed to use EHR data. In each situation, the algorithm learned to identify a pulmonary embolism case by comparing either EKG or EHR data with corresponding results from CTPAs. Finally, a third, fusion algorithm was created by combining the best-performing EKG algorithm with the best-performing EHR one.
    The results showed that the fusion model not only outperformed its parent algorithms but was also better at identifying specific pulmonary embolism cases than the Wells’ Criteria Revised Geneva Score and three other currently used screening tests. The researchers estimated that the fusion model was anywhere from 15 to 30 percent more effective at accurately screening acute embolism cases, and the model performed best at predicting the most severe cases. Furthermore, the fusion model’s accuracy remained consistent regardless of whether race or sex was tested as a factor, suggesting it may be useful for screening a variety of patients.
    According to the authors, these results support the theory that EKG data may be effectively incorporated into new pulmonary embolism screening algorithms. They plan to further develop and test these algorithms out for potential utility in the clinic.
    This study was support by the National Institutes of Health (TR001433). More

  • in

    A new platform for controlled design of printed electronics with 2D materials

    Scientists have shown how electricity is transported in printed 2D materials, paving the way for design of flexible devices for healthcare and beyond.
    A study, published today in Nature Electronics, led by Imperial College London and Politecnico di Torino researchers reveals the physical mechanisms responsible for the transport of electricity in printed two-dimensional (2D) materials.
    The work identifies what properties of 2D material films need to be tweaked to make electronic devices to order, allowing rational design of a new class of high-performance printed and flexible electronics.
    Silicon chips are the components that power most of our electronics, from fitness trackers to smartphones. However, their rigid nature limits their use in flexible electronics. Made of single-atom-thick layers, 2D materials can be dispersed in solution and formulated into printable inks, producing ultra-thin films that are extremely flexible, semi-transparent and with novel electronic properties.
    This opens up the possibility of new types of devices, such as those that can be integrated into flexible and stretchable materials, like clothes, paper, or even tissues into the human body.
    Previously, researchers have built several flexible electronic devices from printed 2D material inks, but these have been one-off ‘proof-of-concept’ components, built to show how one particular property, such as high electron mobility, light detection, or charge storage can be realised. More

  • in

    Computer simulation models potential asteroid collisions

    An asteroid impact can be enough to ruin anyone’s day, but several small factors can make the difference between an out-of-this-world story and total annihilation. In AIP Advances, by AIP Publishing, a researcher from the National Institute of Natural Hazards in China developed a computer simulation of asteroid collisions to better understand these factors.
    The computer simulation initially sought to replicate model asteroid strikes performed in a laboratory. After verifying the accuracy of the simulation, Duoxing Yang believes it could be used to predict the result of future asteroid impacts or to learn more about past impacts by studying their craters.
    “From these models, we learn generally a destructive impact process, and its crater formation,” said Yang. “And from crater morphologies, we could learn impact environment temperatures and its velocity.”
    Yang’s simulation was built using the space-time conservation element and solution element method, designed by NASA and used by many universities and government agencies, to model shock waves and other acoustic problems.
    The goal was to simulate a small rocky asteroid striking a larger metal asteroid at several thousand meters per second. Using his simulation, Yang was able to calculate the effects this would have on the metal asteroid, such as the size and shape of the crater.
    The simulation results were compared against mock asteroid impacts created experimentally in a laboratory. The simulation held up against these experimental tests, which means the next step in the research is to use the simulation to generate more data that can’t be produced in the laboratory.
    This data is being created in preparation for NASA’s Psyche mission, which aims to be the first spacecraft to explore an asteroid made entirely of metal. Unlike more familiar rocky asteroids, which are made of roughly the same materials as the Earth’s crust, metal asteroids are made of materials found in the Earth’s inner core. NASA believes studying such an asteroid can reveal more about the conditions found in the center of our own planet.
    Yang believes computer simulation models can generalize his results to all metal asteroid impacts and, in the process, answer several existing questions about asteroid interactions.
    “What kind of geochemistry components will be generated after impacts?” said Yang. “What kinds of impacts result in good or bad consequences to local climate? Can we change trajectory of asteroids heading to us?”
    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    Researchers develop new measurements for designing cooler electronics

    When cell phones, electric vehicle chargers, or other electronic devices get too hot, performance degrades, and eventually overheating can cause them to shut down or fail. In order to prevent that from happening researchers are working to solve the problem of dissipating heat produced during performance. Heat that is generated in the device during operation has to flow out, ideally with little hinderance to reduce the temperature rise. Often this thermal energy must cross several dissimilar materials during the process and the interface between these materials can cause challenges by impeding heat flow.
    A new study from researchers at the Georgia Institute of Technology, Notre Dame, University of California Los Angeles, University of California Irvine, Oak Ridge National Laboratory, and the Naval Research Laboratory observed interfacial phonon modes which only exist at the interface between silicon (Si) and germanium (Ge). This discovery, published in the journal Nature Communications, shows experimentally that decades-old conventional theories for interfacial heat transfer are not complete and the inclusion of these phonon modes are warranted.
    “The discovery of interfacial phonon modes suggests that the conventional models of heat transfer at interfaces which only use bulk phonon properties are not accurate,” said the Zhe Cheng, a Ph.D. graduate from Georgia Tech’s George W. Woodruff School of Mechanical Engineering who is now a postdoc at University of Illinois at Urbana-Champaign (UIUC). “There is more space for research at the interfaces. Even though these modes are localized, they can contribute to thermal conductance across interfaces.”
    The discovery opens a new pathway for consideration when engineering thermal conductance at interfaces for electronics cooling and other applications where phonons are majority heat carriers at material interfaces.
    “These results will lead to great progress in real-world engineering applications for thermal management of power electronics,” said co-author Samuel Graham, a professor in the Woodruff School of Mechanical Engineering at Georgia Tech and new dean of engineering at University of Maryland. “Interfacial phonon modes should exist widely at solid interfaces. The understanding and manipulation of these interface modes will give us the opportunity to enhance thermal conductance across technologically-important interfaces, for example, GaN-SiC, GaN-diamond, β-Ga2O3-SiC, and β-Ga2O3-diamond interfaces.”
    Presence of Interfacial Phonon Modes Confirmed in Lab
    The researchers observed the interfacial phonon modes experimentally at a high-quality Si-Ge epitaxial interface by using Raman Spectroscopy and high-energy resolution electron energy-loss spectroscopy (EELS). To figure out the role of interfacial phonon modes in heat transfer at interfaces, they used a technique called time-domain thermoreflectance in labs at Georgia Tech and UIUC to determine the temperature-dependent thermal conductance across these interfaces.
    They also observed a clean additional peak showing up in Raman Spectroscopy measurements when they measured the sample with Si-Ge interface, which was not observed when they measured a Si wafer and a Ge wafer with the same system. Both the observed interfacial modes and thermal boundary conductance were fully captured by molecular dynamics (MD) simulations and were confined to the interfacial region as predicted by theory.
    “This research is the result of great team work with all the collaborators,” said Graham. “Without this team and the unique tools that were available to us, this work would not have been possible.”
    Moving forward the researchers plan to continue to pursue the measurement and prediction of interfacial modes, increase the understanding of their contribution to heat transfer, and determine ways to manipulate these phonon modes to increase thermal transport. Breakthroughs in this area could lead to better performance in semiconductors used in satellites, 5G devices, and advanced radar systems, among other devices.
    The epitaxial Si-Ge samples used in this research were grown at the U.S. Naval Research Lab. The TEM and EELS measurements were done at University of California, Irvine and Oak Ridge National Labs. The MD simulations were performed by the University of Notre Dame. The XRD study was done at UCLA.
    This work is financially supported by U.S. Office of Naval Research under a MURI project. The EELS study at UC Irvine is supported by U.S. Department of Energy.
    Story Source:
    Materials provided by Georgia Institute of Technology. Note: Content may be edited for style and length. More