More stories

  • in

    Rectifying AI’s usage in the quest for thermoelectric materials

    Using AI, a team of researchers has identified a thermoelectric material that potentially possesses favorable values. The group was able to navigate AI’s conventional pitfalls, giving a prime example of how AI can revolutionize materials science.
    Details of their finding were published in the journal Science China Materials on March 8, 2024.
    “Traditional methods of finding suitable materials involve trial-and-error, which is time-consuming and often expensive,” proclaims Hao Li, associate professor at Tohoku University’s Advanced Institute for Materials Research (WPI-AIMR) and corresponding author of the paper. “AI transforms this by combing through databases to identify potential materials that can then be experimentally verified.”
    Still, challenges remain. Large-scale material datasets sometimes contain errors and overfitting the predicted temperature-dependent properties is also a common error. Overfitting occurs when a model learns to capture noise or random fluctuations in the training data rather than the underlying pattern or relationship. As a result, the model performs well on the training data but fails to generalize new, unseen data. When predicting temperature-dependent properties, overfitting could lead to inaccurate predictions when the model encounters new conditions outside the range of the training data.
    Li and his colleagues sought to overcome this to develop a thermoelectric material. These materials convert heat energy into electrical energy, or vice versa. Thus, getting a highly accurate temperature-dependence is critical.
    “First, we performed a series of rational actions to identify and discard questionable data, obtaining 92,291 data points comprising 7,295 compositions and different temperatures from the Starrydata2 database — an online database that collects digital data from published papers,” states Li.
    Following this, the researchers implemented a composition-based cross-validation method. Crucially, they emphasized that data points with the same compositions but different temperatures should not be split into different sets to avoid overfitting.
    Then the researchers built machine building models using the Gradient Boosting Decision Tree method. The model achieved remarkable R2 values 0.89, ~0.90, and ~0.89 on the training dataset, test dataset, and new out-of-sample experimental data released in 2023, demonstrating the models accuracy in predicting newly available materials.
    “We could use this model to carry out a large-scale evaluation of the stable materials from the Materials Project database, predicting the potential thermoelectric performance of new materials and providing guidance for experiments,” states Xue Jia, Assistant Professor at WPI-AIMR, and co-author of the paper.
    Ultimately, the study illustrates the importance of following rigorous guidelines when it comes to data preprocessing and data splitting in machine learning so that it addresses the pressing issues in materials science. The researchers are optimistic that their strategy can also be applied to other materials, such as electrocatalysts and batteries. More

  • in

    Quantum interference could lead to smaller, faster, and more energy-efficient transistors

    An international team of researchers from Queen Mary University of London, the University of Oxford, Lancaster University, and the University of Waterloo have developed a new single-molecule transistor that uses quantum interference to control the flow of electrons. The transistor, which is described in a paper published in the Nature Nanotechnology, opens new possibilities for using quantum effects in electronic devices.
    Transistors are the basic building blocks of modern electronics. They are used to amplify and switch electrical signals, and they are essential for everything from smartphones to spaceships. However, the traditional method of making transistors, which involves etching silicon into tiny channels, is reaching its limits. As transistors get smaller, they become increasingly inefficient and susceptible to errors, as electrons can leak through the device even when it is supposed to be switched off, by a process known as quantum tunnelling. Researchers are exploring new types of switching mechanisms that can be used with different materials to remove this effect.
    In the nanoscale structures that Professor Jan Mol, Dr James Thomas, and their group study at Queen Mary’s School of Physical and Chemical Sciences, quantum mechanical effects dominate, and electrons behave as waves rather than particles. Taking advantage of these quantum effects, the researchers built a new transistor. The transistor’s conductive channel is a single zinc porphyrin, a molecule that can conduct electricity. The porphyrin is sandwiched between two graphene electrodes, and when a voltage is applied to the electrodes, electron flow through the molecule can be controlled using quantum interference.
    Interference is a phenomenon that occurs when two waves interact with each other and either cancel each other out (destructive interference) or reinforce each other (constructive interference). In the new transistor’s case, researchers switched the transistor on and off by controlling whether the electrons interfere constructively (on) or destructively (off) as they flow through the zinc porphyrin molecule.
    The researchers found that the new transistor has a very high on/off ratio, meaning that it can be turned on and off very precisely. Destructive quantum interference plays a crucial role in this by eliminating the leaky electron flow from quantum tunneling through the transistor when it is supposed to be switched off. They also found that the transistor is very stable. Previous transistors made from a single molecule have only been able to demonstrate a handful of switching cycles, however this device can be operated for hundreds of thousands of cycles without breaking down.
    “Quantum interference is a powerful phenomenon that has the potential to be used in a wide variety of electronics applications,” said lead author Dr James Thomas, Lecturer in Quantum Technologies at Queen Mary. “We believe that our work is a significant step towards realizing this potential.”
    “Our results show that quantum interference can be used to control the flow of electrons in transistors, and that this can be done in a way that is both efficient and reliable,” said co-author Professor Jan Mol. “This could lead to the development of new types of transistors that are smaller, faster, and more energy-efficient than current devices.”
    The researchers also found that the quantum interference effects could be used to improve the transistor’s subthreshold swing, which is a measure of how sensitive the transistor is to changes in the gate voltage. The lower the subthreshold swing, the more efficient the transistor is. The researchers’ transistors had a subthreshold swing of 140 mV/dec, which is better than subthreshold swings reported for other single-molecule transistors, and comparable to larger devices made from materials such as carbon nanotubes.
    The research is still in its initial stages, but the researchers are optimistic that the new transistor could be used to create a new generation of electronic devices. These devices could be used in a variety of applications, starting from computers, smartphones, and ending with medical devices. More

  • in

    Novel quantum algorithm for high-quality solutions to combinatorial optimization problems

    Conventional quantum algorithms are not feasible for solving combinatorial optimization problems (COPs) with constraints in the operation time of quantum computers. To address this issue, researchers have developed a novel algorithm called post-processing variationally scheduled quantum algorithm. The novelty of this innovative algorithm lies in the use of a post-processing technique combined with variational scheduling to achieve high-quality solutions to COPs in a short time.
    Combinatorial optimization problems (COPs) have applications in many different fields such as logistics, supply chain management, machine learning, material design and drug discovery, among others, for finding the optimal solution to complex problems. These problems are usually very computationally intensive using classical computers and thus solving COPs using quantum computers has attracted significant attention from both academia and industry.
    Quantum computers take advantage of the quantum property of superposition, using specialized qubits, that can exist in an infinite yet contained number of states of 0 or 1 or any combination of the two, to quickly solve large problems. However, when COPs involve constraints, conventional quantum algorithms like adiabatic quantum annealing struggle to obtain a near-optimal solution within the operation time of quantum computers. Recent advances in quantum technology have led to devices such as quantum annealers and gate-type quantum devices that provide suitable platforms for solving COPs. Unfortunately, they are susceptible to noise, which limits their applicability to quantum algorithms with low computational costs.
    To address this challenge, Assistant Professor Tatsuhiko Shirai and Professor Nozomu Togawa from the Department of Computer Science and Communications Engineering at Waseda University in Japan have recently developed a groundbreaking post-processing variationally scheduled quantum algorithm (pVSQA). “The two main methods for solving COPs with quantum devices are variational scheduling and post-processing. Our algorithm combines variational scheduling with a post-processing method that transforms infeasible solutions into feasible ones, allowing us to achieve near-optimal solutions for constrained COPs on both quantum annealers and gate-based quantum computers,” explains Dr. Shirai. Their study was published in the journal IEEE Transactions on Quantum Engineering on 13 March 2024.
    The innovative pVSQA algorithm uses a quantum device to first generate a variational quantum state via quantum computation. This is then used to generate a probability distribution function which consists of all the feasible and infeasible solutions that are within the constraints of the COP. Next, the post-processing method transforms the infeasible solutions into feasible ones, leaving the probability distribution with only feasible solutions. A classical computer is then used to calculate an energy expectation value of the cost function using this new probability distribution. Repeating this calculation results in a near-optimal solution.
    The researchers analyzed the performance of this algorithm using both a simulator and real quantum devices such as a quantum annealer and a gate-type quantum device. The experiments revealed that pVSQA achieves a near-optimal performance within a predetermined time on the simulator and outperforms conventional quantum algorithms without post-processing on real quantum devices.
    Dr. Shirai highlights the potential applications of the algorithm, stating: “Drastic social transformations are urgently needed to address various social issues. Examples include the realization of a carbon-neutral society to solve climate change issues and the realization of sustainable development goals to address issues such as increased energy demand and food shortage. Efficiently solving combinatorial optimization problems is at the heart of achieving these transformations. Our new method will play a significant role in realizing these long-term social transformations.”
    In conclusion, this study marks a significant step forward for using quantum computers for solving COPs, holding promise for addressing complex real-world problems across various domains.

    Reference
    DOI: https://doi.org/10.1109/TQE.2024.3376721
    Authors: Tatsuhiko Shirai1 and Nozomu Togawa1
    Affiliations: Department of Computer Science and Communications Engineering, Waseda University
    About Waseda University
    Located in the heart of Tokyo, Waseda University is a leading private research university that has long been dedicated to academic excellence, innovative research, and civic engagement at both the local and global levels since 1882. The University has produced many changemakers in its history, including nine prime ministers and many leaders in business, science and technology, literature, sports, and film. Waseda has strong collaborations with overseas research institutions and is committed to advancing cutting-edge research and developing leaders who can contribute to the resolution of complex, global social issues. The University has set a target of achieving a zero-carbon campus by 2032, in line with the Sustainable Development Goals (SDGs) adopted by the United Nations in 2015.

    To learn more about Waseda University, visit https://www.waseda.jp/top/en
    About Assistant Professor Tatsuhiko Shirai
    Tatsuhiko Shirai is currently an Assistant Professor at the Department of Computer and Communications Engineering at Waseda University in Japan. He obtained his master’s and Ph.D. in Physics from the University of Tokyo in 2013 and 2016 respectively. In 2022, he obtained the SLDM Research Group Excellent Paper Award. His research interests include quantum algorithms, quantum open systems, and quantum computing. He is a member of The Physical Society of Japan. More

  • in

    Semiconductors at scale: New processor achieves remarkable speed-up in problem solving

    Annealing processors are designed specifically for addressing combinatorial optimization problems, where the task is to find the best solution from a finite set of possibilities. This holds implications for practical applications in logistics, resource allocation, and the discovery of drugs and materials. In the context of CMOS (a type of semiconductor technology), it is necessary for the components of annealing processors to be fully “coupled.” However, the complexity of this coupling directly affects the scalability of the processors.
    In a new IEEE Access study published on 30 January 2024, led by ProfessorTakayuki Kawahara from Tokyo University of Science, researchers have developed and successfully tested a scalable processor that divides the calculation into multiple LSI chips. The innovation was also presented in IEEE 22nd World Symposium on Applied Machine Intelligence and Informatics (SAMI 2024) on 25 January 2024.
    According to Prof. Kawahara, We want to achieve advanced information processing directly at the edge, rather than in the cloud, or performing preprocessing at the edge for the cloud. Using the unique processing architecture announced by the Tokyo University of Science in 2020, we have realized a fully coupled LSI (Large Scale Integration) on one chip using 28nm CMOS technology. Furthermore, we devised a scalable method with parallel-operating chips, and demonstrated its feasibility using FPGAs (Field-Programmable Gate Arrays) in 2022.
    In the study, which was partially supported by the JSPS KAKENHI Grant Number 22H01559, the Tokyo University of Science Entrepreneurship Grant (PoC Support Grant), and the Tokyo Metropolitan Government, the team created a scalable annealing processor. It used 36 22nm CMOS calculation LSI (Large Scale Integration) chips and one control FPGA. This technology enables the construction of large-scale fully coupled semiconductor systems following the Ising model (a mathematical model of magnetic systems) with 4096 spins.
    The processor incorporates two distinct technologies developed at the Tokyo University of Science. This includes a “spin thread method” that enables 8 parallel solution searches, coupled with a technique that reduces chip requirements by about half compared to conventional methods. Its power needs are also modest, operating at 10MHz with a power consumption of 2.9W (1.3W for the core part). This was practically confirmed using a vertex cover problem with 4096 vertices.
    In terms of power performance ratio, the processor outperformed simulating a fully coupled Ising system on a PC (i7, 3.6GHz) using annealing emulation by 2,306 times. Additionally, it surpassed the core CPU and arithmetic chip by 2,186 times.
    The successful machine verification of this processor suggests the possibility of enhanced capacity. According to Prof. Kawahara, who holds a vision for the social implementation of this technology (such as initiating a business, joint research, and technology transfer), “In the future, we will develop this technology for a joint research effort targeting an LSI system with the computing power of a 2050-level quantum computer for solving combinatorial optimization problems. The goal is to achieve this without the need for air conditioning, large equipment, or cloud infrastructure, using current semiconductor processes. Specifically, we would like to achieve 2M (million) spins by 2030 and explore the creation of new digital industries using this.”
    In summary, researchers have developed a scalable, fully coupled annealing processor incorporating 4096 spins on a single board with 36 CMOS chips. Key innovations, including chip reduction and parallel operations for simultaneous solution searches, played a crucial role in this development! More

  • in

    Downscaling storage devices: Magnetic memory based on the chirality of spiral magnets

    A team of researchers has proposed a new concept for magnet-based memory devices, which might revolutionize information storage devices owing to their potential for large-scale integration, non-volatility, and high durability.
    Details of their findings were published in the journal Nature Communications on March 7, 2024.
    Spintronic devices, represented by magnetic random access memory (MRAM), utilize the magnetization direction of ferromagnetic materials to memorize information. Because of their non-volatility and low energy consumption, spintronic devices will likely play a pivotal role in future information storage components.
    However, ferromagnet-based spintronics devices have a potential pitfall. Ferromagnets generate magnetic fields around them, which affect nearby ferromagnets. In an integrated magnetic device, this results in crosstalk between magnetic bits, which will limit the magnetic memory density.
    The research team, which comprised Hidetoshi Masuda, Takeshi Seki, Yoshinori Onose and others from Tohoku University’s Institute for Materials Research, and Jun-ichiro Ohe from Toho University, demonstrated that magnetic materials called helical magnets can be utilized for a magnetic memory device, which should resolve the magnetic field problem.
    In helical magnets, the directions of the atomic magnetic moments are ordered in a spiral. The right- or left-handedness of the spiral, called chirality, could be utilized to memorize the information. The magnetic fields induced by each atomic magnetic moment cancel each other out, so the helical magnets do not generate any macroscopic magnetic field. “The memory devices based on the handedness of the helimagnets, free from the crosstalk among bits, could pave a new pathway for improving the memory density,” says Masuda.
    The research team demonstrated that the chirality memory can be written and read out at room temperature. They fabricated epitaxial thin films of a room-temperature helimagnet MnAu2 and demonstrated the switching of chirality (right- and left-handedness of the spiral) by the electric current pulses under magnetic fields. Furthermore, they fabricated a bilayer device composed of MnAu2 and Pt (platinum) and demonstrated that the chirality memory can be read out as a resistance change, even without magnetic fields.
    “We have uncovered the potential capability of chirality memory in helical magnets for next-generation memory devices; it may offer high-density, non-volatile, and highly stable memory bits,” adds Masuda. “This will hopefully lead to future storage devices with ultrahigh information density and high reliability.” More

  • in

    Physicists develop modeling software to diagnose serious diseases

    Researchers at the Niels Bohr Institute, University of Copenhagen and University of Southern Denmark have recently published FreeDTS — a shared software package designed to model and study biological membranes at the mesoscale — the scale “in between” the larger macro level and smaller micro level.
    This software fills an important missing software among the available biomolecular modeling tools and enables modeling and understanding of many different biological processes involving the cellular membranes e.g. cell division.
    Membrane shape contains information about the physiological state of the cell and overall health of an organism, so this new tool, with its wide array of applications, will enhance our understanding of cell behavior and open routes for diagnostics of infections and diseases like Parkinsons.
    The publication of FreeDTS is now reported in Nature Communications.
    Sharing a powerful tool that could have provided NBI with an advantage. Why?
    The software package Weria Pezeshkian from the Niels Bohr Institute has been working on for the last 5 years, after an initial idea between him and John Ipsen from the University of Southern Denmark, is shared — laid open for every researcher in this field to use.
    Normally the competition for achieving scientific results is high, and science advancements kept secret until publication — so this seems like a very generous attitude indeed. So generous it might seem a bit naive.

    It is a strange mix of respect for the “pioneers” of the biomolecular modeling field and the fact that the field offers so many unanswered questions that it would seem almost disrespectful towards the scientific community to keep the tool to ourselves, Weria Pezeshkian explains.
    “There are so many questions and bottlenecks to tackle to reach the end goals, that it would be unlikely that we work on exactly the same problems. However, occasional overlap occurs and is a worthwhile cost we pay for advancing the field.
    But there is another aspect as well: One of the reasons our community, the biomolecular simulation and modeling community has had this surge in popularity and a fast growth is that we’ve always strived to get more people into the game and share ideas, results and methods and often direct assistance without expecting immediate personal gains.
    Acknowledging Herman Berendsen
    Herman Berendsen (1934-2019) was a professor of physical chemistry at the University of Groningen (RUG). He was especially known for his contributions to the field of molecular modeling and his dedication to translate models into accessible applications.
    Berendsen was especially praised for his non-hierarchical and open approach. This not only locally at his institute, where he was known for enabling the young researchers in his group, but also among the wider scientific community. He contributed to computer simulation applications that are still widely used to study the dynamics of biomolecules. Examples of this are his SPC (simple-point-charge) model, used to model liquid water; and the ‘Berendsen’ thermostat and barostat, that serves to keep the temperature and pressure constant during simulations.

    Also, he organized a series of workshops where pioneers in the field met to discuss and share their newest findings.
    Berendsen remains one of RUG’s most cited scholars. The applicability of his work ranges far beyond the field of physical chemistry and it is also used by mathematicians, computer scientists, molecular life scientists and in the development of medical applications.
    Biological membranes — what are they really?
    When you consider a cell, you can imagine a whole lot of small “factories” inside, called organelles, doing their thing — surrounded by a membrane.
    The cell also is surrounded by a membrane called Plasma membrane. But membranes are not just a boundary surface. They are actively participating in many processes. They are made from a myriad of different molecules, and they are dynamic, in motion all the time.
    Many diseases are associated with irregular membrane shape and abnormal biomolecular organization, so the study of membranes can help us understand the state of a cell and overall health of an organism. For instance, when a neuron has increased spiking activity, indicating a higher energy demand, the structure of mitochondria, an organelle responsible for generating cellular energy parcels from food (often referred to as the powerhouse of the cell), undergoes changes.
    Moreover, certain diseases, e.g., Alzheimers for one, have been associated with changes in the mitochondrial membranes shapes.
    Computer models will improve our abilities within diagnostics
    “For now, we are not able to see exactly what the exact causes of changes in membrane shape are and how are they exactly related to the diagnostics of a certain disease. But at some point, in the future, the try and error works in the lab will become minimal because modelling will guide experiments with unimaginable accuracy, as our modeling becomes more precise and the power of computational options increasing.
    We will need a lot of adjustments and there is still long way to go, so it is really nice to work within this sharing community, because we all work on different aspects of it” Weria Pezeshkian explains.
    Weria continues with a word of caution: “This is probably stretching it a bit far, but possibly, in the future, by imaging for example our mitochondria and leveraging physics-based computer simulations we may be able to say: This person has this disease with this specific genetic deficiency. So, the perspective for computational modelling is rather great — we are not there yet, but we can see it in the horizon.” More

  • in

    Researchers invent artificial intelligence model to design new superbug-fighting antibiotics

    Researchers at McMaster University and Stanford University have invented a new generative artificial intelligence model which can design billions of new antibiotic molecules that are inexpensive and easy to build in the laboratory.
    The worldwide spread of drug-resistant bacteria has created an urgent need for new antibiotics, but even modern AI methods are limited at isolating promising chemical compounds, especially when researchers must also find ways to manufacture these new AI-guided drugs and test them in the lab.
    In a new study, published today in the journal Nature Machine Intelligence, researchers report they have developed a new generative AI model called SyntheMol, which can design new antibiotics to stop the spread of Acinetobacter baumannii, which the World Health Organization has identified as one of the world’s most dangerous antibiotic-resistant bacteria.
    Notoriously difficult to eradicate, A. baumannii can cause pneumonia, meningitis and infect wounds, all of which can lead to death. Researchers say few treatment options remain.
    “Antibiotics are a unique medicine. As soon as we begin to employ them in the clinic, we’re starting a timer before the drugs become ineffective, because bacteria evolve quickly to resist them,” says Jonathan Stokes, lead author on the paper and an assistant professor in McMaster’s Department of Biomedicine & Biochemistry, who conducted the work with James Zou, an associate professor of biomedical data science at Stanford University.
    “We need a robust pipeline of antibiotics and we need to discover them quickly and inexpensively. That’s where the artificial intelligence plays a crucial role,” he says.
    Researchers developed the generative model to access tens of billions of promising molecules quickly and cheaply.

    They drew from a library of 132,000 molecular fragments, which fit together like Lego pieces but are all very different in nature. They then cross-referenced these molecular fragments with a set of 13 chemical reactions, enabling them to identify 30 billion two-way combinations of fragments to design new molecules with the most promising antibacterial properties.
    Each of the molecules designed by this model was in turn fed through another AI model trained to predict toxicity. The process yielded six molecules which display potent antibacterial activity against A. baumannii and are also non-toxic.
    “Synthemol not only designs novel molecules that are promising drug candidates, but it also generates the recipe for how to make each new molecule. Generating such recipes is a new approach and a game changer because chemists do not know how to make AI-designed molecules,” says Zou, who co-authored the paper.
    The research is funded in part by the Weston Family Foundation, the Canadian Institutes of Health Research, and Marnix and Mary Heersink. More

  • in

    N-channel diamond field-effect transistor

    A NIMS research team has developed the world’s first n-channel diamond MOSFET (metal-oxide-semiconductor field-effect transistor). The developed n-channel diamond MOSFET provides a key step toward CMOS (complementary metal-oxide-semiconductor: one of the most popular technologies in the computer chip) integrated circuits for harsh-environment- applications as well as the development of diamond power electronics.
    Semiconductor diamond has outstanding physical properties such as ultra wide-bandgap energy of 5.5 eV, high carriers mobilities, and high thermal conductivity etc, which is promising for the applications under extreme environmental conditions with high performance and high reliability, such as the environments of high temperatures and high levels of radiation (e.g., in proximity to nuclear reactor cores). By using diamond electronics, not only can the thermal management demand for conventional semiconductors be alleviated but these devices are also more energy efficient and can endure much higher breakdown voltages and harsh environments. On the other hand, with the development of diamond growth technologies, power electronics, spintronics, and microelectromechanical system (MEMS) sensors operatable under high-temperature and strong-radiation conditions, the demand for peripheral circuitry based on diamond CMOS devices has increased for monolithic integration. For the fabrication of CMOS integrated circuits, both p- and n-type channel MOSFETs are required as those required for conventional silicon electronics. However, n-channel diamond MOSFETs had yet to be developed.
    This NIMS research team developed a technique to grow high-quality monocrystalline n-type diamond semiconductors with smooth and flat terraces at the atomic level by doping diamond with a low concentration of phosphorus. Using this technique, the team succeeded in fabricating an n-channel diamond MOSFET for the first time in the world. This MOSFET is composed mainly of an n-channel diamond semiconductor layer atop another diamond layer doped with a high concentration of phosphorus. The use of the latter diamond layer significantly reduced source and drain contact resistance. The team confirmed that the fabricated diamond MOSFET actually functioned as an n-channel transistor. In addition, the team verified the excellent high-temperature performance of the MOSFET as indicated by its field-effect mobility — an important transistor performance indicator — of approximately 150 cm2/V・sec at 300°C.
    These achievements are expected to facilitate the development of CMOS integrated circuits for the manufacture of energy-efficient power electronics, spintronic devices and (MEMS) sensors under harsh environments. More