More stories

  • in

    GPT-4 for identifying cell types in single cells matches and sometimes outperforms expert methods

    GPT-4 can accurately interpret types of cells important for the analysis of single-cell RNA sequencing — a sequencing process fundamental to interpreting cell types — with high consistency to that of time-consuming manual annotation by human experts of gene information, according to a study at Columbia University Mailman School of Public Health. The findings are published in the journal Nature Methods.
    GPT-4 is a large language model designed for speech understanding and generation. Upon assessment across numerous tissue and cell types, GPT-4 has demonstrated the ability to produce cell type annotations that closely align with manual annotations of human experts and surpass existing automatic algorithms. This feature has the potential to significantly lessen the amount of effort and expertise needed for annotating cell types, a process that can take months. Moreover, the researchers have developed GPTCelltype, an R software package, to facilitate the automated annotation of cell types using GPT-4.
    “The process of annotating cell types for single cells is often time-consuming, requiring human experts to compare genes across cell clusters,” said Wenpin Hou, PhD, assistant professor of Biostatistics at Columbia Mailman School. “Although automated cell type annotation methods have been developed, manual methods to interpret scientific data remain widely used, and such a process can take weeks to months. We hypothesized that GPT-4 can accurately annotate cell types, transitioning the process from manual to a semi- or even fully automated procedure and be cost-efficient and seamless.”
    The researchers assessed GPT-4’s performance across ten datasets covering five species, hundreds of tissue and cell types, and including both normal and cancer samples. GPT-4 was queried using GPTCelltype, the software tool developed by the researchers. For competing purposes, they also evaluated other GPT versions and manual methods as a reference tool.
    As a first step, the researchers first explored the various factors that may affect the annotation accuracy of GPT-4. They found that GPT-4 performs best when using the top 10 different genes and exhibits similar accuracy across various prompt strategies, including a basic prompt strategy, a chain-of-thought-inspired prompt strategy that includes reasoning steps, and a repeated prompt strategy. GPT-4 matched manual analyses in over 75 percent of cell types in most studies and tissues demonstrating its competency in generating expert-comparable cell type annotations. In addition, the low agreement between GPT-4 and manual annotations in some cell types does not necessarily imply that GPT-4’s annotation is incorrect. In an example of stromal or connective tissue cells, GPT-4 provides more accurate cell type annotations. GPT-4 was also notably faster.
    Hou and her colleague also assessed GPT-4’s robustness in complex real data scenarios and found that GPT-4 can distinguish between pure and mixed cell types with 93 percent accuracy, and differentiated between known and unknown cell types with 99 percent accuracy. They also evaluated the performance of reproducing GPT-4’s methods using prior simulation studies. GPT-4 generated identical notations for the same marker genes in 85 percent of cases. “All of these results demonstrate GPT-4’s robustness in various scenarios,” observed Hou.
    While GPT-4 surpasses existing methods, there are limitations to consider, according to Hou, including the challenges for verifying GPT-4’s quality and reliability because it discloses little about its training proceedings.
    “Since our study focuses on the standard version of GPT-4, fine-tuning GPT-4 could further improve cell type annotation performance,” said Hou.
    Zhicheng Ji of Duke University School of Medicine is a co-author.
    The study was supported by the National Institutes of Health, grant U54AG075936 and R35GM150887. More

  • in

    Pairing crypto mining with green hydrogen offers clean energy boost

    Pairing cryptocurrency mining — notable for its outsize consumption of carbon-based fuel — with green hydrogen could provide the foundation for wider deployment of renewable energy, such as solar and wind power, according to a new Cornell University study.
    “Since current cryptocurrency operations now contribute heavily to worldwide carbon emissions, it becomes vital to explore opportunities for harnessing the widespread enthusiasm for cryptocurrency as we move toward a sustainable and a climate-friendly future,” said Fengqi You, professor of energy systems engineering at Cornell.
    You and doctoral student Apoorv Lal are authors of “Climate Sustainability Through a Dynamic Duo: Green Hydrogen and Crypto Driving Energy Transition and Decarbonization,” which published March 25 in the Proceedings of the National Academy of Sciences.
    Their research shows how linking the use of energy-intensive cryptocurrency mining with green hydrogen technology — the “dynamic duo,” they call it — can boost renewable energy sectors.
    “Building a green hydrogen infrastructure to help produce cryptocurrency can accelerate renewable energy and create a more sustainable energy landscape,” Lal said.
    Using clean energy sources to power blockchain mining operations and fuel the production of green hydrogen can lead to growing wind and solar capacity — and expand sustainable energy production across the country, the researchers said.
    In its current structure, mining blockchain-based cryptocurrency in the U.S. can use as much carbon-based energy as the entire country of Argentina, according to a 2022 White House Office of Science and Technology report. Nearly all domestic crypto-mining electricity is driven by computer power-hungry consensus mechanisms, known as “proof of work,” which is used to verify crypto-assets.

    Preliminary estimates by the U.S. Energy Information Administration suggest that 2023 annual electricity consumption for cryptocurrency mining likely represents from 0.6% to 2.3% of all U.S. electricity consumption.
    “Acknowledging the substantial energy demands of cryptocurrency mining, our research proposes an innovative technology solution,” You said. “By leveraging cryptocurrencies as virtual energy carriers in tandem with using green hydrogen, we can transform what was once an environmental challenge into a dynamic force for climate mitigation and sustainability.”
    In their research, You and Lal examined individual U.S. states to assess potential energy strengths in each region.
    Supporting cryptocurrency can hasten the building of extra energy infrastructure and potentially create 78.4 megawatt hours of solar power for each Bitcoin mined in New Mexico, for example, and potentially 265.8 megawatt hours of wind power for each Bitcoin mined in Wyoming, according to the paper.
    “While cryptocurrency currently has a high dollar value (Bitcoin traded for more than $73,000 on March 13,) you cannot hold it in your hand,” You said. “It’s virtual. Think of cryptocurrency and energy in the same way — much like a gift-card concept. Cryptocurrency also can hold an energy value and that becomes an additional function.”
    To advance a sustainable future for blockchain-based cryptocurrency, the researchers said, stronger federal policies for climate goals and renewable energy need to advance.
    “Coupled with green hydrogen, this approach to cryptocurrency not only mitigates its own environmental impact, but pioneers a sustainable path for renewable energy transition,” You said. “It’s a novel strategy.”
    You is a senior faculty fellow at the Cornell Atkinson Center for Sustainability. Funding for this work was provided by the National Science Foundation. More

  • in

    Pushing back the limits of optical imaging by processing trillions of frames per second

    Pushing for a higher speed isn’t just for athletes. Researchers, too, can achieve such feats with their discoveries. This is the case for Jinyang Liang, Professor at the Institut national de la recherche scientifique (INRS), and his team, whose research results have recently been published in Nature Communications.
    The group based at INRS’ Énergie Matériaux Télécommunications Research Centre has developed a new ultrafast camera system that can capture up to 156.3 trillion frames per second with astonishing precision. For the first time, 2D optical imaging of ultrafast demagnetization in a single shot is possible. This new device called SCARF (for swept-coded aperture real-time femtophotography) can capture transient absorption in a semiconductor and ultrafast demagnetization of a metal alloy. This new method will help push forward the frontiers of knowledge in a wide range of fields, including modern physics, biology, chemistry, materials science, and engineering.
    Improving on past advances
    Professor Liang is known around the world as a pioneer of ultrafast imaging. Already, in 2018, he was the principal developer of a major breakthrough in the field, which laid the groundwork for the development of SCARF.
    Until now, ultrafast camera systems have mainly used an approach involving sequentially capturing frames one by one. They would acquire data through brief, repeated measurements, then put everything together to create a movie that reconstructed the observed movement.
    “However, this approach can only be applied to inert samples or to phenomena that happen the exact same way each time. Fragile samples, not to mention non-repeatable phenomena or phenomena with ultrafast speeds, cannot be observed with this method.”
    Professor Jinyang Liang, expert in ultra-fast and biophotonic imaging said, “For example, phenomena such as femtosecond laser ablation, shock-wave interaction with living cells, and optical chaos cannot be studied this way.”
    The first tool developed by Professor Liang helped fill this gap. The T-CUP (Trillion-frame-per-second compressed ultrafast photography) system was based on passive femtosecond imaging capable of acquiring ten trillion (1013) frames per second. This was a major first step towards ultrafast, single-shot real-time imaging.

    Yet challenges still remained.
    “Many systems based on compressed ultrafast photography have to cope with degraded data quality and have to trade the sequence depth of the field of view. These limitations are attributable to the operating principle, which requires simultaneously shearing the scene and the coded aperture.”
    Miguel Marquez, postdoctoral fellow and co-first author of the study said, “SCARF overcomes these challenges. Its imaging modality enables ultrafast sweeping of a static coded aperture while not shearing the ultrafast phenomenon. This provides full-sequence encoding rates of up to 156.3 THz to individual pixels on a camera with a charge-coupled device (CCD). These results can be obtained in a single shot at tunable frame rates and spatial scales in both reflection and transmission modes.”
    A range of applications
    SCARF makes it possible to observe unique phenomena that are ultrafast, non-repeatable, or difficult to reproduce, such as shock wave mechanics in living cells or matter. These advances could potentially be used to develop better pharmaceutics and medical treatments.
    What’s more, SCARF promises very appealing economic spin-offs. Two companies, Axis Photonique and Few-Cycle, are already working with Professor Liang’s team to produce a marketable version of their patent-pending discovery. This represents a great opportunity for Quebec to strengthen its already enviable position as a leader in photonics.
    The work was carried out in the Advanced Laser Light Source (ALLS) Laboratory in collaboration with Professor François Légaré, Director of the Énergie Matériaux Télécommunications Research Centre, and international colleagues Michel Hehn, Stéphane Mangin and Grégory Malinowski of the Institut Jean Lamour at the Université de Lorraine (France) and Zhengyan Li of Huazhong University of Science and Technology (China).
    This research was funded by the Natural Sciences and Engineering Research Council of Canada, the Canada Research Chairs Program, the Canada Foundation for Innovation, the Ministère de l’Économie et de l’Innovation du Québec, the Canadian Cancer Society, the Government of Canada’s New Frontiers in Research Fund, as well as the Fonds de recherche du Québec-Nature et Technologies and the Fonds de recherche du Québec -Santé. More

  • in

    Scientists deliver quantum algorithm to develop new materials and chemistry

    U.S. Naval Research Laboratory (NRL) scientists published the Cascaded Variational Quantum Eigensolver (CVQE) algorithm in a recent Physical Review Research article, expected to become a powerful tool to investigate the physical properties in electronic systems.
    The CVQE algorithm is a variant of the Variational Quantum Eigensolver (VQE) algorithm that only requires the execution of a set of quantum circuits once rather than at every iteration during the parameter optimization process, thereby increasing the computational throughput.
    “Both algorithms produce a quantum state close to the ground state of a system, which is used to determine many of the system’s physical properties,” said John Stenger, Ph.D., a Theoretical Chemistry Section research physicist. “Calculations that previously took months can now be performed in hours.”
    The CVQE algorithm uses a quantum computer to probe the needed probability mass functions and a classical computer to perform the remaining calculations, including the energy minimization.
    “Finding the minimum energy is computationally hard as the size of the state space grows exponentially with the system size,” said Steve Hellberg, Ph.D., a Theory of Advanced Functional Materials Section research physicist. “Except for very small systems, even the world’s most powerful supercomputers are unable to find the exact ground state.”
    To address this challenge, scientists use a quantum computer with a qubit register, whose state space also increases exponentially, in this case with qubits. By representing the states of a physical system on the state space of the register, a quantum computer can be used to simulate the states in the exponentially large representation space of the system.
    Data can subsequently be extracted by quantum measurements. As quantum measurements are not deterministic, the quantum circuit executions must be repeated multiple times to estimate probability distributions describing the states, a process known as sampling. Variational quantum algorithms, including the CVQE algorithm, identify trial states by a set of parameters that are optimized to minimize the energy.
    “The key difference between the original VQE method and the new CVQE method is that the sampling and optimization processes have been decoupled in the latter such that the sampling can be performed exclusively on the quantum computer and the parameters processed exclusively on a classical computer,” said Dan Gunlycke, D.Phil., Theoretical Chemistry Section Head, who also leads the NRL quantum computing effort. “The new approach also has other benefits. The form of the solution space does not have to comport with the symmetry requirements of the qubit register, and therefore, it is much easier to shape the solution space and implement symmetries of the system and other physically motivated constraints, which will ultimately lead to more accurate predictions of electronic system properties.”
    Quantum computing is a component of quantum science, which has been designated as a Critical Technology Area within the USD(R&E) Technology Vision for an Era of Competition by the Under Secretary of Defense for Research and Engineering Heidi Shyu.
    “Understanding the properties of quantum-mechanical systems is essential in the development of new materials and chemistry for the Navy and Marine Corps,” Gunlycke said. “Corrosion, for instance, is an omnipresent challenge costing the Department of Defense billions every year. The CVQE algorithm can be used to study the chemical reactions causing corrosion and provide critical information to our existing anticorrosion teams in their quest to develop better coatings and additives.” More

  • in

    The world is one step closer to secure quantum communication on a global scale

    Researchers at the University of Waterloo’s Institute for Quantum Computing (IQC) have brought together two Nobel prize-winning research concepts to advance the field of quantum communication.
    Scientists can now efficiently produce nearly perfect entangled photon pairs from quantum dot sources.
    Entangled photons are particles of light that remain connected, even across large distances, and the 2022 Nobel Prize in Physics recognized experiments on this topic. Combining entanglement with quantum dots, a technology recognized with the Nobel Prize in Chemistry in 2023, the IQC research team aimed to optimize the process for creating entangled photons, which have a wide variety of applications, including secure communications.
    “The combination of a high degree of entanglement and high efficiency is needed for exciting applications such as quantum key distribution or quantum repeaters, which are envisioned to extend the distance of secure quantum communication to a global scale or link remote quantum computers,” said Dr. Michael Reimer, professor at IQC and Waterloo’s Department of Electrical and Computer Engineering. “Previous experiments only measured either near-perfect entanglement or high efficiency, but we’re the first to achieve both requirements with a quantum dot.”
    By embedding semiconductor quantum dots into a nanowire, the researchers created a source that creates near-perfect entangled photons 65 times more efficiently than previous work. This new source, developed in collaboration with the National Research Council of Canada in Ottawa, can be excited with lasers to generate entangled pairs on command. The researchers then used high-resolution single photon detectors provided by Single Quantum in The Netherlands to boost the degree of entanglement.
    “Historically, quantum dot systems were plagued with a problem called fine structure splitting, which causes an entangled state to oscillate over time. This meant that measurements taken with a slow detection system would prevent the entanglement from being measured,” said Matteo Pennacchietti, a PhD student at IQC and Waterloo’s Department of Electrical and Computer Engineering. “We overcame this by combining our quantum dots with a very fast and precise detection system. We can basically take a timestamp of what the entangled state looks like at each point during the oscillations, and that’s where we have the perfect entanglement.”
    To showcase future communications applications, Reimer and Pennacchietti worked with Dr. Norbert Lütkenhaus and Dr. Thomas Jennewein, both IQC faculty members and professors in Waterloo’s Department of Physics and Astronomy, and their teams. Using their new quantum dot entanglement source, the researchers simulated a secure communications method known as quantum key distribution, proving that the quantum dot source holds significant promise in the future of secure quantum communications. More

  • in

    Rectifying AI’s usage in the quest for thermoelectric materials

    Using AI, a team of researchers has identified a thermoelectric material that potentially possesses favorable values. The group was able to navigate AI’s conventional pitfalls, giving a prime example of how AI can revolutionize materials science.
    Details of their finding were published in the journal Science China Materials on March 8, 2024.
    “Traditional methods of finding suitable materials involve trial-and-error, which is time-consuming and often expensive,” proclaims Hao Li, associate professor at Tohoku University’s Advanced Institute for Materials Research (WPI-AIMR) and corresponding author of the paper. “AI transforms this by combing through databases to identify potential materials that can then be experimentally verified.”
    Still, challenges remain. Large-scale material datasets sometimes contain errors and overfitting the predicted temperature-dependent properties is also a common error. Overfitting occurs when a model learns to capture noise or random fluctuations in the training data rather than the underlying pattern or relationship. As a result, the model performs well on the training data but fails to generalize new, unseen data. When predicting temperature-dependent properties, overfitting could lead to inaccurate predictions when the model encounters new conditions outside the range of the training data.
    Li and his colleagues sought to overcome this to develop a thermoelectric material. These materials convert heat energy into electrical energy, or vice versa. Thus, getting a highly accurate temperature-dependence is critical.
    “First, we performed a series of rational actions to identify and discard questionable data, obtaining 92,291 data points comprising 7,295 compositions and different temperatures from the Starrydata2 database — an online database that collects digital data from published papers,” states Li.
    Following this, the researchers implemented a composition-based cross-validation method. Crucially, they emphasized that data points with the same compositions but different temperatures should not be split into different sets to avoid overfitting.
    Then the researchers built machine building models using the Gradient Boosting Decision Tree method. The model achieved remarkable R2 values 0.89, ~0.90, and ~0.89 on the training dataset, test dataset, and new out-of-sample experimental data released in 2023, demonstrating the models accuracy in predicting newly available materials.
    “We could use this model to carry out a large-scale evaluation of the stable materials from the Materials Project database, predicting the potential thermoelectric performance of new materials and providing guidance for experiments,” states Xue Jia, Assistant Professor at WPI-AIMR, and co-author of the paper.
    Ultimately, the study illustrates the importance of following rigorous guidelines when it comes to data preprocessing and data splitting in machine learning so that it addresses the pressing issues in materials science. The researchers are optimistic that their strategy can also be applied to other materials, such as electrocatalysts and batteries. More

  • in

    Quantum interference could lead to smaller, faster, and more energy-efficient transistors

    An international team of researchers from Queen Mary University of London, the University of Oxford, Lancaster University, and the University of Waterloo have developed a new single-molecule transistor that uses quantum interference to control the flow of electrons. The transistor, which is described in a paper published in the Nature Nanotechnology, opens new possibilities for using quantum effects in electronic devices.
    Transistors are the basic building blocks of modern electronics. They are used to amplify and switch electrical signals, and they are essential for everything from smartphones to spaceships. However, the traditional method of making transistors, which involves etching silicon into tiny channels, is reaching its limits. As transistors get smaller, they become increasingly inefficient and susceptible to errors, as electrons can leak through the device even when it is supposed to be switched off, by a process known as quantum tunnelling. Researchers are exploring new types of switching mechanisms that can be used with different materials to remove this effect.
    In the nanoscale structures that Professor Jan Mol, Dr James Thomas, and their group study at Queen Mary’s School of Physical and Chemical Sciences, quantum mechanical effects dominate, and electrons behave as waves rather than particles. Taking advantage of these quantum effects, the researchers built a new transistor. The transistor’s conductive channel is a single zinc porphyrin, a molecule that can conduct electricity. The porphyrin is sandwiched between two graphene electrodes, and when a voltage is applied to the electrodes, electron flow through the molecule can be controlled using quantum interference.
    Interference is a phenomenon that occurs when two waves interact with each other and either cancel each other out (destructive interference) or reinforce each other (constructive interference). In the new transistor’s case, researchers switched the transistor on and off by controlling whether the electrons interfere constructively (on) or destructively (off) as they flow through the zinc porphyrin molecule.
    The researchers found that the new transistor has a very high on/off ratio, meaning that it can be turned on and off very precisely. Destructive quantum interference plays a crucial role in this by eliminating the leaky electron flow from quantum tunneling through the transistor when it is supposed to be switched off. They also found that the transistor is very stable. Previous transistors made from a single molecule have only been able to demonstrate a handful of switching cycles, however this device can be operated for hundreds of thousands of cycles without breaking down.
    “Quantum interference is a powerful phenomenon that has the potential to be used in a wide variety of electronics applications,” said lead author Dr James Thomas, Lecturer in Quantum Technologies at Queen Mary. “We believe that our work is a significant step towards realizing this potential.”
    “Our results show that quantum interference can be used to control the flow of electrons in transistors, and that this can be done in a way that is both efficient and reliable,” said co-author Professor Jan Mol. “This could lead to the development of new types of transistors that are smaller, faster, and more energy-efficient than current devices.”
    The researchers also found that the quantum interference effects could be used to improve the transistor’s subthreshold swing, which is a measure of how sensitive the transistor is to changes in the gate voltage. The lower the subthreshold swing, the more efficient the transistor is. The researchers’ transistors had a subthreshold swing of 140 mV/dec, which is better than subthreshold swings reported for other single-molecule transistors, and comparable to larger devices made from materials such as carbon nanotubes.
    The research is still in its initial stages, but the researchers are optimistic that the new transistor could be used to create a new generation of electronic devices. These devices could be used in a variety of applications, starting from computers, smartphones, and ending with medical devices. More

  • in

    Novel quantum algorithm for high-quality solutions to combinatorial optimization problems

    Conventional quantum algorithms are not feasible for solving combinatorial optimization problems (COPs) with constraints in the operation time of quantum computers. To address this issue, researchers have developed a novel algorithm called post-processing variationally scheduled quantum algorithm. The novelty of this innovative algorithm lies in the use of a post-processing technique combined with variational scheduling to achieve high-quality solutions to COPs in a short time.
    Combinatorial optimization problems (COPs) have applications in many different fields such as logistics, supply chain management, machine learning, material design and drug discovery, among others, for finding the optimal solution to complex problems. These problems are usually very computationally intensive using classical computers and thus solving COPs using quantum computers has attracted significant attention from both academia and industry.
    Quantum computers take advantage of the quantum property of superposition, using specialized qubits, that can exist in an infinite yet contained number of states of 0 or 1 or any combination of the two, to quickly solve large problems. However, when COPs involve constraints, conventional quantum algorithms like adiabatic quantum annealing struggle to obtain a near-optimal solution within the operation time of quantum computers. Recent advances in quantum technology have led to devices such as quantum annealers and gate-type quantum devices that provide suitable platforms for solving COPs. Unfortunately, they are susceptible to noise, which limits their applicability to quantum algorithms with low computational costs.
    To address this challenge, Assistant Professor Tatsuhiko Shirai and Professor Nozomu Togawa from the Department of Computer Science and Communications Engineering at Waseda University in Japan have recently developed a groundbreaking post-processing variationally scheduled quantum algorithm (pVSQA). “The two main methods for solving COPs with quantum devices are variational scheduling and post-processing. Our algorithm combines variational scheduling with a post-processing method that transforms infeasible solutions into feasible ones, allowing us to achieve near-optimal solutions for constrained COPs on both quantum annealers and gate-based quantum computers,” explains Dr. Shirai. Their study was published in the journal IEEE Transactions on Quantum Engineering on 13 March 2024.
    The innovative pVSQA algorithm uses a quantum device to first generate a variational quantum state via quantum computation. This is then used to generate a probability distribution function which consists of all the feasible and infeasible solutions that are within the constraints of the COP. Next, the post-processing method transforms the infeasible solutions into feasible ones, leaving the probability distribution with only feasible solutions. A classical computer is then used to calculate an energy expectation value of the cost function using this new probability distribution. Repeating this calculation results in a near-optimal solution.
    The researchers analyzed the performance of this algorithm using both a simulator and real quantum devices such as a quantum annealer and a gate-type quantum device. The experiments revealed that pVSQA achieves a near-optimal performance within a predetermined time on the simulator and outperforms conventional quantum algorithms without post-processing on real quantum devices.
    Dr. Shirai highlights the potential applications of the algorithm, stating: “Drastic social transformations are urgently needed to address various social issues. Examples include the realization of a carbon-neutral society to solve climate change issues and the realization of sustainable development goals to address issues such as increased energy demand and food shortage. Efficiently solving combinatorial optimization problems is at the heart of achieving these transformations. Our new method will play a significant role in realizing these long-term social transformations.”
    In conclusion, this study marks a significant step forward for using quantum computers for solving COPs, holding promise for addressing complex real-world problems across various domains.

    Reference
    DOI: https://doi.org/10.1109/TQE.2024.3376721
    Authors: Tatsuhiko Shirai1 and Nozomu Togawa1
    Affiliations: Department of Computer Science and Communications Engineering, Waseda University
    About Waseda University
    Located in the heart of Tokyo, Waseda University is a leading private research university that has long been dedicated to academic excellence, innovative research, and civic engagement at both the local and global levels since 1882. The University has produced many changemakers in its history, including nine prime ministers and many leaders in business, science and technology, literature, sports, and film. Waseda has strong collaborations with overseas research institutions and is committed to advancing cutting-edge research and developing leaders who can contribute to the resolution of complex, global social issues. The University has set a target of achieving a zero-carbon campus by 2032, in line with the Sustainable Development Goals (SDGs) adopted by the United Nations in 2015.

    To learn more about Waseda University, visit https://www.waseda.jp/top/en
    About Assistant Professor Tatsuhiko Shirai
    Tatsuhiko Shirai is currently an Assistant Professor at the Department of Computer and Communications Engineering at Waseda University in Japan. He obtained his master’s and Ph.D. in Physics from the University of Tokyo in 2013 and 2016 respectively. In 2022, he obtained the SLDM Research Group Excellent Paper Award. His research interests include quantum algorithms, quantum open systems, and quantum computing. He is a member of The Physical Society of Japan. More