More stories

  • in

    The quest for an ideal quantum bit

    New qubit platform could transform quantum information science and technology.
    You are no doubt viewing this article on a digital device whose basic unit of information is the bit, either 0 or 1. Scientists worldwide are racing to develop a new kind of computer based on use of quantum bits, or qubits.
    In a recent Nature paper, a team led by the U.S. Department of Energy’s (DOE) Argonne National Laboratory has announced the creation of a new qubit platform formed by freezing neon gas into a solid at very low temperatures, spraying electrons from a light bulb’s filament onto the solid, and trapping a single electron there. This system shows great promise to be developed into ideal building blocks for future quantum computers.
    To realize a useful quantum computer, the quality requirements for the qubits are extremely demanding. While there are various forms of qubits today, none of them is ideal.
    What would make an ideal qubit? It has at least three sterling qualities, according to Dafei Jin, an Argonne scientist and the principal investigator of the project.
    It can remain in a simultaneous 0 and 1 state (remember the cat!) over a long time. Scientists call this long “coherence.” Ideally, that time would be around a second, a time step that we can perceive on a home clock in our daily life. More

  • in

    Deep learning model to predict adverse drug-drug interactions

    Prescriptions for multiple drugs, or polypharmacy, is often recommended for the treatment of complex diseases. However, upon ingestion, multiple drugs may interact in an undesirable manner, resulting in severe adverse effects or decreased clinical efficacy. Early detection of such drug-drug interactions (DDIs) is therefore essential to prevent patients from experiencing adverse effects.
    Currently, computational models and neural network-based algorithms examine prior records of known drug interactions and identify the structures and side effects they are associated with. These approaches assume that similar drugs have similar interactions and identify drug combinations associated with similar adverse effects.
    Although understanding the mechanisms of DDIs at a molecular level is essential to predict their undesirable effects, current models rely on structures and properties of drugs, with predictive range limited to previously observed interactions. They do not consider the effect of DDIs on genes and cell functionality.
    To address these limitations, Associate Professor Hojung Nam and Ph.D. candidate Eunyoung Kim from the Gwangju Institute of Science and Technology in South Korea developed a deep learning-based model to predict DDIs based on drug-induced gene expression signatures. These findings were published in the Journal of Cheminformatics on March 4, 2022.
    The DeSIDE-DDI model consists of two parts: a feature generation model and a DDI prediction model. The feature generation model predicts a drug’s effect on gene expression by considering both the structure and properties of the drug while the DDI prediction model predicts various side effects resulting from drug combinations.
    To explain the key features of this model, Prof. Nam explains, “Our model considers the effects of drugs on genes by utilizing gene expression data, providing an explanation for why a certain pair of drugs cause DDIs. It can predict DDIs for currently approved drugs as well as for novel compounds. This way, the threats of polypharmacy can be resolved before new drugs are made available to the public.”
    What’s more, since all compounds do not have drug-treated gene expression signatures, this model uses a pre-trained compound generation model to generate expected drug-treated gene expressions.
    Discussing its real-life applications, Prof. Nam remarks, “This model can discern potentially dangerous drug pairs, acting as a drug safety monitoring system. It can help researchers define the correct usage of the drug in the drug development phase.”
    A model with such potential will truly revolutionize how the safety of novel drugs is established in the future.
    Story Source:
    Materials provided by GIST (Gwangju Institute of Science and Technology). Note: Content may be edited for style and length. More

  • in

    Taste of the future: Robot chef learns to 'taste as you go'

    A robot ‘chef’ has been trained to taste food at different stages of the chewing process to assess whether it’s sufficiently seasoned.
    Working in collaboration with domestic appliances manufacturer Beko, researchers from the University of Cambridge trained their robot chef to assess the saltiness of a dish at different stages of the chewing process, imitating a similar process in humans.
    Their results could be useful in the development of automated or semi-automated food preparation by helping robots to learn what tastes good and what doesn’t, making them better cooks.
    When we chew our food, we notice a change in texture and taste. For example, biting into a fresh tomato at the height of summer will release juices, and as we chew, releasing both saliva and digestive enzymes, our perception of the tomato’s flavour will change.
    The robot chef, which has already been trained to make omelettes based on human taster’s feedback, tasted nine different variations of a simple dish of scrambled eggs and tomatoes at three different stages of the chewing process, and produced ‘taste maps’ of the different dishes.
    The researchers found that this ‘taste as you go’ approach significantly improved the robot’s ability to quickly and accurately assess the saltiness of the dish over other electronic tasting technologies, which only test a single homogenised sample. The results are reported in the journal Frontiers in Robotics & AI. More

  • in

    New open-source software automates RNA analysis to speed up research and drug development

    Scientists at Scripps Research have unveiled a new software tool for studying RNA (ribonucleic acid) molecules, which have a host of critical roles in organisms. The open-source app, “Pytheas,” described May 3, 2022, in Nature Communications, speeds up the process of characterizing and quantifying RNAs in basic research and drug-development settings.
    The app is designed specifically to analyze RNA data generated through a method called mass spectrometry. “Mass spec” is commonly used to evaluate RNA molecules that are not simple chains of standard RNA nucleotides but are instead modified in some way. Among their demonstrations, the researchers showed that Pytheas can be used to swiftly identify and quantify modified RNA molecules like those in the current Pfizer and Moderna COVID-19 mRNA vaccines.
    “The analysis of RNA data from mass spectrometry has been a relatively laborious process, lacking the tools found in other areas of biological research, and so our aim with Pytheas is to bring the field into the 21st century,” says study senior author James Williamson, PhD, professor in the Department of Integrative Structural and Computational Biology, and vice president of Research and Academic Affairs at Scripps Research.
    The first authors of the study were Luigi D’Ascenzo, PhD, and Anna Popova, PhD, respectively a postdoctoral research associate and staff scientist in the Williamson lab during the study.
    RNA is chemically very similar to DNA, and RNA molecules in cells are heavily involved in the process of translating genes into proteins, as well as in fine-tuning gene activity. Additionally, RNA-based therapeutics — which include the Pfizer and Moderna vaccines — are viewed as a highly promising new class of medicines, capable in principle of hitting their biological targets more potently and selectively than traditional small-molecule drugs.
    A common tool for detecting RNA molecules that have chemical modifications is mass spectrometry, which can be used essentially to recognize the RNAs and their modifications based on their masses. Natural RNAs often have modifications that affect their functions, while RNAs used for vaccines and RNA-based drugs are almost always modified artificially to optimize their activity and reduce side effects. Up to now, methods for processing raw mass spectrometry data on modified RNAs have been relatively slow and manual — thus, very labor-intensive — in contrast to corresponding methods in the field of protein analysis, for example. More

  • in

    Automated synthesis allows for discovery of unexpected charge transport behavior in organic molecules

    A cross-disciplinary UIUC team has demonstrated a major breakthrough in using automated synthesis to discover new molecules for organic electronics applications.
    The technology that enabled the discovery relies on an automated platform for rapid molecular synthesis at scale — which is a game-changer in the field of organic electronics and beyond. Using automated synthesis, the team was able to rapidly scan through a library of molecules with precisely defined structures, thereby uncovering, via single-molecule characterization experiments, a new mechanism for high conductance. The work was just reported in Nature Communications and is the first major result to emerge from the Molecule Maker Lab, which is located in the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign.
    The unexpectedly high conductance was uncovered in experiments led by Charles M. Schroeder, who is the James Economy Professor in materials science & engineering and a professor in chemical & biomolecular engineering. The project’s goal was to seek out new molecules with strong conductivity that might be suitable for use in molecular electronics or organic electronics applications. The team’s approach was to systematically append many different side chains to molecular backbones to understand how the side chains affected conductance.
    The first stage of the project consisted of synthesizing a large library of molecules to be characterized using single-molecule electronics experiments. If the synthesis had been done with conventional methods, it would have been a long, cumbersome process. That effort was avoided through use of the Molecule Maker Lab’s automated synthesis platform, which was designed to facilitate molecular discovery research that requires testing of large numbers of candidate molecules.
    Edward R. Jira, a Ph.D. student in chemical & biomolecular engineering who had a leading role in the project, explained the synthesis platform’s concept. “What’s really powerful… is that it leverages a building-block-based strategy where all of the chemical functionality that we’re interested in is pre-encoded in building blocks that are bench-stable, and you can have a large library of them sitting on a shelf,” he said. A single type of reaction is used repeatedly to couple the building blocks together as needed, and “because we have this diverse building block library that encodes a lot of different functionality, we can access a huge array of different structures for different applications.”
    As Schroeder put it, “Imagine snapping Legos together.”
    Co-author Martin D. Burke extended the Lego-brick analogy to explain why the synthesizer was so valuable to the experiments — and it wasn’t only because of the rapid production of the initial molecular library. “Because of the Lego-like approach for making these molecules, the team was able to understand why they are super-fast,” he explained. Once the surprisingly fast state was discovered, “using the ‘Legos,’ we could take the molecules apart piece by piece, and swap in different ‘Lego’ bricks — and thereby systematically understand the structure/function relationships that led to this ultrafast conductivity.” More

  • in

    Researchers develop smartphone-powered microchip for at-home medical diagnostic testing

    A University of Minnesota Twin Cities research team has developed a new microfluidic chip for diagnosing diseases that uses a minimal number of components and can be powered wirelessly by a smartphone. The innovation opens the door for faster and more affordable at-home medical testing.
    The researchers’ paper is published in Nature Communications, a peer-reviewed, open access, scientific journal published by Nature Research. Researchers are also working to commercialize the technology.
    Microfluidics involves the study and manipulation of liquids at a very small scale. One of the most popular applications in the field is developing “lab-on-a-chip” technology, or the ability to create devices that can diagnose diseases from a very small biological sample, blood or urine, for example.
    Scientists already have portable devices for diagnosing some conditions — rapid COVID-19 antigen tests, for one. However, a big roadblock to engineering more sophisticated diagnostic chips that could, for example, identify the specific strain of COVID-19 or measure biomarkers like glucose or cholesterol, is the fact that they need so many moving parts.
    Chips like these would require materials to seal the liquid inside, pumps and tubing to manipulate the liquid, and wires to activate those pumps — all materials that are difficult to scale down to the micro level. Researchers at the University of Minnesota Twin Cities were able to create a microfluidic device that functions without all of those bulky components.
    “Researchers have been extremely successful when it comes to electronic device scaling, but the ability to handle liquid samples has not kept up,” said Sang-Hyun Oh, a professor in the University of Minnesota Twin Cities Department of Electrical and Computer Engineering and senior author of the study. “It’s not an exaggeration that a state-of-the-art, microfluidic lab-on-a-chip system is very labor intensive to put together. Our thought was, can we just get rid of the cover material, wires, and pumps altogether and make it simple?”
    Many lab-on-a-chip technologies work by moving liquid droplets across a microchip to detect the virus pathogens or bacteria inside the sample. The University of Minnesota researchers’ solution was inspired by a peculiar real-world phenomenon with which wine drinkers will be familiar — the “legs,” or long droplets that form inside a wine bottle due to surface tension caused by the evaporation of alcohol. More

  • in

    Engineers develop new control electronics for quantum computers that improve performance, cut costs

    When designing a next-generation quantum computer, a surprisingly large problem is bridging the communication gap between the classical and quantum worlds. Such computers need a specialized control and readout electronics to translate back and forth between the human operator and the quantum computer’s languages — but existing systems are cumbersome and expensive.
    However, a new system of control and readout electronics, known as Quantum Instrumentation Control Kit, or QICK, developed by engineers at the U.S. Department of Energy’s Fermi National Accelerator Laboratory, has proved to drastically improve quantum computer performance while cutting the cost of control equipment.
    “The development of the Quantum Instrumentation Control Kit is an excellent example of U.S. investment in joint quantum technology research with partnerships between industry, academia and government to accelerate pre-competitive quantum research and development technologies,” said Harriet Kung, DOE deputy director for science programs for the Office of Science and acting associate director of science for high-energy physics.
    The faster and more cost-efficient controls were developed by a team of Fermilab engineers led by senior principal engineer Gustavo Cancelo in collaboration with the University of Chicago whose goal was to create and test a field-programmable gate array-based (FPGA) controller for quantum computing experiments. David Schuster, a physicist at the University of Chicago, led the university’s lab that helped with the specifications and verification on real hardware.
    “This is exactly the type of project that combines the strengths of a national laboratory and a university,” said Schuster. “There is a clear need for an open-source control hardware ecosystem, and it is being rapidly adopted by the quantum community.”
    Engineers designing quantum computers deal with the challenge of bridging the two seemingly incompatible worlds of quantum and classical computers. Quantum computers are based on the counterintuitive, probabilistic rules of quantum mechanics that govern the microscopic world, which enables them to perform calculations that ordinary computers cannot. Because people live in the macroscopic visible world where classical physics reigns, control and readout electronics act as the interpreter connecting these two worlds. More

  • in

    New approach may help clear hurdle to large-scale quantum computing

    Building a plane while flying it isn’t typically a goal for most, but for a team of Harvard-led physicists that general idea might be a key to finally building large-scale quantum computers.
    Described in a new paper in Nature, the research team, which includes collaborators from QuEra Computing, MIT, and the University of Innsbruck, developed a new approach for processing quantum information that allows them to dynamically change the layout of atoms in their system by moving and connecting them with each other in the midst of computation.
    This ability to shuffle the qubits (the fundamental building blocks of quantum computers and the source of their massive processing power) during the computation process while preserving their quantum state dramatically expands processing capabilities and allows for the self-correction of errors. Clearing this hurdle marks a major step toward building large-scale machines that leverage the bizarre characteristics of quantum mechanics and promise to bring about real-world breakthroughs in material science, communication technologies, finance, and many other fields.
    “The reason why building large scale quantum computers is hard is because eventually you have errors,” said Mikhail Lukin, the George Vasmer Leverett Professor of Physics, co-director of the Harvard Quantum Initiative, and one of the senior authors of the study. “One way to reduce these errors is to just make your qubits better and better, but another more systematic and ultimately practical way is to do something which is called quantum error correction. That means that even if you have some errors, you can correct these errors during your computation process with redundancy.”
    In classical computing, error correction is done by simply copying information from a single binary digit or bit so it’s clear when and where it failed. For example, one single bit of 0 can be copied three times to read 000. Suddenly, when it reads 001, it’s clear where the error is and can be corrected. A foundational limitation of quantum mechanics is that information can’t be copied, making error correction difficult.
    The workaround the researchers implement creates a sort of backup system for the atoms and their information called a quantum error correction code. The researchers use their new technique to create many of these correction codes, including what’s known as a toric code, and it spreads them out throughout the system. More