More stories

  • in

    New quantum algorithm surpasses the QPE norm

    Researchers improve their newly established quantum algorithm, bringing it to one-tenth the computational cost of Quantum Phase Estimation, and use it to directly calculate the vertical ionization energies of light atoms and molecules such as CO, O2, CN, F2, H2O, NH3 within 0.1 electron volts of precision.
    Quantum computers have seen a lot attention recently as they are expected to solve certain problems that are outside the capabilities of normal computers. Primary to these problems is determining the electronic states of atoms and molecules so they can be used more effectively in a variety of industries — from lithium-ion battery designs to in silico technologies in drug development. A common way scientists have approached this problem is by calculating the total energies of the individual states of a molecule or atom and then determine the difference in energy between these states. In nature, many molecules grow in size and complexity, and the cost to calculate this constant flux is beyond the capability of any traditional computer or currently establish quantum algorithms. Therefore, theoretical predictions of the total energies have only been possible if molecules are not sizable and isolated from their natural environment.
    “For quantum computers to be a reality, its algorithms must be robust enough to accurately predict the electronic states of atoms and molecules, as they exist in nature, ” state Kenji Sugisaki and Takeji Takui from the Graduate School of Science, Osaka City University.
    In December 2020, Sugisaki and Takui, together with their colleagues, led a team of researchers to develop a quantum algorithm they call Bayesian eXchange coupling parameter calculator with Broken-symmetry wave functions (BxB), that predicts the electronic states of atoms and molecules by directly calculating the energy differences. They noted that energy differences in atoms and molecules remain constant, regardless to how complex and large they get despite their total energies grow as the system size. “With BxB, we avoided the common practice of calculating the total energies and targeted the energy differences directly, keeping computing costs within polynomial time,” they state. “Since then, our goal has been to improve the efficiency of our BxB software so it can predict the electronic sates of atoms and molecules with chemical precision.”
    Using the computing costs of a well-known algorithm called Quantum Phase Estimation (QPE) as a benchmark, “we calculated the vertical ionization energies of small molecules such as CO, O2, CN, F2, H2O, NH3 within 0.1 electron volts (eV) of precision,” states the team, using half the number of qubits, bringing the calculation cost on par with QPE.
    Their findings will be published online in the March edition of The Journal of Physical Chemistry Letters.
    Ionization energy is one of the most fundamental physical properties of atoms and molecules and an important indicator for understanding the strength and properties of chemical bonds and reactions. In short, accurately predicting the ionization energy allows us to use chemicals beyond the current norm. In the past, it was necessary to calculate the energies of the neutral and ionized states, but with the BxB quantum algorithm, the ionization energy can be obtained in a single calculation without inspecting the individual total energies of the neutral and ionized states. “From numerical simulations of the quantum logic circuit in BxB, we found that the computational cost for reading out the ionization energy is constant regardless of the atomic number or the size of the molecule,” the team states, “and that the ionization energy can be obtained with a high accuracy of 0.1 eV after modifying the length of the quantum logic circuit to be less than one tenth of QPE.” (See image for modification details)
    With the development of quantum computer hardware, Sugisaki and Takui, along with their team, are expecting the BxB quantum algorithm to perform high-precision energy calculations for large molecules that cannot be treated in real time with conventional computers.
    Story Source:
    Materials provided by Osaka City University. Note: Content may be edited for style and length. More

  • in

    Patient wait times reduced thanks to new study by engineers

    The first known study to explore optimal outpatient exam scheduling given the flexibility of inpatient exams has resulted in shorter wait times for magnetic resonance imaging (MRI) patients at Lahey Hospital & Medical Center in Burlington, Mass. A team of researchers from Dartmouth Engineering and Philips worked to identify sources of delays for MRI procedures at Lahey Hospital in order to optimize scheduling and reduce overall costs for the hospital by 23 percent.
    The Dartmouth-led study, “Stochastic programming for outpatient scheduling with flexible inpatient exam accommodation,” was sponsored by Philips and recently published by Health Care Management Science in collaboration with Lahey Hospital.
    “Excellence in service and positive patient experiences are a primary focus for the hospital. We continuously monitor various aspects of patient experiences and one key indicator is patient wait times,” said Christoph Wald, chair of the department of radiology at Lahey Hospital and professor of radiology at Tufts University Medical School. “With a goal of wanting to improve patient wait times, we worked with data science researchers at Philips and Dartmouth to help identify levers for improvement that might be achieved without impeding access.”
    Prior to working with the researchers, on an average weekday, outpatients at Lahey Hospital waited about 54 minutes from their arrival until the beginning of their exam. Researchers determined that one of the reasons for the routine delays was a complex scheduling system, which must cater to emergency room patients, inpatients, and outpatients; while exams for inpatients are usually flexible and can be delayed if necessary, other appointments cannot.
    “Mathematical models and algorithms are crucial to improve the efficiency of healthcare systems, especially in the current crisis we are going through. By analyzing the patient data, we found that delays were prominent because the schedule was not optimal,” said first author Yifei Sun, a Dartmouth Engineering PhD candidate. “This research uses optimization and simulation tools to help the MRI centers of Lahey Hospital better plan their schedule to reduce overall cost, which includes patient waiting time.”
    First, the researchers reviewed data to analyze and identify sources of delays. They then worked on developing a mathematical model to optimize the length of each exam slot and the placement of inpatient exams within the overall schedule. Finally, the researchers developed an algorithm to minimize the wait time and cost associated with exam delays for outpatients, the idle time of equipment, employee overtime, and cancelled inpatient exams.
    “This iterative improvement process did result in measurable improvements of patient wait times,” said Wald. “The construction and use of a simulation model have been instrumental in educating the Lahey team about the benefits of dissecting workflow components to arrive at an optimized process outcome. We have extended this approach to identify bottlenecks in our interventional radiology workflow and to add additional capacity under the constraints of staffing schedules.”
    The researchers believe their solutions are broadly applicable, as the issue is common to many mid-sized hospitals throughout the country.
    “We also provided suggestions for hospitals that don’t have optimization tools or have different priorities, such as patient waiting times or idle machine times,” said Sun, who worked on the paper with her advisor Vikrant Vaze, the Stata Family Career Development Associate Professor of Engineering at Dartmouth.
    The other co-authors of the paper are: Usha Nandini Raghavan and Christopher S. Hall, both from Philips, and Patricia Doyle and Stacey Sullivan Richard of Lahey Hospital.
    Story Source:
    Materials provided by Thayer School of Engineering at Dartmouth. Original written by Julie Bonette. Note: Content may be edited for style and length. More

  • in

    From a window to a mirror: New material paves the way to faster computing

    Research led by the Cavendish Laboratory at the University of Cambridge has identified a material that could help tackle speed and energy, the two biggest challenges for computers of the future.
    Research in the field of light-based computing — using light instead of electricity for computation to go beyond the limits of today’s computers — is moving fast, but barriers remain in developing optical switching, the process by which light would be easily turned ‘on’ and ‘off’, reflecting or transmitting light on-demand.
    The study, published in Nature Communications, shows that a material known as Ta2NiSe5 could switch between a window and a mirror in a quadrillionth of a second when struck by a short laser pulse, paving the way for the development of ultra-fast switching in computers of the future.
    The material looks like a chunk of pencil lead and acts an insulator at room temperature, which means that when infrared light strikes the material in this insulating state, it passes straight through like a window. However, when heated, the material becomes a metal which acts like a mirror and reflects light.
    “We knew that Ta2NiSe5 could switch between a window and a mirror when it was heated up, but heating an object is a very slow process,” said Dr Akshay Rao, Harding University Lecturer at the Cavendish Laboratory, who led the research. “What our experiments have shown is that a short laser pulse can also trigger this ‘flip’ in only 10-15 seconds. This is a million times faster than switches in our current computers.”
    The researchers were looking into the material’s behaviour to show the existence of a new phase of matter called an ‘excitonic insulator’, which has been experimentally challenging to find since it was first theorised in the 1960s.
    “This excitonic insulating phase looks in many ways like a very normal insulator, but one way to distinguish between an unusual and ordinary insulator is to see exactly how long it takes for it to become a metal,” said Rao. “For normal matter, going from an insulator to a metal is like melting an ice cube. The atoms themselves move positions and rearrange, making it a slow process. But in an excitonic insulator, this could happen very fast because the atoms themselves do not need to move to switch phases. If we could find a way to measure how fast this transition occurs, we could potentially unmask the excitonic insulator.”
    To do these experiments, the researchers used a sequence of very short laser pulses to first perturb the material and then measure how its reflection changed. At room temperature, they found that when Ta2NiSe5 was struck by a strong laser pulse, it exhibited signatures of the metallic state immediately, becoming a mirror on a timescale faster than they could resolve. This provided strong evidence for the excitonic insulating nature of Ta2NiSe5.
    “Not only does this work remove the material’s camouflage, opening up further studies into its unusual quantum mechanical behaviour, it also highlights this material’s unique capability of acting as an ultrafast switch,” said first author Hope Bretscher, also from the Cavendish Laboratory. “In fact, for the optical switch to be effective, not only must it transition quickly from the insulating to the metallic phase, but the reverse process must also be fast.
    “We found that Ta2NiSe5 returned to an insulating state rapidly, much faster than other candidate switch materials. This ability to go from mirror, to window, to mirror again, make it extremely enticing for computing applications.”
    “Science is a complicated and evolving process — and we think we’ve been able to take this discussion a step forward. Not only we can now better understand the properties of this material, but we also uncovered an interesting potential application for it,” said co-author Professor Ajay Sood, from the Indian Institute of Science in Bangalore.
    “While practically producing quantum switches with Ta2NiSe5 may still be a long way off, having identified a new approach to the growing challenge of computer’s speed and energy use is an exciting development,” said Rao. More

  • in

    'Swarmalation' used to design active materials for self-regulating soft robots

    During the swarming of birds or fish, each entity coordinates its location relative to the others, so that the swarm moves as one larger, coherent unit. Fireflies on the other hand coordinate their temporal behavior: within a group, they eventually all flash on and off at the same time and thus act as synchronized oscillators.
    Few entities, however, coordinate both their spatial movements and inherent time clocks; the limited examples are termed “swarmalators”1, which simultaneously swarm in space and oscillate in time. Japanese tree frogs are exemplar swarmalators: each frog changes both its location and rate of croaking relative to all the other frogs in a group.
    Moreover, the frogs change shape when they croak: the air sac below their mouth inflates and deflates to make the sound. This coordinated behavior plays an important role during mating and hence, is vital to the frogs’ survival. In the synthetic realm there are hardly any materials systems where individual units simultaneously synchronize their spatial assembly, temporal oscillations and morphological changes. Such highly self-organizing materials are important for creating self-propelled soft robots that come together and cooperatively alter their form to accomplish a regular, repeated function.
    Chemical engineers at the University of Pittsburgh Swanson School of Engineering have now designed a system of self-oscillating flexible materials that display a distinctive mode of dynamic self-organization. In addition to exhibiting the swarmalator behavior, the component materials mutually adapt their overall shapes as they interact in a fluid-filled chamber. These systems can pave the way for fabricating collaborative, self-regulating soft robotic systems.
    The group’s research was published this week in the journal Proceedings of the National Academy of Sciences. Principal investigator is Anna C. Balazs, Distinguished Professor of Chemical and Petroleum Engineering and the John A. Swanson Chair of Engineering. Lead author is Raj Kumar Manna and co-author is Oleg E. Shklyaev, both post-doctoral associates.
    “Self-oscillating materials convert a non-periodic signal into the material’s periodic motion,” Balazs explained. “Using our computer models, we first designed micron and millimeter sized flexible sheets in solution that respond to a non-periodic input of chemical reactants by spontaneously undergoing oscillatory changes in location, motion and shape. For example, an initially flat, single sheet morphs into a three-dimensional shape resembling an undulating fish tail, which simultaneously oscillates back and forth across the microchamber.”
    The self-oscillations of the flexible sheets are powered by catalytic reactions in a fluidic chamber. The reactions on the surfaces of the sheet and chamber initiate a complex feedback loop: chemical energy from the reaction is converted into fluid flow, which transports and deforms the flexible sheets. The structurally evolving sheets in turn affect the motion of the fluid, which continues to deform the sheets.
    “What is really intriguing is that when we introduce a second sheet, we uncover novel forms of self-organization between vibrating structures,” Manna adds. In particular, the two sheets form coupled oscillators that communicate through the fluid to coordinate not only their location and temporal pulsations, but also synchronize their mutual shape changes. This behavior is analogous to that of the tree frog swarmalators that coordinate their relative spatial location, and time of croaking, which also involves a periodic change in the frog’s shape (with an inflated or deflated throat).
    “Complex dynamic behavior is a critical feature of biological systems,” Shklyaev says. Stuff does not just come together and stop moving. Analogously, these sheets assemble in the proper time and space to form a larger, composite dynamic system. Moreover, this structure is self-regulating and can perform functions that a single sheet alone cannot carry out.”
    “For two or more sheets, the collective temporal oscillations and spatial behavior can be controlled by varying the size of the different sheets or the pattern of catalyst coating on the sheet,” says Balazs. These variations permit control over the relative phase of the oscillations, e.g., the oscillators can move in-phase or anti-phase.
    “These are very exciting results because the 2D sheets self-morph into 3D objects, which spontaneously translate a non-oscillating signal into “instructions” for forming a larger aggregate whose shape and periodic motion is regulated by each of its moving parts,” she notes. “Our research could eventually lead to forms of bio-inspired computation — just as coupled oscillators are used to transmit information in electronics — but with self-sustained, self-regulating behavior.”
    Video: https://www.youtube.com/watch?v=89Y9lVlEaBs More

  • in

    Second-wave COVID mortality dropped markedly in (most) wealthier zones

    Wealthier northeastern US states and Western European countries tended to have significantly lower mortality rates during second-wave COVID-19 infections, new research from the University of Sydney and Tsinghua University has shown. However, the pattern was not as general as expected, with notable exceptions to this trend in Sweden and Germany.
    Researchers say mortality change could have several explanations: European first-wave case counts were underestimated; First-wave deaths disproportionately affected the elderly; Second-wave infections tended to affect younger people; With some exceptions, lower mortality rates occurred in countries with more socialised and equitable health systems.The researchers, Nick James, Max Menzies and Peter Radchenko, believe their new methodology could assist epidemiologists to analyse data consistently to assess the impact of COVID-19 mortality across populations.
    “We have been able to look at the mortality rates in a more dynamic way,” said Mr James from the University of Sydney.
    They have published their results today in the mathematical journal Chaos.
    “We take a time series of infection rates by country, apply an algorithmic approach to chop it up into first and later waves and then do some relatively simple optimisation and calculations to determine two different mortality numbers,” said Nick James, a PhD student in the School of Mathematics & Statistics at the University of Sydney. More

  • in

    Researchers enhance quantum machine learning algorithms

    A Florida State University professor’s research could help quantum computing fulfill its promise as a powerful computational tool.
    William Oates, the Cummins Inc. Professor in Mechanical Engineering and chair of the Department of Mechanical Engineering at the FAMU-FSU College of Engineering, and postdoctoral researcher Guanglei Xu found a way to automatically infer parameters used in an important quantum Boltzmann machine algorithm for machine learning applications.
    Their findings were published in Scientific Reports.
    The work could help build artificial neural networks that could be used for training computers to solve complicated, interconnected problems like image recognition, drug discovery and the creation of new materials.
    “There’s a belief that quantum computing, as it comes online and grows in computational power, can provide you with some new tools, but figuring out how to program it and how to apply it in certain applications is a big question,” Oates said.
    Quantum bits, unlike binary bits in a standard computer, can exist in more than one state at a time, a concept known as superposition. Measuring the state of a quantum bit — or qubit — causes it to lose that special state, so quantum computers work by calculating the probability of a qubit’s state before it is observed.
    Specialized quantum computers known as quantum annealers are one tool for doing this type of computing. They work by representing each state of a qubit as an energy level. The lowest energy state among its qubits gives the solution to a problem. The result is a machine that could handle complicated, interconnected systems that would take a regular computer a very long time to calculate — like building a neural network.
    One way to build neural networks is by using a restricted Boltzmann machine, an algorithm that uses probability to learn based on inputs given to the network. Oates and Xu found a way to automatically calculate an important parameter associated with effective temperature that is used in that algorithm. Restricted Boltzmann machines typically guess at that parameter instead, which requires testing to confirm and can change whenever the computer is asked to investigate a new problem.
    “That parameter in the model replicates what the quantum annealer is doing,” Oates said. “If you can accurately estimate it, you can train your neural network more effectively and use it for predicting things.”
    This research was supported by Cummins Inc. and used resources of the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility.
    Story Source:
    Materials provided by Florida State University. Original written by Bill Wellock. Note: Content may be edited for style and length. More

  • in

    Spontaneous superconducting currents in Sr2RuO4

    Superconductivity is a complete loss of electrical resistance. Superconductors are not merely very good metals: it is a fundamentally different electronic state. In normal metals, electrons move individually, and they collide with defects and vibrations in the lattice. In superconductors, electrons are bound together by an attractive force, which allows them to move together in a correlated way and avoid defects.
    In a very small number of known superconductors, the onset of superconductivity causes spontaneous electrical currents to flow. These currents are very different from those in a normal metal wire: they are built into the ground state of the superconductor, and so they cannot be switched off. For example, in a sheet of a superconducting material, currents might appear that flow around the edge, as shown in the figure.
    This is a very rare form of superconductivity, and it always indicates that the attractive interaction is something unusual. Sr2RuO4 is one famous material where this type of superconductivity is thought to occur. Although the transition temperature is low — Sr2RuO4 superconducts only below 1.5 Kelvin — the reason why it superconducts at all is completely unknown. To explain the superconductivity in this material has become a major test of physicists’ understanding of superconductivity in general. Theoretically, it is very difficult to obtain spontaneous currents in Sr2RuO4 from standard models of superconductivity, and so if they are confirmed then a new model for superconductivity — an attractive force that is not seen in other materials — might be required.
    The way that these electrical currents are detected is subtle. Subatomic particles known as muons are implanted into the sample. The spin of each muon then precesses in whatever magnetic field exists at the muon stopping site. In effect, the muons act as sensitive detectors of magnetic field, that can be placed inside the sample. From such muon implantation experiments it has been found that spontaneous magnetic fields appear when Sr2RuO4 becomes superconducting, which shows that there are spontaneous electrical currents.
    However, because the signal is subtle, researchers have questioned whether it is in fact real. Onset of superconductivity is a major change in the electronic properties of a material, and maybe this subtle additional signal appeared because the measurement apparatus was not properly tuned.
    In this work, researchers at the Max Planck Institute for Chemical Physics of Solids, the Technical University of Dresden, and the Paul Scherrer Institute (Switzerland) have shown that when uniaxial pressure is applied to Sr2RuO4, the spontaneous currents onset at a lower temperature than the superconductivity. In other words, the transition splits into two: first superconductivity, then spontaneous currents. This splitting has not been clearly demonstrated in any other material, and it is important because it shows definitively that the second transition is real. The spontaneous currents must be explained scientifically, not as a consequence of imperfect measurement. This may require a major re-write of our understanding of superconductivity.
    Story Source:
    Materials provided by Max Planck Institute for Chemical Physics of Solids. Note: Content may be edited for style and length. More

  • in

    Smart quantum technologies for secure communication

    Researchers from Louisiana State University have introduced a smart quantum technology for the spatial mode correction of single photons. In a paper featured on the cover of the March 2021 issue of Advanced Quantum Technologies, the authors exploit the self-learning and self-evolving features of artificial neural networks to correct the distorted spatial profile of single photons.
    The authors, PhD candidate Narayan Bhusal, postdoctoral researcher Chenglong You, graduate student Mingyuan Hong, undergraduate student Joshua Fabre, and Assistant Professor Omar S. Magaña?Loaiza of LSU — together with collaborators Sanjaya Lohani, Erin M. Knutson, and Ryan T. Glasser of Tulane University and Pengcheng Zhao of Qingdao University of Science and Technology — report on the potential of artificial intelligence to correct spatial modes at the single-photon level.
    “The random phase distortion is one of the biggest challenges in using spatial modes of light in a wide variety of quantum technologies, such as quantum communication, quantum cryptography, and quantum sensing,” said Bhusal. “In this paper, we use artificial neurons to correct distorted spatial modes of light at the single-photon level. Our method is remarkably effective and time-efficient compared to conventional techniques. This is an exciting development for the future of free-space quantum technologies.”
    The newly developed technique boosts the channel capacity of optical communication protocols that rely on structured photons.
    “One important goal of the Quantum Photonics Group at LSU is to develop robust quantum technologies that work under realistic conditions,” said Magaña?Loaiza. “This smart quantum technology demonstrates the possibility of encoding multiple bits of information in a single photon in realistic communication protocols affected by atmospheric turbulence. Our technique has enormous implications for optical communication and quantum cryptography. We are now exploring paths to implement our machine learning scheme in the Louisiana Optical Network Initiative (LONI) to make it smart, secure, and quantum.”
    “We are still in the fairly early stages of understanding the potential for machine learning techniques to play a role in quantum information science,” said Dr. Sara Gamble, program manager at the Army Research Office, an element of DEVCOM ARL. “The team’s result is an exciting step forward in developing this understanding, and it has the potential to ultimately enhance the Army’s sensing and communication capabilities on the battlefield.”
    Story Source:
    Materials provided by Louisiana State University. Note: Content may be edited for style and length. More