More stories

  • in

    Mysterious “quantum echo” in superconductors could unlock new tech

    Scientists at the U. S. Department of Energy Ames National Laboratory and Iowa State University have discovered an unexpected “quantum echo” in a superconducting material. This discovery provides insight into quantum behaviors that could be used for next-generation quantum sensing and computing technologies.
    Superconductors are materials that carry electricity without resistance. Within these superconductors are collective vibrations known as “Higgs modes.” A Higgs mode is a quantum phenomenon that occurs when its electron potential fluctuates in a similar way to a Higgs boson. They appear when a material is undergoing a superconducting phase transition.
    Observing these vibrations has been a long-time challenge for scientists because they exist for a very short time. They also have complex interactions with quasiparticles, which are electron-like excitations that emerge from the breakdown of superconductivity.
    However, using advanced terahertz (THz) spectroscopy techniques, the research team discovered a novel type of quantum echo, called the “Higgs echo,” in superconducting niobium materials used in quantum computing circuits.
    “Unlike conventional echoes observed in atoms or semiconductors, the Higgs echo arises from a complex interaction between the Higgs modes and quasiparticles, leading to unusual signals with distinct characteristics,” explained Jigang Wang, a scientist at Ames Lab and lead of the research team.
    According to Wang, the Higgs echo can remember and reveal hidden quantum pathways within the material. By using precisely timed pulses of THz radiation, his team was able to observe these echoes. Using these THz radiation pulses, they can also use the echoes to encode, store, and retrieve quantum information embedded within this superconducting material.
    This research demonstrates the ability to control and observe quantum coherence in superconductors and paves the way for potential new methods of quantum information storage and processing.
    “Understanding and controlling these unique quantum echoes brings us a step closer to practical quantum computing and advanced quantum sensing technologies,” said Wang.
    This project was partially supported through the Superconducting Quantum Materials and Systems Center (SQMS). More

  • in

    Could your smartphone detect mental health risks before you notice them?

    Data passively collected from cell phone sensors can identify behaviors associated with a host of mental health disorders, from agoraphobia to generalized anxiety disorder to narcissistic personality disorder. New findings show that the same data can identify behaviors associated with a wider array of mental disorder symptoms.
    Colin E. Vize, assistant professor in the Department of Psychology in Pitt’s Kenneth P. Dietrich School of Arts and Sciences, is co-PI on this research, which broadens the scope of how clinicians might one day use this data to treat their patients.
    The work was led by first author Whitney Ringwald (SOC WK ’18G, A&S ’21G), professor at the University of Minnesota who completed her graduate training at Pitt. Also on their team were former Pitt Professor Aiden Wright, now at the University of Michigan, and Grant King, one of Wright’s graduate students.
    “This is an important step in the right direction,” Vize said, “but there is a lot of work to be done before we can potentially realize any of the clinical promises of using sensors on smartphones to help inform assessment and treatment.”
    In theory, an app that could make use of such data would give clinicians access to substantially more, and more reliable, data about their patients’ lives between visits.
    “We’re not always the best reporters, we often forget things,” Vize said of filling out self-assessments. “But with passive sensing, we might be able to collect data unobtrusively, as people are going about their daily lives, without having to ask a lot of questions.”
    As the first steps to realizing such a tool, researchers investigated whether they could infer if people were behaving in ways associated with certain mental health conditions. Previous research has connected passive sensor readings with behaviors that point to specific illnesses, including depression and post-traumatic stress disorder. This new work, published July 3 in the journal JAMA Network Open, expands upon that research, showing that it can be linked to symptoms that are not specific to any one mental health condition.

    This is important, Vize said, because many behaviors are associated with more than one disorder, and different people with the same disorder can look, act and feel very differently.
    “The disorder categories tend to not carve nature at its joints,” he said. “We can think more transdiagnostically, and that gives us a little more accurate picture of some of the symptoms that people are experiencing.”
    For this study, Vize and a team of researchers used a statistical analysis tool called Mplus to find correlations between sensor data and mental health symptoms reported at baseline. The scientists then had to determine whether sensor data correlated with a set of broad, evidence-based symptom dimensions: internalizing, detachment, disinhibition, antagonism, thought disorder and somatoform, or unexplained physical symptoms.
    In addition to the six dimensions, they also looked at what has been called the p-factor. This is not a specific behavior or symptom, rather it represents an ineffable, shared feature that runs across all kinds of mental health symptoms.
    “You can think about it sort of like a Venn diagram,” Vize said. If all the symptoms associated with all mental health issues were circles, the p-factor is the space where they all overlap. It is not a behavior in and of itself. “It’s essentially what’s shared across all dimensions.”
    The researchers made use of the Intensive Longitudinal Investigation of Alternative Diagnostic Dimensions study (ILIADD), which was conducted in Pittsburgh in the spring of 2023. From ILIADD, they analyzed the data of 557 people who had filled out self-assessments and shared data from their cell phones, including (but not limited to): GPS data that indicated how long people stayed home and the maximum distance they traveled from home Time spent walking, running and stationary How long their screens were on How many calls they received and made Battery status Sleep timeUsing an app developed by researchers at the University of Oregon, the team was able to relate the sensor data to various mental health symptoms. Comparing the app’s findings to questionnaires filled out by participants, Vize and team determined that the six dimensions of mental health symptoms, which reflect symptoms represented among many disorders, did correlate to the sensor data.

    Interestingly, they also found sensor data correlated to the p-factor, a general marker of mental health problems. The implications of these findings are several-fold — ultimately, it may one day be possible to use this kind of technology to better understand symptoms in a patient whose presentation doesn’t fit the category of any single disorder.
    But for now, these data do not say anything about individuals’ mental health; they deal in averages. Mental health is complex. Behavior varies wildly. “These sensor analyses may more accurately describe some people than others.”
    That’s one of the reasons Vize doesn’t see this kind of technology ever replacing a human clinician. “A lot of work in this area is focused on getting to the point where we can talk about, ‘How does this potentially enhance or supplement existing clinical care?’
    “Because I definitely don’t think it can replace treatment. It would be more of an additional tool in the clinician’s toolbox.” More

  • in

    This new camera sees the invisible in 3D without lenses

    Researchers have used the centuries-old idea of pinhole imaging to create a high-performance mid-infrared imaging system without lenses. The new camera can capture extremely clear pictures over a large range of distances and in low light, making it useful for situations that are challenging for traditional cameras.
    “Many useful signals are in the mid-infrared, such as heat and molecular fingerprints, but cameras working at these wavelengths are often noisy, expensive or require cooling,” said research team leader Heping Zeng from East China Normal University. “Moreover, traditional lens-based setups have a limited depth of field and need careful design to minimize optical distortions. We developed a high-sensitivity, lens-free approach that delivers a much larger depth of field and field of view than other systems.”
    In Optica, Optica Publishing Group’s journal for high-impact research, the researchers describe how they use light to form a tiny “optical pinhole” inside a nonlinear crystal, which also turns the infrared image into a visible one. Using this setup, they acquired clear mid-infrared images with a depth of field of over 35 cm and a field of view of more than 6 cm. They were also able to use the system to acquire 3D images.
    “This approach can enhance night-time safety, industrial quality control and environmental monitoring,” said research team member Kun Huang from East China Normal University. “And because it uses simpler optics and standard silicon sensors, it could eventually make infrared imaging systems more affordable, portable and energy efficient. It can even be applied with other spectral bands such as the far-infrared or terahertz wavelengths, where lenses are hard to make or perform poorly.”
    Pinhole imaging reimagined
    Pinhole imaging is one of the oldest image-making methods, first described by the Chinese philosopher Mozi in the 4th century BC. A traditional pinhole camera works by letting light pass through a tiny hole in a lightproof box, projecting an inverted image of the outside scene onto the opposite surface inside. Unlike lens-based imaging, pinhole imaging avoids distortion, has an infinite depth of field and works across a wide range of wavelengths.
    To bring these advantages to a modern infrared imaging system, the researchers used an intense laser to form an optical hole, or artificial aperture, inside a nonlinear crystal. Because of its special optical properties, the crystal converts the infrared image into visible light, so that a standard silicon camera can record it.

    The researchers say that the use of a specially designed crystal with a chirped-period structure, which can accept light rays from a broad range of directions, was key to achieving a large field of view. Also, the upconversion detection method naturally suppresses noise, which allows it to work even in very low light conditions.
    “Lensless nonlinear pinhole imaging is a practical way to achieve distortion-free, large-depth, wide-field-of-view mid-infrared imaging with high sensitivity,” said Huang. “The ultrashort synchronized laser pulses also provide a built-in ultrafast optical time gate that can be used for sensitive, time-of-flight depth imaging, even with very few photons.”
    After figuring out that an optical pinhole radius of about 0.20 mm produced sharp, well-defined details, the researchers used this aperture size to image targets that were 11 cm, 15 cm and 19 cm away. They achieved sharp imaging at the mid-infrared wavelength of 3.07 μm, across all the distances, confirming a large depth range. They were also able to keep images sharp for objects placed up to 35 cm away, demonstrating a large depth of field.
    3D imaging without lenses
    The investigators then used their setup for two types of 3D imaging. For 3D time-of-flight imaging, they imaged a matte ceramic rabbit by using synchronized ultrafast pulses as an optical gate and were able to reconstruct the 3D shape with micron-level axial precision. Even when the input was reduced to about 1.5 photons per pulse — simulating very low-light conditions — the method still produced 3D images after correlation-based denoising.
    They also performed two-snapshot depth imaging by taking two pictures of a stacked “ECNU” target at slightly different object distances and using those to calculate the true sizes and depths. With this method, they were able to measure the depth of the objects over a range of about 6 centimeters, without using complex pulsed timing techniques.
    The researchers note that the mid-infrared nonlinear pinhole imaging system is still a proof-of-concept that requires a relatively complex and bulky laser setup. However, as new nonlinear materials and integrated light sources are developed, the technology should become far more compact and easier to deploy.
    They are now working to make the system faster, more sensitive and adaptable to different imaging scenarios. Their plans include boosting conversion efficiency, adding dynamic control to reshape the optical pinhole for different scenes, and extending the camera’s operation across a wider mid-infrared range. More

  • in

    The quantum internet just went live on Verizon’s network

    In a first-of-its-kind experiment, engineers at the University of Pennsylvania brought quantum networking out of the lab and onto commercial fiber-optic cables using the same Internet Protocol (IP) that powers today’s web. Reported in Science, the work shows that fragile quantum signals can run on the same infrastructure that carries everyday online traffic. The team tested their approach on Verizon’s campus fiber-optic network.
    The Penn team’s tiny “Q-chip” coordinates quantum and classical data and, crucially, speaks the same language as the modern web. That approach could pave the way for a future “quantum internet,” which scientists believe may one day be as transformative as the dawn of the online era.
    Quantum signals rely on pairs of “entangled” particles, so closely linked that changing one instantly affects the other. Harnessing that property could allow quantum computers to link up and pool their processing power, enabling advances like faster, more energy-efficient AI or designing new drugs and materials beyond the reach of today’s supercomputers.
    Penn’s work shows, for the first time on live commercial fiber, that a chip can not only send quantum signals but also automatically correct for noise, bundle quantum and classical data into standard internet-style packets, and route them using the same addressing system and management tools that connect everyday devices online.
    “By showing an integrated chip can manage quantum signals on a live commercial network like Verizon’s, and do so using the same protocols that run the classical internet, we’ve taken a key step toward larger-scale experiments and a practical quantum internet,” says Liang Feng, Professor in Materials Science and Engineering (MSE) and in Electrical and Systems Engineering (ESE), and the Science paper’s senior author.
    The Challenges of Scaling the Quantum Internet
    Erwin Schrodinger, who coined the term “quantum entanglement,” famously related the concept to a cat hidden in a box. If the lid is closed, and the box also contains radioactive material, the cat could be alive or dead. One way to interpret the situation is that the cat is both alive and dead. Only opening the box confirms the cat’s state.

    That paradox is roughly analogous to the unique nature of quantum particles. Once measured, they lose their unusual properties, which makes scaling a quantum network extremely difficult.
    “Normal networks measure data to guide it towards the ultimate destination,” says Robert Broberg, a doctoral student in ESE and coauthor of the paper. “With purely quantum networks, you can’t do that, because measuring the particles destroys the quantum state.”
    Coordinating Classical and Quantum Signals
    To get around this obstacle, the team developed the “Q-Chip” (short for “Quantum-Classical Hybrid Internet by Photonics”) to coordinate “classical” signals, made of regular streams of light, and quantum particles. “The classical signal travels just ahead of the quantum signal,” says Yichi Zhang, a doctoral student in MSE and the paper’s first author. “That allows us to measure the classical signal for routing, while leaving the quantum signal intact.”
    In essence, the new system works like a railway, pairing regular light locomotives with quantum cargo. “The classical ‘header’ acts like the train’s engine, while the quantum information rides behind in sealed containers,” says Zhang. “You can’t open the containers without destroying what’s inside, but the engine ensures the whole train gets where it needs to go.”
    Because the classical header can be measured, the entire system can follow the same “IP” or “Internet Protocol” that governs today’s internet traffic. “By embedding quantum information in the familiar IP framework, we showed that a quantum internet could literally speak the same language as the classical one,” says Zhang. “That compatibility is key to scaling using existing infrastructure.”
    Adapting Quantum Technology to the Real World

    One of the greatest challenges to transmitting quantum particles on commercial infrastructure is the variability of real-world transmission lines. Unlike laboratory environments, which can maintain ideal conditions, commercial networks frequently encounter changes in temperature, thanks to weather, as well as vibrations from human activities like construction and transportation, not to mention seismic activity.
    To counteract this, the researchers developed an error-correction method that takes advantage of the fact that interference to the classical header will affect the quantum signal in a similar fashion. “Because we can measure the classical signal without damaging the quantum one,” says Feng, “we can infer what corrections need to be made to the quantum signal without ever measuring it, preserving the quantum state.”
    In testing, the system maintained transmission fidelities above 97%, showing that it could overcome the noise and instability that usually destroy quantum signals outside the lab. And because the chip is made of silicon and fabricated using established techniques, it could be mass produced, making the new approach easy to scale.
    “Our network has just one server and one node, connecting two buildings, with about a kilometer of fiber-optic cable installed by Verizon between them,” says Feng. “But all you need to do to expand the network is fabricate more chips and connect them to Philadelphia’s existing fiber-optic cables.”
    The Future of the Quantum Internet
    The main barrier to scaling quantum networks beyond a metro area is that quantum signals cannot yet be amplified without destroying their entanglement.
    While some teams have shown that “quantum keys,” special codes for ultra-secure communication, can travel long distances over ordinary fiber, those systems use weak coherent light to generate random numbers that cannot be copied, a technique that is highly effective for security applications but not sufficient to link actual quantum processors.
    Overcoming this challenge will require new devices, but the Penn study provides an important early step: showing how a chip can run quantum signals over existing commercial fiber using internet-style packet routing, dynamic switching and on-chip error mitigation that work with the same protocols that manage today’s networks.
    “This feels like the early days of the classical internet in the 1990s, when universities first connected their networks,” says Broberg. “That opened the door to transformations no one could have predicted. A quantum internet has the same potential.”
    This study was conducted at the University of Pennsylvania School of Engineering and Applied Science and was supported by the Gordon and Betty Moore Foundation (GBMF12960 and DOI 10.37807), Office of Naval Research (N00014-23-1-2882), National Science Foundation (DMR-2323468), Olga and Alberico Pompa endowed professorship, and PSC-CUNY award (ENHC-54-93).
    Additional co-authors include Alan Zhu, Gushi Li and Jonathan Smith of the University of Pennsylvania, and Li Ge of the City University of New York. More

  • in

    Scientists unveil breakthrough pixel that could put holograms on your smartphone

    New research from the University of St Andrews paves the way for holographic technology, with the potential to transform smart devices, communication, gaming and entertainment.
    In a study published recently in Light, Science and Application, researchers from the school of Physics and Astronomy created a new optoelectronic device from the combined use of Holographic Metasurfaces (HMs) and Organic Light Emmitting Diodes (OLEDs).
    Until now, holograms have are created using lasers, however researchers have found  that using OLEDs and HMs gives a simpler and more compact approach that is potentially cheaper and easier to apply, overcoming the main barriers to hologram technology being used more widely.
    Organic light-emitting diodes are thin film devices widely used to make the colored pixels in mobile phone displays and some TVs. As a flat and surface-emitting light source, OLEDs are also used in emerging applications such as optical wireless communications, biophotonics and sensing, where the ability to integrate with other technologies makes them good candidates to realize miniaturized light-based platforms.
    A holographic metasurface is a thin, flat array of tiny structures called meta-atoms – the size of roughly a thousand of the width of a strand of hair – they are designed to manipulate light’s properties. They can make holograms and their uses span diverse fields, such as data storage, anti-counterfeiting, optical displays, high numerical aperture lenses – for example optical microscopy, and sensing.
    This, however, is the first time both have been used together to produce the basic building block of a holographic display.
    Researchers found that when each meta- atom is carefully shaped to control the properties of the beam of light that goes through it, it behaves as a pixel of the HM. When light goes through the HM, at each pixel, the properties of the light are slightly modified.

    Thanks to these modifications, it is possible to create a pre-designed image on the other side, exploiting the principle of light interference, whereby light waves create complicated patterns when they interact with each other.
    Professor Ifor Samuel, from the School of Physics and Astronomy, said: “We are excited to demonstrate this new direction for OLEDs.  By combining OLEDs with metasurfaces, we also open a new way of generating holograms and shaping light.”
    Andrea Di Falco, professor in nano-photonics at the School of Physics and Astronomy, said: “Holographic metasurfaces are one of the most versatile material platforms to control light. With this work, we have removed one of the technological barriers that prevent the adoption of metamaterials in everyday applications. This breakthrough will enable a step change in the architecture of holographic displays for emerging applications, for example, in virtual and augmented reality.”
    Professor Graham Turnbull, from the School of Physics and Astronomy, said: “OLED displays normally need thousands of pixels to create a simple picture. This new approach allows a complete image to be projected from a single OLED pixel!”
    Until now, researchers could only make very simple shapes with OLEDs, which limited their usability in some applications. However, this breakthrough provides a path toward a miniaturized and highly integrated metasurface display. More

  • in

    Scientists brew “quantum ink” to power next-gen night vision

    Manufacturers of infrared cameras face a growing problem: the toxic heavy metals in today’s infrared detectors are increasingly banned under environmental regulations, forcing companies to choose between performance and compliance.
    This regulatory pressure is slowing the broader adoption of infrared detectors across civilian applications, just as demand in fields like autonomous vehicles, medical imaging and national security is accelerating.
    In a paper published in ACS Applied Materials & Interfaces, researchers at NYU Tandon School of Engineering reveal a potential solution that uses environmentally friendly quantum dots to detect infrared light without relying on mercury, lead, or other restricted materials.
    The researchers use colloidal quantum dots which upends the age-old, expensive, and tedious processing of infrared detectors. Traditional devices are fabricated through slow, ultra-precise methods that place atoms almost one by one across the pixels of a detector — much like assembling a puzzle piece by piece under a microscope.
    Colloidal quantum dots are instead synthesized entirely in solution, more like brewing ink, and can be deposited using scalable coating techniques similar to those used in roll-to-roll manufacturing for packaging or newspapers. This shift from painstaking assembly to solution-based processing dramatically reduces manufacturing costs and opens the door to widespread commercial applications.
    “The industry is facing a perfect storm where environmental regulations are tightening just as demand for infrared imaging is exploding,” said Ayaskanta Sahu, associate professor in the Department of Chemical and Biomolecular Engineering (CBE) at NYU Tandon and the study’s senior author. “This creates real bottlenecks for companies trying to scale up production of thermal imaging systems.”
    Another challenge the researchers addressed was making the quantum dot ink conductive enough to relay signals from incoming light. They achieved this using a technique called solution-phase ligand exchange, which tailors the quantum dot surface chemistry to enhance performance in electronic devices. Unlike traditional fabrication methods that often leave cracked or uneven films, this solution-based process yields smooth, uniform coatings in a single step — ideal for scalable manufacturing.

    The resulting devices show remarkable performance: they respond to infrared light on the microsecond timescale — for comparison, the human eye blinks at speeds hundreds of times slower — and they can detect signals as faint as a nanowatt of light.
    “What excites me is that we can take a material long considered too difficult for real devices and engineer it to be more competitive,” said graduate researcher Shlok J. Paul, lead author on the study. “With more time this material has the potential to shine deeper in the infrared spectrum where few materials exist for such tasks.”
    This work adds to earlier research from the same lead researchers that developed new transparent electrodes using silver nanowires. Those electrodes remain highly transparent to infrared light while efficiently collecting electrical signals, addressing one component of the infrared camera system.
    Combined with their earlier transparent electrode work, these developments address both major components of infrared imaging systems. The quantum dots provide environmentally compliant sensing capability, while the transparent electrodes handle signal collection and processing.
    This combination addresses challenges in large-area infrared imaging arrays, which require high-performance detection across wide areas and signal readout from millions of individual detector pixels. The transparent electrodes allow light to reach the quantum dot detectors while providing electrical pathways for signal extraction.
    “Every infrared camera in a Tesla or smartphone needs detectors that meet environmental standards while remaining cost-effective,” Sahu said. “Our approach could help make these technologies much more accessible.”
    The performance still falls short of the best heavy-metal-based detectors in some measurements. However, the researchers expect continued advances in quantum dot synthesis and device engineering could reduce this gap.
    In addition to Sahu and Paul, the paper’s authors are Letian Li, Zheng Li, Thomas Kywe, and Ana Vataj, all from NYU Tandon CBE. The work was supported by the Office of Naval Research and the Defense Advanced Research Projects Agency. More

  • in

    Caltech’s massive 6,100-qubit array brings the quantum future closer

    Quantum computers will need large numbers of qubits to tackle challenging problems in physics, chemistry, and beyond. Unlike classical bits, qubits can exist in two states at once — a phenomenon called superposition. This quirk of quantum physics gives quantum computers the potential to perform certain complex calculations better than their classical counterparts, but it also means the qubits are fragile. To compensate, researchers are building quantum computers with extra, redundant qubits to correct any errors. That is why robust quantum computers will require hundreds of thousands of qubits.
    Now, in a step toward this vision, Caltech physicists have created the largest qubit array ever assembled: 6,100 neutral-atom qubits trapped in a grid by lasers. Previous arrays of this kind contained only hundreds of qubits.
    This milestone comes amid a rapidly growing race to scale up quantum computers. There are several approaches in development, including those based on superconducting circuits, trapped ions, and neutral atoms, as used in the new study.
    “This is an exciting moment for neutral-atom quantum computing,” says Manuel Endres, professor of physics at Caltech. “We can now see a pathway to large error-corrected quantum computers. The building blocks are in place.” Endres is the principal investigator of the research published on September 24 in Nature. Three Caltech graduate students led the study: Hannah Manetsch, Gyohei Nomura, and Elie Bataille.
    The team used optical tweezers — highly focused laser beams — to trap thousands of individual cesium atoms in a grid. To build the array of atoms, the researchers split a laser beam into 12,000 tweezers, which together held 6,100 atoms in a vacuum chamber. “On the screen, we can actually see each qubit as a pinpoint of light,” Manetsch says. “It’s a striking image of quantum hardware at a large scale.”
    A key achievement was showing that this larger scale did not come at the expense of quality. Even with more than 6,000 qubits in a single array, the team kept them in superposition for about 13 seconds — nearly 10 times longer than what was possible in previous similar arrays — while manipulating individual qubits with 99.98 percent accuracy. “Large scale, with more atoms, is often thought to come at the expense of accuracy, but our results show that we can do both,” Nomura says. “Qubits aren’t useful without quality. Now we have quantity and quality.”
    The team also demonstrated that they could move the atoms hundreds of micrometers across the array while maintaining superposition. The ability to shuttle qubits is a key feature of neutral-atom quantum computers that enables more efficient error correction compared with traditional, hard-wired platforms like superconducting qubits.

    Manetsch compares the task of moving the individual atoms while keeping them in a state of superposition to balancing a glass of water while running. “Trying to hold an atom while moving is like trying to not let the glass of water tip over. Trying to also keep the atom in a state of superposition is like being careful to not run so fast that water splashes over,” she says.
    The next big milestone for the field is implementing quantum error correction at the scale of thousands of physical qubits, and this work shows that neutral atoms are a strong candidate to get there. “Quantum computers will have to encode information in a way that’s tolerant to errors, so we can actually do calculations of value,” Bataille says. “Unlike in classical computers, qubits can’t simply be copied due to the so-called no-cloning theorem, so error correction has to rely on more subtle strategies.”
    Looking ahead, the researchers plan to link the qubits in their array together in a state of entanglement, where particles become correlated and behave as one. Entanglement is a necessary step for quantum computers to move beyond simply storing information in superposition; entanglement will allow them to begin carrying out full quantum computations. It is also what gives quantum computers their ultimate power — the ability to simulate nature itself, where entanglement shapes the behavior of matter at every scale. The goal is clear: to harness entanglement to unlock new scientific discoveries, from revealing new phases of matter to guiding the design of novel materials and modeling the quantum fields that govern space-time.
    “It’s exciting that we are creating machines to help us learn about the universe in ways that only quantum mechanics can teach us,” Manetsch says.
    The new study, “A tweezer array with 6100 highly coherent atomic qubits,” was funded by the Gordon and Betty Moore Foundation, the Weston Havens Foundation, the National Science Foundation via its Graduate Research Fellowship Program and the Institute for Quantum Information and Matter (IQIM) at Caltech, the Army Research Office, the U.S. Department of Energy including its Quantum Systems Accelerator, the Defense Advanced Research Projects Agency, the Air Force Office for Scientific Research, the Heising-Simons Foundation, and the AWS Quantum Postdoctoral Fellowship. Other authors include Caltech’s Kon H. Leung, the AWS Quantum senior postdoctoral scholar research associate in physics, as well as former Caltech postdoctoral scholar Xudong Lv, now at the Chinese Academy of Sciences. More

  • in

    AI-powered smart bandage heals wounds 25% faster

    As a wound heals, it goes through several stages: clotting to stop bleeding, immune system response, scabbing, and scarring.
    A wearable device called “a-Heal,” designed by engineers at the University of California, Santa Cruz, aims to optimize each stage of the process. The system uses a tiny camera and AI to detect the stage of healing and deliver a treatment in the form of medication or an electric field. The system responds to the unique healing process of the patient, offering personalized treatment.
    The portable, wireless device could make wound therapy more accessible to patients in remote areas or with limited mobility. Initial preclinical results, published in the journal npj Biomedical Innovations, show the device successfully speeds up the healing process.
    Designing a-Heal
    A team of UC Santa Cruz and UC Davis researchers, sponsored by the DARPA-BETR program and led by UC Santa Cruz Baskin Engineering Endowed Chair and Professor of Electrical and Computer Engineering (ECE) Marco Rolandi, designed a device that combines a camera, bioelectronics, and AI for faster wound healing. The integration in one device makes it a “closed-loop system” — one of the firsts of its kind for wound healing as far as the researchers are aware.
    “Our system takes all the cues from the body, and with external interventions, it optimizes the healing progress,” Rolandi said.
    The device uses an onboard camera, developed by fellow Associate Professor of ECE Mircea Teodorescu and described in a Communications Biology study, to take photos of the wound every two hours. The photos are fed into a machine learning (ML) model, developed by Associate Professor of Applied Mathematics Marcella Gomez, which the researchers call the “AI physician” running on a nearby computer.

    “It’s essentially a microscope in a bandage,” Teodorescu said. “Individual images say little, but over time, continuous imaging lets AI spot trends, wound healing stages, flag issues, and suggest treatments.”
    The AI physician uses the image to diagnose the wound stage and compares that to where the wound should be along a timeline of optimal wound healing. If the image reveals a lag, the ML model applies a treatment: either medicine, delivered via bioelectronics; or an electric field, which can enhance cell migration toward wound closure.
    The treatment topically delivered through the device is fluoxetine, a selective serotonin reuptake inhibitor which controls serotonin levels in the wound and improves healing by decreasing inflammation and increasing wound tissue closure. The dose, determined by preclinical studies by the Isseroff group at UC Davis group to optimize healing, is administered by bioelectronic actuators on the device, developed by Rolandi. An electric field, optimized to improve healing and developed by prior work of the UC Davis’ Min Zhao and Roslyn Rivkah Isseroff, is also delivered through the device.
    The AI physician determines the optimal dosage of medication to deliver and the magnitude of the applied electric field. After the therapy has been applied for a certain period of time, the camera takes another image, and the process starts again.
    While in use, the device transmits images and data such as healing rate to a secure web interface, so a human physician can intervene manually and fine-tune treatment as needed. The device attaches directly to a commercially available bandage for convenient and secure use.
    To assess the potential for clinical use, the UC Davis team tested the device in preclinical wound models. In these studies, wounds treated with a-Heal followed a healing trajectory about 25% faster than standard of care. These findings highlight the promise of the technology not only for accelerating closure of acute wounds, but also for jump-starting stalled healing in chronic wounds.

    AI reinforcement
    The AI model used for this system, which was led by Assistant Professor of Applied Mathematics Marcella Gomez, uses a reinforcement learning approach, described in a study in the journal Bioengineering, to mimic the diagnostic approach used by physicians.
    Reinforcement learning is a technique in which a model is designed to fulfill a specific end goal, learning through trial and error how to best achieve that goal. In this context, the model is given a goal of minimizing time to wound closure, and is rewarded for making progress toward that goal. It continually learns from the patient and adapts its treatment approach.
    The reinforcement learning model is guided by an algorithm that Gomez and her students created called Deep Mapper, described in a preprint study, which processes wound images to quantify the stage of healing in comparison to normal progression, mapping it along the trajectory of healing. As time passes with the device on a wound, it learns a linear dynamic model of the past healing and uses that to forecast how the healing will continue to progress.
    “It’s not enough to just have the image, you need to process that and put it into context. Then, you can apply the feedback control,” Gomez said.
    This technique makes it possible for the algorithm to learn in real-time the impact of the drug or electric field on healing, and guides the reinforcement learning model’s iterative decision making on how to adjust the drug concentration or electric-field strength.
    Now, the research team is exploring the potential for this device to improve healing of chronic and infected wounds.
    Additional publications related to this work can be found linked here.
    This research was supported by the Defense Advanced Research Projects Agency and the Advanced Research Projects Agency for Health. More