More stories

  • in

    A twist of light could power the next generation of memory devices

    Modern digital systems depend on information encoded in simple binary units of 0s and 1s. Any physical substance that can reliably switch between two different, stable configurations can in principle serve as a storage platform for that binary information.
    Ferroic materials fall into this category. These solids can be toggled between two distinct states. Well-known examples include ferromagnets, which switch between opposite magnetic orientations, and ferroelectrics, which can hold opposing electric polarizations. Their ability to respond to magnetic or electric fields makes ferroic materials essential components in many modern electronic and data storage devices.
    However, they are not without limitations: they are sensitive to external disturbances — such as strong magnetic fields near a hard drive — and their performance typically degrades over time. These challenges have motivated researchers to look for new storage approaches that are more resilient.
    Ferroaxial Materials and Their Unusual Vortex States
    Ferroaxial materials represent a newer branch of the ferroic family. Instead of relying on magnetic or electric polarization states, these materials contain vortices of electric dipoles. These vortices can point in two opposite directions while producing neither net magnetization nor net electric polarization. They are extremely stable and naturally resistant to external fields, but this same stability has made them very difficult to manipulate, limiting scientific progress in this area.
    Using Terahertz Light to Switch Ferroaxial States
    A team led by Andrea Cavalleri has now demonstrated a method to control these elusive states. The researchers used circularly polarized terahertz pulses to flip between clockwise and anti-clockwise ferroaxial domains in a material called rubidium iron dimolybdate (RbFe(MoO₄)2) .

    “We take advantage of a synthetic effective field that arises when a terahertz pulse drives ions in the crystal lattice in circles,” explains lead author Zhiyang Zeng. “This effective field is able to couple to the ferroaxial state, just like a magnetic field would switch a ferromagnet or an electric field would reverse a ferroelectric state,” he added.
    By changing the helicity, or twist, of the circularly polarized pulses, the team could stabilize either the clockwise or anti-clockwise arrangement of electric dipoles. As fellow author Michael Först notes, “in this way enabling information storage in the two ferroic states. Because ferroaxials are free from depolarizing electric or stray magnetic fields, they are extremely promising candidates for stable, non-volatile data storage.”
    Implications for Future Ultrafast Information Technologies
    “This is an exciting discovery that opens up new possibilities for the development of a robust platform for ultrafast information storage,” says Andrea Cavalleri. He adds that the work also highlights the growing importance of circular phonon fields, first demonstrated by the group in 2017, as a powerful tool for manipulating unconventional material phases.
    This research was largely supported by the Max Planck Society and the Max-Planck Graduate Center for Quantum Materials, which fosters collaboration with the University of Oxford. Additional support comes from the Deutsche Forschungsgemeinschaft through the Cluster of Excellence ‘CUI: Advanced Imaging of Matter’. The MPSD is also a partner of the Center for Free-Electron Laser Science (CFEL) with DESY and the University of Hamburg. More

  • in

    Light has been hiding a magnetic secret for nearly 200 years

    Researchers at the Hebrew University of Jerusalem have found that the magnetic component of light plays a direct part in the Faraday Effect, overturning a 180-year belief that only light’s electric field was involved. Their work shows that light can exert magnetic influence on matter, not simply illuminate it. This insight could support advances in optics, spintronics, and emerging quantum technologies.
    The team’s findings, published in Nature’s Scientific Reports, show that the magnetic portion of light, not only its electric one, has a meaningful and measurable influence on how light interacts with materials. This result contradicts a scientific explanation that has shaped the understanding of the Faraday Effect since the nineteenth century.
    The study, led by Dr. Amir Capua and Benjamin Assouline of the university’s Institute of Electrical Engineering and Applied Physics, offers the first theoretical evidence that the oscillating magnetic field of light contributes directly to the Faraday Effect. This effect describes how the polarization of light rotates as it travels through a material placed in a constant magnetic field.
    How Light and Magnetism Interact
    “In simple terms, it’s an interaction between light and magnetism,” says Dr. Capua. “The static magnetic field ‘twists’ the light, and the light, in turn, reveals the magnetic properties of the material. What we’ve found is that the magnetic part of light has a first-order effect, it’s surprisingly active in this process.”
    For nearly two centuries, scientists attributed the Faraday Effect solely to the electric field of light interacting with electric charges in matter. The new study shows that the magnetic field of light also plays a direct role by interacting with atomic spins, a contribution long assumed to be insignificant.
    Calculating the Magnetic Contribution
    Using advanced calculations informed by the Landau-Lifshitz-Gilbert (LLG) equation, which describes how spins behave in magnetic materials, the researchers demonstrated that light’s magnetic field can generate magnetic torque within a material in a manner similar to a static magnetic field. Capua explains, “In other words, light doesn’t just illuminate matter, it magnetically influences it.”

    To measure the extent of that influence, the team applied their theoretical model to Terbium Gallium Garnet (TGG), a crystal commonly used to study the Faraday Effect. Their analysis revealed that the magnetic component of light is responsible for about 17% of the observed rotation in the visible spectrum and as much as 70% in the infrared.
    New Pathways for Future Technologies
    “Our results show that light ‘talks’ to matter not only through its electric field, but also through its magnetic field, a component that has been largely overlooked until now,” says Benjamin Assouline.
    The researchers note that this revised understanding of light’s magnetic behavior could open doors for innovations in optical data storage, spintronics, and magnetic control using light. The work may also contribute to future developments in spin-based quantum computing. More

  • in

    Quantum computers just simulated physics too complex for supercomputers

    Scientists study matter under extreme conditions to uncover some of nature’s most fundamental behaviors. The Standard Model of particle physics contains the equations needed to describe these phenomena, but in many real situations such as fast-changing environments or extremely dense matter, those equations become too complex for even the most advanced classical supercomputers to handle.
    Quantum computing offers a promising alternative because, in principle, it can represent and simulate these systems far more efficiently. A major challenge, however, is finding reliable methods to set up the initial quantum state that a simulation needs. In this work, researchers achieved a first: they created scalable quantum circuits capable of preparing the starting state of a particle collision similar to those produced in particle accelerators. Their test focuses on the strong interactions described by the Standard Model.
    The team began by determining the required circuits for small systems using classical computers. Once those designs were known, they applied the circuits’ scalable structure to build much larger simulations directly on a quantum computer. Using IBM’s quantum hardware, they successfully simulated key features of nuclear physics on more than 100 qubits.
    Scalable Quantum Methods for High-Density Physics
    These scalable quantum algorithms open the door to simulations that were previously out of reach. The approach can be used to model the vacuum state before a particle collision, physical systems with extremely high densities, and beams of hadrons. Researchers anticipate that future quantum simulations built on these circuits will exceed what classical computing can accomplish.
    Such simulations could shed light on major open questions in physics, including the imbalance of matter and antimatter, the creation of heavy elements inside supernovae, and the behavior of matter at ultra-high densities. The same techniques may also help model other difficult systems, including exotic materials with unusual quantum properties.
    Nuclear physicists used IBM’s quantum computers to perform the largest digital quantum simulation ever completed. Their success stemmed in part from identifying patterns in physical systems, including symmetries and differences in length scales, which helped them design scalable circuits that prepare states with localized correlations. They demonstrated the effectiveness of this algorithm by preparing the vacuum state and hadrons within a one-dimensional version of quantum electrodynamics.

    Advancing from Small Models to Large-Scale Quantum Systems
    The team validated their circuit components by first testing them on small systems with classical computing tools, confirming that the resulting states could be systematically improved. They then expanded the circuits to handle more than 100 qubits and ran them on IBM’s quantum devices. Using the data from these simulations, scientists extracted properties of the vacuum with percent-level accuracy.
    They also used the circuits to generate pulses of hadrons, then simulated how those pulses evolved over time to track their propagation. These advances point toward a future in which quantum computers can carry out full dynamical simulations of matter under extreme conditions that lie well beyond the reach of classical machines.
    This research received support from the Department of Energy (DOE) Office of Science, Office of Nuclear Physics, InQubator for Quantum Simulation (IQuS) through the Quantum Horizons: QIS Research and Innovation for Nuclear Science Initiative, and the Quantum Science Center (QSC), a DOE and University of Washington National Quantum Information Science Research Center. Additional computing resources were provided by the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility, and by the Hyak supercomputer system at the University of Washington. The team also acknowledges the use of IBM Quantum services for this project. More

  • in

    Nanoscale trick makes “dark excitons” glow 300,000 times stronger

    A research group at the City University of New York and the University of Texas at Austin has found a method to make dark excitons, a class of previously unseen light states, emit bright light and be controlled with nanoscale precision. The study, published November 12 in Nature Photonics, points toward future technologies that could operate faster, use less energy, and shrink to even smaller sizes.
    Dark excitons form in ultra-thin semiconductor materials and normally remain undetectable because they release only faint light. Even so, scientists have long viewed them as promising for quantum information and advanced photonics because they interact with light in unusual ways, remain stable for relatively long periods, and experience less disruption from their surroundings, which helps reduce decoherence.
    Amplifying Dark Excitons With Nanoscale Design
    To bring these hidden states into view, the researchers created a tiny optical cavity built from gold nanotubes combined with a single layer of tungsten diselenide (WSe2), a material just three atoms thick. This structure increased the brightness of dark excitons by an extraordinary factor of 300,000, making them clearly observable and allowing their behavior to be precisely controlled.
    “This work shows that we can access and manipulate light-matter states that were previously out of reach,” said principal investigator Andrea Alù, Distinguished and Einstein Professor of Physics at the CUNY Graduate Center and founding director of the Photonics Initiative at the Advanced Science Research Center at the CUNY Graduate Center (CUNY ASRC). “By turning these hidden states on and off at will and controlling them with nanoscale resolution, we open exciting opportunities to disruptively advance next-generation optical and quantum technologies, including for sensing and computing.”
    Electric and Magnetic Control of Hidden Quantum States
    The team also demonstrated that these dark excitons can be switched and adjusted using electric and magnetic fields. This level of control could support new designs for on-chip photonics, highly sensitive detectors, and secure quantum communication. Importantly, the method preserves the original characteristics of the material while still achieving record-setting improvements in light-matter coupling.

    “Our study reveals a new family of spin-forbidden dark excitons that had never been observed before,” said first author Jiamin Quan. “This discovery is just the beginning — it opens a path to explore many other hidden quantum states in 2D materials.”
    Solving a Debate in Plasmonics
    The findings also address a long-standing question of whether plasmonic structures can boost dark excitons without altering their fundamental nature when placed in close proximity. The researchers solved this by designing a plasmonic-excitonic heterostructure made with nanometer-thin boron nitride layers, which proved essential for revealing the newly identified dark excitons.
    The work received support from the Air Force Office of Scientific Research, the Office of Naval Research, and the National Science Foundation. More

  • in

    Combining western science with Indigenous knowledge could help the Arctic

    The Arctic char, a red-pink bellied relative of trout and salmon, is a staple food source for millions of people living in the Arctic. But that dynamic is being embrangled by climate change, as the Arctic is warming two to four times faster than the rest of the world.

    Marianne Falardeau, a polar marine ecologist at Université TÉLUQ in Quebec City, Canada, studies how climate change is reshaping boreal and polar marine ecosystems and the benefits those ecosystems provide to people, aiming to help northern communities adapt to the shifting environment. In 2022, she coauthored a study showing how to make small-scale fisheries in the Arctic more resilient in the face of climate change. More

  • in

    Life-saving research on extreme heat comes under fire

    Located just a few hours’ drive from the Canadian border, Missoula, Mont. is not known for sweltering temperatures. And yet heat waves are becoming more common in the mountainous region due to climate change, and researchers are concerned that a catastrophic heat event could soon shock the 120,000 or so people who call Missoula County home. Recent history reveals the cost of being unprepared for extreme heat; in 2021, the Pacific Northwest was caught off guard by the strongest heat wave the region had seen in a thousand years, resulting in more than 1,400 deaths.

    “We’ve come to understand that heat is a major threat to our region,” says Alli Kane, the Climate Action Program Coordinator for Missoula County. “And it’s something that we’re not prepared for.” More

  • in

    Princeton’s new quantum chip marks a major step toward quantum advantage

    Princeton engineers have created a superconducting qubit that remains stable for three times longer than the strongest designs available today. This improvement represents an important move toward building quantum computers that can operate reliably.
    “The real challenge, the thing that stops us from having useful quantum computers today, is that you build a qubit and the information just doesn’t last very long,” said Andrew Houck, leader of a federally funded national quantum research center, Princeton’s dean of engineering and co-principal investigator on the paper. “This is the next big jump forward.”
    In a Nov. 5 article published in Nature, the Princeton team reported that their qubit maintains coherence for more than 1 millisecond. This performance is triple the longest lifetime documented in laboratory experiments and nearly fifteen times greater than the standard used in industrial quantum processors. To confirm the result, the team constructed a functioning quantum chip based on the new qubit, demonstrating that the design can support error correction and scale toward larger systems.
    The researchers noted that their qubit is compatible with the architectures used by major companies such as Google and IBM. According to their analysis, replacing key components in Google’s Willow processor with Princeton’s approach could increase its performance by a factor of 1,000. Houck added that as quantum systems incorporate more qubits, the advantages of this design increase even more rapidly.
    Why Better Qubits Matter for Quantum Computing
    Quantum computers show promise for solving problems that traditional computers cannot address. Yet their current abilities remain limited because qubits lose their information before complex calculations can be completed. Extending coherence time is therefore essential for building practical quantum hardware. Princeton’s improvement represents the largest single gain in coherence time in more than ten years.
    Many labs are pursuing different qubit technologies, but Princeton’s design builds on a widely used approach known as the transmon qubit. Transmons, which operate as superconducting circuits held at extremely low temperatures, are known for being resistant to environmental interference and compatible with modern manufacturing tools.

    Despite these strengths, increasing the coherence time of transmon qubits has proven difficult. Recent results from Google showed that material defects now pose the main barrier to improving their newest processor.
    Tantalum and Silicon: A New Materials Strategy
    The Princeton team developed a two-part strategy to address these material challenges. First, they incorporated tantalum, a metal known for helping delicate circuits retain energy. Second, they replaced the standard sapphire substrate with high-purity silicon, a material foundational to the computing industry. Growing tantalum directly on silicon required solving several technical problems related to how the two materials interact, but the researchers succeeded and uncovered significant advantages in the process.
    Nathalie de Leon, co-director of Princeton’s Quantum Initiative and co-principal investigator of the project, said the tantalum-silicon design not only performs better than previous approaches but is also simpler to manufacture at scale. “Our results are really pushing the state of the art,” she said.
    Michel Devoret, chief scientist for hardware at Google Quantum AI, which provided partial funding, described the difficulty of extending the lifetime of quantum circuits. He noted that the challenge had become a “graveyard” of attempted solutions. “Nathalie really had the guts to pursue this strategy and make it work,” said Devoret, the 2025 Nobel Prize winner in physics.
    The project received primary funding from the U.S. Department of Energy National Quantum Information Science Research Centers and the Co-design Center for Quantum Advantage (C2QA), a center directed by Houck from 2021 to 2025 and where he now serves as chief scientist. The paper lists postdoctoral researcher Faranak Bahrami and graduate student Matthew P. Bland as co-lead authors.

    How Tantalum Improves Qubit Stability
    Houck, the Anthony H.P. Lee ’79 P11 P14 Professor of Electrical and Computer Engineering, explained that a quantum computer’s capability depends on two main factors. One is the total number of qubits that can be linked together. The other is how many operations each qubit can complete before errors accumulate. Improving the durability of a single qubit strengthens both of these factors. Longer coherence time directly supports scaling and more reliable error correction.
    Energy loss is the most common cause of failure in these systems. Microscopic surface defects in the metal can trap energy and disrupt the qubit during calculations. These disruptions multiply as more qubits are added. Tantalum is especially beneficial because it typically contains fewer of these defects than metals like aluminum. With fewer defects, the system produces fewer errors and simplifies the process of correcting the ones that remain.
    Houck and de Leon introduced tantalum for superconducting chips in 2021 with help from Princeton chemist Robert Cava, the Russell Wellman Moore Professor of Chemistry. Cava, who specializes in superconducting materials, became interested in the problem after hearing one of de Leon’s talks. Their conversations eventually led him to suggest tantalum as a promising material. “Then she went and did it,” Cava said. “That’s the amazing part.”
    Researchers across all three labs followed this idea and built a tantalum-based superconducting circuit on a sapphire substrate. The result showed a significant improvement in coherence time, approaching the previous world record.
    Bahrami noted that tantalum stands out because it is extremely durable and can withstand the harsh cleaning used to remove contamination during fabrication. “You can put tantalum in acid, and still the properties don’t change,” she said.
    Once contaminants were removed, the team evaluated the remaining energy losses. They found that the sapphire substrate was responsible for most of the remaining problems. Switching to high-purity silicon eliminated that source of loss, and the combination of tantalum and silicon, along with refined fabrication techniques, produced one of the biggest improvements ever achieved in a transmon qubit. Houck described the outcome as “a major breakthrough on the path to enabling useful quantum computing.”
    Houck added that because the benefits of the design increase exponentially as systems grow, replacing today’s industry-leading qubits with the Princeton version could allow a theoretical 1,000-qubit computer to operate about 1 billion times more effectively.
    Silicon-Based Design Supports Industry-Scale Growth
    The project draws from three areas of expertise. Houck’s group focuses on the design and optimization of superconducting circuits. De Leon’s lab specializes in quantum metrology along with the materials and fabrication methods that determine qubit performance. Cava’s group has spent decades developing superconducting materials. By combining their strengths, the team produced results that none of the groups could have achieved individually. Their success has already attracted attention from the quantum industry.
    Devoret said collaborations between universities and companies are essential for moving advanced technologies forward. “There is a rather harmonious relationship between industry and academic research,” he said. University researchers can investigate the fundamental limits of quantum performance, while industry partners apply those findings to large-scale systems.
    “We’ve shown that it’s possible in silicon,” de Leon said. “The fact that we’ve shown what the critical steps are, and the important underlying characteristics that will enable these kinds of coherence times, now makes it pretty easy for anyone who’s working on scaled processors to adopt.”
    The paper “Millisecond lifetimes and coherence times in 2D transmon qubits” was published in Nature on Nov. 5. Along with de Leon, Houck, Cava, Bahrami, and Bland, the authors include Jeronimo G.C. Martinez, Paal H. Prestegaard, Basil M. Smitham, Atharv Joshi, Elizabeth Hedrick, Alex Pakpour-Tabrizi, Shashwat Kumar, Apoorv Jindal, Ray D. Chang, Ambrose Yang, Guangming Cheng and Nan Yao. This research received primary support from the U.S. Department of Energy, Office of Science, National Quantum Information Science Research Centers, Co-design Center for Quantum Advantage (C2QA), and partial support from Google Quantum AI. More

  • in

    Physicists reveal a new quantum state where electrons run wild

    Electricity keeps modern life running, from cars and phones to computers and nearly every device we rely on. It works through the movement of electrons traveling through a circuit. Although these particles are far too small to see, the electric current they produce flows through wires in a way that resembles water moving through a pipe.
    In some materials, however, this steady flow can suddenly lock into organized, crystal-like patterns. When electrons settle into these rigid arrangements, the material undergoes a shift in its state of matter and stops conducting electricity. Instead of acting like a metal, it behaves as an insulator. This unusual behavior provides scientists with valuable insight into how electrons interact and has opened the door to advances in quantum computing, high-performance superconductors used in energy and medical imaging, innovative lighting systems, and extremely precise atomic clocks.
    A group of physicists at Florida State University, including National High Magnetic Field Laboratory Dirac Postdoctoral Fellow Aman Kumar, Associate Professor Hitesh Changlani, and Assistant Professor Cyprian Lewandowski, has now identified the specific conditions that allow a special kind of electron crystal to form. In this state, electrons arrange themselves in a solid lattice yet can also shift into a more fluid form. This hybrid phase is called a generalized Wigner crystal, and the team’s findings appear in npj Quantum Materials, a Nature publication.
    How Electron Crystals Form
    Scientists have long known that electrons in thin, two-dimensional materials can solidify into Wigner crystals, a concept first proposed in 1934. Experiments in recent years have detected these structures, but researchers had not fully understood how they arise once additional quantum effects are considered.
    “In our study, we determined which ‘quantum knobs’ to turn to trigger this phase transition and achieve a generalized Wigner crystal, which uses a 2D moiré system and allows different crystalline shapes to form, like stripes or honeycomb crystals, unlike traditional Wigner crystals that only show a triangular lattice crystal,” Changlani said.
    To explore these conditions, the team relied on advanced computational tools at FSU’s Research Computing Center, an academic service unit of Information Technology Services, as well as the National Science Foundation’s ACCESS program (an advanced computing and data resource under the Office of Advanced Cyberinfrastructure). They used methods such as exact diagonalization, density matrix renormalization group, and Monte Carlo simulations to test how electrons behave under various scenarios.

    Processing Enormous Amounts of Quantum Data
    Quantum mechanics assigns two pieces of information to every electron, and when hundreds or thousands of electrons interact, the total amount of data becomes extremely large. The researchers used sophisticated algorithms to compress and organize this overwhelming information into networks that could be examined and interpreted.
    “We’re able to mimic experimental findings via our theoretical understanding of the state of matter,” Kumar said. “We conduct precise theoretical calculations using state-of-the-art tensor network calculations and exact diagonalization, a powerful numerical technique used in physics to collect details about a quantum Hamiltonian, which represents the total quantum energy in a system. Through this, we can provide a picture for how the crystal states came about and why they’re favored in comparison to other energetically competitive states.”
    A New Hybrid: The Quantum Pinball Phase
    While studying the generalized Wigner crystal, the team uncovered another surprising state of matter. In this newly identified phase, electrons show both insulating and conducting behavior at the same time. Some electrons remain anchored in place within the crystal lattice, while others break free and move throughout the material. Their motion resembles a pinball ricocheting between stationary posts.
    “This pinball phase is a very exciting phase of matter that we observed while researching the generalized Wigner crystal,” Lewandowski said. “Some electrons want to freeze and others want to float around, which means that some are insulating and some are conducting electricity. This is the first time this unique quantum mechanical effect has been observed and reported for the electron density we studied in our work.”
    Why These Discoveries Matter

    These results expand scientists’ ability to understand and control how matter behaves at the quantum level.
    “What causes something to be insulating, conducting or magnetic? Can we transmute something into a different state?” Lewandowski said. “We’re looking to predict where certain phases of matter exist and how one state can transition to another — when you think of turning a liquid into gas, you picture turning up a heat knob to get water to boil into steam. Here, it turns out there are other quantum knobs we can play with to manipulate states of matter, which can lead to impressive advances in experimental research.”
    By adjusting these quantum knobs, or energy scales, researchers can push electrons from solid to liquid phases within these materials. Understanding Wigner crystals and their related states may shape the future of quantum technologies, including quantum computing and spintronics — a rapidly evolving area of condensed-matter physics that promises faster, more efficient nano-electronic devices with lower energy use and reduced manufacturing costs.
    The team aims to further explore how electrons cooperate and influence one another in complex systems. Their goal is to address fundamental questions that could ultimately drive innovations in quantum, superconducting, and atomic technologies. More