More stories

  • in

    Light has been hiding a magnetic secret for nearly 200 years

    Researchers at the Hebrew University of Jerusalem have found that the magnetic component of light plays a direct part in the Faraday Effect, overturning a 180-year belief that only light’s electric field was involved. Their work shows that light can exert magnetic influence on matter, not simply illuminate it. This insight could support advances in optics, spintronics, and emerging quantum technologies.
    The team’s findings, published in Nature’s Scientific Reports, show that the magnetic portion of light, not only its electric one, has a meaningful and measurable influence on how light interacts with materials. This result contradicts a scientific explanation that has shaped the understanding of the Faraday Effect since the nineteenth century.
    The study, led by Dr. Amir Capua and Benjamin Assouline of the university’s Institute of Electrical Engineering and Applied Physics, offers the first theoretical evidence that the oscillating magnetic field of light contributes directly to the Faraday Effect. This effect describes how the polarization of light rotates as it travels through a material placed in a constant magnetic field.
    How Light and Magnetism Interact
    “In simple terms, it’s an interaction between light and magnetism,” says Dr. Capua. “The static magnetic field ‘twists’ the light, and the light, in turn, reveals the magnetic properties of the material. What we’ve found is that the magnetic part of light has a first-order effect, it’s surprisingly active in this process.”
    For nearly two centuries, scientists attributed the Faraday Effect solely to the electric field of light interacting with electric charges in matter. The new study shows that the magnetic field of light also plays a direct role by interacting with atomic spins, a contribution long assumed to be insignificant.
    Calculating the Magnetic Contribution
    Using advanced calculations informed by the Landau-Lifshitz-Gilbert (LLG) equation, which describes how spins behave in magnetic materials, the researchers demonstrated that light’s magnetic field can generate magnetic torque within a material in a manner similar to a static magnetic field. Capua explains, “In other words, light doesn’t just illuminate matter, it magnetically influences it.”

    To measure the extent of that influence, the team applied their theoretical model to Terbium Gallium Garnet (TGG), a crystal commonly used to study the Faraday Effect. Their analysis revealed that the magnetic component of light is responsible for about 17% of the observed rotation in the visible spectrum and as much as 70% in the infrared.
    New Pathways for Future Technologies
    “Our results show that light ‘talks’ to matter not only through its electric field, but also through its magnetic field, a component that has been largely overlooked until now,” says Benjamin Assouline.
    The researchers note that this revised understanding of light’s magnetic behavior could open doors for innovations in optical data storage, spintronics, and magnetic control using light. The work may also contribute to future developments in spin-based quantum computing. More

  • in

    Quantum computers just simulated physics too complex for supercomputers

    Scientists study matter under extreme conditions to uncover some of nature’s most fundamental behaviors. The Standard Model of particle physics contains the equations needed to describe these phenomena, but in many real situations such as fast-changing environments or extremely dense matter, those equations become too complex for even the most advanced classical supercomputers to handle.
    Quantum computing offers a promising alternative because, in principle, it can represent and simulate these systems far more efficiently. A major challenge, however, is finding reliable methods to set up the initial quantum state that a simulation needs. In this work, researchers achieved a first: they created scalable quantum circuits capable of preparing the starting state of a particle collision similar to those produced in particle accelerators. Their test focuses on the strong interactions described by the Standard Model.
    The team began by determining the required circuits for small systems using classical computers. Once those designs were known, they applied the circuits’ scalable structure to build much larger simulations directly on a quantum computer. Using IBM’s quantum hardware, they successfully simulated key features of nuclear physics on more than 100 qubits.
    Scalable Quantum Methods for High-Density Physics
    These scalable quantum algorithms open the door to simulations that were previously out of reach. The approach can be used to model the vacuum state before a particle collision, physical systems with extremely high densities, and beams of hadrons. Researchers anticipate that future quantum simulations built on these circuits will exceed what classical computing can accomplish.
    Such simulations could shed light on major open questions in physics, including the imbalance of matter and antimatter, the creation of heavy elements inside supernovae, and the behavior of matter at ultra-high densities. The same techniques may also help model other difficult systems, including exotic materials with unusual quantum properties.
    Nuclear physicists used IBM’s quantum computers to perform the largest digital quantum simulation ever completed. Their success stemmed in part from identifying patterns in physical systems, including symmetries and differences in length scales, which helped them design scalable circuits that prepare states with localized correlations. They demonstrated the effectiveness of this algorithm by preparing the vacuum state and hadrons within a one-dimensional version of quantum electrodynamics.

    Advancing from Small Models to Large-Scale Quantum Systems
    The team validated their circuit components by first testing them on small systems with classical computing tools, confirming that the resulting states could be systematically improved. They then expanded the circuits to handle more than 100 qubits and ran them on IBM’s quantum devices. Using the data from these simulations, scientists extracted properties of the vacuum with percent-level accuracy.
    They also used the circuits to generate pulses of hadrons, then simulated how those pulses evolved over time to track their propagation. These advances point toward a future in which quantum computers can carry out full dynamical simulations of matter under extreme conditions that lie well beyond the reach of classical machines.
    This research received support from the Department of Energy (DOE) Office of Science, Office of Nuclear Physics, InQubator for Quantum Simulation (IQuS) through the Quantum Horizons: QIS Research and Innovation for Nuclear Science Initiative, and the Quantum Science Center (QSC), a DOE and University of Washington National Quantum Information Science Research Center. Additional computing resources were provided by the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility, and by the Hyak supercomputer system at the University of Washington. The team also acknowledges the use of IBM Quantum services for this project. More

  • in

    Nanoscale trick makes “dark excitons” glow 300,000 times stronger

    A research group at the City University of New York and the University of Texas at Austin has found a method to make dark excitons, a class of previously unseen light states, emit bright light and be controlled with nanoscale precision. The study, published November 12 in Nature Photonics, points toward future technologies that could operate faster, use less energy, and shrink to even smaller sizes.
    Dark excitons form in ultra-thin semiconductor materials and normally remain undetectable because they release only faint light. Even so, scientists have long viewed them as promising for quantum information and advanced photonics because they interact with light in unusual ways, remain stable for relatively long periods, and experience less disruption from their surroundings, which helps reduce decoherence.
    Amplifying Dark Excitons With Nanoscale Design
    To bring these hidden states into view, the researchers created a tiny optical cavity built from gold nanotubes combined with a single layer of tungsten diselenide (WSe2), a material just three atoms thick. This structure increased the brightness of dark excitons by an extraordinary factor of 300,000, making them clearly observable and allowing their behavior to be precisely controlled.
    “This work shows that we can access and manipulate light-matter states that were previously out of reach,” said principal investigator Andrea Alù, Distinguished and Einstein Professor of Physics at the CUNY Graduate Center and founding director of the Photonics Initiative at the Advanced Science Research Center at the CUNY Graduate Center (CUNY ASRC). “By turning these hidden states on and off at will and controlling them with nanoscale resolution, we open exciting opportunities to disruptively advance next-generation optical and quantum technologies, including for sensing and computing.”
    Electric and Magnetic Control of Hidden Quantum States
    The team also demonstrated that these dark excitons can be switched and adjusted using electric and magnetic fields. This level of control could support new designs for on-chip photonics, highly sensitive detectors, and secure quantum communication. Importantly, the method preserves the original characteristics of the material while still achieving record-setting improvements in light-matter coupling.

    “Our study reveals a new family of spin-forbidden dark excitons that had never been observed before,” said first author Jiamin Quan. “This discovery is just the beginning — it opens a path to explore many other hidden quantum states in 2D materials.”
    Solving a Debate in Plasmonics
    The findings also address a long-standing question of whether plasmonic structures can boost dark excitons without altering their fundamental nature when placed in close proximity. The researchers solved this by designing a plasmonic-excitonic heterostructure made with nanometer-thin boron nitride layers, which proved essential for revealing the newly identified dark excitons.
    The work received support from the Air Force Office of Scientific Research, the Office of Naval Research, and the National Science Foundation. More

  • in

    Princeton’s new quantum chip marks a major step toward quantum advantage

    Princeton engineers have created a superconducting qubit that remains stable for three times longer than the strongest designs available today. This improvement represents an important move toward building quantum computers that can operate reliably.
    “The real challenge, the thing that stops us from having useful quantum computers today, is that you build a qubit and the information just doesn’t last very long,” said Andrew Houck, leader of a federally funded national quantum research center, Princeton’s dean of engineering and co-principal investigator on the paper. “This is the next big jump forward.”
    In a Nov. 5 article published in Nature, the Princeton team reported that their qubit maintains coherence for more than 1 millisecond. This performance is triple the longest lifetime documented in laboratory experiments and nearly fifteen times greater than the standard used in industrial quantum processors. To confirm the result, the team constructed a functioning quantum chip based on the new qubit, demonstrating that the design can support error correction and scale toward larger systems.
    The researchers noted that their qubit is compatible with the architectures used by major companies such as Google and IBM. According to their analysis, replacing key components in Google’s Willow processor with Princeton’s approach could increase its performance by a factor of 1,000. Houck added that as quantum systems incorporate more qubits, the advantages of this design increase even more rapidly.
    Why Better Qubits Matter for Quantum Computing
    Quantum computers show promise for solving problems that traditional computers cannot address. Yet their current abilities remain limited because qubits lose their information before complex calculations can be completed. Extending coherence time is therefore essential for building practical quantum hardware. Princeton’s improvement represents the largest single gain in coherence time in more than ten years.
    Many labs are pursuing different qubit technologies, but Princeton’s design builds on a widely used approach known as the transmon qubit. Transmons, which operate as superconducting circuits held at extremely low temperatures, are known for being resistant to environmental interference and compatible with modern manufacturing tools.

    Despite these strengths, increasing the coherence time of transmon qubits has proven difficult. Recent results from Google showed that material defects now pose the main barrier to improving their newest processor.
    Tantalum and Silicon: A New Materials Strategy
    The Princeton team developed a two-part strategy to address these material challenges. First, they incorporated tantalum, a metal known for helping delicate circuits retain energy. Second, they replaced the standard sapphire substrate with high-purity silicon, a material foundational to the computing industry. Growing tantalum directly on silicon required solving several technical problems related to how the two materials interact, but the researchers succeeded and uncovered significant advantages in the process.
    Nathalie de Leon, co-director of Princeton’s Quantum Initiative and co-principal investigator of the project, said the tantalum-silicon design not only performs better than previous approaches but is also simpler to manufacture at scale. “Our results are really pushing the state of the art,” she said.
    Michel Devoret, chief scientist for hardware at Google Quantum AI, which provided partial funding, described the difficulty of extending the lifetime of quantum circuits. He noted that the challenge had become a “graveyard” of attempted solutions. “Nathalie really had the guts to pursue this strategy and make it work,” said Devoret, the 2025 Nobel Prize winner in physics.
    The project received primary funding from the U.S. Department of Energy National Quantum Information Science Research Centers and the Co-design Center for Quantum Advantage (C2QA), a center directed by Houck from 2021 to 2025 and where he now serves as chief scientist. The paper lists postdoctoral researcher Faranak Bahrami and graduate student Matthew P. Bland as co-lead authors.

    How Tantalum Improves Qubit Stability
    Houck, the Anthony H.P. Lee ’79 P11 P14 Professor of Electrical and Computer Engineering, explained that a quantum computer’s capability depends on two main factors. One is the total number of qubits that can be linked together. The other is how many operations each qubit can complete before errors accumulate. Improving the durability of a single qubit strengthens both of these factors. Longer coherence time directly supports scaling and more reliable error correction.
    Energy loss is the most common cause of failure in these systems. Microscopic surface defects in the metal can trap energy and disrupt the qubit during calculations. These disruptions multiply as more qubits are added. Tantalum is especially beneficial because it typically contains fewer of these defects than metals like aluminum. With fewer defects, the system produces fewer errors and simplifies the process of correcting the ones that remain.
    Houck and de Leon introduced tantalum for superconducting chips in 2021 with help from Princeton chemist Robert Cava, the Russell Wellman Moore Professor of Chemistry. Cava, who specializes in superconducting materials, became interested in the problem after hearing one of de Leon’s talks. Their conversations eventually led him to suggest tantalum as a promising material. “Then she went and did it,” Cava said. “That’s the amazing part.”
    Researchers across all three labs followed this idea and built a tantalum-based superconducting circuit on a sapphire substrate. The result showed a significant improvement in coherence time, approaching the previous world record.
    Bahrami noted that tantalum stands out because it is extremely durable and can withstand the harsh cleaning used to remove contamination during fabrication. “You can put tantalum in acid, and still the properties don’t change,” she said.
    Once contaminants were removed, the team evaluated the remaining energy losses. They found that the sapphire substrate was responsible for most of the remaining problems. Switching to high-purity silicon eliminated that source of loss, and the combination of tantalum and silicon, along with refined fabrication techniques, produced one of the biggest improvements ever achieved in a transmon qubit. Houck described the outcome as “a major breakthrough on the path to enabling useful quantum computing.”
    Houck added that because the benefits of the design increase exponentially as systems grow, replacing today’s industry-leading qubits with the Princeton version could allow a theoretical 1,000-qubit computer to operate about 1 billion times more effectively.
    Silicon-Based Design Supports Industry-Scale Growth
    The project draws from three areas of expertise. Houck’s group focuses on the design and optimization of superconducting circuits. De Leon’s lab specializes in quantum metrology along with the materials and fabrication methods that determine qubit performance. Cava’s group has spent decades developing superconducting materials. By combining their strengths, the team produced results that none of the groups could have achieved individually. Their success has already attracted attention from the quantum industry.
    Devoret said collaborations between universities and companies are essential for moving advanced technologies forward. “There is a rather harmonious relationship between industry and academic research,” he said. University researchers can investigate the fundamental limits of quantum performance, while industry partners apply those findings to large-scale systems.
    “We’ve shown that it’s possible in silicon,” de Leon said. “The fact that we’ve shown what the critical steps are, and the important underlying characteristics that will enable these kinds of coherence times, now makes it pretty easy for anyone who’s working on scaled processors to adopt.”
    The paper “Millisecond lifetimes and coherence times in 2D transmon qubits” was published in Nature on Nov. 5. Along with de Leon, Houck, Cava, Bahrami, and Bland, the authors include Jeronimo G.C. Martinez, Paal H. Prestegaard, Basil M. Smitham, Atharv Joshi, Elizabeth Hedrick, Alex Pakpour-Tabrizi, Shashwat Kumar, Apoorv Jindal, Ray D. Chang, Ambrose Yang, Guangming Cheng and Nan Yao. This research received primary support from the U.S. Department of Energy, Office of Science, National Quantum Information Science Research Centers, Co-design Center for Quantum Advantage (C2QA), and partial support from Google Quantum AI. More

  • in

    Physicists reveal a new quantum state where electrons run wild

    Electricity keeps modern life running, from cars and phones to computers and nearly every device we rely on. It works through the movement of electrons traveling through a circuit. Although these particles are far too small to see, the electric current they produce flows through wires in a way that resembles water moving through a pipe.
    In some materials, however, this steady flow can suddenly lock into organized, crystal-like patterns. When electrons settle into these rigid arrangements, the material undergoes a shift in its state of matter and stops conducting electricity. Instead of acting like a metal, it behaves as an insulator. This unusual behavior provides scientists with valuable insight into how electrons interact and has opened the door to advances in quantum computing, high-performance superconductors used in energy and medical imaging, innovative lighting systems, and extremely precise atomic clocks.
    A group of physicists at Florida State University, including National High Magnetic Field Laboratory Dirac Postdoctoral Fellow Aman Kumar, Associate Professor Hitesh Changlani, and Assistant Professor Cyprian Lewandowski, has now identified the specific conditions that allow a special kind of electron crystal to form. In this state, electrons arrange themselves in a solid lattice yet can also shift into a more fluid form. This hybrid phase is called a generalized Wigner crystal, and the team’s findings appear in npj Quantum Materials, a Nature publication.
    How Electron Crystals Form
    Scientists have long known that electrons in thin, two-dimensional materials can solidify into Wigner crystals, a concept first proposed in 1934. Experiments in recent years have detected these structures, but researchers had not fully understood how they arise once additional quantum effects are considered.
    “In our study, we determined which ‘quantum knobs’ to turn to trigger this phase transition and achieve a generalized Wigner crystal, which uses a 2D moiré system and allows different crystalline shapes to form, like stripes or honeycomb crystals, unlike traditional Wigner crystals that only show a triangular lattice crystal,” Changlani said.
    To explore these conditions, the team relied on advanced computational tools at FSU’s Research Computing Center, an academic service unit of Information Technology Services, as well as the National Science Foundation’s ACCESS program (an advanced computing and data resource under the Office of Advanced Cyberinfrastructure). They used methods such as exact diagonalization, density matrix renormalization group, and Monte Carlo simulations to test how electrons behave under various scenarios.

    Processing Enormous Amounts of Quantum Data
    Quantum mechanics assigns two pieces of information to every electron, and when hundreds or thousands of electrons interact, the total amount of data becomes extremely large. The researchers used sophisticated algorithms to compress and organize this overwhelming information into networks that could be examined and interpreted.
    “We’re able to mimic experimental findings via our theoretical understanding of the state of matter,” Kumar said. “We conduct precise theoretical calculations using state-of-the-art tensor network calculations and exact diagonalization, a powerful numerical technique used in physics to collect details about a quantum Hamiltonian, which represents the total quantum energy in a system. Through this, we can provide a picture for how the crystal states came about and why they’re favored in comparison to other energetically competitive states.”
    A New Hybrid: The Quantum Pinball Phase
    While studying the generalized Wigner crystal, the team uncovered another surprising state of matter. In this newly identified phase, electrons show both insulating and conducting behavior at the same time. Some electrons remain anchored in place within the crystal lattice, while others break free and move throughout the material. Their motion resembles a pinball ricocheting between stationary posts.
    “This pinball phase is a very exciting phase of matter that we observed while researching the generalized Wigner crystal,” Lewandowski said. “Some electrons want to freeze and others want to float around, which means that some are insulating and some are conducting electricity. This is the first time this unique quantum mechanical effect has been observed and reported for the electron density we studied in our work.”
    Why These Discoveries Matter

    These results expand scientists’ ability to understand and control how matter behaves at the quantum level.
    “What causes something to be insulating, conducting or magnetic? Can we transmute something into a different state?” Lewandowski said. “We’re looking to predict where certain phases of matter exist and how one state can transition to another — when you think of turning a liquid into gas, you picture turning up a heat knob to get water to boil into steam. Here, it turns out there are other quantum knobs we can play with to manipulate states of matter, which can lead to impressive advances in experimental research.”
    By adjusting these quantum knobs, or energy scales, researchers can push electrons from solid to liquid phases within these materials. Understanding Wigner crystals and their related states may shape the future of quantum technologies, including quantum computing and spintronics — a rapidly evolving area of condensed-matter physics that promises faster, more efficient nano-electronic devices with lower energy use and reduced manufacturing costs.
    The team aims to further explore how electrons cooperate and influence one another in complex systems. Their goal is to address fundamental questions that could ultimately drive innovations in quantum, superconducting, and atomic technologies. More

  • in

    AI creates the first 100-billion-star Milky Way simulation

    Researchers led by Keiya Hirashima at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan, working with partners from The University of Tokyo and Universitat de Barcelona in Spain, have created the first Milky Way simulation capable of tracking more than 100 billion individual stars across 10 thousand years of evolution. The team achieved this milestone by pairing artificial intelligence (AI) with advanced numerical simulation techniques. Their model includes 100 times more stars than the most sophisticated earlier simulations and was generated more than 100 times faster.
    The work, presented at the international supercomputing conference SC ’25, marks a major step forward for astrophysics, high-performance computing, and AI-assisted modeling. The same strategy could also be applied to large-scale Earth system studies, including climate and weather research.
    Why Modeling Every Star Is So Difficult
    For many years, astrophysicists have aimed to build Milky Way simulations detailed enough to follow each individual star. Such models would allow researchers to compare theories of galactic evolution, structure, and star formation directly to observational data. However, simulating a galaxy accurately requires calculating gravity, fluid behavior, chemical element formation, and supernova activity across enormous ranges of time and space, which makes the task extremely demanding.
    Scientists have not previously been able to model a galaxy as large as the Milky Way while maintaining fine detail at the level of single stars. Current cutting-edge simulations can represent systems with the equivalent mass of about one billion suns, far below the more than 100 billion stars that make up the Milky Way. As a result, the smallest “particle” in those models usually represents a group of roughly 100 stars, which averages away the behavior of individual stars and limits the accuracy of small-scale processes. The challenge is tied to the interval between computational steps: to capture rapid events such as supernova evolution, the simulation must advance in very small time increments.
    Shrinking the timestep means dramatically greater computational effort. Even with today’s best physics-based models, simulating the Milky Way star by star would require about 315 hours for every 1 million years of galactic evolution. At that rate, generating 1 billion years of activity would take over 36 years of real time. Simply adding more supercomputer cores is not a practical solution, as energy use becomes excessive and efficiency drops as more cores are added.
    A New Deep Learning Approach
    To overcome these barriers, Hirashima and his team designed a method that blends a deep learning surrogate model with standard physical simulations. The surrogate was trained using high-resolution supernova simulations and learned to predict how gas spreads during the 100,000 years following a supernova explosion without requiring additional resources from the main simulation. This AI component allowed the researchers to capture the galaxy’s overall behavior while still modeling small-scale events, including the fine details of individual supernovae. The team validated the approach by comparing its results against large-scale runs on RIKEN’s Fugaku supercomputer and The University of Tokyo’s Miyabi Supercomputer System.

    The method offers true individual-star resolution for galaxies with more than 100 billion stars, and it does so with remarkable speed. Simulating 1 million years took just 2.78 hours, meaning that 1 billion years could be completed in approximately 115 days instead of 36 years.
    Broader Potential for Climate, Weather, and Ocean Modeling
    This hybrid AI approach could reshape many areas of computational science that require linking small-scale physics with large-scale behavior. Fields such as meteorology, oceanography, and climate modeling face similar challenges and could benefit from tools that accelerate complex, multi-scale simulations.
    “I believe that integrating AI with high-performance computing marks a fundamental shift in how we tackle multi-scale, multi-physics problems across the computational sciences,” says Hirashima. “This achievement also shows that AI-accelerated simulations can move beyond pattern recognition to become a genuine tool for scientific discovery — helping us trace how the elements that formed life itself emerged within our galaxy.” More

  • in

    Chimps shock scientists by changing their minds with new evidence

    Chimpanzees may share more with human thinkers than researchers once realized. A new study published in Science presents compelling evidence that chimpanzees can revise their beliefs in a rational way when they encounter new information.
    The study, titled “Chimpanzees rationally revise their beliefs,” was carried out by an international team that included UC Berkeley Psychology Postdoctoral Researcher Emily Sanford, UC Berkeley Psychology Professor Jan Engelmann and Utrecht University Psychology Professor Hanna Schleihauf. Their results indicate that chimpanzees, similar to humans, adjust their decisions based on how strong the available evidence is, which is a central component of rational thinking.
    At the Ngamba Island Chimpanzee Sanctuary in Uganda, the researchers designed an experiment involving two boxes, one of which contained food. The chimps were first given a hint about which box held the reward. Later, they received a clearer and more convincing clue that pointed to the other box. Many of the animals changed their choice after receiving the stronger information.
    “Chimpanzees were able to revise their beliefs when better evidence became available,” said Sanford, a researcher in the UC Berkeley Social Origins Lab. “This kind of flexible reasoning is something we often associate with 4-year-old children. It was exciting to show that chimps can do this too.”
    Testing Whether Chimps Are Reasoning or Acting on Instinct
    To confirm that the animals were truly engaging in reasoning rather than reacting on impulse, the researchers used tightly controlled experiments combined with computational modeling. These methods helped rule out simpler explanations, such as the chimps favoring the most recent clue (recency bias) or simply responding to the easiest cue to notice. The modeling showed that their decisions followed patterns consistent with rational belief revision.
    “We recorded their first choice, then their second, and compared whether they revised their beliefs,” Sanford said. “We also used computational models to test how their choices matched up with various reasoning strategies.”
    This work challenges long-held assumptions that rationality, defined as forming and updating beliefs based on evidence, belongs only to humans.

    “The difference between humans and chimpanzees isn’t a categorical leap. It’s more like a continuum,” Sanford said.
    Broader Implications for Learning, Childhood Development and AI
    Sanford believes these findings may influence how scientists think about a wide range of fields. Learning how primates update their beliefs could reshape ideas about how children learn and even how artificial intelligence systems are designed.
    “This research can help us think differently about how we approach early education or how we model reasoning in AI systems,” she said. “We shouldn’t assume children are blank slates when they walk into a classroom.”
    The next phase of the project will apply the same belief revision tasks to young children. Sanford’s team is now gathering data from two- to four-year-olds to see how toddlers handle changing information compared to chimps.
    “It’s fascinating to design a task for chimps, and then try to adapt it for a toddler,” she said.

    Expanding the Study to Other Primates
    Sanford hopes to broaden the work to additional primate species, creating a comparative view of reasoning abilities across evolutionary branches. Her previous research spans topics from empathy in dogs to numerical understanding in children, and she notes that one theme continues to stand out: animals often demonstrate far more cognitive sophistication than people assume.
    “They may not know what science is, but they’re navigating complex environments with intelligent and adaptive strategies,” she said. “And that’s something worth paying attention to.”
    Other members of the research team include: Bill Thompson (UC Berkeley Psychology); Snow Zhang (UC Berkeley Philosophy); Joshua Rukundo (Ngamba Island Chimpanzee Sanctuary/Chimpanzee Trust, Uganda); Josep Call (School of Psychology and Neuroscience, University of St Andrews); and Esther Herrmann (School of Psychology, University of Portsmouth). More

  • in

    A single beam of light runs AI with supercomputer power

    Tensor operations are a form of advanced mathematics that support many modern technologies, especially artificial intelligence. These operations go far beyond the simple calculations most people encounter. A helpful way to picture them is to imagine manipulating a Rubik’s cube in several dimensions at once by rotating, slicing, or rearranging its layers. Humans and traditional computers must break these tasks into sequences, but light can perform all of them at the same time.
    Today, tensor operations are essential for AI systems involved in image processing, language understanding, and countless other tasks. As the amount of data continues to grow, conventional digital hardware such as GPUs faces increasing strain in speed, energy use, and scalability.
    Researchers Demonstrate Single-Shot Tensor Computing With Light
    To address these challenges, an international team led by Dr. Yufeng Zhang from the Photonics Group at Aalto University’s Department of Electronics and Nanoengineering has developed a fundamentally new approach. Their method allows complex tensor calculations to be completed within a single movement of light through an optical system. The process, described as single-shot tensor computing, functions at the speed of light.
    “Our method performs the same kinds of operations that today’s GPUs handle, like convolutions and attention layers, but does them all at the speed of light,” says Dr. Zhang. “Instead of relying on electronic circuits, we use the physical properties of light to perform many computations simultaneously.”
    Encoding Information Into Light for High-Speed Computation
    The team accomplished this by embedding digital information into the amplitude and phase of light waves, transforming numerical data into physical variations within the optical field. As these light waves interact, they automatically carry out mathematical procedures such as matrix and tensor multiplication, which form the basis of deep learning. By working with multiple wavelengths of light, the researchers expanded their technique to support even more complex, higher-order tensor operations.

    “Imagine you’re a customs officer who must inspect every parcel through multiple machines with different functions and then sort them into the right bins,” Zhang says. “Normally, you’d process each parcel one by one. Our optical computing method merges all parcels and all machines together — we create multiple ‘optical hooks’ that connect each input to its correct output. With just one operation, one pass of light, all inspections and sorting happen instantly and in parallel.”
    Passive Optical Processing With Wide Compatibility
    One of the most striking benefits of this method is how little intervention it requires. The necessary operations occur on their own as the light travels, so the system does not need active control or electronic switching during computation.
    “This approach can be implemented on almost any optical platform,” says Professor Zhipei Sun, leader of Aalto University’s Photonics Group. “In the future, we plan to integrate this computational framework directly onto photonic chips, enabling light-based processors to perform complex AI tasks with extremely low power consumption.”
    Path Toward Future Light-Based AI Hardware
    Zhang notes that the ultimate objective is to adapt the technique to existing hardware and platforms used by major technology companies. He estimates that the method could be incorporated into such systems within 3 to 5 years.
    “This will create a new generation of optical computing systems, significantly accelerating complex AI tasks across a myriad of fields,” he concludes.
    The study was published in Nature Photonics on November 14th, 2025. More