More stories

  • in

    Antarctic krill eject more food when it’s contaminated with plastic

    Antarctic krill keep revealing new superpowers.

    Euphausia superba, the Southern Ocean’s ubiquitous krill species, sequester large amounts of carbon via their profuse poop. Now, scientists have identified another way in which the swimming crustaceans may modulate Earth’s climate: by sending their leftovers down to the bottom of the sea.

    Laboratory observations of krills’ filter feeding behavior suggest that when food is plentiful — such as during a phytoplankton bloom — ejected “boluses” of leftover food also sequester carbon, researchers report October 7 in Biology Letters. More

  • in

    Physicists just built a quantum lie detector. It works

    Can you prove whether a large quantum system truly behaves according to the weird and wonderful rules of quantum mechanics — or if it just looks like it does? In a groundbreaking study, physicists from Leiden, Beijing en Hangzhou found the answer to this question.
    You could call it a ‘quantum lie detector’: Bell’s test designed by famous physicist John Bell. This test shows whether a machine, like a quantum computer, is truly using quantum effects or just mimics them.
    As quantum technologies become more mature, ever more stringent tests of quantumness become necessary. In this new study, the researchers took things to the next level, testing Bell correlations in systems with up to 73 qubits — the basic building blocks of a quantum computer.
    The study involved a global team: theoretical physicists Jordi Tura, Patrick Emonts, PhD candidate Mengyao Hu from Leiden University, together with colleagues from Tsinghua University (Beijing) and experimental physicists from Zhejiang University (Hangzhou).The world of quantum physics
    Quantum mechanics is the science that explains how the tiniest particles in the universe — like atoms and electrons — behave. It’s a world full of strange and counterintuitive ideas.
    One of those is quantum nonlocality, where particles appear to instantly affect each other, even when far apart. Although it sounds strange, it’s a real effect, and it won the Nobel Prize in Physics in 2022. This research is focused on proving the occurrence of nonlocal correlation, also known as Bell correlations.

    Clever experimenting
    It was an extremely ambitious plan, but the team’s well-optimized strategy made all the difference. Instead of trying to directly measure the complex Bell correlations, they focused on something quantum devices are already good at: minimizing energy.
    And it paid off. The team created a special quantum state using 73 qubits in a superconducting quantum processor and measured energies far below what would be possible in a classical system. The difference was striking — 48 standard deviations — making it almost impossible that the result was due to chance.
    But the team didn’t stop there. They went on to certify a rare and more demanding type of nonlocality – known as genuine multipartite Bell correlations. In this kind of quantum correlation, all qubits in the system must be involved, making it much harder to generate — and even harder to verify. Remarkably, the researchers succeeded in preparing a whole series of low-energy states that passed this test up to 24 qubits, confirming these special correlations efficiently.
    This result shows that quantum computers are not just getting bigger — they are also becoming better at displaying and proving truly quantum behaviour.
    Why this matters
    This study proves that it’s possible to certify deep quantum behaviour in large, complex systems — something never done at this scale before. It’s a big step toward making sure quantum computers are truly quantum.
    These insights are more than just theoretical. Understanding and controlling Bell correlations could improve quantum communication, make cryptography more secure, and help develop new quantum algorithms. More

  • in

    Scientists accidentally create a tiny “rainbow chip” that could supercharge the internet

    A few years ago, researchers in Michal Lipson’s lab noticed something remarkable.
    They were working on a project to improve LiDAR, a technology that uses lightwaves to measure distance. The lab was designing high-power chips that could produce brighter beams of light.
    “As we sent more and more power through the chip, we noticed that it was creating what we call a frequency comb,” says Andres Gil-Molina, a former postdoctoral researcher in Lipson’s lab.
    A frequency comb is a special type of light that contains many colors lined up next to each other in an orderly pattern, kind of like a rainbow. Dozens of colors — or frequencies of light — shine brightly, while the gaps between them remain dark. When you look at a frequency comb on a spectrogram, these bright frequencies appear as spikes, or teeth on a comb. This offers the tremendous opportunity of sending dozens of streams of data simultaneously. Because the different colors of light don’t interfere with each other, each tooth acts as its own channel.
    Today, creating a powerful frequency comb requires large and expensive lasers and amplifiers. In their new paper in Nature Photonics, Lipson, Eugene Higgins Professor of Electrical Engineering and professor of Applied Physics, and her collaborators show how to do the same thing on a single chip.
    “Data centers have created tremendous demand for powerful and efficient sources of light that contain many wavelengths,” says Gil-Molina, who is now a principal engineer at Xscape Photonics. “The technology we’ve developed takes a very powerful laser and turns it into dozens of clean, high-power channels on a chip. That means you can replace racks of individual lasers with one compact device, cutting cost, saving space, and opening the door to much faster, more energy-efficient systems.”
    “This research marks another milestone in our mission to advance silicon photonics,” Lipson said. “As this technology becomes increasingly central to critical infrastructure and our daily lives, this type of progress is essential to ensuring that data centers are as efficient as possible.”
    Cleaning up messy light

    The breakthrough started with a simple question: What’s the most powerful laser we can put on a chip?
    The team chose a type called a multimode laser diode, which is used widely in applications like medical devices and laser cutting tools. These lasers can produce enormous amounts of light, but the beam is “messy,” which makes it hard to use for precise applications.
    Integrating such a laser into a silicon photonics chip, where the light pathways are just a few microns — even hundreds of nanometers — wide, required careful engineering.
    “We used something called a locking mechanism to purify this powerful but very noisy source of light,” Gil-Molina says. The method relies on silicon photonics to reshape and clean up the laser’s output, producing a much cleaner, more stable beam, a property scientists call high coherence.
    Once the light is purified, the chip’s nonlinear optical properties take over, splitting that single powerful beam into dozens of evenly spaced colors, a defining feature of a frequency comb. The result is a compact, high-efficiency light source that combines the raw power of an industrial laser with the precision and stability needed for advanced communications and sensing.
    Why it matters now
    The timing for this breakthrough is no accident. With the explosive growth of artificial intelligence, the infrastructure inside data centers is straining to move information fast enough, for example, between processors and memory. State-of-the-art data centers are already using fiber optic links to transport data, but most of these still rely on single-wavelength lasers.

    Frequency combs change that. Instead of one beam carrying one data stream, dozens of beams can run in parallel through the same fiber. That’s the principle behind wavelength-division multiplexing (WDM), the technology that turned the internet into a global high-speed network in the late 1990s.
    By making high-power, multi-wavelength combs small enough to fit directly on a chip, Lipson’s team has made it possible to bring this capability into the most compact, cost-sensitive parts of modern computing systems. Beyond data centers, the same chips could enable portable spectrometers, ultra-precise optical clocks, compact quantum devices, and even advanced LiDAR systems.
    “This is about bringing lab-grade light sources into real-world devices,” says Gil-Molina. “If you can make them powerful, efficient, and small enough, you can put them almost anywhere.” More

  • in

    These little robots literally walk on water

    Imagine a tiny robot, no bigger than a leaf, gliding across a pond’s surface like a water strider. One day, devices like this could track pollutants, collect water samples or scout flooded areas too risky for people.
    Baoxing Xu, professor of mechanical and aerospace engineering at the University of Virginia’s School of Engineering and Applied Science, is pioneering a way to build them. In a new study published in Science Advances, Xu’s research introduces HydroSpread, a first-of-its-kind fabrication method that has great potential to impact the growing field of soft robotics. This innovation allows scientists to make soft, floating devices directly on water, a technology that could be utilized in fields from health care to electronics to environmental monitoring.
    Until now, the thin, flexible films used in soft robotics had to be manufactured on rigid surfaces like glass and then peeled off and transferred to water, a delicate process that often caused films to tear. HydroSpread sidesteps this issue by letting liquid itself serve as the “workbench.” Droplets of liquid polymer could naturally spread into ultrathin, uniform sheets on the water’s surface. With a finely tuned laser, Xu’s team can then carve these sheets into complex patterns — circles, strips, even the UVA logo — with remarkable precision.
    Using this approach, the researchers built two insect-like prototypes: HydroFlexor, which paddles across the surface using fin-like motions. HydroBuckler, which “walks” forward with buckling legs, inspired by water striders.In the lab, the team powered these devices with an overhead infrared heater. As the films warmed, their layered structure bent or buckled, creating paddling or walking motions. By cycling the heat on and off, the devices could adjust their speed and even turn — proof that controlled, repeatable movement is possible. Future versions could be designed to respond to sunlight, magnetic fields or tiny embedded heaters, opening the door to autonomous soft robots that can move and adapt on their own.
    “Fabricating the film directly on liquid gives us an unprecedented level of integration and precision,” Xu said. “Instead of building on a rigid surface and then transferring the device, we let the liquid do the work to provide a perfectly smooth platform, reducing failure at every step.”
    The potential reaches beyond soft robots. By making it easier to form delicate films without damaging them, HydroSpread could open new possibilities for creating wearable medical sensors, flexible electronics and environmental monitors — tools that need to be thin, soft and durable in settings where traditional rigid materials don’t work.
    About the Researcher
    Baoxing Xu is a nationally recognized expert in mechanics, compliant structures and bioinspired engineering. His lab at UVA Engineering focuses on translating strategies from nature — such as the delicate mechanics of insect locomotion — into resilient, functional devices for human use.
    This work, supported by the National Science Foundation and 4-VA, was carried out within UVA’s Department of Mechanical and Aerospace Engineering. Graduate and undergraduate researchers in Xu’s group played a central role in the experiments, gaining hands-on experience with state-of-the-art fabrication and robotics techniques. More

  • in

    Scientists finally found the “dark matter” of electronics

    In a world-first, researchers from the Femtosecond Spectroscopy Unit at the Okinawa Institute of Science and Technology (OIST) have directly observed the evolution of the elusive dark excitons in atomically thin materials, laying the foundation for new breakthroughs in both classical and quantum information technologies. Their findings have been published in Nature Communications. Professor Keshav Dani, head of the unit, highlights the significance: “Dark excitons have great potential as information carriers, because they are inherently less likely to interact with light, and hence less prone to degradation of their quantum properties. However, this invisibility also makes them very challenging to study and manipulate. Building on a previous breakthrough at OIST in 2020, we have opened a route to the creation, observation, and manipulation of dark excitons.”
    “In the general field of electronics, one manipulates electron charge to process information,” explains Xing Zhu, co-first author and PhD student in the unit. “In the field of spintronics, we exploit the spin of electrons to carry information. Going further, in valleytronics, the crystal structure of unique materials enables us to encode information into distinct momentum states of the electrons, known as valleys.” The ability to use the valley dimension of dark excitons to carry information positions them as promising candidates for quantum technologies. Dark excitons are by nature more resistant to environmental factors like thermal background than the current generation of qubits, potentially requiring less extreme cooling and making them less prone to decoherence, where the unique quantum state breaks down.
    Defining landscapes of energy with bright and dark excitons
    Over the past decade, progress has been made in the development of a class of atomically thin semiconducting materials known as TMDs (transition metal dichalcogenides). As with all semiconductors, atoms in TMDs are aligned in a crystal lattice, which confines electrons to a specific level (or band) of energy, such as the valence band. When exposed to light, the negatively charged electrons are excited to a higher energy state – the conduction band – leaving behind a positively charged hole in the valence band. The electrons and holes are bound together by electrostatic attraction, forming hydrogen-like quasiparticles called excitons. If certain quantum properties of the electron and hole match, i.e. they have the same spin configuration and they inhabit the same ‘valley’ in momentum space (the energy minima that electrons and holes can occupy in the atomic crystal structure) the two recombine within a picosecond (1ps = 10−12 second), emitting light in the process. These are ‘bright’ excitons.
    However, if the quantum properties of the electron and hole do not match up, the electron and hole are forbidden from recombining on their own and do not emit light. These are characterized as ‘dark’ excitons. “There are two ‘species’ of dark excitons,” explains Dr. David Bacon, co-first author who is now at University College London, “momentum-dark and spin-dark, depending on where the properties of electron and hole are in conflict. The mismatch in properties not only prevents immediate recombination, allowing them to exist up to several nanoseconds (1ns = 10−9 second – a much more useful timescale), but also makes dark excitons more isolated from environmental interactions.”
    “The unique atomic symmetry of TMDs means that when exposed to a state of light with a circular polarization, one can selectively create bright excitons only in a specific valley. This is the fundamental principle of valleytronics. However, bright excitons rapidly turn into numerous dark excitons that can potentially preserve the valley information. Which species of dark excitons are involved and to what degree they can sustain the valley information is unclear, but this is a key step in the pursuit of valleytronic applications,” explains Dr. Vivek Pareek, co-first author and OIST graduate who is now a Presidential Postdoctoral Fellow at the California Institute of Technology.
    Observing electrons at the femtosecond scale
    Using the world-leading TR-ARPES (time- and angle resolved photoemission spectroscopy) setup at OIST, which includes a proprietary, table-top XUV (extreme ultraviolet) source, the team has managed to track the characteristics of all excitons after the creation of bright excitons in a specific valley in a TMD semiconductor over time by simultaneously quantifying momentum, spin state, and population levels of electrons and holes – these properties have never been simultaneously quantified before.
    Their findings show that within a picosecond, some bright excitons are scattered by phonons (quantized crystal lattice vibrations) into different momentum valleys, rendering them momentum-dark. Later, spin-dark excitons dominate, where electrons have flipped spin within the same valley, persisting on nanosecond scales.
    With this, the team has overcome the fundamental challenge of how to access and track dark excitons, laying the foundation for dark valleytronics as a field. Dr. Julien Madéo of the unit summarizes: “Thanks to the sophisticated TR-ARPES setup at OIST, we have directly accessed and mapped how and what dark excitons keep long-lived valley information. Future developments to read out the dark excitons valley properties will unlock broad dark valleytronic applications across information systems.” More

  • in

    Scientists just recreated a wildfire that made its own weather

    On September 5, 2020, California’s Creek Fire grew so severe that it began producing it’s own weather system. The fire’s extreme heat produced an explosive thunderhead that spewed lightning strikes and further fanned the roaring flames, making containment elusive and endangering the lives of firefighters on the ground. These wildfire-born storms have become a growing part of fire seasons across the West, with lasting impacts on air quality, weather, and climate. Until now, scientists have struggled to replicate them in Earth system models, hindering our ability to predict their occurrence and understand their impacts on the global climate. Now, a new study provides a breakthrough by developing a novel wildfire-Earth system modeling framework.
    The research, published September 25th in Geophysical Research Letters, represents the first successful simulation of these wildfire-induced storms, known as pyrocumulonimbus clouds, within an Earth system model. Led by DRI scientist Ziming Ke, the study successfully reproduced the observed timing, height, and strength of the Creek Fire’s thunderhead – one of the largest known pyrocumulonimbus clouds seen in the U.S., according to NASA. The model also replicated multiple thunderstorms produced by the 2021 Dixie Fire, which occurred under very different conditions. Accounting for the way that cloud development is aided by moisture lofted into the higher reaches of the atmosphere by terrain and winds is key to their findings.
    “This work is a first-of-its-kind breakthrough in Earth system modeling,” Ke said. “It not only demonstrates how extreme wildfire events can be studied within Earth system models, but also establishes DRI’s growing capability in Earth system model development — a core strength that positions the institute to lead future advances in wildfire-climate science.”
    When a pyrocumulonimbus cloud forms, it injects smoke and moisture into the upper atmosphere at magnitudes comparable to those of small volcanic eruptions, impacting the way Earth’s atmosphere receives and reflects sunlight. These fire aerosols can persist for months or longer, altering stratospheric composition. When transported to polar regions, they affect Antarctic ozone dynamics, modify clouds and albedo, and accelerate ice and snow melt, reshaping polar climate feedbacks. Scientists estimate that tens to hundreds of these storms occur globally each year, and that the trend of increasingly severe wildfires will only grow their numbers. Until now, failing to incorporate these storms into Earth system models has hindered our ability to understand this natural disturbance’s impact on global climate.
    The research team also included scientists from Lawrence Livermore National Laboratory, U.C. Irvine, and Pacific Northwest National Laboratory. Their breakthrough leveraged the Department of Energy’s (DOE) Energy Exascale Earth System Model (E3SM) to successfully capture the complex interplay between wildfires and the atmosphere.
    “Our team developed a novel wildfire-Earth system modeling framework that integrates high-resolution wildfire emissions, a one-dimensional plume-rise model, and fire-induced water vapor transport into DOE’s cutting-edge Earth system model,” Ke said. “This breakthrough advances high-resolution modeling of extreme hazards to improve national resilience and preparedness, and provides the framework for future exploration of these storms at regional and global scales within Earth system models.” More

  • in

    DOLPHIN AI uncovers hundreds of invisible cancer markers

    McGill University researchers have developed an artificial intelligence tool that can detect previously invisible disease markers inside single cells.
    In a study published in Nature Communications, the researchers demonstrate how the tool, called DOLPHIN, could one day be used by doctors to catch diseases earlier and guide treatment options.
    “This tool has the potential to help doctors match patients with the therapies most likely to work for them, reducing trial-and-error in treatment,” said senior author Jun Ding, assistant professor in McGill’s Department of Medicine and a junior scientist at the Research Institute of the McGill University Health Centre.
    Zooming in on genetic building blocks
    Disease markers are often subtle changes in RNA expression that can indicate when a disease is present, how severe it may become or how it might respond to treatment.
    Conventional gene-level methods of analysis collapse these markers into a single count per gene, masking critical variation and capturing only the tip of the iceberg, said the researchers.
    Now, advances in artificial intelligence have made it possible to capture the fine-grained complexity of single-cell data. DOLPHIN moves beyond gene-level, zooming in to see how genes are spliced together from smaller pieces called exons to provide a clearer view of cell states.

    “Genes are not just one block, they’re like Lego sets made of many smaller pieces,” said first author Kailu Song, a PhD student in McGill’s Quantitative Life Sciences program. “By looking at how those pieces are connected, our tool reveals important disease markers that have long been overlooked.”
    In one test case, DOLPHIN analyzed single-cell data from pancreatic cancer patients and found more than 800 disease markers missed by conventional tools. It was able to distinguish patients with high-risk, aggressive cancers from those with less severe cases, information that would help doctors choose the right treatment path.
    A step toward ‘virtual cells’
    More broadly, the breakthrough lays the foundation for achieving the long-term goal of building digital models of human cells. DOLPHIN generates richer single-cell profiles than conventional methods, enabling virtual simulations of how cells behave and respond to drugs before moving to lab or clinical trials, saving time and money.
    The researchers’ next step will be to expand the tool’s reach from a few datasets to millions of cells, paving the way for more accurate virtual cell models in the future.
    About the study
    “DOLPHIN advances single-cell transcriptomics beyond gene level by leveraging exon and junction reads” by Kailu Song and Jun Ding et al., was published inNature Communications.
    This research was supported the Meakins-Christie Chair in Respiratory Research, the Canadian Institutes of Health Research, the Natural Sciences and Engineering Research Council of Canada and the Fonds de recherche du Québec. More