More stories

  • in

    Scientists discover a way simulate the Universe on a laptop

    As astronomers gather more data than ever before, studying the cosmos has become an increasingly complex task. A new innovation is changing that reality. Researchers have now developed a way to analyze enormous cosmic data sets using only a laptop and a few hours of processing time.
    Leading this effort is Dr. Marco Bonici, a postdoctoral researcher at the Waterloo Centre for Astrophysics at the University of Waterloo. Bonici and an international team created Effort.jl, short for EFfective Field theORy surrogate. This tool uses advanced numerical techniques and smart data-preprocessing methods to deliver exceptional computational performance while maintaining the accuracy required in cosmology. The team designed it as a powerful emulator for the Effective Field Theory of Large-Scale Structure (EFTofLSS), allowing researchers to process vast datasets more efficiently than ever before.
    Turning Frustration Into Innovation
    The idea for Effort.jl emerged from Bonici’s experience running time-consuming computer models. Each time he adjusted even a single parameter, it could take days of extra computation to see the results. That challenge inspired him to build a faster, more flexible solution that could handle such adjustments in hours rather than days.
    “Using Effort.jl, we can run through complex data sets on models like EFTofLSS, which have previously needed a lot of time and computer power,” Bonici explained. “With projects like DESI and Euclid expanding our knowledge of the universe and creating even larger astronomical datasets to explore, Effort.jl allows researchers to analyze data faster, inexpensively and multiple times while making small changes based on nuances in the data.”
    Smarter Simulations for a Faster Universe
    Effort.jl belongs to a class of tools known as emulators. These are trained computational shortcuts that replicate the behavior of large, resource-intensive simulations but run dramatically faster. By using emulators, scientists can explore many possible cosmic scenarios in a fraction of the time and apply advanced techniques such as gradient-based sampling to study intricate physical models with greater efficiency.

    “We were able to validate the predictions coming out of Effort.jl by aligning them with those coming out of EFTofLSS,” Bonici said. “The margin of error was small and showed us that the calculations coming out of Effort.jl are strong. Effort.jl can also handle observational quirks like distortions in data and can be customized very easily to the needs of the researcher.”
    Human Expertise Still Matters
    Despite its impressive capabilities, Effort.jl is not a substitute for scientific understanding. Cosmologists still play a vital role in setting parameters, interpreting results, and applying physical insight to ensure meaningful conclusions. The combination of expert knowledge and computational power is what makes the system so effective.
    Looking ahead, Effort.jl is expected to take on even larger cosmological datasets and work alongside other analytical tools. Researchers also see potential for its methods in areas beyond astrophysics, including weather and climate modeling.
    The paper, “Effort.jl: a fast and differentiable emulator for the Effective Field Theory of the Large Scale Structure of the Universe,” was published in the Journal of Cosmology and Astroparticle Physics. More

  • in

    Deep Antarctic waters hold geometric communities of fish nests

    Carly Kay is the Fall 2025 science writing intern at Science News. She holds a bachelor’s degree in communication from the University of California, Santa Barbara and a master’s degree in science communication from the University of California, Santa Cruz. More

  • in

    A revolutionary DNA search engine is speeding up genetic discovery

    Rare genetic diseases can now be detected in patients, and tumor-specific mutations identified — a milestone made possible by DNA sequencing, which transformed biomedical research decades ago. In recent years, the introduction of new sequencing technologies (next-generation sequencing) has driven a wave of breakthroughs. During 2020 and 2021, for instance, these methods enabled the rapid decoding and worldwide monitoring of the SARS-CoV-2 genome.
    At the same time, an increasing number of researchers are making their sequencing results publicly accessible. This has led to an explosion of data, stored in major databases such as the American SRA (Sequence Read Archive) and the European ENA (European Nucleotide Archive). Together, these archives now hold about 100 petabytes of information — roughly equivalent to the total amount of text found across the entire internet, with a single petabyte equaling one million gigabytes.
    Until now, biomedical scientists needed enormous computing resources to search through these vast genetic repositories and compare them with their own data, making comprehensive searches nearly impossible. Researchers at ETH Zurich have now developed a way to overcome that limitation.
    Full-text search instead of downloading entire data sets
    The team created a tool called MetaGraph, which dramatically streamlines and accelerates the process. Instead of downloading entire datasets, MetaGraph enables direct searches within the raw DNA or RNA data — much like using an internet search engine. Scientists simply enter a genetic sequence of interest into a search field and, within seconds or minutes depending on the query, can see where that sequence appears in global databases.
    “It’s a kind of Google for DNA,” explains Professor Gunnar Rätsch, a data scientist in ETH Zurich’s Department of Computer Science. Previously, researchers could only search for descriptive metadata and then had to download the full datasets to access raw sequences. That approach was slow, incomplete, and expensive.
    According to the study authors, MetaGraph is also remarkably cost-efficient. Representing all publicly available biological sequences would require only a few computer hard drives, and large queries would cost no more than about 0.74 dollars per megabase.

    Because the new DNA search engine is both fast and accurate, it could significantly accelerate research — particularly in identifying emerging pathogens or analyzing genetic factors linked to antibiotic resistance. The system may even help locate beneficial viruses that destroy harmful bacteria (bacteriophages) hidden within these massive databases.
    Compression by a factor of 300
    In their study published on October 8 in Nature, the ETH team demonstrated how MetaGraph works. The tool organizes and compresses genetic data using advanced mathematical graphs that structure information more efficiently, similar to how spreadsheet software arranges values. “Mathematically speaking, it is a huge matrix with millions of columns and trillions of rows,” Rätsch explains.
    Creating indexes to make large datasets searchable is a familiar concept in computer science, but the ETH approach stands out for how it connects raw data with metadata while achieving an extraordinary compression rate of about 300 times. This reduction works much like summarizing a book — it removes redundancies while preserving the essential narrative and relationships, retaining all relevant information in a much smaller form.
    “We are pushing the limits of what is possible in order to keep the data sets as compact as possible without losing necessary information,” says Dr. André Kahles, who, like Rätsch, is a member of the Biomedical Informatics Group at ETH Zurich. By contrast with other DNA search masks currently being researched, the ETH researchers’ approach is scalable. This means that the larger the amount of data queried, the less additional computing power the tool requires.
    Half of the data is already available now
    First introduced in 2020, MetaGraph has been steadily refined. The tool is now publicly accessible for searches (https://metagraph.ethz.ch/search) and already indexes millions of DNA, RNA, and protein sequences from viruses, bacteria, fungi, plants, animals, and humans. Currently, nearly half of all available global sequence datasets are included, with the remainder expected to follow by the end of the year. Since MetaGraph is open source, it could also attract interest from pharmaceutical companies managing large volumes of internal research data.
    Kahles even believes it is possible that the DNA search engine will one day be used by private individuals: “In the early days, even Google didn’t know exactly what a search engine was good for. If the rapid development in DNA sequencing continues, it may become commonplace to identify your balcony plants more precisely.” More

  • in

    Breakthrough optical processor lets AI compute at the speed of light

    Modern artificial intelligence (AI) systems, from robotic surgery to high-frequency trading, rely on processing streams of raw data in real time. Extracting important features quickly is critical, but conventional digital processors are hitting physical limits. Traditional electronics can no longer reduce latency or increase throughput enough to keep up with today’s data-heavy applications.
    Turning to Light for Faster Computing
    Researchers are now looking to light as a solution. Optical computing — using light instead of electricity to handle complex calculations — offers a way to dramatically boost speed and efficiency. One promising approach involves optical diffraction operators, thin plate-like structures that perform mathematical operations as light passes through them. These systems can process many signals at once with low energy use. However, maintaining the stable, coherent light needed for such computations at speeds above 10 GHz has proven extremely difficult.
    To overcome this challenge, a team led by Professor Hongwei Chen at Tsinghua University in China developed a groundbreaking device known as the Optical Feature Extraction Engine, or OFE2. Their work, published in Advanced Photonics Nexus, demonstrates a new way to perform high-speed optical feature extraction suitable for multiple real-world applications.
    How OFE2 Prepares and Processes Data
    A key advance in OFE2 is its innovative data preparation module. Supplying fast, parallel optical signals to the core optical components without losing phase stability is one of the toughest problems in the field. Fiber-based systems often introduce unwanted phase fluctuations when splitting and delaying light. The Tsinghua team solved this by designing a fully integrated on-chip system with adjustable power splitters and precise delay lines. This setup converts serial data into several synchronized optical channels. In addition, an integrated phase array allows OFE2 to be easily reconfigured for different computational tasks.
    Once prepared, the optical signals pass through a diffraction operator that performs the feature extraction. This process is similar to a matrix-vector multiplication, where light waves interact to create focused “bright spots” at specific output points. By fine-tuning the phase of the input light, these spots can be directed toward chosen output ports, enabling OFE2 to capture subtle variations in the input data over time.

    Record-Breaking Optical Performance
    Operating at an impressive 12.5 GHz, OFE2 achieves a single matrix-vector multiplication in just 250.5 picoseconds — the fastest known result for this type of optical computation. “We firmly believe this work provides a significant benchmark for advancing integrated optical diffraction computing to exceed a 10 GHz rate in real-world applications,” says Chen.
    The research team tested OFE2 across multiple domains. In image processing, it successfully extracted edge features from visual data, creating paired “relief and engraving” maps that improved image classification and increased accuracy in tasks such as identifying organs in CT scans. Systems using OFE2 required fewer electronic parameters than standard AI models, proving that optical preprocessing can make hybrid AI networks both faster and more efficient.
    The team also applied OFE2 to digital trading, where it processed live market data to generate profitable buy and sell actions. After being trained with optimized strategies, OFE2 converted incoming price signals directly into trading decisions, achieving consistent returns. Because these calculations happen at the speed of light, traders could act on opportunities with almost no delay.
    Lighting the Way Toward the Future of AI
    Together, these achievements signal a major shift in computing. By moving the most demanding parts of AI processing from power-hungry electronic chips to lightning-fast photonic systems, technologies like OFE2 could usher in a new era of real-time, low-energy AI. “The advancements presented in our study push integrated diffraction operators to a higher rate, providing support for compute-intensive services in areas such as image recognition, assisted healthcare, and digital finance. We look forward to collaborating with partners who have data-intensive computational needs,” concludes Chen. More

  • in

    Polar bears provide millions of kilograms of food for other Arctic species

    In a single year, one polar bear can leave roughly 300 kilograms of prey for other animals to dine on. Altogether, the carnivores provide 7.6 million kilograms of carrion for scavengers throughout the Arctic, researchers estimate.

    The findings, reported October 28 in Oikos, highlight the crucial role these apex predators play in feeding a vast array of species and hints at the way that food web might be shaken as climate change warms the Arctic, endangering polar bear populations. More

  • in

    Australia’s tropical forests now emit CO₂, clouding the COP30 talks

    Australia’s tropical forests are the world’s first to flip a worrisome switch. The forests are now putting more carbon into the atmosphere than they are taking out, researchers report in the Oct. 16 Nature.

    That switch is a clanging alarm bell for the planet’s tropical forests, sounding as world leaders prepare to gather in the heart of the Amazon rainforest to wrangle over how to address the crisis of global climate change. The 30th annual United Nations Climate Change Conference, or COP30, begins November 10 in Belém, Brazil. More

  • in

    AI restores James Webb telescope’s crystal-clear vision

    Two PhD students from Sydney have helped restore the sharp vision of the world’s most powerful space observatory without ever leaving the ground. Louis Desdoigts, now a postdoctoral researcher at Leiden University in the Netherlands, and his colleague Max Charles celebrated their achievement with tattoos of the instrument they repaired inked on their arms — an enduring reminder of their contribution to space science.
    A Groundbreaking Software Fix
    Researchers at the University of Sydney developed an innovative software solution that corrected blurriness in images captured by NASA’s multi-billion-dollar James Webb Space Telescope (JWST). Their breakthrough restored the full precision of one of the telescope’s key instruments, achieving what would once have required a costly astronaut repair mission.
    This success builds on the JWST’s only Australian-designed component, the Aperture Masking Interferometer (AMI). Created by Professor Peter Tuthill from the University of Sydney’s School of Physics and the Sydney Institute for Astronomy, the AMI allows astronomers to capture ultra-high-resolution images of stars and exoplanets. It works by combining light from different sections of the telescope’s main mirror, a process known as interferometry. When the JWST began its scientific operations, researchers noticed that AMI’s performance was being affected by faint electronic distortions in its infrared camera detector. These distortions caused subtle image fuzziness, reminiscent of the Hubble Space Telescope’s well-known early optical flaw that had to be corrected through astronaut spacewalks.
    Solving a Space Problem from Earth
    Instead of attempting a physical repair, PhD students Louis Desdoigts and Max Charles, working with Professor Tuthill and Associate Professor Ben Pope (at Macquarie University), devised a purely software-based calibration technique to fix the distortion from Earth.
    Their system, called AMIGO (Aperture Masking Interferometry Generative Observations), uses advanced simulations and neural networks to replicate how the telescope’s optics and electronics function in space. By pinpointing an issue where electric charge slightly spreads to neighboring pixels — a phenomenon called the brighter-fatter effect — the team designed algorithms that digitally corrected the images, fully restoring AMI’s performance.

    “Instead of sending astronauts to bolt on new parts, they managed to fix things with code,” Professor Tuthill said. “It’s a brilliant example of how Australian innovation can make a global impact in space science.”
    Sharper Views of the Universe
    The results have been striking. With AMIGO in use, the James Webb Space Telescope has delivered its clearest images yet, capturing faint celestial objects in unprecedented detail. This includes direct images of a dim exoplanet and a red-brown dwarf orbiting the nearby star HD 206893, about 133 light years from Earth.
    A related study led by Max Charles further demonstrated AMI’s renewed precision. Using the improved calibration, the telescope produced sharp images of a black hole jet, the fiery surface of Jupiter’s moon Io, and the dust-filled stellar winds of WR 137 — showing that JWST can now probe deeper and clearer than before.
    “This work brings JWST’s vision into even sharper focus,” Dr. Desdoigts said. “It’s incredibly rewarding to see a software solution extend the telescope’s scientific reach — and to know it was possible without ever leaving the lab.”
    Dr. Desdoigts has now landed a prestigious postdoctoral research position at Leiden University in the Netherlands.
    Both studies have been published on the pre-press server arXiv. Dr. Desdoigts’ paper has been peer-reviewed and will shortly be published in the Publications of the Astronomical Society of Australia. We have published this release to coincide with the latest round of James Webb Space Telescope General Observer, Survey and Archival Research programs.
    Associate Professor Benjamin Pope, who presented on these findings at SXSW Sydney, said the research team was keen to get the new code into the hands of researchers working on JWST as soon as possible. More

  • in

    Living computers powered by mushrooms

    Fungal networks could one day replace the tiny metal components that process and store computer data, according to new research.
    Mushrooms are known for their toughness and unusual biological properties, qualities that make them attractive for bioelectronics. This emerging field blends biology and technology to design innovative, sustainable materials for future computing systems.
    Turning Mushrooms Into Living Memory Devices
    Researchers at The Ohio State University recently discovered that edible fungi, such as shiitake mushrooms, can be cultivated and guided to function as organic memristors. These components act like memory cells that retain information about previous electrical states.
    Their experiments showed that mushroom-based devices could reproduce the same kind of memory behavior seen in semiconductor chips. They may also enable the creation of other eco-friendly, brain-like computing tools that cost less to produce.
    “Being able to develop microchips that mimic actual neural activity means you don’t need a lot of power for standby or when the machine isn’t being used,” said John LaRocco, lead author of the study and a research scientist in psychiatry at Ohio State’s College of Medicine. “That’s something that can be a huge potential computational and economic advantage.”
    The Promise of Fungal Electronics
    LaRocco noted that fungal electronics are not a brand-new idea, but they are becoming increasingly practical for sustainable computing. Because fungal materials are biodegradable and inexpensive to produce, they can help reduce electronic waste. In contrast, conventional semiconductors often require rare minerals and large amounts of energy to manufacture and operate.

    “Mycelium as a computing substrate has been explored before in less intuitive setups, but our work tries to push one of these memristive systems to its limits,” he said.
    The team’s findings were published in PLOS One.
    How Scientists Tested Mushroom Memory
    To test their capabilities, researchers grew samples of shiitake and button mushrooms. Once matured, they were dehydrated to preserve them and then attached to custom electronic circuits. The mushrooms were exposed to controlled electric currents at different voltages and frequencies.
    “We would connect electrical wires and probes at different points on the mushrooms because distinct parts of it have different electrical properties,” said LaRocco. “Depending on the voltage and connectivity, we were seeing different performances.”
    Surprising Results from Mushroom Circuits
    After two months of testing, the researchers found that their mushroom-based memristor could switch between electrical states up to 5,850 times per second with about 90% accuracy. Although performance decreased at higher electrical frequencies, the team noticed that connecting multiple mushrooms together helped restore stability — much like neural connections in the human brain.

    Qudsia Tahmina, co-author of the study and an associate professor of electrical and computer engineering at Ohio State, said the results highlight how easily mushrooms can be adapted for computing. “Society has become increasingly aware of the need to protect our environment and ensure that we preserve it for future generations,” said Tahmina.”So that could be one of the driving factors behind new bio-friendly ideas like these.”
    Building on the flexibility mushrooms offer also suggests there are possibilities for scaling up fungal computing, said Tahmina. For instance, larger mushroom systems may be useful in edge computing and aerospace exploration; smaller ones in enhancing the performance of autonomous systems and wearable devices.
    Looking Ahead: The Future of Fungal Computing
    Although organic memristors are still in their early stages, scientists aim to refine cultivation methods and shrink device sizes in future work. Achieving smaller, more efficient fungal components will be key to making them viable alternatives to traditional microchips.
    “Everything you’d need to start exploring fungi and computing could be as small as a compost heap and some homemade electronics, or as big as a culturing factory with pre-made templates,” said LaRocco. “All of them are viable with the resources we have in front of us now.”
    Other Ohio State contributors to the study include Ruben Petreaca, John Simonis, and Justin Hill. The research was supported by the Honda Research Institute. More