More stories

  • in

    Living computers powered by mushrooms

    Fungal networks could one day replace the tiny metal components that process and store computer data, according to new research.
    Mushrooms are known for their toughness and unusual biological properties, qualities that make them attractive for bioelectronics. This emerging field blends biology and technology to design innovative, sustainable materials for future computing systems.
    Turning Mushrooms Into Living Memory Devices
    Researchers at The Ohio State University recently discovered that edible fungi, such as shiitake mushrooms, can be cultivated and guided to function as organic memristors. These components act like memory cells that retain information about previous electrical states.
    Their experiments showed that mushroom-based devices could reproduce the same kind of memory behavior seen in semiconductor chips. They may also enable the creation of other eco-friendly, brain-like computing tools that cost less to produce.
    “Being able to develop microchips that mimic actual neural activity means you don’t need a lot of power for standby or when the machine isn’t being used,” said John LaRocco, lead author of the study and a research scientist in psychiatry at Ohio State’s College of Medicine. “That’s something that can be a huge potential computational and economic advantage.”
    The Promise of Fungal Electronics
    LaRocco noted that fungal electronics are not a brand-new idea, but they are becoming increasingly practical for sustainable computing. Because fungal materials are biodegradable and inexpensive to produce, they can help reduce electronic waste. In contrast, conventional semiconductors often require rare minerals and large amounts of energy to manufacture and operate.

    “Mycelium as a computing substrate has been explored before in less intuitive setups, but our work tries to push one of these memristive systems to its limits,” he said.
    The team’s findings were published in PLOS One.
    How Scientists Tested Mushroom Memory
    To test their capabilities, researchers grew samples of shiitake and button mushrooms. Once matured, they were dehydrated to preserve them and then attached to custom electronic circuits. The mushrooms were exposed to controlled electric currents at different voltages and frequencies.
    “We would connect electrical wires and probes at different points on the mushrooms because distinct parts of it have different electrical properties,” said LaRocco. “Depending on the voltage and connectivity, we were seeing different performances.”
    Surprising Results from Mushroom Circuits
    After two months of testing, the researchers found that their mushroom-based memristor could switch between electrical states up to 5,850 times per second with about 90% accuracy. Although performance decreased at higher electrical frequencies, the team noticed that connecting multiple mushrooms together helped restore stability — much like neural connections in the human brain.

    Qudsia Tahmina, co-author of the study and an associate professor of electrical and computer engineering at Ohio State, said the results highlight how easily mushrooms can be adapted for computing. “Society has become increasingly aware of the need to protect our environment and ensure that we preserve it for future generations,” said Tahmina.”So that could be one of the driving factors behind new bio-friendly ideas like these.”
    Building on the flexibility mushrooms offer also suggests there are possibilities for scaling up fungal computing, said Tahmina. For instance, larger mushroom systems may be useful in edge computing and aerospace exploration; smaller ones in enhancing the performance of autonomous systems and wearable devices.
    Looking Ahead: The Future of Fungal Computing
    Although organic memristors are still in their early stages, scientists aim to refine cultivation methods and shrink device sizes in future work. Achieving smaller, more efficient fungal components will be key to making them viable alternatives to traditional microchips.
    “Everything you’d need to start exploring fungi and computing could be as small as a compost heap and some homemade electronics, or as big as a culturing factory with pre-made templates,” said LaRocco. “All of them are viable with the resources we have in front of us now.”
    Other Ohio State contributors to the study include Ruben Petreaca, John Simonis, and Justin Hill. The research was supported by the Honda Research Institute. More

  • in

    The math says life shouldn’t exist, but somehow it does

    A groundbreaking study is taking a fresh look at one of science’s oldest questions: how did life arise from nonliving material on early Earth? Researcher Robert G. Endres of Imperial College London has created a new mathematical framework suggesting that the spontaneous appearance of life may have been far less likely than many scientists once believed.
    The Improbable Odds of Life Emerging Naturally
    The research examines how extraordinarily difficult it would be for organized biological information to form under plausible prebiotic conditions. Endres illustrates this by comparing it to trying to write a coherent article for a leading science website by tossing random letters onto a page. As complexity increases, the probability of success quickly drops to near zero.
    To explore the issue, Endres applied principles from information theory and algorithmic complexity to estimate what it would take for the first simple cell, known as a protocell, to assemble itself from basic chemical ingredients. This approach revealed that the odds of such a process happening naturally are astonishingly low.
    Why Chance Alone May Not Be Enough
    The findings suggest that random chemical reactions and natural processes may not fully explain how life appeared within the limited time available on early Earth. Because systems naturally tend toward disorder, building the intricate molecular organization required for life would have been a major challenge.
    Although this doesn’t mean that life’s origin was impossible, Endres argues that current scientific models may be missing key elements. He emphasizes that identifying the physical principles behind life’s emergence from nonliving matter remains one of the greatest unsolved problems in biological physics.

    Considering a Speculative Alternative
    The study also briefly considers directed panspermia, a controversial idea proposed by Francis Crick and Leslie Orgel. This hypothesis suggests that life could have been intentionally introduced to Earth by advanced extraterrestrial civilizations. While Endres acknowledges the idea as logically possible, he notes that it runs counter to Occam’s razor, the principle that favors simpler explanations.
    Rather than ruling out natural origins, the research provides a way to quantify how difficult the process may have been. It points to the potential need for new physical laws or mechanisms that could help overcome the immense informational and organizational barriers to life. The work represents an important move toward a more mathematically grounded understanding of how living systems might arise.
    A Continuing Mystery
    This study is a reminder that some of the most profound questions in science remain unanswered. By merging mathematics with biology, researchers are beginning to uncover new layers of insight into one of humanity’s oldest mysteries: how existence itself began.
    Adapted from an article originally published on Universe Today. More

  • in

    Stanford’s tiny eye chip helps the blind see again

    A tiny wireless chip placed at the back of the eye, combined with a pair of advanced smart glasses, has partially restored vision to people suffering from an advanced form of age-related macular degeneration. In a clinical study led by Stanford Medicine and international collaborators, 27 of the 32 participants regained the ability to read within a year of receiving the implant.
    With the help of digital features such as adjustable zoom and enhanced contrast, some participants achieved visual sharpness comparable to 20/42 vision.
    The study’s findings were published on Oct. 20 in the New England Journal of Medicine.
    A Milestone in Restoring Functional Vision
    The implant, named PRIMA and developed at Stanford Medicine, is the first prosthetic eye device to restore usable vision to individuals with otherwise untreatable vision loss. The technology enables patients to recognize shapes and patterns, a level of vision known as form vision.
    “All previous attempts to provide vision with prosthetic devices resulted in basically light sensitivity, not really form vision,” said Daniel Palanker, PhD, a professor of ophthalmology and a co-senior author of the paper. “We are the first to provide form vision.”
    The research was co-led by José-Alain Sahel, MD, professor of ophthalmology at the University of Pittsburgh School of Medicine, with Frank Holz, MD, of the University of Bonn in Germany, serving as lead author.

    How the PRIMA System Works
    The system includes two main parts: a small camera attached to a pair of glasses and a wireless chip implanted in the retina. The camera captures visual information and projects it through infrared light to the implant, which converts it into electrical signals. These signals substitute for the damaged photoreceptors that normally detect light and send visual data to the brain.
    The PRIMA project represents decades of scientific effort, involving numerous prototypes, animal testing, and an initial human trial.
    Palanker first conceived the idea two decades ago while working with ophthalmic lasers to treat eye disorders. “I realized we should use the fact that the eye is transparent and deliver information by light,” he said.
    “The device we imagined in 2005 now works in patients remarkably well.”
    Replacing Lost Photoreceptors
    Participants in the latest trial had an advanced stage of age-related macular degeneration known as geographic atrophy, which progressively destroys central vision. This condition affects over 5 million people worldwide and is the leading cause of irreversible blindness among older adults.

    In macular degeneration, the light-sensitive photoreceptor cells in the central retina deteriorate, leaving only limited peripheral vision. However, many of the retinal neurons that process visual information remain intact, and PRIMA capitalizes on these surviving structures.
    The implant, measuring just 2 by 2 millimeters, is placed in the area of the retina where photoreceptors have been lost. Unlike natural photoreceptors that respond to visible light, the chip detects infrared light emitted from the glasses.
    “The projection is done by infrared because we want to make sure it’s invisible to the remaining photoreceptors outside the implant,” Palanker said.
    Combining Natural and Artificial Vision
    This design allows patients to use both their natural peripheral vision and the new prosthetic central vision simultaneously, improving their ability to orient themselves and move around.
    “The fact that they see simultaneously prosthetic and peripheral vision is important because they can merge and use vision to its fullest,” Palanker said.
    Since the implant is photovoltaic — relying solely on light to generate electrical current — it operates wirelessly and can be safely placed beneath the retina. Earlier versions of artificial eye devices required external power sources and cables that extended outside the eye.
    Reading Again
    The new trial included 38 patients older than 60 who had geographic atrophy due to age-related macular degeneration and worse than 20/320 vision in at least one eye.
    Four to five weeks after implantation of the chip in one eye, patients began using the glasses. Though some patients could make out patterns immediately, all patients’ visual acuity improved over months of training.
    “It may take several months of training to reach top performance — which is similar to what cochlear implants require to master prosthetic hearing,” Palanker said.
    Of the 32 patients who completed the one-year trial, 27 could read and 26 demonstrated clinically meaningful improvement in visual acuity, which was defined as the ability to read at least two additional lines on a standard eye chart. On average, participants’ visual acuity improved by 5 lines; one improved by 12 lines.
    The participants used the prosthesis in their daily lives to read books, food labels and subway signs. The glasses allowed them to adjust contrast and brightness and magnify up to 12 times. Two-thirds reported medium to high user satisfaction with the device.
    Nineteen participants experienced side effects, including ocular hypertension (high pressure in the eye), tears in the peripheral retina and subretinal hemorrhage (blood collecting under the retina). None were life-threatening, and almost all resolved within two months.
    Future Visions
    For now, the PRIMA device provides only black-and-white vision, with no shades in between, but Palanker is developing software that will soon enable the full range of grayscale.
    “Number one on the patients’ wish list is reading, but number two, very close behind, is face recognition,” he said. “And face recognition requires grayscale.”
    He is also engineering chips that will offer higher resolution vision. Resolution is limited by the size of pixels on the chip. Currently, the pixels are 100 microns wide, with 378 pixels on each chip. The new version, already tested in rats, may have pixels as small as 20 microns wide, with 10,000 pixels on each chip.
    Palanker also wants to test the device for other types of blindness caused by lost photoreceptors.
    “This is the first version of the chip, and resolution is relatively low,” he said. “The next generation of the chip, with smaller pixels, will have better resolution and be paired with sleeker-looking glasses.”
    A chip with 20-micron pixels could give a patient 20/80 vision, Palanker said. “But with electronic zoom, they could get close to 20/20.”
    Researchers from the University of Bonn, Germany; Hôpital Fondation A. de Rothschild, France; Moorfields Eye Hospital and University College London; Ludwigshafen Academic Teaching Hospital; University of Rome Tor Vergata; Medical Center Schleswig-Holstein, University of Lübeck; L’Hôpital Universitaire de la Croix-Rousse and Université Claude Bernard Lyon 1; Azienda Ospedaliera San Giovanni Addolorata; Centre Monticelli Paradis and L’Université d’Aix-Marseille; Intercommunal Hospital of Créteil and Henri Mondor Hospital; Knappschaft Hospital Saar; Nantes University; University Eye Hospital Tübingen; University of Münster Medical Center; Bordeaux University Hospital; Hôpital National des 15-20; Erasmus University Medical Center; University of Ulm; Science Corp.; University of California, San Francisco; University of Washington; University of Pittsburgh School of Medicine; and Sorbonne Université contributed to the study.
    The study was supported by funding from Science Corp., the National Institute for Health and Care Research, Moorfields Eye Hospital National Health Service Foundation Trust, and University College London Institute of Ophthalmology. More

  • in

    AI turns x-rays into time machines for arthritis care

    A new artificial intelligence system developed by researchers at the University of Surrey can forecast what a patient’s knee X-ray might look like one year in the future. This breakthrough could reshape how millions of people living with osteoarthritis understand and manage their condition.
    The research, presented at the International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2025), describes a powerful AI model capable of generating realistic “future” X-rays along with a personalized risk score that estimates disease progression. Together, these outputs give doctors and patients a visual roadmap of how osteoarthritis may evolve over time.
    A Major Step Forward in Predicting Osteoarthritis Progression
    Osteoarthritis, a degenerative joint disorder that affects more than 500 million people globally, is the leading cause of disability among older adults. The Surrey system was trained on nearly 50,000 knee X-rays from about 5,000 patients, making it one of the largest datasets of its kind. It can predict disease progression roughly nine times faster than similar AI tools and operates with greater efficiency and accuracy. Researchers believe this combination of speed and precision could help integrate the technology into clinical practice more quickly.
    David Butler, the study’s lead author from the University of Surrey’s Centre for Vision, Speech and Signal Processing (CVSSP) and the Institute for People-Centred AI, explained:
    “We’re used to medical AI tools that give a number or a prediction, but not much explanation. Our system not only predicts the likelihood of your knee getting worse — it actually shows you a realistic image of what that future knee could look like. Seeing the two X-rays side by side — one from today and one for next year — is a powerful motivator. It helps doctors act sooner and gives patients a clearer picture of why sticking to their treatment plan or making lifestyle changes really matters. We think this can be a turning point in how we communicate risk and improve osteoarthritic knee care and other related conditions.”
    How the System Visualizes Change
    At the core of the new system is an advanced generative model known as a diffusion model. It creates a “future” version of a patient’s X-ray and identifies 16 key points in the joint to highlight areas being tracked for potential changes. This feature enhances transparency by showing clinicians exactly which parts of the knee the AI is monitoring, helping build confidence and understanding in its predictions.

    The Surrey team believes their approach could be adapted for other chronic diseases. Similar AI tools might one day predict lung damage in smokers or track the progression of heart disease, providing the same kind of visual insights and early warning that this system offers for osteoarthritis. Researchers are now seeking collaborations to bring the technology into hospitals and everyday healthcare use.
    Greater Transparency and Early Intervention
    Gustavo Carneiro, Professor of AI and Machine Learning at Surrey’s Centre for Vision, Speech and Signal Processing (CVSSP), said:
    “Earlier AI systems could estimate the risk of osteoarthritis progression, but they were often slow, opaque and limited to numbers rather than clear images. Our approach takes a big step forward by generating realistic future X-rays quickly and by pinpointing the areas of the joint most likely to change. That extra visibility helps clinicians identify high-risk patients sooner and personalize their care in ways that were not previously practical.” More

  • in

    As wildfires worsen, science can help communities avoid destruction

    Bright flecks of burning wood stream through the smoky air and toward a hapless house. Before the one-story structure, the glowing specks, each merely centimeters in size, seem insignificant. But each lofted ember is a seed of destruction. Researchers estimate that embers cause somewhere between 60 to 90 percent of home ignitions.

    Next to the house stands a trash bin, its lid propped open with sheets of cardboard inside. The fiery spores enter and in seconds flames sprout inside. Within minutes, a column of fire rises and licks the house’s sidewall. Black flaps of vinyl siding begin to peel and writhe. Burning chunks fall to the ground, and a crackling, smoldering fissure grows up the wall. Orange, blue and purple flames roar as they ascend toward the roof. More

  • in

    Quantum crystals could spark the next tech revolution

    Picture a future where factories can create materials and chemical compounds more quickly, at lower cost, and with fewer production steps. Imagine your laptop processing complex data in seconds or a supercomputer learning and adapting as efficiently as the human brain. These possibilities depend on one fundamental factor: how electrons behave inside materials. Researchers at Auburn University have now developed a groundbreaking type of material that allows scientists to precisely control these tiny charged particles. Their findings, published in ACS Materials Letters, describe how the team achieved adjustable coupling between isolated-metal molecular complexes, called solvated electron precursors, where electrons are not tied to specific atoms but instead move freely within open spaces.
    Electrons are central to nearly every chemical and technological process. They drive energy transfer, bonding, and electrical conductivity, serving as the foundation for both chemical synthesis and modern electronics. In chemical reactions, electrons enable redox processes, bond formation, and catalytic activity. In technology, managing how electrons move and interact underpins everything from electronic circuits and AI systems to solar cells and quantum computers. Typically, electrons are confined to atoms, which restricts their potential uses. However, in materials known as electrides, electrons move independently, opening the door to remarkable new capabilities.
    “By learning how to control these free electrons, we can design materials that do things nature never intended,” explains Dr. Evangelos Miliordos, Associate Professor of Chemistry at Auburn and senior author of the study, which was based on advanced computational modeling.
    To achieve this, the Auburn team created innovative material structures called Surface Immobilized Electrides by attaching solvated electron precursors to stable surfaces such as diamond and silicon carbide. This configuration makes the electronic characteristics of the electrides both durable and tunable. By changing how the molecules are arranged, electrons can either cluster into isolated “islands” that behave like quantum bits for advanced computing or spread into extended “seas” that promote complex chemical reactions.
    This versatility is what gives the discovery its transformative potential. One version could lead to the development of powerful quantum computers capable of solving problems beyond the reach of today’s technology. Another could provide the basis for cutting-edge catalysts that speed up essential chemical reactions, potentially revolutionizing how fuels, pharmaceuticals, and industrial materials are produced.
    “As our society pushes the limits of current technology, the demand for new kinds of materials is exploding,” says Dr. Marcelo Kuroda, Associate Professor of Physics at Auburn. “Our work shows a new path to materials that offer both opportunities for fundamental investigations on interactions in matter as well as practical applications.”
    Earlier versions of electrides were unstable and difficult to scale. By depositing them directly on solid surfaces, the Auburn team has overcome these barriers, proposing a family of materials structures that could move from theoretical models to real-world devices. “This is fundamental science, but it has very real implications,” says Dr. Konstantin Klyukin, Assistant Professor of Materials Engineering at Auburn. “We’re talking about technologies that could change the way we compute and the way we manufacture.”
    The theoretical study was led by faculty across chemistry, physics, and materials engineering at Auburn University. “This is just the beginning,” Miliordos adds. “By learning how to tame free electrons, we can imagine a future with faster computers, smarter machines, and new technologies we haven’t even dreamed of yet.”
    The study, “Electrides with Tunable Electron Delocalization for Applications in Quantum Computing and Catalysis,” was also coauthored by graduate students Andrei Evdokimov and Valentina Nesterova. It was supported by the U.S. National Science Foundation and Auburn University computing resources. More

  • in

    How a Yurok family played a key role in the world’s largest dam removal project 

    The Water RemembersAmy Bowers CordalisLittle Brown & Co., $30In September 2002, an estimated 34,000 to 78,000 adult Chinook salmon died in the Klamath River within the Yurok Reservation in Northern California. The U.S. government had diverted river water to farms during a drought. The resulting low levels and warm temperature of the water, coupled with the flow of toxic blue-green algae that bloomed in the reservoirs behind the river’s four dams, created the perfect conditions for “ich,” a parasitic gill rot disease, to spread and suffocate the fish. It was one of the largest fish kills recorded in U.S. history.The ecological disaster catalyzed an Indigenous-led movement to remove the dams, the oldest of which had choked the river, blocking fish migrations and tainting water quality, for over 100 years. In The Water Remembers, Yurok tribal member, activist and attorney Amy Bowers Cordalis shares an intimate look into her family’s and nation’s decades-long fight to restore the health of the Klamath and preserve their way of life — a multigenerational effort that culminated in the largest dam removal and river restoration project in history.

    The Yurok people believe it is their duty to live in balance with nature. They steward the Klamath and its surrounding ecosystems. In return, the river gives them sustenance, physically and spiritually. This sacred reciprocity is reflected in Yurok stories, Cordalis writes, which “teach that if the Klamath salmon and the Klamath River die, so will the Yurok people.”

    Cordalis’ reverence for the river, the salmon and the craft of fishing drips from every page of this memoir. She describes the thrill that overcomes her and other members of the Yurok Nation when salmon return to the Klamath River from the Pacific Ocean to spawn. Bobbing in a boat, gill net in hand, surrounded by trees, water and wildlife, is a spiritual practice.

    In 2002, tens of thousands of salmon died in the Klamath River from a gill rot disease called “ich.” The river’s four dams helped create the perfect conditions for the illness to spread.Northcoast Environment Center

    Every page is also stained with stories of historical injustice. For nearly two centuries, colonization, genocide and their lingering scars have threatened the Yurok’s way of life, from the United States’ theft of Yurok land since the 19th century to California’s mid-20th century ban on Yurok fishing to boost non-Indigenous logging and fishing businesses.

    Through it all, Cordalis’ family has resisted. Cordalis’ great-grandmother, Geneva Mattz, and her sons fished and sold bootlegged salmon throughout the ban. In the late 1960s, her great-uncle Ray Mattz sued California for violating his Indigenous rights by repeatedly arresting him for fishing on his ancestral land — a case that he won in the U.S. Supreme Court in 1973. The 2002 fish kill reinvigorated this tradition of resistance. Cordalis, then a 22-year-old intern at the Yurok Tribal Fisheries Department, witnessed the devastation firsthand. Her gruesome descriptions of the limp and rotting carcasses of thousands of salmon crowded on the riverbank convey the visceral and emotional response of the Yurok to what Cordalis deems an “ecocide.” More