More stories

  • in

    Brain-like learning found in bacterial nanopores

    Pore-forming proteins are widespread across living organisms. In humans, they are essential for immune defense, while in bacteria they often act as toxins that puncture cell membranes. These microscopic pores allow ions and molecules to move through membranes, controlling molecular traffic within cells. Because of their precision and control, scientists have adapted them as nanopore tools for biotechnology, such as in DNA sequencing and molecular sensing.
    Although biological nanopores have revolutionized biotechnology, they can behave in complex and sometimes erratic ways. Researchers still lack a complete understanding of how ions travel through them or why ion flow occasionally stops altogether.
    Two particularly puzzling behaviors have long intrigued scientists: rectification and gating. Rectification occurs when the flow of ions changes depending on the “sign” (plus or minus — positive or negative) of the voltage applied. Gating happens when the ion flow suddenly decreases or stops. These effects, especially gating, can disrupt nanopore-based sensing and have remained difficult to explain.
    A research team led by Matteo Dal Peraro and Aleksandra Radenovic at EPFL has now identified the physical mechanisms behind these two effects. Using a combination of experiments, simulations, and theoretical modeling, they found that both rectification and gating arise from the nanopore’s own electrical charges and the way those charges interact with the ions moving through the pore.
    Experimenting With Electric Charges
    The team studied aerolysin, a bacterial pore commonly used in sensing research. They modified the charged amino acids lining its interior to create 26 nanopore variants, each with a distinct charge pattern. By observing how ions traveled through these modified pores under different conditions, they were able to isolate key electrical and structural factors.
    To better understand how these effects evolve over time, the scientists applied alternating voltage signals to the nanopores. This approach allowed them to distinguish rectification, which occurs quickly, from gating, which develops more slowly. They then built biophysical models to interpret their data and reveal the mechanisms at work.

    How Nanopores Learn Like the Brain
    The researchers discovered that rectification happens because of how the charges along the inner surface influence ion movement, making it easier for ions to flow in one direction than the other, similar to a one-way valve. Gating, in contrast, occurs when a heavy ion flow disrupts the charge balance and destabilizes the pore’s structure. This temporary collapse blocks ion passage until the system resets.
    Both effects depend on the exact placement and type of electrical charge within the nanopore. By reversing the charge “sign,” the team could control when and how gating occurred. When they increased the pore’s rigidity, gating stopped completely, confirming that structural flexibility is key to this phenomenon.
    Toward Smarter Nanopores
    These findings open new possibilities for engineering biological nanopores with custom properties. Scientists can now design pores that minimize unwanted gating for applications in nanopore sensing, or deliberately use gating for bio-inspired computing. In one demonstration, the team created a nanopore that mimics synaptic plasticity, “learning” from voltage pulses much like a neural synapse. This discovery suggests that future ion-based processors could one day harness such molecular “learning” to power new forms of computing. More

  • in

    Dark energy might be changing and so is the Universe

    Since the early 20th century, scientists have gathered convincing evidence that the Universe is expanding — and that this expansion is accelerating. The force responsible for this acceleration is called dark energy, a mysterious property of spacetime thought to push galaxies apart. For decades, the prevailing cosmological model, known as Lambda Cold Dark Matter (ΛCDM), has assumed that dark energy remains constant throughout cosmic history. This simple but powerful assumption has been the foundation of modern cosmology. Yet, it leaves one key question unresolved: what if dark energy changes over time instead of remaining fixed?
    Recent observations have started to challenge this long-held view. Data from the Dark Energy Spectroscopic Instrument (DESI) — an advanced project that maps the distribution of galaxies across the Universe — suggests the possibility of a dynamic dark energy (DDE) component. Such a finding would mark a significant shift from the standard ΛCDM model. While this points to a more intricate and evolving cosmic story, it also exposes a major gap in understanding: how a time-dependent dark energy might shape the formation and growth of cosmic structures remains unclear.
    Simulating an Evolving Universe
    To explore this mystery, a team led by Associate Professor Tomoaki Ishiyama of Chiba University’s Digital Transformation Enhancement Council in Japan carried out one of the most extensive cosmological simulations ever performed. Collaborators included Francisco Prada of the Instituto de Astrofísica de Andalucía in Spain and Anatoly A. Klypin of New Mexico State University in the United States. Their study, published in Physical Review D (Volume 112, Issue 4), investigated how a time-varying dark energy could influence the evolution of the cosmos and help interpret future astronomical observations.
    Using Japan’s flagship supercomputer, Fugaku, the researchers executed three large, high-resolution N-body simulations, each with a computational volume eight times greater than previous work. One simulation followed the standard Planck-2018 ΛCDM model, while two others incorporated dynamic dark energy. By comparing the DDE model with fixed parameters to the standard model, they were able to isolate the effects of a changing dark energy component. A third simulation used parameters drawn from DESI’s first-year data, revealing how an “updated” cosmological model might behave if dark energy truly varies with time.
    How a Small Change Can Reshape the Universe
    The results showed that the influence of dark energy variations alone was relatively subtle. However, once the researchers adjusted the cosmological parameters in line with DESI data — particularly increasing the matter density by about 10% — the differences became striking. A higher matter density strengthens gravitational attraction, which accelerates the formation of massive clusters of galaxies. In this scenario, the DESI-based DDE model predicted as much as 70% more massive clusters in the early Universe than the standard model. These clusters form the cosmic framework on which galaxies and galaxy groups assemble.

    The team also examined baryonic acoustic oscillations (BAOs) — patterns left behind by sound waves in the early Universe that serve as “cosmic rulers” for measuring distances. In the DESI-derived DDE simulation, the BAO peak shifted by 3.71% toward smaller scales, closely matching DESI’s actual observations. This strong agreement confirmed that the model not only reflects theoretical predictions but also aligns well with real-world data.
    Mapping Galaxy Clusters and Cosmic Structure
    In addition, the researchers analyzed how galaxies cluster throughout the cosmos. The DESI-based DDE model produced noticeably stronger clustering than the standard ΛCDM version, particularly on smaller scales. The enhanced clustering directly results from the higher matter density, which amplifies gravitational binding. This close match between simulation and observation further supports the validity of the dynamic dark energy model.
    Overall, the team’s findings clarify how both dark energy and matter density shape the large-scale structure of the Universe. “Our large simulations demonstrate that variations in cosmological parameters, particularly the matter density in the Universe, have a greater influence on structure formation than the DDE component alone,” says Dr. Ishiyama.
    Preparing for the Next Generation of Cosmic Surveys
    With new observational campaigns on the horizon, these simulations will play a crucial role in interpreting upcoming results. “In the near future, large-scale galaxy surveys from the Subaru Prime Focus Spectrograph and DESI are expected to significantly improve measurements of cosmological parameters. This study provides a theoretical basis for interpreting such upcoming data,” concludes Dr. Ishiyama. More

  • in

    Stanford discovers an extraordinary crystal that could transform quantum tech

    Stanford engineers have discovered a standout material, strontium titanate (STO), that performs even better in extreme cold. Instead of weakening, its optical and mechanical properties improve at cryogenic temperatures. STO outperforms every comparable material tested in low-temperature environments, revealing exceptional strength, stability, and tunability. Its unique capabilities could accelerate advances in quantum computing, laser systems, and space exploration, where high performance under freezing conditions is essential.Superconductivity and quantum computing have moved from theoretical physics into real-world innovation. The 2025 Nobel Prize in Physics recognized breakthroughs in superconducting quantum circuits that could lead to ultra-powerful computers. Yet many of these technologies only function at cryogenic temperatures (near absolute zero), where most materials lose their defining properties. Finding materials that perform under such extreme cold has long been one of science’s biggest hurdles.A Crystal That Defies the Cold
    In a new Science publication, engineers at Stanford University report a breakthrough with strontium titanate (STO), a material that not only maintains but enhances its optical and mechanical performance in freezing conditions. Instead of deteriorating, it becomes significantly more capable, outperforming other known materials by a wide margin. The researchers believe this discovery could open the door to a new class of light-based and mechanical cryogenic devices that propel quantum computing, space exploration, and other advanced technologies.
    “Strontium titanate has electro-optic effects 40 times stronger than the most-used electro-optic material today. But it also works at cryogenic temperatures, which is beneficial for building quantum transducers and switches that are current bottlenecks in quantum technologies,” explained the study’s senior author Jelena Vuckovic, professor of electrical engineering at Stanford.
    Pushing the Limits of Performance
    STO’s optical behavior is “non-linear,” meaning that when an electric field is applied, its optical and mechanical properties shift dramatically. This electro-optic effect allows scientists to adjust the frequency, intensity, phase, and direction of light in ways that other materials cannot. Such versatility could enable entirely new types of low-temperature devices.

    STO is also piezoelectric, meaning it physically expands and contracts in response to electric fields. This makes it ideal for developing new electromechanical components that function efficiently in extreme cold. According to the researchers, these capabilities could be especially valuable for use in the vacuum of space or in the cryogenic fuel systems of rockets.
    “At low temperature, not only is strontium titanate the most electrically tunable optical material we know of, but it’s also the most piezoelectrically tunable material,” said Christopher Anderson, co-first author and now a faculty member at the University of Illinois, Urbana-Champaign.
    An Overlooked Material Finds New Purpose
    Strontium titanate is not a newly discovered substance. It has been studied for decades and is inexpensive and abundant. “STO is not particularly special. It’s not rare. It’s not expensive,” said co-first author Giovanni Scuri, a postdoctoral scholar in Vuckovic’s lab. “In fact, it has often been used as a diamond substitute in jewelry or as a substrate for growing other, more valuable materials. Despite being a ‘textbook’ material, it performs exceptionally well in a cryogenic context.”
    The decision to test STO was guided by an understanding of what characteristics make materials highly tunable. “We knew what ingredients we needed to make a highly tunable material. We found those ingredients already existed in nature, and we simply used them in a new recipe. STO was the obvious choice,” Anderson said. “When we tried it, surprisingly, it matched our expectations perfectly.”
    Scuri added that the framework they developed could help identify or enhance other nonlinear materials for a variety of operating conditions.

    Record-Breaking Performance at Near Absolute Zero
    When tested at 5 Kelvin (-450°F), STO’s performance stunned researchers. Its nonlinear optical response was 20 times greater than that of lithium niobate, the leading nonlinear optical material, and nearly triple that of barium titanate, the previous cryogenic benchmark.
    To push its properties even further, the team replaced certain oxygen atoms in the crystal with heavier isotopes. This adjustment moved STO closer to a state called quantum criticality, producing even greater tunability.
    “By adding just two neutrons to exactly 33 percent of the oxygen atoms in the material, the resulting tunability increased by a factor of four,” Anderson said. “We precisely tuned our recipe to get the best possible performance.”
    Building the Future of Cryogenic Devices
    According to the team, STO also offers practical advantages that could make it appealing to engineers. It can be synthesized, structurally modified, and fabricated at wafer scale using existing semiconductor equipment. These features make it well-suited for next-generation quantum devices, such as laser-based switches used to control and transmit quantum information.
    The research was partially funded by Samsung Electronics and Google’s quantum computing division, both of which are searching for materials to advance their quantum hardware. The team’s next goal is to design fully functional cryogenic devices based on STO’s unique properties.
    “We found this material on the shelf. We used it and it was amazing. We understood why it was good. Then the cherry on the top — we knew how to do better, added that special sauce, and we made the world’s best material for these applications,” Anderson said. “It’s a great story.”
    Alongside Samsung and Google, the study received support from a Vannevar Bush Faculty Fellowship through the U.S. Department of Defense and the Department of Energy’s Q-NEXT program.
    Contributors include Aaron Chan and Lu Li from the University of Michigan; Sungjun Eun, Alexander D. White, Geun Ho Ahn, Amir Safavi-Naeini, and Kasper Van Gasse from Stanford’s E. L. Ginzton Laboratory; and Christine Jilly from the Stanford Nano Shared Facilities. More

  • in

    Artificial neurons that behave like real brain cells

    Scientists at the USC Viterbi School of Engineering and the School of Advanced Computing have created artificial neurons that reproduce the intricate electrochemical behavior of real brain cells. The discovery, published in Nature Electronics, marks a major milestone in neuromorphic computing, a field that designs hardware modeled after the human brain. This advancement could shrink chip sizes by orders of magnitude, cut energy use dramatically, and push artificial intelligence closer to achieving artificial general intelligence.
    Unlike digital processors or earlier neuromorphic chips that only simulate brain activity through mathematical models, these new neurons physically reproduce how real neurons operate. Just as natural brain activity is triggered by chemical signals, these artificial versions use actual chemical interactions to start computational processes. This means they are not just symbolic representations but tangible recreations of biological function.
    A New Class of Brain-Like Hardware
    The research, led by Professor Joshua Yang of USC’s Department of Computer and Electrical Engineering, builds on his earlier pioneering work on artificial synapses more than a decade ago. The team’s new approach centers on a device called a “diffusive memristor.” Their findings describe how these components could lead to a new generation of chips that both complement and enhance traditional silicon-based electronics. While silicon systems rely on electrons to perform computations, Yang’s diffusive memristors use the motion of atoms instead, creating a process that more closely resembles how biological neurons transmit information. The result could be smaller, more efficient chips that process information the way the brain does and potentially pave the way toward artificial general intelligence (AGI).
    In the brain, both electrical and chemical signals drive communication between nerve cells. When an electrical impulse reaches the end of a neuron at a junction called a synapse, it converts into a chemical signal to transmit information to the next neuron. Once received, that signal is converted back into an electrical impulse that continues through the neuron. Yang and his colleagues have replicated this complex process in their devices with striking accuracy. A major advantage of their design is that each artificial neuron fits within the footprint of a single transistor, whereas older designs required tens or even hundreds.
    In biological neurons, charged particles known as ions help create the electrical impulses that enable activity in the nervous system. The human brain relies on ions such as potassium, sodium, and calcium to make this happen.
    Using Silver Ions to Recreate Brain Dynamics
    In the new study, Yang — who also directs the USC Center of Excellence on Neuromorphic Computing — used silver ions embedded in oxide materials to generate electrical pulses that mimic natural brain functions. These include fundamental processes like learning, movement, and planning.

    “Even though it’s not exactly the same ions in our artificial synapses and neurons, the physics governing the ion motion and the dynamics are very similar,” says Yang.
    Yang explains, “Silver is easy to diffuse and gives us the dynamics we need to emulate the biosystem so that we can achieve the function of the neurons, with a very simple structure.” The new device that can enable a brain-like chip is called the “diffusive memristor” because of the ion motion and the dynamic diffusion that occurs with the use of silver.
    He adds, the team chose to utilize ion dynamics for building artificial intelligent systems “because that is what happens in the human brain, for a good reason and since the human brain, is the ‘winner in evolution-the most efficient intelligent engine.”
    “It’s more efficient,” says Yang.
    Why Efficiency Matters in AI Hardware
    Yang emphasizes that the issue with modern computing isn’t lack of power but inefficiency. “It’s not that our chips or computers are not powerful enough for whatever they are doing. It’s that they aren’t efficient enough. They use too much energy,” he explains. This is especially important given how much energy today’s large-scale artificial intelligence systems consume to process massive datasets.

    Yang goes on to explain that unlike the brain, “Our existing computing systems were never intended to process massive amounts of data or to learn from just a few examples on their own. One way to boost both energy and learning efficiency is to build artificial systems that operate according to principles observed in the brain.”
    If you are looking for pure speed, electrons that run modern computing would be the best for fast operations. But, he explains, “Ions are a better medium than electrons for embodying principles of the brain. Because electrons are lightweight and volatile, computing with them enables software-based learning rather than hardware-based learning, which is fundamentally different from how the brain operates.”
    In contrast, he says, “The brain learns by moving ions across membranes, achieving energy-efficient and adaptive learning directly in hardware, or more precisely, in what people may call ‘wetware’.”
    For example, a young child can learn to recognize handwritten digits after seeing only a few examples of each, whereas a computer typically needs thousands to achieve the same task. Yet, the human brain accomplishes this remarkable learning while consuming only about 20 watts of power, compared to the megawatts required by today’s supercomputers.
    Potential Impact and Next Steps
    Yang and his team see this technology as a major step toward replicating natural intelligence. However, he acknowledges that the silver used in these experiments is not yet compatible with standard semiconductor manufacturing processes. Future work will explore other ionic materials that can achieve similar effects.
    The diffusive memristors are efficient in both energy and size. A typical smartphone may contain around ten chips, each with billions of transistors switching on and off to perform calculations.
    “Instead [with this innovation], we just use a footprint of one transistor for each neuron. We are designing the building blocks that eventually led us to reduce the chip size by orders of magnitude, reduce the energy consumption by orders of magnitude, so it can be sustainable to perform AI in the future, with similar level of intelligence without burning energy that we cannot sustain,” says Yang.
    Now that we have demonstrated capable and compact building blocks, artificial synapses and neurons, the next step is to integrate large numbers of them and test how closely we can replicate the brain’s efficiency and capabilities. “Even more exciting,” says Yang, “is the prospect that such brain-faithful systems could help us uncover new insights into how the brain itself works.” More

  • in

    Breakthrough links magnetism and electricity for faster tech

    Engineers at the University of Delaware have uncovered a new way to connect magnetic and electric forces in computing, a finding that could pave the way for computers that run dramatically faster while consuming far less energy.
    Tiny Magnetic Waves Generate Electric Signals
    In a study published in Proceedings of the National Academy of Sciences, researchers from the university’s Center for Hybrid, Active and Responsive Materials (CHARM), a National Science Foundation-funded Materials Research Science and Engineering Center, report that magnons — tiny magnetic waves that move through solid materials — are capable of generating measurable electric signals.
    This discovery suggests that future computer chips could merge magnetic and electric systems directly, removing the need for the constant energy exchange that limits the performance of today’s devices.
    How Magnons Transmit Information
    Traditional electronics rely on the flow of charged electrons, which lose energy as heat when moving through circuits. In contrast, magnons carry information through the synchronized “spin” of electrons, creating wave-like patterns across a material. According to theoretical models developed by the UD team, when these magnetic waves travel through antiferromagnetic materials, they can induce electric polarization, effectively creating a measurable voltage.
    Toward Ultrafast, Energy-Efficient Computing
    Antiferromagnetic magnons can move at terahertz frequencies — around a thousand times faster than magnetic waves in conventional materials. This exceptional speed points to a promising path for ultrafast, low-power computing. The researchers are now working to verify their theoretical predictions through experiments and to investigate how magnons interact with light, which could lead to even more efficient ways of controlling them.

    Advancing Quantum Material Research
    This work contributes to CHARM’s larger goal of developing hybrid quantum materials for cutting-edge technologies. The center’s researchers study how different types of materials — such as magnetic, electronic, and quantum systems — can be combined and controlled to create next-generation technologies. CHARM’s goal is to design smart materials that respond to their environments and enable breakthroughs in computing, energy, and communication.
    The study’s co-authors are Federico Garcia-Gaitan, Yafei Ren, M. Benjamin Jungfleisch, John Q. Xiao, Branislav K. Nikolić, Joshua Zide, and Garnett W. Bryant (NIST/University of Maryland). Funding was provided by the National Science Foundation under award DMR-2011824 More

  • in

    Quantum light breakthrough could transform technology

    High-order harmonic generation (HHG) is a process that transforms light into much higher frequencies, allowing scientists to explore areas of the electromagnetic spectrum that are otherwise difficult to reach. However, generating terahertz (THz) frequencies using HHG has remained a major obstacle because most materials are too symmetrical to support this conversion.
    Graphene has long been a promising candidate for HHG research, but its perfect symmetry restricts it to producing only odd harmonics — frequencies that are odd multiples of the original light source. Even harmonics, which are essential for expanding practical uses of this technology, have been much harder to achieve.
    Quantum Materials Break the Barrier
    In a recent study published in Light: Science & Applications, a research group led by Prof. Miriam Serena Vitiello has achieved a major advance in optical science. By working with exotic quantum materials, the team successfully extended HHG into new and previously unreachable parts of the electromagnetic spectrum.
    Their work centers on topological insulators (TIs), a special class of materials that behave as electrical insulators inside but conduct electricity along their surfaces. These materials exhibit unusual quantum behavior due to strong spin-orbit coupling and time-reversal symmetry. Although scientists had predicted that TIs could support advanced forms of harmonic generation, no one had yet demonstrated it experimentally — until now.
    Amplifying Light With Quantum Nanostructures
    The researchers designed specialized nanostructures called split ring resonators and integrated them with thin layers of Bi2Se₃ and van der Waals heterostructures made from (InₓBi₁₋ₓ)2Se₃. These resonators significantly intensified the incoming light, allowing the team to observe HHG at both even and odd THz frequencies, an exceptional accomplishment.

    They recorded frequency up-conversion between 6.4 THz (even) and 9.7 THz (odd), uncovering how both the symmetrical interior and the asymmetrical surface of the topological materials contribute to light generation. This result represents one of the first clear demonstrations of how topological effects can shape harmonic behavior in the THz range.
    Toward Next-Generation Terahertz Technology
    This experimental achievement not only validates long-standing theoretical predictions but also establishes a new foundation for developing compact terahertz light sources, sensors, and ultrafast optoelectronic components. It gives researchers a new way to study the complex interplay between symmetry, quantum states, and light-matter interactions at the nanoscale.
    As industries continue to demand smaller, faster, and more efficient devices, such progress highlights the growing potential of quantum materials to drive real-world innovation. The discovery also points toward the creation of compact, tunable terahertz light sources powered by optical methods — an advance that could reshape technologies in high-speed communications, medical imaging, and quantum computing. More

  • in

    Too much screen time may be hurting kids’ hearts

    More time using electronic devices or watching TV among children and young adults was linked with higher cardiometabolic disease risk, including high blood pressure, high cholesterol and insulin resistance, based on data from more than 1,000 participants in Denmark. The association between screen time and cardiometabolic risks was strongest in youth who slept fewer hours, suggesting that screen use may harm health by “stealing” time from sleep, researchers said. Researchers said the findings underscore the importance of addressing screen habits among young people as a potential way to protect long-term heart and metabolic health.Screen time tied to early heart and metabolic risksChildren and teens who spend many hours on TVs, phones, tablets, computers or gaming systems appear to face higher chances of cardiometabolic problems, such as elevated blood pressure, unfavorable cholesterol levels and insulin resistance. The findings are reported in the Journal of the American Heart Association, an open-access, peer-reviewed journal of the American Heart Association.
    A 2023 scientific statement from the American Heart Association reported that “cardiometabolic risk is accruing at younger and younger ages,” and that only 29% of U.S. youth ages 2 to 19 had favorable cardiometabolic health in 2013-2018 National Health and Nutrition Examination Survey data.
    Danish cohorts show a consistent pattern
    An evaluation of more than 1,000 participants from two Danish studies found a clear connection: more recreational screen time was significantly associated with greater cardiovascular and overall cardiometabolic risk among children and adolescents.
    “Limiting discretionary screen time in childhood and adolescence may protect long-term heart and metabolic health,” said study lead author David Horner, M.D., PhD., a researcher at the Copenhagen Prospective Studies on Asthma in Childhood (COPSAC) at the University of Copenhagen in Denmark. “Our study provides evidence that this connection starts early and highlights the importance of having balanced daily routines.”
    What researchers measured

    The team analyzed two COPSAC groups: one of 10-year-olds followed in 2010 and one of 18-year-olds followed in 2000. They examined how leisure screen use related to cardiometabolic risk factors. Screen time included watching TV and movies, gaming and time on phones, tablets or computers for fun.
    To capture overall risk, researchers created a composite cardiometabolic score based on multiple components of metabolic syndrome, including waist size, blood pressure, high-density lipoprotein or HDL “good” cholesterol, triglycerides and blood sugar levels. They adjusted for sex and age. The score reflects each participant’s risk relative to the study average (in standard deviations): 0 indicates average risk, and 1 indicates one standard deviation above average.
    Each hour adds up
    The analysis showed that every additional hour of recreational screen time was linked with an increase of about 0.08 standard deviations in the cardiometabolic score for the 10-year-olds and 0.13 standard deviations for the 18-year-olds. “This means a child with three extra hours of screen time a day would have roughly a quarter to half a standard-deviation higher risk than their peers,” Horner said.
    “It’s a small change per hour, but when screen time accumulates to three, five or even six hours a day, as we saw in many adolescents, that adds up,” he said. “Multiply that across a whole population of children, and you’re looking at a meaningful shift in early cardiometabolic risk that could carry into adulthood.”
    Sleep appears to intensify the risk
    Short sleep and later bedtimes strengthened the relationship between screen time and cardiometabolic risk. Youth who slept less showed notably higher risk linked to the same amount of screen exposure.

    “In childhood, sleep duration not only moderated this relationship but also partially explained it: about 12% of the association between screen time and cardiometabolic risk was mediated through shorter sleep duration,” Horner said. “These findings suggest that insufficient sleep may not only magnify the impact of screen time but could be a key pathway linking screen habits to early metabolic changes.”
    Metabolic “fingerprint” linked to screen use
    In a machine learning analysis, investigators identified a distinctive pattern of blood metabolites that appeared to correlate with screen time.
    “We were able to detect a set of blood-metabolite changes, a ‘screen-time fingerprint,’ validating the potential biological impact of the screen time behavior,” he said. “Using the same metabolomics data, we also assessed whether screen time was linked to predicted cardiovascular risk in adulthood, finding a positive trend in childhood and a significant association in adolescence. This suggests that screen-related metabolic changes may carry early signals of long-term heart health risk.
    “Recognizing and discussing screen habits during pediatric appointments could become part of broader lifestyle counseling, much like diet or physical activity,” he said. “These results also open the door to using metabolomic signatures as early objective markers of lifestyle risk.”
    Practical guidance from experts
    Amanda Marma Perak, M.D., M.S.CI., FAHA, chair of the American Heart Association’s Young Hearts Cardiovascular Disease Prevention Committee, who was not involved in this research, said focusing on sleep is a great starting point to change screen time patterns.
    “If cutting back on screen time feels difficult, start by moving screentime earlier and focusing on getting into bed earlier and for longer,” said Perak, an assistant professor of pediatrics and preventive medicine at Northwestern University Feinberg School of Medicine in Chicago.
    Adults can also set an example, she said. “All of us use screens, so it’s important to guide kids, teens and young adults to healthy screen use in a way that grows with them. As a parent, you can model healthy screen use — when to put it away, how to use it, how to avoid multitasking. And as kids get a little older, be more explicit, narrating why you put away your devices during dinner or other times together.
    “Make sure they know how to entertain and soothe themselves without a screen and can handle being bored! Boredom breeds brilliance and creativity, so don’t be bothered when your kids complain they’re bored. Loneliness and discomfort will happen throughout life, so those are opportunities to support and mentor your kids in healthy ways to respond that don’t involve scrolling.”
    Important caveats and next questions
    Because this work is observational, it reveals associations rather than direct cause and effect. In addition, screen use for the 10-year-olds and 18-year-olds was reported by parents through questionnaires, which may not perfectly reflect actual time spent on screens.
    Horner noted that future studies could test whether reducing screen exposure in the hours before bedtime, when screen light may disrupt circadian rhythms and delay sleep onset, helps lower cardiometabolic risk.
    Study details, background and design The two prospective research groups at COPSAC in Denmark consisted of mother-child pairs, with analysis of data collected at planned clinical visits and study assessments from the birth of the children through age 10 in the 2010 study group and age 18 in the 2000 study group. Through questionnaires, parents of children in the 10-year-old group and 18-year-olds detailed the number of hours the young participants spent watching TV or movies, gaming on a console/TV and using phones, tablets or computers for leisure. For the 2010 group, the number of hours of screen time was available for 657 children at age 6 and 630 children at age 10. Average screen time was two hours per day at age 6, and 3.2 hours per day at age 10, representing a significant increase over time. For the 2000 group of 18-year-olds, screen time was available for 364 individuals. Screen time at 18 years was significantly higher at an average of 6.1 hours per day. Sleep was measured by sensors over a 14-day period. More

  • in

    Scientists discover a way simulate the Universe on a laptop

    As astronomers gather more data than ever before, studying the cosmos has become an increasingly complex task. A new innovation is changing that reality. Researchers have now developed a way to analyze enormous cosmic data sets using only a laptop and a few hours of processing time.
    Leading this effort is Dr. Marco Bonici, a postdoctoral researcher at the Waterloo Centre for Astrophysics at the University of Waterloo. Bonici and an international team created Effort.jl, short for EFfective Field theORy surrogate. This tool uses advanced numerical techniques and smart data-preprocessing methods to deliver exceptional computational performance while maintaining the accuracy required in cosmology. The team designed it as a powerful emulator for the Effective Field Theory of Large-Scale Structure (EFTofLSS), allowing researchers to process vast datasets more efficiently than ever before.
    Turning Frustration Into Innovation
    The idea for Effort.jl emerged from Bonici’s experience running time-consuming computer models. Each time he adjusted even a single parameter, it could take days of extra computation to see the results. That challenge inspired him to build a faster, more flexible solution that could handle such adjustments in hours rather than days.
    “Using Effort.jl, we can run through complex data sets on models like EFTofLSS, which have previously needed a lot of time and computer power,” Bonici explained. “With projects like DESI and Euclid expanding our knowledge of the universe and creating even larger astronomical datasets to explore, Effort.jl allows researchers to analyze data faster, inexpensively and multiple times while making small changes based on nuances in the data.”
    Smarter Simulations for a Faster Universe
    Effort.jl belongs to a class of tools known as emulators. These are trained computational shortcuts that replicate the behavior of large, resource-intensive simulations but run dramatically faster. By using emulators, scientists can explore many possible cosmic scenarios in a fraction of the time and apply advanced techniques such as gradient-based sampling to study intricate physical models with greater efficiency.

    “We were able to validate the predictions coming out of Effort.jl by aligning them with those coming out of EFTofLSS,” Bonici said. “The margin of error was small and showed us that the calculations coming out of Effort.jl are strong. Effort.jl can also handle observational quirks like distortions in data and can be customized very easily to the needs of the researcher.”
    Human Expertise Still Matters
    Despite its impressive capabilities, Effort.jl is not a substitute for scientific understanding. Cosmologists still play a vital role in setting parameters, interpreting results, and applying physical insight to ensure meaningful conclusions. The combination of expert knowledge and computational power is what makes the system so effective.
    Looking ahead, Effort.jl is expected to take on even larger cosmological datasets and work alongside other analytical tools. Researchers also see potential for its methods in areas beyond astrophysics, including weather and climate modeling.
    The paper, “Effort.jl: a fast and differentiable emulator for the Effective Field Theory of the Large Scale Structure of the Universe,” was published in the Journal of Cosmology and Astroparticle Physics. More