More stories

  • in

    Unifying behavioral analysis through animal foundation models

    Behavioral analysis can provide a lot of information about the health status or motivations of a living being. A new technology developed at EPFL makes it possible for a single deep learning model to detect animal motion across many species and environments. This “foundational model,” called SuperAnimal, can be used for animal conservation, biomedicine, and neuroscience research.
    Although there is the saying, “straight from the horse’s mouth,” it’s impossible to get a horse to tell you if it’s in pain or experiencing joy. Yet, its body will express the answer in its movements. To a trained eye, pain will manifest as a change in gait, or in the case of joy, the facial expressions of the animal could change. But what if we can automate this with AI? And what about AI models for cows, dogs, cats, or even mice? Automating animal behavior not only removes observer bias, but it helps humans more efficiently get to the right answer.
    Today marks the beginning of a new chapter in posture analysis for behavioral phenotyping. Mackenzie Mathis’ laboratory at EPFL publishes a Nature Communications article describing a particularly effective new open-source tool that requires no human annotations to get the model to track animals. Named “SuperAnimal,” it can automatically recognize, without human supervision, the location of “keypoints” (typically joints) in a whole range of animals — over 45 animal species — and even in mythical ones!
    “The current pipeline allows users to tailor deep learning models, but this then relies on human effort to identify keypoints on each animal to create a training set,” explains Mackenzie Mathis. “This leads to duplicated labeling efforts across researchers and can lead to different semantic labels for the same keypoints, making merging data to train large foundation models very challenging. Our new method provides a new approach to standardize this process and train large-scale datasets. It also makes labeling 10 to 100 times more effective than current tools.”
    The “SuperAnimal method” is an evolution of a pose estimation technique that Mackenzie Mathis’ laboratory had already distributed under the name “DeepLabCut™.” You can read more about this game-changing tool and its origin in this new Nature technology feature.
    “Here, we have developed an algorithm capable of compiling a large set of annotations across databases and train the model to learn a harmonized language — we call this pre-training the foundation model,” explains Shaokai Ye, a PhD student researcher and first author of the study. “Then users can simply deploy our base model or fine-tune it on their own data, allowing for further customization if needed.”
    These advances will make motion analysis much more accessible. “Veterinarians could be particularly interested, as well as those in biomedical research — especially when it comes to observing the behavior of laboratory mice. But it can go further,” says Mackenzie Mathis, mentioning neuroscience and… athletes (canine or otherwise)! Other species — birds, fish, and insects — are also within the scope of the model’s next evolution. “We also will leverage these models in natural language interfaces to build even more accessible and next-generation tools. For example, Shaokai and I, along with our co-authors at EPFL, recently developed AmadeusGPT, published recently at NeurIPS, that allows for querying video data with written or spoken text. Expanding this for complex behavioral analysis will be very exciting.” SuperAnimal is now available to researchers worldwide through its open-source distribution. More

  • in

    Controlling electronics with light: The magnetite breakthrough

    “Some time ago, we showed that it is possible to induce an inverse phase transition in magnetite,” says physicist Fabrizio Carbone at EPFL. “It’s as if you took water and you could turn it into ice by putting energy into it with a laser. This is counterintuitive as normally to freeze water you cool it down, i.e. remove energy from it.”
    Now, Carbone has led a research project to elucidate and control the microscopic structural properties of magnetite during such light-induced phase transitions. The study discovered that using specific light wavelengths for photoexcitation the system can drive magnetite into distinct non-equilibrium metastable states (“metastable” means that the state can change under certain conditions) called “hidden phases,” thus revealing a new protocol to manipulate material properties at ultrafast timescales. The findings, which could impact the future of electronics, are published in PNAS.
    What are “non-equilibrium states?” An “equilibrium state” is basically a stable state where a material’s properties do not change over time because the forces within it are balanced. When this is disrupted, the material (the “system,” to be accurate in terms of physics) is said to enter a non-equilibrium state, exhibiting properties that can border on the exotic and unpredictable.
    The “hidden phases” of magnetite
    A phase transition is a change in a material’s state, due to changes in temperature, pressure, or other external conditions. An everyday example is water going from solid ice to liquid or from liquid to gas when it boils.
    Phase transitions in materials usually follow predictable pathways under equilibrium conditions. But when materials are driven out of equilibrium, they can start showing so called “hidden phases” — intermediate states that are not normally accessible. Observing hidden phases requires advanced techniques that can capture rapid and minute changes in the material’s structure.
    Magnetite (Fe3O4) is a well-studied material known for its intriguing metal-to-insulator transition at low temperatures — from being able to conduct electricity to actively blocking it. This is known as the Verwey transition, and it changes magnetite’s electronic and structural properties significantly.

    With its complex interplay of crystal structure, charge, and orbital orders, magnetite can undergo this metal-insulator transition at around 125 K.
    Ultrafast lasers induce hidden transitions in magnetite
    “To understand this phenomenon better, we did this experiment where we directly looked at the atomic motions happening during such a transformation,” says Carbone. “We found out that laser excitation takes the solid into some different phases that don’t exist in equilibrium conditions.”
    The experiments used two different wavelengths of light: near-infrared (800 nm) and visible (400 nm). When excited with 800 nm light pulses, the magnetite’s structure was disrupted, creating a mix of metallic and insulating regions. In contrast, 400 nm light pulses made the magnetite a more stable insulator.
    To monitor the structural changes in magnetite induced by laser pulses, the researchers used ultrafast electron diffraction, a technique that can “see” the movements of atoms in materials on sub-picosecond timescales (a picosecond is a trillionth of a second).
    The technique allowed the scientists to observe how the different wavelengths of laser light actually affect the structure of the magnetite on an atomic scale.

    Magnetite’s crystal structure is what is referred to as a “monoclinic lattice,” where the unit cell is shaped like a skewed box, with three unequal edges, and two of its angles are 90 degrees while the third is different.
    When the 800 nm light shone on the magnetite, it induced a rapid compression of the magnetite’s monoclinic lattice, transforming it towards a cubic structure. This takes place in three stages over 50 picoseconds, and suggests that there are complex dynamic interactions happening within the material. Conversely, the 400 nm, visible light caused the lattice to expand, reinforcing the monoclinic lattice, and creating a more ordered phase — a stable insulator.
    Fundamental implications and technological applications
    The study reveals that the electronic properties of magnetite can be controlled by selectively using different light wavelengths. Understanding these light-induced transitions provides valuable insights into the fundamental physics of strongly correlated systems.
    “Our study breaks ground for a novel approach to control matter at ultrafast timescale using tailored photon pulses,” write the researchers. Being able to induce and control hidden phases in magnetite could have significant implications for the development of advanced materials and devices. For instance, materials that can switch between different electronic states quickly and efficiently could be used in next-generation computing and memory devices. More

  • in

    Scientists at uOttawa develop innovative method to validate quantum photonics circuits performance

    A team of researchers from the University of Ottawa’s Nexus for Quantum Technologies Institute (NexQT), led by Dr. Francesco Di Colandrea under the supervision of Professor Ebrahim Karimi, associate professor of physics, has developed an innovative technique for evaluating the performance of quantum circuits. This significant advancement, recently published in the journal npj Quantum Information, represents a substantial leap forward in the field of quantum computing.
    In the rapidly evolving landscape of quantum technologies, ensuring the functionality and reliability of quantum devices is critical. The ability to characterize these devices with high accuracy and speed is essential for their efficient integration into quantum circuits and computers, impacting both fundamental studies and practical applications.
    Characterization helps determine if a device operates as expected, which is necessary when devices exhibit anomalies or errors. Identifying and addressing these issues is crucial for advancing the development of future quantum technologies.
    Traditionally, scientists have relied on Quantum Process Tomography (QPT), a method that requires a large number of “projective measurements” to reconstruct a device’s operations fully. However, the number of required measurements in QPT scales quadratically with the dimensionality of the operations, posing significant experimental and computational challenges, especially for high-dimensional quantum information processors.
    The University of Ottawa research team has pioneered an optimized technique named Fourier Quantum Process Tomography (FQPT). This method allows for the complete characterization of quantum operations with a minimal number of measurements. Instead of performing a large number of projective measurements, FQPT utilises a well-known map, the Fourier transform, to perform a portion of the measurements in two different mathematical spaces. The physical relation between these spaces enhances the information extracted from single measurements, significantly reducing the number of measurements needed. For instance, for processes with dimensions 2d (where d can be arbitrarily high), only seven measurements are required.
    To validate their technique, the researchers conducted a photonic experiment using optical polarisation to encode a qubit. The quantum process was realized as a complex space-dependent polarisation transformation, leveraging state-of-the-art liquid-crystal technology. This experiment demonstrated the flexibility and robustness of the method.
    “The experimental validation is a fundamental step to probe the technique’s resilience to noise, ensuring robust and high-fidelity reconstructions in realistic experimental scenarios,” said Francesco Di Colandrea, a postdoctoral fellow at the University of Ottawa.
    This novel technique represents a remarkable advancement in quantum computing. The research team is already actively working on extending FQPT to arbitrary quantum operations, including non-Hermitian and higher-dimensional implementations, and in implementing AI techniques to increase accuracy and reduce measurement. This new technique represents a promising avenue for further advancements in quantum technology. More

  • in

    Changing climate will make home feel like somewhere else

    The impacts of climate change are being felt all over the world, but how will it impact how your hometown feels? An interactive web application from the University of Maryland Center for Environmental Science allows users to search 40,581 places and 5,323 metro areas around the globe to match the expected future climate in each city with the current climate of another location, providing a relatable picture of what is likely in store.
    You may have already experienced these changes where you live and may be wondering: What will climate of the future be like where I live? How hot will summers be? Will it still snow in winter? And perhaps How might things change course if we act to reduce emissions? This web application helps to provide answers to these questions.
    For example, if you live in Washington, D.C., you would need to travel to northern Louisiana to experience what Washington, D.C., will feel like by 2080, where summers are expected to be 11.5°F warmer in 50 years. If you live in Shanghai, China, you would need to travel to northern Pakistan to experience what Shanghai’s climate could be like in 2080.
    “In 50 years, the northern hemisphere cities to the north are going to become much more like cities to the south. Everything is moving towards the equator in terms of the climate that’s coming for you,” said Professor Matthew Fitzpatrick. “And the closer you get to the equator there are fewer and fewer good matches for climates in places like Central America, south Florida, and northern Africa. There is no place on earth representative of what those places they will be like in the future.”
    A spatial ecologist, Fitzpatrick used climate-analog mapping, a statistical technique that matches the expected future climate at one location — your city of residence, for instance — with the current climate of another familiar location, to provide a place-based understanding of climate change. He used the most recent available data from the Intergovernmental Panel on Climate Change (IPCC), the United Nations body for assessing the science related to climate change, to illustrate anticipated temperature changes over 30 years under two different scenarios.
    Because the answer to these questions depends on how climate is expected to change and the specific nature of those changes is uncertain, the app provides results for both high and reduced emissions scenarios, as well as for several different climate forecast models. You can map the best matches for your city for these different scenarios and models as well as map the similarity between your city’s future climate and present climates everywhere (based on the average of the five forecasts for each emission scenario).
    The first scenario that users can search is similar to our current trajectory and assumes very high greenhouse gas emissions, in which the planet is on track to warm around 9 degrees F by the end of this century. This scenario would make the earth warmer than it likely has been in millions of years. The second scenario is similar to what the planet would feel like if nations pursue Paris Climate Accord goals. Under that scenario, the planet warms by about 3 degrees F by immediately and drastically reducing human-caused greenhouse gases emissions.
    “I hope that it continues to inform the conversation about climate change. I hope it helps people better understand the magnitude of the impacts and why scientists are so concerned,” said Fitzpatrick.
    Further information: https://fitzlab.shinyapps.io/cityapp/ More

  • in

    Sweat health monitor measures levels of disease markers

    A wearable health monitor developed by Washington State University researchers can reliably measure levels of important biochemicals in sweat during physical exercise.
    The 3D-printed monitor could someday provide a simple and non-invasive way to track health conditions and diagnose common diseases, such as diabetes, gout, kidney disease or heart disease.
    Reporting in the journal, ACS Sensors, the researchers were able to accurately monitor the levels of volunteers’ glucose, lactate and uric acid as well as the rate of their sweating during exercise.
    “Diabetes is a major problem worldwide,” said Chuchu Chen, a WSU Ph.D. student and first author on the paper. “I think 3D printing can make a difference to the healthcare fields, and I wanted to see if we can combine 3D printing with disease detection methods to create a device like this.”
    For their proof-of-concept health monitor, the researchers used 3D printing to make the health monitors in a unique, one-step manufacturing process. The researchers used a single-atom catalyst and enzymatic reactions to enhance the signal and measure low levels of the biomarkers. Three biosensors on the monitor change color to indicate the specific biochemical levels.
    Sweat contains many important metabolites that can indicate health conditions, but, unlike blood sampling, it’s non-invasive. Levels of uric acid in sweat can indicate the risk of developing gout, kidney disease or heart disease. Glucose levels are used to monitor diabetes, and lactate levels can indicate exercise intensity.
    “Sweat rate is also an important parameter and physiological indicator for people’s health,” said Kaiyan Qiu, Berry Assistant Professor in WSU’s School of Mechanical and Materials Engineering.

    But the amount of these chemicals in sweat is tiny and can be hard to measure, the researchers noted. While other sweat sensors have been developed, they are complex and need specialized equipment and expertise to fabricate. The sensors also have to be flexible and stretchable.
    “It’s novel to use single-atom catalysts to enhance the sensitivity and accuracy of the health monitor,” said Annie Du, research professor in WSU’s School of Mechanical and Materials Engineering. Qiu and Du led the study.
    The health monitor the researchers developed is made of very tiny channels to measure the sweat rate and biomarkers’ concentration. As they’re being fabricated via 3D printing, the micro-channels don’t require any supporting structure, which can cause contamination problems as they’re removed, said Qiu.
    “We need to measure the tiny concentrations of biomarkers, so we don’t want these supporting materials to be present or to have to remove them,” he said. “That’s why we’re using a unique method to print the self-supporting microfluidic channels.”
    When the researchers compared the monitors on volunteers’ arms to lab results, they found that their monitor was accurately and reliably measuring the concentration of the chemicals as well as the sweating rate.
    While the researchers initially chose three biomarkers to measure, they can add more, and the biomarkers can be customized. The monitors were also comfortable for volunteers to wear.
    The researchers are now working to further improve the device design and validation. They are also hoping to commercialize the technology. The WSU Office of Commercialization has also filed a provisional patent application to protect the intellectual property associated with this technology.
    The work was funded by the National Science Foundation and the Centers for Disease Control and Prevention, as well as through startup funds. More

  • in

    Can AI learn like us?

    It reads. It talks. It collates mountains of data and recommends business decisions. Today’s artificial intelligence might seem more human than ever. However, AI still has several critical shortcomings.
    “As impressive as ChatGPT and all these current AI technologies are, in terms of interacting with the physical world, they’re still very limited. Even in things they do, like solve math problems and write essays, they take billions and billions of training examples before they can do them well, ” explains Cold Spring Harbor Laboratory (CSHL) NeuroAI Scholar Kyle Daruwalla.
    Daruwalla has been searching for new, unconventional ways to design AI that can overcome such computational obstacles. And he might have just found one.
    The key was moving data. Nowadays, most of modern computing’s energy consumption comes from bouncing data around. In artificial neural networks, which are made up of billions of connections, data can have a very long way to go. So, to find a solution, Daruwalla looked for inspiration in one of the most computationally powerful and energy-efficient machines in existence — the human brain.
    Daruwalla designed a new way for AI algorithms to move and process data much more efficiently, based on how our brains take in new information. The design allows individual AI “neurons” to receive feedback and adjust on the fly rather than wait for a whole circuit to update simultaneously. This way, data doesn’t have to travel as far and gets processed in real time.
    “In our brains, our connections are changing and adjusting all the time,” Daruwalla says. “It’s not like you pause everything, adjust, and then resume being you.”
    The new machine-learning model provides evidence for a yet unproven theory that correlates working memory with learning and academic performance. Working memory is the cognitive system that enables us to stay on task while recalling stored knowledge and experiences.
    “There have been theories in neuroscience of how working memory circuits could help facilitate learning. But there isn’t something as concrete as our rule that actually ties these two together. And so that was one of the nice things we stumbled into here. The theory led out to a rule where adjusting each synapse individually necessitated this working memory sitting alongside it, ” says Daruwalla.
    Daruwalla’s design may help pioneer a new generation of AI that learns like we do. That would not only make AI more efficient and accessible — it would also be somewhat of a full-circle moment for neuroAI. Neuroscience has been feeding AI valuable data since long before ChatGPT uttered its first digital syllable. Soon, it seems, AI may return the favor. More

  • in

    Creation of a power-generating, gel electret-based device

    A team of researchers from NIMS, Hokkaido University and Meiji Pharmaceutical University has developed a gel electret capable of stably retaining a large electrostatic charge. The team then combined this gel with highly flexible electrodes to create a sensor capable of perceiving low-frequency vibrations (e.g., vibrations generated by human motion) and converting them into output voltage signals. This device may potentially be used as a wearable healthcare sensor.
    Interest in the development of soft, lightweight, power-generating materials has been growing in recent years for use in soft electronics designed for various purposes, such as healthcare and robotics. Electret materials capable of stably retaining electrostatic charge may be used to develop vibration-powered devices without external power sources. NIMS has been leading efforts to develop a low-volatility, room-temperature alkyl-π liquid composed of a π-conjugated dye moiety and flexible yet branched alkyl chains (a type of hydrocarbon compound). The alkyl-π liquids exhibit excellent charge retention properties, can be applied to other materials (e.g., through painting and impregnation) and are easily formable. However, when these liquids have been combined with electrodes to create flexible devices, they have proven difficult to immobilize and seal, resulting in leakage issues. Moreover, the electrostatic charge retention capacities of alkyl-π liquids needed to be increased in order to improve their power generation capabilities.
    The research team recently succeeded in creating an alkyl-π gel by adding a trace amount of a low-molecular-weight gelator to an alkyl-π liquid. The elastic storage modulus of this gel was found to be 40 million times that of its liquid counterpart, and it could be simplified fixation and sealed. Moreover, the gel-electret obtained by charging this gel achieved a 24% increase in charge retention compared to the base material (i.e., the alkyl-π liquid), thanks to the improved confinement of electrostatic charges within the gel. The team then combined flexible electrodes with the gel-electret to create a vibration sensor. This sensor was able to perceive vibrations with frequencies as low as 17 Hz and convert them into an output voltage of 600 mV — 83% higher than the voltage generated by an alkyl-π liquid electret-based sensor.
    In future research, the team aims to develop wearable sensors capable of responding to subtle vibrations and various strain deformations by further improving the charging electret characteristics (i.e., charge capacity and charge life) and strength of the alkyl-π gel. Additionally, since this gel is recyclable and reusable as a vibration sensor material, its use is expected to help promote a circular economy. More

  • in

    A railroad of cells

    Looking under the microscope, a group of cells slowly moves forward in a line, like a train on the tracks. The cells navigate through complex environments. A new approach by researchers involving the Institute of Science and Technology Austria (ISTA) now shows how they do this and how they interact with each other. The experimental observations and the following mathematical concept are published in Nature Physics.
    The majority of the cells in the human body cannot move. Some specific ones, however, can go to different places. For example, in wound healing, cells move through the body to repair damaged tissue. They sometimes travel alone or in different group sizes. Although the process is increasingly understood, little is known about how cells interact while traveling and how they collectively navigate the complex environments found in the body. An interdisciplinary team of theoretical physicists at the Institute of Science and Technology Austria (ISTA) and experimentalists from the University of Mons in Belgium now has new insights.
    Much like social dynamics experiments, where understanding the interactions of a small group of people is easier than analyzing an entire society, the scientists studied the traveling behavior of a small group of cells in well-defined in vitro surroundings, i.e. outside a living organism, in a Petri dish equipped with interior features. Based on their findings, they developed a framework of interaction rules, which is now published in Nature Physics.
    Cells travel in trains
    David Brückner rushes back to his office to grab his laptop. “I think it’s better to show some videos of our experiments,” he says excitedly and presses play. The video shows a Petri dish. Microstripes — one-dimensional lanes guiding cell movement — are printed on the substrate beside a zebrafish scale made up of numerous cells. Special wound-healing cells, known as “keratocytes” start to stretch away from the scale, forming branches into the lanes. “At first, cells stick together through adhesive molecules on their surface — it’s like they’re holding hands,” explains Brückner. Suddenly, the bond breaks off, and the cells assemble into tiny groups, moving forward like trains along tracks. “The length of the train is always different. Sometimes it’s two, sometimes it’s ten. It depends on the initial conditions.”
    Eléonore Vercurysse and Sylvain Gabriele from the University of Mons in Belgium observed this phenomenon while investigating keratocytes and their wound-healing features within different geometrical patterns. To help interpret these puzzling observations, they reached out to theoretical physicists David Brückner and Edouard Hannezo at ISTA.
    Cells have a steering wheel
    “There’s a gradient within each cell that determines where the cell is going. It’s called ‘polarity’ and it’s like the cell’s very own steering wheel,” says Brückner. “Cells communicate their polarity to neighboring cells, allowing them to move in concert.” But how they do so has remained a big puzzle in the field. Brückner and Hannezo started brainstorming. The two scientists developed a mathematical model combining a cell’s polarity, their interactions, and the geometry of their surroundings. They then transferred the framework into computer simulations, which helped them visualize different scenarios.

    The first thing the scientists in Austria looked at was the speed of the cell trains. The simulation revealed that the speed of the trains is independent of their length, whether they consist of two or ten cells. “Imagine if the first cell did all the work, dragging the others behind it; the overall performance would decrease,” says Hannezo. “But that’s not the case. Within the trains, all the cells are polarized in the same direction. They are aligned and in sync about their movement and smoothly move forward.” In other words, the trains operate like an all-wheel drive rather than just a front-wheel drive.
    As a next step, the theoreticians examined the effects of increasing the width of the lanes and the cell clusters in their simulations. Compared to cells moving in a single file, clusters were much slower. The explanation is quite simple: the more cells are clustered together, the more they bump into each other. These collisions cause them to polarize away from each other and move in opposite directions. The cells are not aligned properly, which disrupts the flow of movement and drastically influences the overall speed. This phenomenon was also observed in the Belgian lab (in vitro experiments).
    Dead end? No problem for cell clusters
    From an efficiency standpoint, it sounds like moving in clusters is not ideal. However, the model predicted that it also had its benefits when cells navigate through complex terrain, as they do, for instance, in the human body. To test this, the scientists added a dead end, both in the experiments and in the simulations. “Trains of cells get to the dead end quickly, but struggle to change direction. Their polarization is well aligned, and it’s very hard for them to agree on switching around,” says Brückner. “Whereas in the cluster, quite a few cells are already polarized in the other direction, making the change of direction way easier.”
    Trains or clusters?
    Naturally, the question arises: when do cells move in clusters, and when do they move in trains? The answer is that both scenarios are observed in nature. For example, some developmental processes rely on clusters of cells moving from one side to the other, while others depend on small trains of cells moving independently. “Our model doesn’t only apply to a single process. Instead, it is a broadly applicable framework showing that placing cells in an environment with geometric constraints is highly instructive, as it challenges them and allows us to decipher their interactions with each other,” Hannezo adds.
    A small train packed with information
    Recent publications by the Hannezo group suggest that cell communication propagates in waves — an interplay between biochemical signals, physical behavior, and motion. The scientists’ new model now provides a physical foundation for these cell-to-cell interactions, possibly aiding in understanding the big picture. Based on this framework, the collaborators can delve deeper into the molecular players involved in this process. According to Brückner, the behaviors revealed by these small cell trains can help us understand large-scale movements, such as those seen in entire tissues. More