More stories

  • in

    Robotic device restores wavelike muscular function involved in processes like digestion, aiding patients with compromised organs

    A team of Vanderbilt researchers has developed a wirelessly activated device that mimics the wavelike muscular function in the esophagus and small intestine responsible for transporting food and viscous fluids for digestion.
    The soft-robotic prototype, which is driven by strong magnets controlled by a wearable external actuator, can aid patients suffering from blockages caused by tumors or those requiring stents. For example, traditional esophageal stents are metal tubes used in patients with esophageal cancer, mostly in an aging population. These patients risk food being blocked from entering the stomach, potentially causing a dangerous situation where food instead enters the lung.
    Restoring the natural motion of peristalsis, the wavelike muscular transport function that takes place inside tubular human organs, “paves the way for next-generation robotic medical devices to improve the quality of life especially for the aging population,” researchers wrote in a new paper in the journal Advanced Functional Materials describing the device.
    The study was led by Xiaoguang Dong, Assistant Professor of Mechanical Engineering. This work was done in collaboration with Vanderbilt University Medical Center colleague, Dr. Rishi Naik, Assistant Professor of Medicine in the Division of Gastroenterology, Hepatology and Nutrition.
    The device itself consists of a soft sheet of small magnets arrayed in parallel rows that are activated in a precise undulating motion that produces the torque required to pump various solid and liquid cargoes. “Magnetically actuated soft robotic pumps that can restore peristalsis and seamlessly integrate with medical stents have not been reported before,” Dong and the researchers report in the paper.
    Dong, who also holds appointments in Biomedical Engineering and Electrical and Computer Engineering, said further refinements of the device could aid in other biological processes that may have been compromised by disease. For example, he said the design could be used to help transport human eggs from the ovaries when muscular function in the fallopian tubes has been impaired. In addition, the researchers said with advanced manufacturing processes, the device could be scaled down to adapt to even narrower passageways.
    Vanderbilt University School of Engineering provided funding support. Oak Ridge National Laboratory provided facility support for this research. The research team is affiliated with the Vanderbilt Institute for Surgery and Engineering (VISE). More

  • in

    Digital babies created to improve infant healthcare

    Researchers at University of Galway have created digital babies to better understand infants’ health in their critical first 180 days of life.
    The team created 360 advanced computer models that simulate the unique metabolic processes of each baby.
    The digital babies are the first sex-specific computational whole-body models representing newborn and infant metabolism with 26 organs, six cell types, and more than 80,000 metabolic reactions.
    Real-life data from 10,000 newborns, including sex, birth weight and metabolite concentrations, enabled the creation and validation of the models, which can be personalised — enabling scientists to investigate an individual infant’s metabolism for precision medicine applications.
    The work was conducted by a team of scientists at University of Galway’s Digital Metabolic Twin Centre and Heidelberg University, led by APC Microbiome Ireland principal investigator Professor Ines Thiele.
    The team’s research aims to advance precision medicine using computational modelling. They describe the computational modelling of babies as seminal, as it enhances understanding of infant metabolism and creates opportunities to improve the diagnosis and treatment of medical conditions during the early days of a baby’s life, such as inherited metabolic diseases.
    Lead author Elaine Zaunseder, Heidelberg University,said: “Babies are not just small adults — they have unique metabolic features that allow them to develop and grow up healthy. For instance, babies need more energy for regulating body temperature due to, for example, their high surface-area-to-mass ratio, but they cannot shiver in the first six months of life, so metabolic processes must ensure the infant keeps warm.

    “Therefore, an essential part of this research work was to identify these metabolic processes and translate them into mathematical concepts that could be applied in the computational model. We captured metabolism in an organ-specific manner, which offers the unique opportunity to model organ-specific energy demands that are very different in infants compared to adults.
    “As nutrition is the fuel for metabolism, we can use breast milk data from real newborns in our models to simulate the associated metabolism throughout the baby’s entire body, including various organs. Based on their nutrition, we simulated the development of digital babies over six months and showed that they will grow at the same rate as real-world infants.”
    Professor Ines Thiele, study lead on the project, said: “New-born screening programmes are crucial for detecting metabolic diseases early on, enhancing infant survival rates and health outcomes. However, the variability observed in how these diseases manifest in babies underscores the urgent need for personalised approaches to disease management.
    “Our models allow researchers to investigate the metabolism of healthy infants as well as infants suffering from inherited metabolic diseases, including those investigated in newborn screening. When simulating the metabolism of infants with a disease, the models showed we can predict known biomarkers for these diseases. Furthermore, the models accurately predicted metabolic responses to various treatment strategies, showcasing their potential in clinical settings.”
    Elaine Zaunseder added: “This work is a first step towards establishing digital metabolic twins for infants, providing a detailed view of their metabolic processes. Such digital twins have the potential to revolutionise paediatric healthcare by enabling tailored disease management for each infant’s unique metabolic needs.” More

  • in

    With programmable pixels, novel sensor improves imaging of neural activity

    Neurons communicate electrically so to understand how they produce brain functions such as memory, neuroscientists must track how their voltage changes — sometimes subtly — on the timescale of milliseconds. In a new paper in Nature Communications, MIT researchers describe a novel image sensor with the capability to substantially increase that ability.
    The invention led by Jie Zhang, a postdoctoral scholar in The Picower Institute for Learning and Memory lab of Sherman Fairchild Professor Matt Wilson, is a new take on the standard “CMOS” technology used in scientific imaging. In that standard approach, all pixels turn on and off at the same time — a configuration with an inherent trade-off in which fast sampling means capturing less light. The new chip enables each pixel’s timing to be controlled individually. That arrangement provides a “best of both worlds” in which neighboring pixels can essentially complement each other to capture all the available light without sacrificing speed.
    In experiments described in the study, Zhang and Wilson’s team demonstrates how “pixelwise” programmability enabled them to improve visualization of neural voltage “spikes,” which are the signals neurons use to communicate with each other, and even the more subtle, momentary fluctuations in their voltage that constantly occur between those spiking events.
    “Measuring with single-spike resolution is really important as part of our research approach,” said senior author Wilson, a Professor in MIT’s Departments of Biology and Brain and Cognitive Sciences (BCS), whose lab studies how the brain encodes and refines spatial memories both during wakeful exploration and during sleep. “Thinking about the encoding processes within the brain, single spikes and the timing of those spikes is important in understanding how the brain processes information.”
    For decades Wilson has helped to drive innovations in the use of electrodes to tap into neural electrical signals in real-time, but like many researchers he has also sought visual readouts of electrical activity because they can highlight large areas of tissue and still show which exact neurons are electrically active at any given moment. Being able to identify which neurons are active can enable researchers to learn which types of neurons are participating in memory processes, providing important clues about how brain circuits work.
    In recent years, neuroscientists including co-senior author Ed Boyden, Y. Eva Tan Professor of Neurotechnology in BCS and The McGovern Institute for Brain Research and a Picower Institute affiliate, have worked to meet that need by inventing “genetically encoded voltage indicators” (GEVIs), that make cells glow as their voltage changes in real-time. But as Zhang and Wilson have tried to employ GEVIs in their research, they’ve found that conventional CMOS image sensors were missing a lot of the action. If they operated too fast, they wouldn’t gather enough light. If they operated too slow, they’d miss rapid changes.
    But image sensors have such fine resolution that many pixels are really looking at essentially the same place on the scale of a whole neuron, Wilson said. Recognizing that there was resolution to spare, Zhang applied his expertise in sensor design to invent an image sensor chip that would enable neighboring pixels to each have their own timing. Faster ones could capture rapid changes. Slower-working ones could gather more light. No action or photons would be missed. Zhang also cleverly engineered the required control electronics so they barely cut into the space available for light-sensitive elements on a pixels. This ensured the sensor’s high sensitivity under low light conditions, Zhang said.

    Two demos
    In the study the researchers demonstrated two ways in which the chip improved imaging of voltage activity of mouse hippocampus neurons cultured in a dish. They ran their sensor head-to-head against an industry standard scientific CMOS image sensor chip.
    In the first set of experiments the team sought to image the fast dynamics of neural voltage. On the conventional CMOS chip, each pixel had a zippy 1.25 millisecond exposure time. On the pixel-wise sensor each pixel in neighboring groups of four stayed on for 5 milliseconds, but their start times were staggered so that each one turned on and off 1.25 seconds later than the next. In the study, the team shows that each pixel, because it was on longer, gathered more light but because each one was capturing a new view every 1.25 milliseconds, it was equivalent to simply having a fast temporal resolution. The result was a doubling of the signal-to-noise ratio for the pixelwise chip. This achieves high temporal resolution at a fraction of the sampling rate compared to conventional CMOS chips, Zhang said.
    Moreover, the pixelwise chip detected neural spiking activities that the conventional sensor missed. And when the researchers compared the performance of each kind of sensor against the electrical readings made with a traditional patch clamp electrode, they found that the staggered pixelwise measurements better matched that of the patch clamp.
    In the second set of experiments, the team sought to demonstrate that the pixelwise chip could capture both the fast dynamics and also the slower, more subtle “subthreshold” voltage variances neurons exhibit. To do so they varied the exposure durations of neighboring pixels in the pixelwise chip, ranging from 15.4 milliseconds down to just 1.9 milliseconds. In this way, fast pixels sampled every quick change (albeit faintly), while slower pixels integrated enough light over time to track even subtle slower fluctuations. By integrating the data from each pixel, the chip was indeed able to capture both fast spiking and slower subthreshold changes, the researchers reported.
    The experiments with small clusters of neurons in a dish was only a proof-of concept, Wilson said. His lab’s ultimate goal is to conduct brain-wide, real-time measurements of activity in distinct types of neurons in animals even as they are freely moving about and learning how to navigate mazes. The development of GEVIs and of image sensors like the pixelwise chip that can successfully take advantage of what they show is crucial to making that goal feasible.

    “That’s the idea of everything we want to put together: large-scale voltage imaging of genetically tagged neurons in freely behaving animals,” Wilson said.
    To achieve this, Zhang added, “We are already working on the next iteration of chips with lower noise, higher pixel counts, time-resolution of multiple kHz, and small form factors for imaging in freely behaving animals.”
    The research is advancing pixel by pixel.
    In addition to Zhang, Wilson and Boyden the paper’s other authors are Jonathan Newman, Zeguan Wang, Yong Qian, Pedro Feliciano-Ramos, Wei Guo, Takato Honda, Zhe Sage Chen, Changyang Linghu, Ralph-Etienne Cummings, and Eric Fossum.
    The Picower Institute for Learning and Memory, The JPB Foundation, the Alana Foundation, The Louis B. Thalheimer Fund for Translational Research, the National Institutes of Health, HHMI, Lisa Yang and John Doerr provided support for the research. More

  • in

    Discovery highlights ‘critical oversight’ in perceived security of wireless networks

    A research team led by Rice University’s Edward Knightly has uncovered an eavesdropping security vulnerability in high-frequency and high-speed wireless backhaul links, widely employed in critical applications such as 5G wireless cell phone signals and low-latency financial trading on Wall Street.
    Contrary to the common belief that these links are inherently secure due to their elevated positioning and highly directive millimeter-wave and sub-terahertz “pencil-beams,” the team exposed a novel method of interception using a metasurface-equipped drone dubbed MetaFly. Their findings were published by the world’s premier security conference, IEEE Symposium on Security and Privacy, in May 2024.
    “The implications of our research are far-reaching, potentially affecting a broad spectrum of companies, government agencies and individuals relying on these links,” said Knightly, the Sheafor-Lindsay Professor of Electrical and Computer Engineering and professor of computer science. “Importantly, understanding this vulnerability is the first step toward developing robust countermeasures.”
    Wireless backhaul links, crucial for the backbone of modern communication networks connecting end users to the main networks, have been assumed immune from eavesdropping because of their underlying physical and technological barriers.
    Knightly and electrical and computer engineering Ph.D. research assistant Zhambyl Shaikhanov, in collaboration with researchers at Brown University and Northeastern University, have demonstrated how a strong adversary can bypass these defenses with alarming ease. By deploying MetaFly, they intercepted high-frequency signals between rooftops in the Boston metropolitan area, leaving almost no trace.
    “Our discovery highlights a critical oversight in the perceived security of our wireless backhaul links,” Shaikhanov said.
    As wireless technology advances into the realms of 5G and beyond, ensuring the security of these networks is paramount. The Rice team’s work is a significant step toward understanding sophisticated threats such as MetaFly and also safeguarding the communication infrastructure.
    Other members of the research team include Sherif Badran, Northeastern graduate researcher and co-lead author; Josep M. Jornet, professor of electrical and computer engineering at Northeastern; Hichem Guerboukha, assistant professor of electrical and computer engineering at University of Missouri-Kansas City; and Daniel M. Mittleman, professor of engineering at Brown. More

  • in

    Liquid metal-based electronic logic device that mimics intelligent prey-capture mechanism of Venus flytrap

    A research team led by the School of Engineering of the Hong Kong University of Science and Technology (HKUST) has developed a liquid metal-based electronic logic device that mimics the intelligent prey-capture mechanism of Venus flytraps. Exhibiting memory and counting properties, the device can intelligently respond to various stimulus sequences without the need for additional electronic components. The intelligent strategies and logic mechanisms in the device provide a fresh perspective on understanding “intelligence” in nature and offer inspiration for the development of “embodied intelligence.”
    The unique prey-capture mechanism of Venus flytraps has always been an intriguing research focus in the realm of biological intelligence. This mechanism allows them to effectively distinguish between various external stimuli such as single and double touches, thereby distinguishing between environmental disturbances such as raindrops (single touch) and insects (double touches), ensuring successful prey capture. This functionality is primarily attributed to the sensory hairs on the carnivorous plants, which exhibit features akin to memory and counting, enabling them to perceive stimuli, generate action potentials (a change of electrical signals in cells in response to stimulus), and remember the stimuli for a short duration.
    Inspired by the internal electrical signal accumulation/decay model of Venus flytraps, Prof. SHEN Yajing, Associate Professor of the Department of Electronic and Computer Engineering (ECE) at HKUST, who led the research, joined hands with his former PhD student at City University of Hong Kong, Dr. YANG Yuanyuan, now Associate Professor at Xiamen University, proposed a liquid metal-based logic module (LLM) based on the extension/contraction deformation of liquid metal wires. The device employs liquid metal wires in sodium hydroxide solution as the conductive medium, controlling the length of the liquid metal wires based on electrochemical effects, thereby regulating cathode output according to the stimuli applied to the anode and gate. Research results demonstrate that the LLM itself can memorize the duration and interval of electrical stimuli, calculate the accumulation of signals from multiple stimuli, and exhibit significant logical functions similar to those of Venus flytraps.
    To demonstrate, Prof. Shen and Dr. Yang constructed an artificial Venus flytrap system comprising the LLM intelligent decision-making device, switch-based sensory hair, and soft electric actuator-based petal, replicating the predation process of Venus flytraps. Furthermore, they showcased the potential applications of LLM in functional circuit integration, filtering, artificial neural networks, and more. Their work not only provides insights into simulating intelligent behaviors in plants, but also serves as a reliable reference for the development of subsequent biological signal simulator devices and biologically inspired intelligent systems.
    “When people mention ‘artificial intelligence’, they generally think of intelligence that mimics animal nervous systems. However, in nature, many plants can also demonstrate intelligence through specific material and structural combinations. Research in this direction provides a new perspective and approach for us to understand ‘intelligence’ in nature and construct ‘life-like intelligence’,” said Prof. Shen.
    “Several years ago, when Dr. Yang was still pursuing her PhD in my research group, we discussed the idea of constructing intelligent entities inspired by plants together. It is gratifying that after several years of effort, we have achieved the conceptual verification and simulation of Venus flytrap intelligence. However, it is worth noting that this work is still relatively preliminary, and there is much work to be done in the future, such as designing more efficient structures, reducing the size of devices, and improving system responsiveness,” added Prof. Shen. More

  • in

    The unexpected origins of a modern finance tool

    In the early 1600s, the officials running Durham Cathedral, in England, had serious financial problems. Soaring prices had raised expenses. Most cathedral income came from renting land to tenant farmers, who had long leases so officials could not easily raise the rent. Instead, church leaders started charging periodic fees, but these often made tenants furious. And the 1600s, a time of religious schism, was not the moment to alienate church members.
    But in 1626, Durham officials found a formula for fees that tenants would accept. If tenant farmers paid a fee equal to one year’s net value of the land, it earned them a seven-year lease. A fee equal to 7.75 years of net value earned a 21-year lease.
    This was a form of discounting, the now-common technique for evaluating the present and future value of money by assuming a certain rate of return on that money. The Durham officials likely got their numbers from new books of discounting tables. Volumes like this had never existed before, but suddenly local church officials were applying the technique up and down England.
    As financial innovation stories go, this one is unusual. Normally, avant-garde financial tools might come from, well, the financial avant-garde — bankers, merchants, and investors hunting for short-term profits, not clergymen.
    “Most people have assumed these very sophisticated calculations would have been implemented by hard-nosed capitalists, because really powerful calculations would allow you to get an economic edge and increase profits,” says MIT historian William Deringer, an expert in the deployment of quantitative reasoning in public life. “But that was not the primary or only driver in this situation.”
    Deringer has published a new research article about this episode, “Mr. Aecroid’s Tables: Economic Calculations and Social Customs in the Early Modern Countryside,” appearing in the current issue of the Journal of Modern History. In it, he uses archival research to explore how the English clergy started using discounting, and where. And one other question: Why?
    Enter inflation
    Today, discounting is a pervasive tool. A dollar in the present is worth more than a dollar a decade from now, since one can earn money investing it in the meantime. This concept heavily informs investment markets, corporate finance, and even the NFL draft (where trading this year’s picks yields a greater haul of future picks). As the historian William N. Goetzmann has written, the related idea of net present value “is the most important tool in modern finance.” But while discounting was known as far back as the mathematician Leonardo of Pisa (often called Fibonacci) in the 1200s, why were English clergy some of its most enthusiastic early adopters?

    The answer involves a global change in the 1500s: the “price revolution,” in which things began costing more, after a long period when prices had been constant. That is, inflation hit the world.
    “People up to that point lived with the expectation that prices would stay the same,” Deringer says. “The idea that prices changed in a systematic way was shocking.”
    For Durham Cathedral, inflation meant the organization had to pay more for goods while three-quarters of its revenues came from tenant rents, which were hard to alter. Many leases were complex, and some were locked in for a tenant’s lifetime. The Durham leaders did levy intermittent fees on tenants, but that led to angry responses and court cases.
    Meanwhile, tenants had additional leverage against the Church of England: religious competition following the Reformation. England’s political and religious schisms would lead it to a midcentury civil war. Maybe some private landholders could drastically increase fees, but the church did not want to lose followers that way.
    “Some individual landowners could be ruthlessly economic, but the church couldn’t, because it’s in the midst of incredible political and religious turmoil after the Reformation,” Deringer says. “The Church of England is in this precarious position. They’re walking a line between Catholics who don’t think there should have been a Reformation, and Puritans who don’t think there should be bishops. If they’re perceived to be hurting their flock, it would have real consequences. The church is trying to make the finances work but in a way that’s just barely tolerable to the tenants.”
    Enter the books of discounting tables, which allowed local church leaders to finesse the finances. Essentially, discounting more carefully calibrated the upfront fees tenants would periodically pay. Church leaders could simply plug in the numbers as compromise solutions.

    In this period, England’s first prominent discounting book with tables was published in 1613; its most enduring, Ambrose Acroyd’s “Table of Leasses and Interest,” dated to 1628-29. Acroyd was the bursar at Trinity College at Cambridge University, which as a landholder (and church-affiliated institution) faced the same issues concerning inflation and rent. Durham Cathedral began using off-the-shelf discounting formulas in 1626, resolving decades of localized disagreement as well.
    Performing fairness
    The discounting tables from books did not only work because the price was right. Once circulating clergy had popularized the notion throughout England, local leaders could justify using the books because others were doing it. The clergy were “performing fairness,” as Deringer puts it.
    “Strict calculative rules assured tenants and courts that fines were reasonable, limiting landlords’ ability to maximize revenues,” Deringer writes in the new article.
    To be sure, local church leaders in England were using discounting for their own economic self-interest. It just wasn’t the largest short-term economic self-interest possible. And it was a sound strategy.
    “In Durham they would fight with tenants every 20 years [in the 1500s] and come to a new deal, but eventually that evolves into these sophisticated mechanisms, the discounting tables,” Deringer adds. “And you get standardization. By about 1700, it seems like these procedures are used everywhere.”
    Thus, as Deringer writes, “mathematical tables for setting fines were not so much instruments of a capitalist transformation as the linchpin holding together what remained of an older system of customary obligations stretched nearly to breaking by macroeconomic forces.”
    Once discounting was widely introduced, it never went away. Deringer’s Journal of Modern History article is part of a larger book project he is currently pursuing, about discounting in many facets of modern life.
    Deringer was able piece together the history of discounting in 17th-century England thanks in part to archival clues. For instance, Durham University owns a 1686 discounting book self-described as an update to Acroyd’s work; that copy was owned by a Durham Cathedral administrator in the 1700s. Of the 11 existing copies of Acroyd’s work, two are at Canterbury Cathedral and Lincoln Cathedral.
    Hints like that helped Deringer recognize that church leaders were very interested in discounting; his further research helped him see that this chapter in the history of discounting is not merely about finance; it also opens a new window into the turbulent 1600s.
    “I never expected to be researching church finances, I didn’t expect it to have anything to do with the countryside, landlord-tenant relationships, and tenant law,” Deringer says. “I was seeing this as an interesting example of a story about bottom-line economic calculation, and it wound up being more about this effort to use calculation to resolve social tensions.” More

  • in

    Study offers a better way to make AI fairer for everyone

    n a new paper, researchers from Carnegie Mellon University and Stevens Institute of Technology show a new way of thinking about the fair impacts of AI decisions. They draw on a well-established tradition known as social welfare optimization, which aims to make decisions fairer by focusing on the overall benefits and harms to individuals. This method can be used to evaluate the industry standard assessment tools for AI fairness, which look at approval rates across protected groups.
    “In assessing fairness, the AI community tries to ensure equitable treatment for groups that differ in economic level, race, ethnic background, gender, and other categories,” explained John Hooker,  professor of operations research at the Tepper School at Carnegie Mellon, who coauthored the study and presented the paper at the International Conference on the Integration of Constraint Programming, Artificial Intelligence, and Operations Research (CPAIOR) on May 29 in Uppsala, Sweden. The paper received the Best Paper Award.
    Imagine a situation where an AI system decides who gets approved for a mortgage or who gets a job interview. Traditional fairness methods might only ensure that the same percentage of people from different groups get approved. But what if being denied a mortgage has a much bigger negative impact on someone from a disadvantaged group than on someone from an advantaged group? By employing a social welfare optimization method, AI systems can make decisions that lead to better outcomes for everyone, especially for those in disadvantaged groups.
    The study focuses on “alpha fairness,” a method of finding a balance between being fair and getting the most benefit for everyone. Alpha fairness can be adjusted to balance fairness and efficiency more or less, depending on the situation.
    Hooker and his coauthors show how social welfare optimization can be used to compare different assessments for group fairness currently used in AI. By using this method, we can understand the benefits of applying different group fairness tools in different contexts. It also ties these group fairness assessment tools to the larger world of fairness-efficiency standards used in economics and engineering.
    “Our findings suggest that social welfare optimization can shed light on the intensely discussed question of how to achieve group fairness in AI,” Leben said.
    The study is important for both AI system developers and policymakers. Developers can create more equitable and effective AI models by adopting a broader approach to fairness and understanding the limitations of fairness measures. It also highlights the importance of considering social justice in AI development, ensuring that technology promotes equity across diverse groups in society. More

  • in

    How do you know where a fish goes?

    When scientists want to study the long-distance movement of marine animals, they will instrument them with a small device called an acoustic transmitter — or tag — which emits unique signals or “pings.” These signals are picked up by receivers anchored to the seafloor that record the date and time of each detection when the tagged animal comes within range.
    Data collected by the receivers are stored until they are retrieved by researchers and shared across members of cooperative acoustic telemetry networks. This information provides valuable insights into animal behavior, migration patterns, habitat preferences and ecosystem dynamics — all of which are vital for conservation and wildlife management efforts.
    However, this method is not without limitations. Receivers must be physically retrieved to access the data they have collected. For use in marine animals, receivers are often placed near the coast for easy access, but the distribution of receivers can be uneven, with some areas having few receivers and others having many. This can lead to biased data collection, especially for animals that move across large distances.
    A pioneering study by researchers at Florida Atlantic University and the Smithsonian Environmental Research Center addresses these limitations by filling in the gaps in sporadic detection data and tackles the tradeoff between spatial coverage and cost. Using a movement model, researchers reconstructed animal tracks and leveraged an iterative process to measure the accuracy and precision of these reconstructions from acoustic telemetry data.
    Results of the study, published in the journal Methods in Ecology and Evolution, demonstrate how researchers can apply these techniques and measure the accuracy and precision of the methods to their study sites.
    For the study, researchers simulated animal tracks on a computer, then tested how well their method could accurately reconstruct the tracks if they received detection data only from a cooperative acoustic telemetry array. While most of the data used were simulated, they further tested their methodology with data from highly migratory blacktip sharks (Carcharhinus limbatus) to demonstrate how this method can be applied ecologically.
    Findings demonstrate that their novel method can improve track reconstruction, even in regions with uneven receiver coverage. The track reconstruction methods performed well in coastal regions from Palm Beach County to Long Island, minimizing the clustering effect of high densities of receivers and closing the gaps in some regions that were lacking receiver coverage. Performance was primarily affected by the presence or absence of receivers, and to a lesser extent by receiver density and water depth, depending on the grid resolution.

    “Our approach could significantly reduce gaps in data collection and improve the reliability of ecological insights,” said Beth Bowers, Ph.D., senior author and a post-doctoral researcher at the Smithsonian Environmental Research Center, who conducted the research as a Ph.D. student with her mentor, Stephen Kajiura, Ph.D., in FAU’s Charles E. Schmidt College of Science. “Importantly, this method doesn’t rely on costly field techniques such as motorboat towed tests, which makes it suitable for large-scale studies across diverse habitats.”
    This new method increases the utility of acoustic telemetry technology and provides a framework for future studies to assess the accuracy and precision of animal movement calculated from track reconstructions that use acoustic telemetry.
    “Results from our study will enable resource managers and others to infer the reliability of ecological results in their decision-making processes,” said Kajiura, co-author and a professor of biological sciences, FAU College of Science.
    To foster collaboration and innovation, the researchers have made their data repository accessible, empowering fellow scientists to adapt and apply the methodology to their respective study organisms and habitats, whether that encompasses marine, freshwater or terrestrial habitats.
    “Importantly, implications of our findings extend beyond marine environments, offering a transformative approach to wildlife monitoring across aquatic and terrestrial landscapes,” said Bowers. More