More stories

  • in

    Simulated human eye movement aims to train metaverse platforms

    Computer engineers at Duke University have developed virtual eyes that simulate how humans look at the world accurately enough for companies to train virtual reality and augmented reality programs. Called EyeSyn for short, the program will help developers create applications for the rapidly expanding metaverse while protecting user data.
    The results have been accepted and will be presented at the International Conference on Information Processing in Sensor Networks (IPSN), May 4-6, 2022, a leading annual forum on research in networked sensing and control.
    “If you’re interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that,” said Maria Gorlatova, the Nortel Networks Assistant Professor of Electrical and Computer Engineering at Duke.
    “But training that kind of algorithm requires data from hundreds of people wearing headsets for hours at a time,” Gorlatova added. “We wanted to develop software that not only reduces the privacy concerns that come with gathering this sort of data, but also allows smaller companies who don’t have those levels of resources to get into the metaverse game.”
    The poetic insight describing eyes as the windows to the soul has been repeated since at least Biblical times for good reason: The tiny movements of how our eyes move and pupils dilate provide a surprising amount of information. Human eyes can reveal if we’re bored or excited, where concentration is focused, whether or not we’re expert or novice at a given task, or even if we’re fluent in a specific language.
    “Where you’re prioritizing your vision says a lot about you as a person, too,” Gorlatova said. “It can inadvertently reveal sexual and racial biases, interests that we don’t want others to know about, and information that we may not even know about ourselves.”
    Eye movement data is invaluable to companies building platforms and software in the metaverse. For example, reading a user’s eyes allows developers to tailor content to engagement responses or reduce resolution in their peripheral vision to save computational power. More

  • in

    Harnessing AI and Robotics to treat spinal cord injuries

    By employing artificial intelligence (AI) and robotics to formulate therapeutic proteins, a team led by Rutgers researchers has successfully stabilized an enzyme able to degrade scar tissue resulting from spinal cord injuries and promote tissue regeneration.
    The study, recently published in Advanced Healthcare Materials, details the team’s ground-breaking stabilization of the enzyme Chondroitinase ABC, (ChABC) offering new hope for patients coping with spinal cord injuries.
    “This study represents one of the first times artificial intelligence and robotics have been used to formulate highly sensitive therapeutic proteins and extend their activity by such a large amount. It’s a major scientific achievement,” said Adam Gormley, the project’s principal investigator and an assistant professor of biomedical engineering at Rutgers School of Engineering (SOE) at Rutgers University-New Brunswick.
    Gormley expressed that his research is also motivated, in part, by a personal connection to spinal cord injury.
    “I’ll never forget being at the hospital and learning a close college friend would likely never walk again after being paralyzed from the waist down after a mountain biking accident,” Gormley recalled. “The therapy we are developing may someday help people such as my friend lessen the scar on their spinal cords and regain function. This is a great reason to wake up in the morning and fight to further the science and potential therapy.”
    Shashank Kosuri, a biomedical engineering doctoral student at Rutgers SOE and a lead author of the study noted that spinal cord injuries, or SCIs, can negatively impact the physical, psychological, and socio-economic well-being of patients and their families. Soon after an SCI, a secondary cascade of inflammation produces a dense scar tissue that can inhibit or prevent nervous tissue regeneration.
    The enzyme successfully stabilized in the study, ChABC, is known to degrade scar tissue molecules and promote tissue regeneration, yet it is highly unstable at the human body temperature of 98.6° F. and loses all activity within a few hours. Kosuri noted that this necessitates multiple, expensive infusions at very high doses to maintain therapeutic efficacy.
    Synthetic copolymers are able to wrap around enzymes such as ChABC and stabilize them in hostile microenvironments. In order to stabilize the enzyme, the researchers utilized an AI-driven approach with liquid handling robotics to synthesize and test the ability of numerous copolymers to stabilize ChABC and maintain its activity at 98.6° F.
    While the researchers were able to identify several copolymers that performed well, Kosuri reported that one copolymer combination even continued to retain 30% of the enzyme for up to one week, a promising result for patients seeking care for spinal cord injuries.
    The study received support from grants funded by the National Institutes of Health, the National Science Foundation, and The New Jersey Commission on Spinal Cord research. In addition to Gormley and Kosuri, the Rutgers research team also included SOE Professor Li Cai and Distinguished Professor Martin Yarmush, as well as several SOE-affiliated students. Faculty and students from Princeton University’s Department of Chemical and Biological Engineering also collaborated on the project.
    Story Source:
    Materials provided by Rutgers University. Original written by Emily Everson Layden. Note: Content may be edited for style and length. More

  • in

    Some deep-sea octopuses aren’t the long-haul moms scientists thought they were

    Octopuses living in the deep sea off the coast of California are breeding far faster than expected.

    The animals lay their eggs near geothermal springs, and the warmer water speeds up embryonic development, researchers report February 28 at the virtual 2022 Ocean Sciences Meeting. That reproductive sleight of hand means that the octopus moms brood for less than two years, instead of the estimated 12.

    In 2018, scientists working off the coast of California discovered thousands of deep-sea octopuses (Muusoctopus robustus) congregated on a patch of seafloor about 3,200 meters below the surface. Many of the grapefruit-sized animals were females brooding clutches of eggs, leading researchers to dub the site the Octopus Garden.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    But with water temperatures hovering around a frigid 1.6° Celsius, growth in this garden was predicted to be leisurely. In octopuses, embryonic development tends to slow down at low temperatures, says marine ecologist Jim Barry of the Monterey Bay Aquarium Research Institute in Moss Landing, Calif. “When you get really cold, down near zero, that’s when brood periods get really long.”

    The record for the longest brood period of any animal, just over four years, is held by a different species of octopus living in warmer water (SN: 7/30/14). M. robustus, thriving in the chilly depths of the Octopus Garden, was therefore a serious contender to snatch that title, Barry says. “If you look at its predicted brood period at 1.6° C, it’s over 12 years.”

    To verify what would be a record-setting stint of motherhood, Barry and his colleagues repeatedly visited the Octopus Garden from 2019 to 2021 using a remotely operated vehicle. The team trained cameras at the octopus eggs, which resemble white fingers, to monitor their rate of development. With one of the submersible’s robotic arms, the researchers also gently nudged dozens of octopuses aside and measured the water temperature in their nests.

    The team found that relatively warm water — up to 10.5° C — bathed all the egg clutches. The female octopuses are preferentially laying their eggs in streams of geothermally heated water, the researchers realized. That discovery was a tip-off that these animals are not the long-haul moms people thought them to be, Barry says. “We’re virtually certain these animals are breeding far more rapidly than you’d expect.”

    Deep-sea octopuses (Muusoctopus robustus) brood clutches of eggs, which look like white fingers.Ocean Exploration Trust, NOAA

    Based on observations of the developing eggs, Barry and colleagues calculated that the moms brooded for only about 600 days, or about a year and a half. That is much faster than predicted, says Jeffrey Drazen, a deep-sea ecologist at the University of Hawaii at Manoa who was not involved in the research. “They’re cutting a huge amount of time off of their parental care period.”

    There is also an evolutionary advantage to seeking out warmer water: Shorter brood periods mean that fewer eggs are likely to be gobbled up by predators. And these octopuses seem to know that, Barry says. “We believe they’re exploiting that thermal energy to improve reproductive success.”

    Only a few other marine animals, such as icefish in Antarctica’s Weddell Sea (SN: 1/13/22), are known to seek out warmer conditions when breeding. But there are probably other species that do the same, Drazen says. The challenge is finding them and their breeding grounds in the vast expanse of the deep ocean. “I imagine that as we keep looking, we will keep finding really interesting sites that are important to certain species,” he says. More

  • in

    Event horizons are tunable factories of quantum entanglement

    LSU physicists have leveraged quantum information theory techniques to reveal a mechanism for amplifying, or “stimulating,” the production of entanglement in the Hawking effect in a controlled manner. Furthermore, these scientists propose a protocol for testing this idea in the laboratory using artificially produced event horizons. These results have been recently published in Physical Review Letters, “Quantum aspects of stimulated Hawking radiation in an analog white-black hole pair,” where Ivan Agullo, Anthony J. Brady and Dimitrios Kranas present these ideas and apply them to optical systems containing the analog of a pair white-black hole.
    Black holes are some of the most mystifying objects in our universe, largely due to the fact that their inner-workings are hidden behind a completely obscuring veil — the black hole’s event horizon.
    In 1974, Stephen Hawking added more mystique to the character of black holes by showing that, once quantum effects are considered, a black hole isn’t really black at all but, instead, emits radiation, as if it was a hot body, gradually losing mass in the so-called “Hawking evaporation process.” Further, Hawking’s calculations showed that the emitted radiation is quantum mechanically entangled with the bowels of the black hole itself. This entanglement is the quantum signature of the Hawking effect. This astounding result is difficult, if not impossible, to be tested on astrophysical black holes, since the faint Hawking radiation gets overshined by other sources of radiation in the cosmos.
    On the other hand, in the 1980’s, a seminal article by William Unruh established that the spontaneous production of entangled Hawking particles occurs in any system that can support an effective event horizon. Such systems generally fall under the umbrella of “analog gravity systems” and opened a window for testing Hawking’s ideas in the laboratory.
    Serious experimental investigations into analog gravity systems — made of Bose-Einstein condensates, non-linear optical fibers, or even flowing water — have been underway for more than a decade. Stimulated and spontaneously-generated Hawking radiation has recently been observed in several platforms, but measuring entanglement has proved elusive due to its faint and fragile character.
    “We show that, by illuminating the horizon, or horizons, with appropriately chosen quantum states, one can amplify the production of entanglement in Hawking’s process in a tunable manner,” said Associate Professor Ivan Agullo. “As an example, we apply these ideas to the concrete case of a pair of analog white-black holes sharing an interior and produced within a non-linear optical material.”
    “Many of the quantum information tools used in this research were from my graduate research with Professor Jonathan P. Dowling,” said 2021 PhD alumnus Anthony Brady, postdoctoral researcher at the University of Arizona. “Jon was a charismatic character, and he brought his charisma and unconventionality into his science, as well as his advising. He encouraged me to work on eccentric ideas, like analog black holes, and see if I could meld techniques from various fields of physics — like quantum information and analog gravity — in order to produce something novel, or ‘cute,’ as he liked to say.”
    “The Hawking process is one of the richest physical phenomena connecting seemingly unrelated fields of physics from the quantum theory to thermodynamics and relativity,” said Dimitrios Kranas, LSU graduate student. “Analog black holes came to add an extra flavor to the effect providing us, at the same time, with the exciting possibility of testing it in the laboratory. Our detailed numerical analysis allows us to probe new features of the Hawking process, helping us understand better the similarities and differences between astrophysical and analog black holes.”
    Story Source:
    Materials provided by Louisiana State University. Note: Content may be edited for style and length. More

  • in

    Researchers map magnetic fields in 3D, findings could improve device storage capacity

    Researchers from the University of New Hampshire have mapped magnetic fields in three dimensions, a major step toward solving what they call the “grand challenge” of revealing 3D magnetic configuration in magnetic materials. The work has implications for improving diagnostic imaging and capacity in storage devices.
    “The number three really represents a breakthrough in this field,” said Jiadong Zang, associate professor of physics. “Our brain is a three-dimensional object. It’s ironic that all our devices are two-dimensional. They’re underperforming compared to our brains.”
    The study, published recently in the journal Nature Materials, provides the results of three years of high-performance numerical simulations, mapping a three-dimensional structure of a 100 nanometer magnetic tetrahedron sample using only three projection angles of electron beams. Zang points to computed tomography medical imaging, or CT scans, as an example. Instead of sending multiple beams of X-rays to map tissues in the body the same images could be produced with only three beams.
    Reducing electron beam exposure in fast three-dimensional magnetic imaging is one potential application for this collaborative research. The researchers’ findings also have implications for improving storage capacity of magnetic memory devices, which currently deposit circuits onto two-dimensional panels that are approaching maximum density.
    The method offered by this research will be a useful tool to detect and characterize three-dimensional magnetic circuits.
    Zang and Alexander Booth, a former UNH doctoral student, conducted the theoretical analysis. Researchers from Japan and the University of Wisconsin performed the physical experiments. Funds from the U.S. Department of Energy (DOE), Office of Science, Basic Energy Sciences (BES) under award number DE-SC0020221 helped support Zang and Booth’s contributions to this research.
    The University of New Hampshire inspires innovation and transforms lives in our state, nation and world. More than 16,000 students from all 50 states and 71 countries engage with an award-winning faculty in top-ranked programs in business, engineering, law, health and human services, liberal arts and the sciences across more than 200 programs of study. A Carnegie Classification R1 institution, UNH partners with NASA, NOAA, NSF and NIH, and received $260 million in competitive external funding in FY21 to further explore and define the frontiers of land, sea and space.
    Story Source:
    Materials provided by University of New Hampshire. Original written by Beth Potier. Note: Content may be edited for style and length. More

  • in

    AI helped protect businesses from COVID-19 risks

    A new study has found that artificial intelligence (AI) apps helped protect small and medium-sized businesses against many of the risks that emerged during the COVID-19 pandemic — yet only a quarter of small firms currently use them.
    The research, undertaken by Anglia Ruskin University (ARU) and published in the journal Information Systems Frontiers, surveyed 317 small and medium sized firms based in London. The study found the use of AI-powered apps was associated with a 3.1% reduced risk to business during the pandemic.
    The COVID-19 pandemic has created risks for economies and business operations, with customers stopping, reducing, or postponing purchases, thereby affecting supply chains and resulting in difficulties in sourcing alternative suppliers.
    Business risks were defined by a 60-point scale developed by the International Labor Organization’s (ILO) that measures the pandemic’s impact on staffing, processes such as working patterns, reduced profits, and threats to partnerships.
    AI software utilised by businesses include chatbots to allow swift interaction with customers, apps that identify damaging fake reviews, and apps that use algorithms to improve customer targeting based on their habits, social media activities and profiles, online activities, and past transactions.
    The study found the use of AI apps to offer personalised shopping suggestions was associated with 2% lower business risks to profits caused by the COVID-19 pandemic. The use of AI apps to target audience online was associated with 1.2% lower overall business risk.
    However, the research revealed that only 26% of small enterprises were utilising AI applications, considerably lower than the 70.4% of medium-sized businesses.
    Lead author Professor Nick Drydakis, Director of the Centre for Pluralist Economics at Anglia Ruskin University, said: “SMEs can invest in AI technologies to track users’ habits and provide recommendations, improve customer’s purchasing decisions, search results, media communication, trade raise sales, improve organisational performance, and lower costs.
    “AI can help SMEs to adapt to unprecedented conditions, meaning they can leverage technology to meet new types of demand, move at speed to pivot business operations, boost efficiency and reduce their business risks.
    “We found that SMEs’ business risks caused by the COVID-19 pandemic declined with the use of AI applications across a ten-item scale including marketing, sales, communication, predictions, pricing and cash flow, fake reviews, cybersecurity, recruitment, and legal services.
    “The outcomes proved true regardless of enterprise size, turnover, and years of operation, indicating that AI applications have helped SMEs to adapt to unprecedented conditions during the COVID-19 pandemic.
    “It seems investment in AI apps could be a smart move for the three quarters of small businesses that do not currently utilise them.” More

  • in

    Bendy robotic arm twisted into shape with help of augmented reality

    The flexible arm, which was designed and created at Imperial College London, can twist and turn in all directions, making it readily customisable for potential applications in manufacturing, spacecraft maintenance, and even injury rehabilitation.
    Instead of being constrained by rigid limbs and firm joints, the versatile arm is readily bendable into a wide variety of shapes. In practice, people working alongside the robot would manually bend the arm into the precise shape needed for each task, a level of flexibility made possible by the slippery layers of mylar sheets inside, which slide over one another and can lock into place. However, configuring the robot into specific shapes without guidance has proven to be difficult for users.
    To enhance the robot’s user-friendliness, researchers at Imperial’s REDS (Robotic manipulation: Engineering, Design, and Science) Lab have designed a system for users to see in AR how to configure their robot. Wearing mixed reality smartglasses and through motion tracking cameras, users see templates and designs in front of them superimposed onto their real-world environment. They then adjust the robotic arm until it matches the template, which turns green on successful configuration so that the robot can be locked into place.
    Senior author of the paper Dr Nicolas Rojas, of Imperial’s Dyson School of Design Engineering, said: “One of the key issues in adjusting these robots is accuracy in their new position. We humans aren’t great at making sure the new position matches the template, which is why we looked to AR for help.
    “We’ve shown that AR can simplify working alongside our malleable robot. The approach gives users a range of easy-to-create robot positions, for all sorts of applications, without needing so much technical expertise.”
    The researchers tested the system on five men aged 20-26 with experience in robotics but no experience with manipulating malleable robots specifically. The subjects were able to adjust the robot accurately, and the results are published in Robotics & Automation Magazine.
    Although the pool of participants was narrow, the researchers say their initial findings show that AR could be a successful approach to adapting malleable robots following further testing and user training.
    Bent into shape
    Potential applications include manufacturing, and building and vehicle maintenance. Because the arm is lightweight, it could also be used on spacecraft where low-weight instruments are preferred. It is also gentle enough that it could be used in injury rehabilitation, helping a patient perform an exercise while their physiotherapist performs another.
    Co-first authors PhD researchers Alex Ranne and Angus Clark, also of the Dyson School of Design Engineering, said: “In many ways it can be seen as a detached, bendier, third arm. It could help in many situations where an extra limb might come in handy and help to spread the workload.”
    The researchers are still in the process of perfecting the robot as well as its AR component. Next, they will look into introducing touch and audio elements to the AR to boost its accuracy in configuring the robot.
    They are also looking into strengthening the robots. Although their flexibility and softness makes them easier to configure and maybe even safer to work alongside humans, they are less rigid while in the locked position, which could affect precision and accuracy.
    Story Source:
    Materials provided by Imperial College London. Original written by Caroline Brogan. Note: Content may be edited for style and length. More

  • in

    Robot 'bugs' that can go just about anywhere

    These ancient creatures can squeeze through the tiniest cracks, fit snugly into tight spaces and survive in harsh environments: There aren’t many spaces that are off-limits to an insect.
    That’s why researchers at the University of Pittsburgh have created tiny bug-inspired robots that can carry out tasks in hard-to-reach spaces and inhospitable environments.
    “These robots could be used to access confined areas for imaging or environmental evaluation, take water samples, or perform structural evaluations,” said Junfeng Gao, who led the work as a PhD student in industrial engineering at the Swanson School of Engineering. “Anywhere you want to access confined places—where a bug could go but a person could not—these machines could be useful.”
    For many creatures under a certain size—like trap-jaw ants, mantis shrimp, and fleas—jumping across a surface is more energy-efficient than crawling. Those impulsive movements were replicated in the robots, which are made of a polymeric artificial muscle. 
    “It’s akin to loading an arrow into a bow and shooting it—the robots latch on to build up energy and then release it in an impulsive burst to spring forward,” explained M. Ravi Shankar, professor of industrial engineering at Pitt whose lab led the research. “Usually, actuation in the artificial muscles we work with is fairly slow. We were drawn to the question, ‘How do we take this artificial muscle and use it to generate a jumping actuation rather than slow actuation?’” 
    The answer lay in the interplay of molecular order and geometry.
    “The curved composite shape of the polymer muscle allows it to build energy when it is powered. The way the molecules are aligned in the muscle draws inspiration from the natural world, where their combined actuation builds energy into the structure,” said Mohsen Tabrizi, co-author of the study and PhD student in industrial engineering at the Swanson School. “This is accomplished using no more than a few volts of electricity.”
    The versatile movement and lightweight structure enables the robots—which are about the size of a cricket—to move along moving surfaces like sand as easily as hard surfaces, and even to hop across water.
    The paper, “Molecularly Directed, Geometrically Latched, Impulsive Actuation Powers Sub-Gram Scale Motility,” (DOI: 10.1002/admt.202100979) was published in the journal Advanced Materials Technologies and was coauthored by Junfeng Gao, Arul Clement, Mohsen Tabrizi, and M. Ravi Shankar.
    Story Source:
    Materials provided by University of Pittsburgh. Original written by Maggie Lindenberg. Note: Content may be edited for style and length. More