More stories

  • in

    Finding simplicity within complexity

    Picture a tall stately grandfather clock, its long pendulum swinging back and forth, over and again, keeping rhythm with the time. Scientists can describe that motion with an equation, or dynamical model, and though there are seemingly hundreds of factors contributing to the sway, (the weight of the clock, the material of the pendulum, ad infinitum) there is only one variable necessary to describe the motion of the pendulum and translate it into math: the angle of the swing. How long it took scientists and mathematicians to discover that is unknown. It could have taken years to test each variable in the equation to determine the single important variable for sway.
    Now a University of Houston researcher is reporting a method to describe these kinds of complex systems with the least number of variables possible, sometimes reducing the possibility of millions to a minimal amount, and just one on rare occasions. It’s an advancement that can speed up science with its efficiency and ability to understand and predict the behavior of natural systems, and it has implications for speeding up an array of activities that use simulations from weather forecasting to production of aircraft.
    “In the example of the grandfather clock, I can take a video of the pendulum swinging back and forth and from that video, automatically discover what is the right variable. Accurate models of system dynamics enable deeper understanding of these systems, as well as the ability to predict their future behavior,” reports Daniel Floryan, Kalsi Assistant Professor of Mechanical Engineering, in the journal Nature Machine Intelligence.
    To begin building the compact-yet-accurate models, one principle is fundamental: For every action, even those seemingly complex and random, there exists an underlying pattern that enables a compact representation of the system.
    “Our method finds the very most compact description that is mathematically possible, and that’s what differentiates our method from others,” said Floryan.
    Using ideas from machine learning and smooth manifold theory, the method makes simulations extremely fast and inexpensive.
    In one application, Floryan simulated a reaction between a couple of chemicals. The reaction resulted in complex behavior among the chemicals when they met: a repetitive rhythmic spiraling requiring more than 20,000 variables to simulate it. Floryan fed video of the reaction into his algorithm, and it discovered he needed just one variable to understand the action. The necessary variable was the time the spiral took to come back to where it started, like a second hand on a watch.
    Regarding weather prediction, numerical models are computer simulations of the atmosphere that use complicated physics and fluid dynamics equations.
    “For weather prediction and climate modeling, if you have something that is much faster you can better model the earth’s climate and better predict what’s going to happen,” said Floryan.
    Story Source:
    Materials provided by University of Houston. Original written by Laurie Fickman. Note: Content may be edited for style and length. More

  • in

    Improving the accuracy of markerless gait analysis

    Gait analysis systems measure certain metrics to give their results. These results then drive clinical treatment for gait correction. However, detailed gait analysis requires expensive equipment, and a lot of space, markers, time. Measurements from markerless, video-based gait analysis systems, on the other hand, are inaccurate. To improve upon existing systems, researchers have now combined RGB camera-based pose estimation and an inertial measurement unit sensor for gait analysis. This significantly reduces errors in the process.
    In people with gait disabilities (i.e., a pattern of walking — or gait — that is not normal), assessing gait speed, stride length, and joint kinematics are essential. Measurement of gait parameters over a period of time is critical to determine treatment effects, predict fall risk in elderly individuals, and plan physiotherapy treatments. In this regard, optoelectronic marker-based three-dimensional motion capture (3DMC) — a gait analysis tool — can accurately measure gait metrics. However, economic and time constraints, coupled with requirements for a large space, extensive equipment, and technical expertise make 3DMC impractical in clinical settings. Alternate methods include inertial measurement unit (IMU)-based motion capture systems and RGB camera-based methods, which can measure gait without reflective markers when equipped with depth sensors. But these have their own drawbacks. IMU-based systems require many IMU sensors to be attached to human body segments, reducing their feasibility, and compared to optoelectronic 3DMC systems, RGB camera-based methods are less accurate in their measurement of kinematic parameters such as lower limb joint angles.
    Hence, improved gait analysis systems are needed.
    To this end, a team of researchers comprising Dr. Masataka Yamamoto, Mr. Yuto Ishige, and Professor Hiroshi Takemura from the Faculty of Science and Technology, Tokyo University of Science, and Professor Koji Shimatani from the Prefectural University of Hiroshima, Japan, have developed a simple and accurate sensor-fusion method for accurate gait analysis. “We combined information from a small IMU sensor attached to the shoe with estimated information on the bones and joints of the lower limb, obtained by capturing the gait from a single RGB camera,” explains Dr. Yamamoto, the lead author of the study. In a recent article published in Volume 12 of Scientific Reportson October 21, 2022, the researchers have detailed this method and the results they achieved with it.
    The team used single RGB camera-based pose estimation by OpenPose (OP) and an IMU sensor on the foot to measure ankle joint kinematics under various gait conditions for sixteen healthy adult men between 21 and 23 years of age who did not have any limitation of physical activity. The participants’ gait parameters and lower limb joint angles during four gait conditions with varying gait speed and foot progression angles were noted using only OP as well combined measurements from OP and the IMUs. The latter was the team’s novel proposed method. Results from these techniques were compared to gait analysis using 3DMC, the current gold standard.
    The proposed combination method could measure gait parameters and lower limb joint angles in the sagittal plane (which divides the body into right and left). Moreover, the mean absolute errors of peak ankle joint angles calculated by the combination method were significantly less compared to OP alone in all the four gait conditions. This is a significant development in gait analysis. “Our method has the potential to be used not in medicine and welfare, but also to predict the decline of gait function in healthcare, for training and skill evaluation in gyms and sports facilities, and accurate projection of human movements onto an avatar by integrating with virtual reality systems,” notes Dr. Yamamoto.
    With further research, this method can be adapted to clinical settings and a larger demographic.
    Story Source:
    Materials provided by Tokyo University of Science. Note: Content may be edited for style and length. More

  • in

    Using light to manipulate neuron excitability

    Nearly 20 years ago, scientists developed ways to stimulate or silence neurons by shining light on them. This technique, known as optogenetics, allows researchers to discover the functions of specific neurons and how they communicate with other neurons to form circuits.
    Building on that technique, MIT and Harvard University researchers have now devised a way to achieve longer-term changes in neuron activity. With their new strategy, they can use light exposure to change the electrical capacitance of the neurons’ membranes, which alters their excitability (how strongly or weakly they respond to electrical and physiological signals).
    Changes in neuron excitability have been linked to many processes in the brain, including learning and aging, and have also been observed in some brain disorders, including Alzheimer’s disease.
    “This new tool is designed to tune neuron excitability up and down in a light-controllable and long-term manner, which will enable scientists to directly establish the causality between the excitability of various neuron types and animal behaviors,” says Xiao Wang, the Thomas D. and Virginia Cabot Assistant Professor of Chemistry at MIT, and a member of the Broad Institute of MIT and Harvard. “Future application of our approach in disease models will tell whether fine-tuning neuron excitability could help reset abnormal brain circuits to normal.”
    Wang and Jia Liu, an assistant professor at Harvard School of Engineering and Applied Sciences, are the senior authors of the paper, which appears today in Science Advances.
    Chanan Sessler, an MIT graduate student in the Department of Chemistry; Yiming Zhou, a postdoc at the Broad Institute; and Wenbo Wang, a graduate student at Harvard, are the lead authors of the paper. More

  • in

    Soft robot detects damage, heals itself

    Cornell University engineers have created a soft robot capable of detecting when and where it was damaged — and then healing itself on the spot.
    “Our lab is always trying to make robots more enduring and agile, so they operate longer with more capabilities,” said Rob Shepherd, associate professor of mechanical and aerospace engineering. “If you make robots operate for a long time, they’re going to accumulate damage. And so how can we allow them to repair or deal with that damage?”
    Shepherd’s Organic Robotics Lab has developed stretchable fiber-optic sensors for use in soft robots and related components — from skin to wearable technology.
    For self-healing to work, Shepard says the key first step is that the robot must be able to identify that there is, in fact, something that needs to be fixed.
    To do this, researchers have pioneered a technique using fiber-optic sensors coupled with LED lights capable of detecting minute changes on the surface of the robot.
    These sensors are combined with a polyurethane urea elastomer that incorporates hydrogen bonds, for rapid healing, and disulfide exchanges, for strength.
    The resulting SHeaLDS — self-healing light guides for dynamic sensing — provides a damage-resistant soft robot that can self-heal from cuts at room temperature without any external intervention.
    To demonstrate the technology, the researchers installed the SHeaLDS in a soft robot resembling a four-legged starfish and equipped it with feedback control. Researchers then punctured one of its legs six times, after which the robot was then able to detect the damage and self-heal each cut in about a minute. The robot could also autonomously adapt its gait based on the damage it sensed.
    While the material is sturdy, it is not indestructible.
    “They have similar properties to human flesh,” Shepherd said. “You don’t heal well from burning, or from things with acid or heat, because that will change the chemical properties. But we can do a good job of healing from cuts.”
    Shepherd plans to integrate SHeaLDS with machine learning algorithms capable of recognizing tactile events to eventually create “a very enduring robot that has a self-healing skin but uses the same skin to feel its environment to be able to do more tasks.”
    Story Source:
    Materials provided by Cornell University. Original written by David Nutt, courtesy of the Cornell Chronicle. Note: Content may be edited for style and length. More

  • in

    Coupled computer modeling can help more accurately predict coastal flooding, study demonstrates

    When Hurricane Florence hit the coast of North Carolina as a Category 1 storm in 2018, it set new records for rainfall, creating damaging 500-year flooding events along the Cape Fear River Basin.
    This is exactly the sort of weather event Z. George Xue of the LSU Department of Oceanography and Coastal Sciences, or DOCS, believes his novel coupled computer modeling approach can more accurately predict, and thereby assist communities with disaster planning. Xue said as far as he knows, his lab is the only one using this technique.
    Xue, along with DOCS graduate student Daoyang Bao and the rest of their research team, recently published a study using the events of Hurricane Florence to demonstrate the validity of this new approach in the Journal of Advances in Modeling Earth Systems.
    Improving the accuracy of flooding predictions can help in hurricane preparedness, said John C. Warner of the US Geological Survey, another collaborator on the study. “More accurate forecasts can help coastal managers to better alert communities of impending storms.”
    Xue said this breakthrough coupled modeling technique could provide long-term benefits to communities as well.
    “Our model can identify which region is most vulnerable in terms of compound flooding and provide not only short-term forecasts but also scenario analysis regarding future climate and sea level conditions,” he said. More

  • in

    Heart attack on a chip

    Researchers at the University of Southern California Alfred E. Mann Department of Biomedical Engineering have developed a “heart attack on a chip,” a device that could one day serve as a testbed to develop new heart drugs and even personalized medicines.
    “Our device replicates some key features of a heart attack in a relatively simple and easy to use system,” said Megan McCain, an associate professor of biomedical engineering and stem cell biology and regenerative medicine, who developed the device with postdoctoral researcher Megan Rexius-Hall.
    “This enables us to more clearly understand how the heart is changing after a heart attack. From there, we and others can develop and test drugs that will be most effective for limiting the further degradation of heart tissue that can occur after a heart attack,” added McCain.
    McCain, a “cardiac tissue engineer,” whose work previously included co-developing a heart on a chip, and Rexius-Hall detail their findings in a recently released article in the journal Science Advances titled “A Myocardial Infarct Border-Zone-On-A-Chip Demonstrates Distinct Regulation of Cardiac Tissue Function by an Oxygen Gradient.”
    America’s No. 1 killer
    Coronary heart disease is America’s No. 1 killer. In 2018, 360,900 Americans succumbed to it, making heart disease responsible for 12.6% of all deaths in the United States, according to the AHA. Severe coronary heart disease can cause a heart attack, which accounts for much of that pain and suffering. Heart attacks occur when fat, cholesterol and other substances in the coronary arteries severely reduce the flow of oxygen-rich blood to part of the heart. Between 2005 and 2014, an average of 805,000 Americans per year had heart attacks. More

  • in

    A novel, space-time coding antenna promotes 6G and secure wireless communications

    A research team co-led by a scientist at City University of Hong Kong (CityU) has developed a novel antenna that allows manipulation of the direction, frequency and amplitude of the radiated beam, and is expected to play an important role in the integration of sensing and communications (ISAC) for 6th-generation (6G) wireless communications.
    The structure and characteristics of traditional antennas cannot be changed once fabricated. However, the direction, frequency, and amplitude of the electromagnetic waves from this new-generation antenna, which is called a “sideband-free space-time-coding (STC) metasurface antenna,” can be changed through space-time coding (i.e. software control), enabling great user flexibility.
    The key to this innovative feature is that the response of the metasurface (artificial, thin-sheet material with sub-wavelength thickness and made of several sub-wavelength meta-atoms) can be changed by switching the meta-atoms on its surface between radiating and non-radiating states, like turning on and off switches, by controlling the electric current. This allows the STC metasurface antenna to realize complicated wave manipulation in the space and frequency domains through software control, and to create a desired radiation pattern and a highly directed beam.
    Professor Chan Chi-hou, Acting Provost and Chair Professor of Electronic Engineering in the Department of Electrical Engineering at CityU, who led the research, highlighted that the antenna relies on the successful combination of two research advances, namely amplitude-modulated (AM) leaky-wave antennas and space-time coding techniques.
    Dr Wu Gengbo, postdoctoral fellow in the State Key Laboratory of Terahertz and Millimeter Waves (SKLTMW) at CityU, first proposed the new concept of AM leaky-wave antennas in 2020 in his PhD studies at CityU. “The concept provides an analytical approach to synthesize antennas with the desired radiation patterns for different specific uses by simply changing the antennas’ shape and structure,” explained Dr Wu.
    But as with other antennas, once the AM leaky-wave antenna is fabricated, its radiation characteristics are fixed. At about that time, Dr Dai Junyan, from a research group led by Academician Cui Tiejun and Professor Cheng Qiang, from Southeast University at Nanjing, China, who pioneered STC technologies, joined Professor Chan’s group at CityU. “Dr Dai’s expertise in space-time coding and digital metasurfaces to dynamically reconfigure antenna performance added a new, important dimension to the antenna research at the SKLTMW,” said Professor Chan, who is also Director of the SKLTMW at CityU.
    Moreover, the time modulation of electromagnetic waves on metasurfaces usually generates unwanted harmonic frequencies, called sidebands. These sidebands carry part of the radiated electromagnetic wave energy and interfere with the useful communication channels of the antenna, leading to “spectrum pollution.” But Professor Chan and his team proposed a novel design, which makes use of a waveguide (a line for transmitting electromagnetic waves by successive reflection from the inner wall) and successfully suppressed the undesired harmonics, achieving a high-directivity beam and enabling secure communication.
    “With the AM leaky-wave antenna and space-time coding technologies, we achieve the designated radiation characteristics by controlling the on-off sequences and duration of the ‘switches’ on the antenna through software,” said Professor Chan.
    “A high-directivity beam can be generated with the new antenna, allowing a wide range of radiation performance without having to redesign the antenna, except for using different STC inputs,” added Dr Wu.
    The energy from the radiated beam of the STC metasurface antenna can be focused to a focal point with fixed or varying focal lengths, which can be used for real-time imaging and treated as a type of radar to scan the environment and feedback data. “The invention plays an important role in the ISAC for 6G wireless communications,” Professor Chan explained. “For example, the radiated beam can scan a person and create an image of the person, allowing mobile phone users to talk to each other with 3D hologram imaging. It also performs better against eavesdropping than the conventional transmitter architecture.” More

  • in

    Energy-efficient computing with tiny magnetic vortices

    A large percentage of energy used today is consumed in the form of electrical power for processing and storing data and for running the relevant terminal equipment and devices. According to predictions, the level of energy used for these purposes will increase even further in the future. Innovative concepts, such as neuromorphic computing, employ energy-saving approaches to solve this problem. In a joint project undertaken by experimental and theoretical physicists at Johannes Gutenberg University Mainz (JGU) with the funding of an ERC Synergy Grant such an approach, known as Brownian reservoir computing, has now been realized. The results were also recently featured as an Editors’ Highlight in the Devices section of the scientific journal Nature Communications.
    Brownian computing uses ambient thermal energy
    Brownian reservoir computing is a combination of two unconventional computing methods. Brownian computing exploits the fact that computer processes typically run at room temperature so that there is the option of using the surrounding thermal energy and thus cutting down on electricity consumption. The thermal energy used in the computing system is basically the random movement of particles, known as Brownian motion; which explains the name of this computing method.
    Reservoir computing is ideal for exceptionally efficient data processing
    Reservoir computing utilizes the complex response of a physical system to external stimuli, resulting in an extremely resource-efficient way of processing data. Most of the computation is performed by the system itself, which does not require additional energy. Furthermore, this type of reservoir computer can easily be customized to perform various tasks as there is no need to adjust the solid-state system to suit specific requirements.
    A team headed by Professor Mathias Kläui of the Institute of Physics at Mainz University, supported by Professor Johan Mentink of Radboud University Nijmegen in the Netherlands, has now succeeded in developing a prototype that combines these two computing methods. This prototype is able to perform Boolean logic operations, which can be used as standard tests for the validation of reservoir computing.
    The solid-state system selected in this instance consists of metallic thin films exhibiting magnetic skyrmions. These magnetic vortices behave like particles and can be driven by electrical currents. The behavior of skyrmions is influenced not only by the applied current but also by their own Brownian motion. This Brownian motion of skyrmions can result in significantly increased energy savings as the system is automatically reset after each operation and prepared for the next computation.
    First prototype developed in Mainz
    Although there have been many theoretical concepts for skyrmion-based reservoir computing in recent years, the researchers in Mainz succeeded in developing the first functional prototype only when combining these concepts with the principle of Brownian computing. “The prototype is easy to produce from a lithographic point of view and can theoretically be reduced to a size of just nanometers,” said experimental physicist Klaus Raab. “We owe our success to the excellent collaboration between the experimental and theoretical physicists here at Mainz University,” emphasized theoretical physicist Maarten Brems. Project coordinator Professor Mathias Kläui added: “I’m delighted that the funding provided through a Synergy Grant from the European Research Council enabled us to collaborate with outstanding colleagues in the Department of Theoretical Physics in Nijmegen, and it was this collaboration that resulted in our achievement. I see great potential in unconventional computing, a field which also receives extensive support here at Mainz through funding from the Carl Zeiss Foundation for the Emergent Algorithmic Intelligence Center.”
    Story Source:
    Materials provided by Johannes Gutenberg Universitaet Mainz. Note: Content may be edited for style and length. More