More stories

  • in

    A broader definition of learning could help stimulate interdisciplinary research

    We often conceive of learning through the lens of cramming for an exam or teaching a dog to sit, but humans and other mammals aren’t the only entities capable of adapting to their environment — schools of fish, robots, and even our genes can learn new behaviors, explain Jan De Houwer and Sean Hughes (Ghent University) in a new Perspectives on Psychological Science article. Embracing a broader definition of learning that includes any behavioral adaption developed in response to regular features of an environment could help researchers collaborate across the fields of psychology, computer science, sociology, and genetics, De Houwer explained in an interview.
    “Most people think of learning as some kind of mechanism for the storage of new information, but this makes it very difficult to compare learning in different systems because different systems probably use different mechanisms for storing information,” De Houwer said. “We define learning as changes in the way a system responds to its environment — that is, as learned behavior.”
    Much like Darwin’s theory of evolution, De Houwer and Hughes’ functional definition of learning focuses on how systems adapt to their environment, regardless of the mechanisms through which those adaptions may occur. The “system” in question could be an individual organism, a part of an organism such as a gene or the spinal cord, or a community of organisms. In fact, De Houwer added, evolution itself could be conceived of as a form of learning in which a species of animal is seen as a system that adapts to its environment.
    “Because our definition of learning is ‘mechanism-free,’ it allows for interactions between scientists who study learning in different systems,” De Houwer said. “It breaks the barriers between different sciences and allows for an exchange of ideas that is bound to promote the study of learning in general.”
    In addition to supporting comparisons between learning in different kinds of systems, this definition can also help researchers take a closer look at how these systems may influence each other’s learning, De Houwer and Hughes write. A corn plant may learn to become more drought resistant, for example, because its genes have an epigenetic response to dehydration that prompts its cells to retain more water, ultimately influencing the learned behavior of the entire plant.
    Learning can also occur at the group level, such as in a school of fish, because of the learning of some but not all members in that group, De Houwer added. A fish at the head of a school may learn to avoid a shipwreck after repeatedly finding sharks there, for example, whereas fish at the back of the school may perform a similar behavior by simply continuing to follow the fish ahead of them without learning about the shipwreck.
    This analysis can also be applied to the study of robots and artificial intelligence. Though each can be studied separately, the ability of a robot to learn how to navigate obstacles also depends on how its algorithm responds to the environment, the researchers explain.
    It is important to note, however, that a system cannot be described as learning just because its behavior has changed in response to the environment. A system can only be said to have learned something if it changes the way it responds to a stimulus as the result of regularities in its environment, such as repeated exposure to a stimulus or the co-occurrence of stimuli, De Houwer said. Learning researchers examine the conditions under which regularities in the environment change behavior, he continued.
    Developing a precise definition of learning can help scientists communicate existing findings and promote new interdisciplinary research, De Houwer and Hughes conclude.
    “Definitions are tools at the service of better science,” they write. “Our definition allows scientists to share knowledge and thereby explore new ways of studying learning in different systems.”
    Story Source:
    Materials provided by Association for Psychological Science. Note: Content may be edited for style and length. More

  • in

    Scared of injections? Try a wearable soft robot to ease aversion to needles

    While most of us are never without our smartphones, robots may also soon become indispensable companions. It certainly seems so based on the recent experiments conducted by researchers in Japan, who developed a hand-held soft robot that can improve the experience of patients while undergoing potentially unpleasant medical procedures, such as injections.
    During the campaign to encourage vaccination against COVID-19, public health officials recognized that some people are simply afraid of needles, which contributed to reduced vaccination rates. While the problems of patient anxiety and pain during medical procedures have been well studied, there remains a need to test and implement solutions to help patients.
    In a recently published study in Scientific Reports, researchers at the University of Tsukuba have developed a wearable soft robot for patients to use during treatments, in an attempt to ease their pain. On being subjected to a moderate heat stimulus, the study participants who wore the robot experienced less pain than in the tests in which they did not wear the robot. “Our results suggest that the use of wearable soft robots may reduce fear as well as alleviate the perception of pain during medical treatments, including vaccinations,” says senior author Professor Fumihide Tanaka.
    The soft, fur-covered robot the scientists called Reliebo was designed to be attached to the participant’s hand; it contained small airbags that could inflate in response to hand movements. The researchers tested its effectiveness under various conditions based on the clenching of the participant’s hand, while applying the painful thermal stimulus to the other arm that was not being used to hold the robot. The researchers also measured the levels of oxytocin and cortisol (which are biomarkers for stress) from the patients’ saliva samples. Additionally, subjective pain ratings were recorded using an assessment scale, and a survey test was conducted to evaluate the patients’ fear of injections and psychological state before and after the experiments.
    The researchers found that holding the robot helped relieve the experience for patients regardless of the experimental conditions used, and speculated that the feelings of well-being that can be created by human touch may have also been activated by the robot. “It is well known that interpersonal touch can reduce pain and fear, and we believe that this effect can be achieved even with nonliving soft robots,” states Professor Tanaka. This may be useful when actual human contact is not feasible, such as during pandemics. Future versions of the robot might use a controlled gaze or even AR (augmented reality) technologies to help build a connection with the patient or distract them from pain perception in various situations.
    This work was supported by JSPS KAKENHI Grant Number 20K21800 and 22K19784.
    Story Source:
    Materials provided by University of Tsukuba. Note: Content may be edited for style and length. More

  • in

    Keeping electricity affordable on wireless charging highways

    Efficient pricing will be crucial to minimize energy costs for private operators who provide on-the-highway wireless charging for electric cars — and for consumers who will use this service, according to new Cornell University research in Applied Energy.
    Employing dynamic pricing strategies in the marketplace could save consumers as much as 6% off the retail electricity price, according to the new paper, which envisions future wireless charging highways that allay so-called “range anxiety” about low batteries on longer trips.
    “Electrifying transportation is great, since you can eliminate carbon emissions,” said senior author Oliver Gao, the Howard Simpson Professor of Civil and Environmental Engineering in Cornell Engineering. “You can energize your car while driving in the charging lane. But if you’re managing a charging highway that can provide energy to cars, you’re buying and selling electricity on an industrial scale. We’re trying to suggest a smart business strategy.”
    Electricity prices can change drastically within a day, according to Gao, who is a faculty fellow at the Cornell Atkinson Center for Sustainability.
    An efficient bidding strategy is crucial to minimizing the energy cost for operators of wireless charging roads. The primary goal of the new research is to design a competitive, price-sensitive demand bidding strategy for wireless charging road owners — who have electricity storage capacity.
    The paper, “Bidding Strategy for Wireless Charging Roads with Energy Storage in Real-Time Electricity Markets,” designs an efficient, price-sensitive way for a wireless charging road to participate in a real-time electricity market. The research suggests an algorithm to predict the real-time electricity load on a charging highway, in order to evaluate a price forecast and electricity availability.
    The proposed bidding strategy not only reduces the energy cost for operating a wireless charging road but helps to alleviate electricity load pressure on a power network.
    “Our paper comes from the perspective of running a gas station,” Gao said. “If you’re running a charging highway — or if you get the contract to run a charging highway — you’re buying electricity and you’re selling electricity. It’s dynamic. You either buy an hour ahead of time and then you sell it one hour later or you bid on electricity, you submit your bid, buy it and then you sell it.”
    Today, it takes a lot of time to charge an electric vehicle and — due to battery limitations — a car’s range may only be a few hundred miles. With wireless charging lanes, a driver may simply move into the charging lane, much like today’s high-occupancy vehicles move into special lanes to avoid dense traffic.
    Lowering energy costs imposes less pressure on the existing power grid, according to the paper. “These two merits can bring broad benefits to our society,” Gao said. “Cost reduction in operating wireless charging roads is likely to attract more investment in constructing these roads and lower the corresponding charging price — promoting overall electric vehicle adoption.
    “The alleviation of required pressure on power grid is great news to the power industry,” Gao said, “which already suffers significant strain on the existing infrastructure.”
    In addition to Gao, the other authors are Jie Shi, lead, former Cornell postdoctoral researcher, and Nanpeng Yu, University of California, Riverside. The research was funded by Cornell University.
    Story Source:
    Materials provided by Cornell University. Original written by Blaine Friedlander, courtesy of the Cornell Chronicle. Note: Content may be edited for style and length. More

  • in

    Can smartphones predict mortality risk?

    Passive smartphone monitoring of people’s walking activity can be used to construct population-level models of health and mortality risk, according to a new study publishing October 20thin the open access journal PLOS Digital Health by Bruce Schatz of University of Illinois at Urbana-Champaign, USA, and colleagues.
    Previous studies have used measures of physical fitness, including walk tests and self-reported walk pace, to predict individual mortality risk. These metrics focus on quality rather than quantity of movement; measuring an individual’s gait speed has become a standard practice for certain clinical settings, for example. The rise of passive smartphone activity monitoring opens the possibility for population-level analyses using similar metrics.
    In the new study, researchers studied 100,000 participants in the UK Biobank national cohort who wore activity monitors with motion sensors for 1 week. While the wrist sensor is worn differently than how smartphone sensors are carried, their motion sensors can both be used to extract information on walking intensity from short bursts of walking — a daily living version of a walk test.
    The team was able to successfully validate predictive models of mortality risk using only 6 minutes per day of steady walking collected by the sensor, combined with traditional demographic characteristics. The equivalent of gait speed calculated from this passively collected data was a predictor of 5-year mortality independent of age and sex (pooled C-index 0.72). The predictive models used only walking intensity to simulate smartphone monitors.
    “Our results show passive measures with motion sensors can achieve similar accuracy to active measures of gait speed and walk pace,” the authors say. “Our scalable methods offer a feasible pathway towards national screening for health risk.”
    Schatz adds, “I have spent a decade using cheap phones for clinical models of health status. These have now been tested on the largest national cohort to predict life expectancy at population scale.”
    Story Source:
    Materials provided by PLOS. Note: Content may be edited for style and length. More

  • in

    Tapping hidden visual information: An all-in-one detector for thousands of colors

    Spectrometers are widely used throughout industry and research to detect and analyse light. Spectrometers measure the spectrum of light — its strength at different wavelengths, like the colours in a rainbow — and are an essential tool for identifying and analysing specimens and materials. Integrated on-chip spectrometers would be of great benefit to a variety of technologies, including quality inspection platforms, security sensors, biomedical analysers, healthcare systems, environmental monitoring tools, and space telescopes.
    An international research team led by researchers at Aalto University has developed high-sensitivity spectrometers with high wavelength accuracy, high spectral resolution, and broad operation bandwidth, using only a single microchip-sized detector. The research behind this new ultra-miniaturised spectrometer was published today in the journal Science.
    ‘Our single-detector spectrometer is an all-in-one device. We designed this optoelectronic-lab-on-a-chip with artificial intelligence replacing conventional hardware, such as optical and mechanical components. Therefore, our computational spectrometer does not require separate bulky components or array designs to disperse and filter light. It can achieve a high resolution comparable to benchtop systems but in a much smaller package,’ says Postdoctoral Researcher Hoon Hahn Yoon.
    ‘With our spectrometer, we can measure light intensity at each wavelength beyond the visible spectrum using a device at our fingertips. The device is entirely electrically controllable, so it has enormous potential for scalability and integration. Integrating it directly into portable devices such as smartphones and drones could advance our daily lives. Imagine that the next generation of our smartphone cameras could be fitted with hyperspectral cameras that outperform colour cameras,’ he adds.
    Shrinking computational spectrometers is essential for their use in chips and implantable applications. Professor Zhipei Sun, the head of the research team, says, ‘Conventional spectrometers are bulky because they need optical and mechanical components, so their on-chip applications are limited. There is an emerging demand in this field to improve the performance and usability of spectrometers. From this point of view, miniaturised spectrometers are very important to offer high performance and new functions in all fields of science and industry.’
    Professor Pertti Hakonen adds that ‘Finland and Aalto have invested in photonics research in recent years. For example, there has been great support from the Academy of Finland’s Centre of Excellence on quantum technology, Flagship on Photonics Research and Innovation, InstituteQ, and the Otanano Infrastructure. Our new spectrometer is a clear demonstration of the success of these collaborative efforts. I believe that with further improvements in resolution and efficiency, these spectrometers could provide new tools for quantum information processing.’
    In addition to Postdoctoral Researcher Hoon Hahn Yoon and Professors Zhipei Sun and Pertti Hakonen, the key Aalto members linked to the work included Postdoctoral Researchers Henry A. Fernandez and Faisal Ahmed, Doctoral Researchers Fedor Nigmatulin, Xiaoqi Cui, Md Gius Uddin, and Professor Harri Lipsanen. Professor Ethan D. Minot, from Oregon State University, joined this work as a visiting scholar at Aalto University for one year. The international research team led by Aalto university also included Professors Weiwei Cai (Shanghai Jiao Tong University), Zongyin Yang (Zhejiang University), Hanxiao Cui (Sichuan University), Kwanpyo Kim (Yonsei University), and Tawfique Hasan (University of Cambridge).
    Story Source:
    Materials provided by Aalto University. Note: Content may be edited for style and length. More

  • in

    New computing architecture: Deep learning with light

    Ask a smart home device for the weather forecast, and it takes several seconds for the device to respond. One reason this latency occurs is because connected devices don’t have enough memory or power to store and run the enormous machine-learning models needed for the device to understand what a user is asking of it. The model is stored in a data center that may be hundreds of miles away, where the answer is computed and sent to the device.
    MIT researchers have created a new method for computing directly on these devices, which drastically reduces this latency. Their technique shifts the memory-intensive steps of running a machine-learning model to a central server where components of the model are encoded onto light waves.
    The waves are transmitted to a connected device using fiber optics, which enables tons of data to be sent lightning-fast through a network. The receiver then employs a simple optical device that rapidly performs computations using the parts of a model carried by those light waves.
    This technique leads to more than a hundredfold improvement in energy efficiency when compared to other methods. It could also improve security, since a user’s data do not need to be transferred to a central location for computation.
    This method could enable a self-driving car to make decisions in real-time while using just a tiny percentage of the energy currently required by power-hungry computers. It could also allow a user to have a latency-free conversation with their smart home device, be used for live video processing over cellular networks, or even enable high-speed image classification on a spacecraft millions of miles from Earth.
    “Every time you want to run a neural network, you have to run the program, and how fast you can run the program depends on how fast you can pipe the program in from memory. Our pipe is massive — it corresponds to sending a full feature-length movie over the internet every millisecond or so. That is how fast data comes into our system. And it can compute as fast as that,” says senior author Dirk Englund, an associate professor in the Department of Electrical Engineering and Computer Science (EECS) and member of the MIT Research Laboratory of Electronics. More

  • in

    A drop in the sea of electrons

    Recent Australian-led research has provided a world’s first measurement of interactions between Fermi polarons in an atomically-thin 2D semiconductor, using ultrafast spectroscopy capable of probing complex quantum materials.
    Researchers at Swinburne University of Technology found the signatures of interactions between exciton-polarons in experiments on the 2D semiconductor monolayer tungsten-disulfide.
    FLEET collaborators at Monash University and RMIT developed a theoretical model to explain the experimental signals. They found that repulsive interactions at long-range are mediated by a phase-space filling effect, while attractive interactions at short range led to the formation of a cooperatively bound exciton-exciton-electron state.
    The Material
    Tungsten-disulfide (WS2) comes from the family of semiconducting transition metal dichalcogenides (TMDCs). When the bulk material is exfoliated down to a single atomic monolayer (less than 1 nanometre thick), the physics of these 2D materials becomes really interesting, and controllable.
    Much of the intriguing physics is described by the creation and interactions of quasiparticles*. Excitons are one such quasiparticle, and they dominate the optical response of monolayer WS2. Excitons are formed when electrons from the valence band are excited into the conduction band. The vacancy left behind (a hole) can then bind to the excited electron through Coulomb forces, forming the exciton. More

  • in

    How can flying insects and drones tell up from down?

    While drones typically use accelerometers to estimate the direction of gravity, the way flying insects achieve this has been shrouded in mystery until now, as they have no specific sense of acceleration. In this study, a European team of scientists1 led by the Delft University of Technology in the Netherlands and involving a CNRS researcher has shown that drones can assess gravity using visual motion detection and motion modelling together.
    To develop this new principle, scientists have investigated optical flow, that is, how an individual perceives movement relative to their environment. It is the visual movement that sweeps across our retina when we move. For example, when we are on a train, trees next to the tracks pass by faster than distant mountains. The optical flow alone is not enough for an insect to be able to know the direction of gravity.
    However, the research team discovered that it was possible for them to find this direction by combining this optical flow with a modelling of their movement, i.e. a prediction of how they will move. The conclusions of the article show that with this principle it was possible to find the direction of gravity in almost all situations, except in a few rare and specific cases such as when the subject was completely immobile.
    During such perfect stationary flights, the impossibility of finding the direction of gravity will destabilize the drone for a moment and therefore put it in motion. This means the drone will regain the direction of gravity at the next instant. So these movements generate slight oscillations, reminiscent of insect flight.
    Using this new principle in robotics could meet a major challenge that nature has also faced: How to obtain a fully autonomous system while limiting payload. Future drone prototypes would be lightened by not needing accelerometers, which is very promising for the smallest models of the size of an insect.
    Though this theory may explain how flying insects determine gravity, we still need confirmation that they actually use this mechanism. Specific new biological experiments are needed to prove the existence of these neural processes that are difficult to observe in flight. This publication shows how the synergy between robotics and biology can lead to technological advances and new biological research avenues.
    Notes
    1 This research results from a European collaboration between two laboratories: the Micro Air Vehicle Laboratory at the The Faculty of Aerospace Engineering at the Delft University of Technology in the Netherlands and the Institut des Sciences du Mouvement (CNRS/Aix Marseille Université) in France.
    Story Source:
    Materials provided by CNRS. Note: Content may be edited for style and length. More