More stories

  • in

    Wirelessly powered relay will help bring 5G technology to smart factories

    A recently developed wirelessly powered 5G relay could accelerate the development of smart factories, report scientists from Tokyo Tech. By adopting a lower operating frequency for wireless power transfer, the proposed relay design solves many of the current limitations, including range and efficiency. In turn, this allows for a more versatile and widespread arrangement of sensors and transceivers in industrial settings.
    One of the hallmarks of the Information Age is the transformation of industries towards a greater flow of information. This can be readily seen in high-tech factories and warehouses, where wireless sensors and transceivers are installed in robots, production machinery, and automatic vehicles. In many cases, 5G networks are used to orchestrate operations and communications between these devices.
    To avoid relying on cumbersome wired power sources, sensors and transceivers can be energized remotely via wireless power transfer (WPT). However, one problem with conventional WPT designs is that they operate at 24 GHz. At such high frequencies, transmission beams must be extremely narrow to avoid energy losses. Moreover, power can only be transmitted if there is a clear line of sight between the WPT system and the target device. Since 5G relays are often used to extend the range of 5G base stations, WPT needs to reach even further, which is yet another challenge for 24 GHz systems.
    To address the limitations of WPT, a research team from Tokyo Institute of Technology has come up with a clever solution. In a recent study, whose results have been presented in the2024 IEEE Symposium on VLSI Technology & Circuits, they developed a novel 5G relay that can be powered wirelessly at a lower frequency of 5.7 GHz. “By using 5.7 GHz as the WPT frequency, we can get wider coverage than conventional 24 GHz WPT systems, enabling a wider range of devices to operate simultaneously,” explains senior author and Associate Professor Atsushi Shirane.
    The proposed wirelessly powered relay is meant to act as an intermediary receiver and transmitter of 5G signals, which can originate from a 5G base station or wireless devices. The key innovation of this system is the use of a rectifier-type mixer, which performs 4th-order subharmonic mixing while also generating DC power.
    Notably, the mixer uses the received 5.7 GHz WPT signal as a local signal. With this local signal, together with multiplying circuits, phase shifters, and a power combiner, the mixer ‘down-converts’ a received 28 GHz signal into a 5.2 GHz signal. Then, this 5.2 GHz signal is internally amplified, up-converted to 28 GHz through the inverse process, and retransmitted to its intended destination.
    To drive these internal amplifiers, the proposed system first rectifies the 5.7 GHz WPT signal to produce DC power, which is managed by a dedicated power management unit. This ingenious approach offers several advantages, as Shirane highlights: “Since the 5.7 GHz WPT signal has less path loss than the 24 GHz signal, more power can be obtained from a rectifier. In addition, the 5.7 GHz rectifier has a lower loss than 24 GHz rectifiers and can operate at a higher power conversion efficiency.” Finally, this proposed circuit design allows for selecting the transistor size, bias voltage, matching, cutoff frequency of the filter, and load to maximize conversion efficiency and conversion gain simultaneously.
    Through several experiments, the research team showcased the capabilities of their proposed relay. Occupying only a 1.5 mm by 0.77 mm chip using standard CMOS technology, a single chip can output a high power of 6.45 mW at an input power of 10.7 dBm. Notably, multiple chips could be combined to achieve a higher power output. Considering its many advantages, the proposed 5.7 GHz WPT system could thus greatly contribute to the development of smart factories. More

  • in

    Simplicity versus adaptability: Understanding the balance between habitual and goal-directed behaviors

    Both living creatures and AI-driven machines need to act quickly and adaptively in response to situations. In psychology and neuroscience, behavior can be categorized into two types — habitual (fast and simple but inflexible), and goal-directed (flexible but complex and slower). Daniel Kahneman, who won the Nobel Prize in Economic Sciences, distinguishes between these as System 1 and System 2. However, there is ongoing debate as to whether they are independent and conflicting entities or mutually supportive components.
    Scientists from the Okinawa Institute of Science and Technology (OIST) and Microsoft Research Asia in Shanghai have proposed a new AI method in which systems of habitual and goal-directed behaviors learn to help each other. Through computer simulations that mimicked the exploration of a maze, the method quickly adapts to changing environments and also reproduced the behavior of humans and animals after they had been accustomed to a certain environment for a long time.
    The study, published in Nature Communications, not only paves the way for the development of systems that adapt quickly and reliably in the burgeoning field of AI, but also provides clues to how we make decisions in the fields of neuroscience and psychology.
    The scientists derived a model that integrates habitual and goal-directed systems for learning behavior in AI agents that perform reinforcement learning, a method of learning based on rewards and punishments, based on the theory of “active inference,” which has been the focus of much attention recently. In the paper, they created a computer simulation mimicking a task in which mice explore a maze based on visual cues and are rewarded with food when they reach the goal.
    They examined how these two systems adapt and integrate while interacting with the environment, showing that they can achieve adaptive behavior quickly. It was observed that the AI agent collected data and improved its own behavior through reinforcement learning.
    What our brains prefer
    After a long day at work, we usually head home on autopilot (habitual behavior). However, if you have just moved house and are not paying attention, you might find yourself driving back to your old place out of habit. When you catch yourself doing this, you switch gears (goal-directed behavior) and reroute to your new home. Traditionally, these two behaviors are considered to work independently, resulting in behavior being either habitual and fast but inflexible, or goal-directed and flexible but slow.

    “The automatic transition from goal-directed to habitual behavior during learning is a very famous finding in psychology. Our model and simulations can explain why this happens: The brain would prefer behavior with higher certainty. As learning progresses, habitual behavior becomes less random, thereby increasing certainty. Therefore, the brain prefers to rely on habitual behavior after significant training,” Dr. Dongqi Han, a former PhD student at OIST’s Cognitive Neurorobotics Research Unit and first author of the paper, explained.
    For a new goal that AI has not trained for, it uses an internal model of the environment to plan its actions. It does not need to consider all possible actions but uses a combination of its habitual behaviors, which makes planning more efficient. This challenges traditional AI approaches which require all possible goals to be explicitly included in training for them to be achieved. In this model each desired goal can be achieved without explicit training but by flexibly combining learned knowledge.
    “It’s important to achieve a kind of balance or trade-off between flexible and habitual behavior,” Prof. Jun Tani, head of the Cognitive Neurorobotics Research Unit stated. “There could be many possible ways to achieve a goal, but to consider all possible actions is very costly, therefore goal directed behavior is limited by habitual behavior to narrow down options.”
    Building better AI
    Dr. Han got interested in neuroscience and the gap between artificial and human intelligence when he started working on AI algorithms. “I started thinking about how AI can behave more efficiently and adaptably, like humans. I wanted to understand the underlying mathematical principles and how we can use them to improve AI. That was the motivation for my PhD research.”
    Understanding the difference between habitual and goal-directed behaviors has important implications, especially in the field of neuroscience, because it can shed light on neurological disorders such as ADHD, OCD, and Parkinson’s disease.
    “We are exploring the computational principles by which multiple systems in the brain work together. We have also seen that neuromodulators such as dopamine and serotonin play a crucial role in this process,” Prof. Kenji Doya, head of the Neural Computation Unit explained. “AI systems developed with inspiration from the brain and proven capable of solving practical problems can serve as valuable tools in understanding what is happening in the brains of humans and animals.”
    Dr. Han would like to help build better AI that can adapt their behavior to achieve complex goals. “We are very interested in developing AI that have near human abilities when performing everyday tasks, so we want to address this human-AI gap. Our brains have two learning mechanisms, and we need to better understand how they work together to achieve our goal.” More

  • in

    New material puts eco-friendly methanol conversion within reach

    Griffith University researchers have developed innovative, eco-friendly quantum materials that can drive the transformation of methanol into ethylene glycol.
    Ethylene glycol is an important chemical used to make polyester (including PET) and antifreeze agents, with a global production of over 35 million tons annually with strong growth.
    Currently, it’s mainly produced from petrochemicals through energy-intensive processes.
    Methanol (CH3OH) can be produced sustainably from CO2, agricultural biomass waste, and plastic waste through various methods such as hydrogenation, catalytic partial oxidation, and fermentation. As a fuel, methanol also serves as a circular hydrogen carrier and a precursor for numerous chemicals.
    Led by Professor Qin Li, the Griffith team’s method uses solar-driven photocatalysis to convert methanol into ethylene glycol under mild conditions.
    This process uses sunlight to drive chemical reactions, which minimises waste and maximises the use of renewable energy.
    While previous attempts at this conversion have faced challenges — such as the need for toxic or precious materials — Professor Li and the research team have identified a greener solution.

    “Climate change is a major challenge facing humanity today,” Professor Li said.
    “To tackle this, we need to focus on zero-emission power generation, low-emission manufacturing, and a circular economy. Methanol stands out as a crucial chemical that links these three strategies.
    “What we have created is a novel material that combines carbon quantum dots with zinc selenide quantum wells.”
    “This combination significantly enhances the photocatalytic activity more than four times higher than using carbon quantum dots alone, demonstrating the effectiveness of the new material,” Lead author Dr Dechao Chen said.
    The approach has also shown high photocurrent, indicating efficient charge transfer within the material, crucial for driving the desired chemical reactions.
    Analyses confirmed the formation of ethylene glycol, showcasing the potential of this new method. It’s worth noting that the by-product of this reaction is green hydrogen.

    This discovery opens up new possibilities for using eco-friendly materials in photocatalysis, paving the way for sustainable chemical production.
    As a new quantum material, it also has the potential to lead to further advancements in photocatalysis, sensing, and optoelectronics.
    “Our research demonstrates a significant step towards green chemistry, showing how sustainable materials can be used to achieve important chemical transformations,” Professor Li said.
    “This could transform methanol conversion and contribute significantly to emissions reduction.”
    The findings ‘Colloidal Synthesis of Carbon Dot-ZnSe Nanoplatelet Vander Waals Heterostructures for Boosting PhotocatalyticGeneration of Methanol-Storable Hydrogen’ have been published in the journal Small. More

  • in

    Custom-made molecules designed to be invisible while absorbing near-infrared light

    Getting a molecule to do what you want it to do is not always easy. As an example, an organic molecule will absorb only certain wavelengths of light based on its arrangement of electrons, which can be difficult to fine tune. Even so, the ability to make substances that respond to only specific ranges of the spectrum could lead to important new applications.
    There is currently significant interest in the design of new organic semiconducting materials for high-tech applications such as solar cells and transistors. In particular, molecules that can absorb near-infrared light but not visible light, and so are colorless, have applications in everything from chemotherapy to photodetectors. Some such compounds have already been developed but so far there has been no systematic process for making these molecules.
    In a study recently published in Advanced Science, researchers from SANKEN at Osaka University were able to systematically design a large, complex molecule that does not absorb visible light, meaning that it is completely colorless and transparent, but do absorb near-infrared radiation. This was accomplished by carefully constructing molecules that have suitable arrangements of electrons.
    The absorbance of light by an organic compound is based on electrons moving between regions around atoms known as orbitals. In this work, the researchers show a methodical approach to constructing molecules having orbitals that allow some ranges of light to be absorbed but not others.
    “The main challenge was finding a rational approach to constructing molecules with the desired electronic transitions,” says lead author of the study Soichi Yokoyama. “To do so, we focused on large structures having many delocalized electrons, using theoretical calculations to guide our selections.”
    These compounds were based on a so-called donor-acceptor-donor system and utilized a naphthobisthiadiazole group as the acceptor combined with either pyrrole or indenopyrrole donor groups along with boron bridges. This specialized structure allowed electrons to spread out over wider areas of the molecules, producing just the right type of light absorption. The new molecule was exhaustively characterized and were found not to absorb in the visible region of the spectrum but to absorb near-infrared light, as planned.
    “A somewhat similar molecule absorbing near-infrared radiation was reported some time ago,” explains Yutaka Ie, senior author, “but this compound also absorbed visible light and so appeared blue. Our goal was to find a molecule that showed no color at all, to allow specific applications. A combination of an extended polyene structure and orbital symmetry were key.”
    The molecule was found to act as semiconductors and the pyrrole-based compound could also be used to construct a phototransistor responsive to near-infrared light. Many uses for organic compounds that show unique optoelectronic properties and specific light absorption characteristics are yet to be explored. This work is expected to pave the way for the future design of transparent, colorless molecules that respond to near-infrared light and lead to many new applications. More

  • in

    AI recognizes athletes’ emotions

    Using computer-assisted neural networks, Researchers at the Karlsruhe Institute of Technology (KIT) and the University of Duisburg-Essen have been able to accurately identify affective states from the body language of tennis players during games. For the first time, they trained a model based on artificial intelligence (AI) with data from actual games. Their study, published in the journal Knowledge-Based Systems, demonstrates that AI can assess body language and emotions with accuracy similar to that of humans. However, it also points to ethical concerns.
    For their study, “Recognizing affective states from the expressive behavior of tennis players using convolutional neural networks,” sports sciences, software development and computer science researchers from KIT and the University of Duisburg-Essen developed a special AI model. They used pattern-recognition programs to analyze video of tennis players recorded during actual games.
    Success Rate of 68.9 Percent
    “Our model can identify affective states with an accuracy of up to 68.9 percent, which is comparable and sometimes even superior to assessments made by both human observers and earlier automated methods,” said Professor Darko Jekauc of KIT’s Institute of Sports and Sports Science.
    An important and unique feature of the study is the project team’s use of real-life scenes instead of simulated or contrived situations to train their AI system. The researchers recorded video sequences of 15 tennis players in a specific setting, focusing on the body language displayed when a point was won or lost. The videos showed players with cues including lowered head, arms raised in exultation, hanging racket, or differences in walking speed; these cues could be used to identify the players’ affective states.
    After being fed with this data, the AI learned to associate the body language signals with different affective reactions and to determine whether a point had been won (positive body language) or lost (negative body language). “Training in natural contexts is a significant advance for the identification of real emotional states, and it makes predictions possible in real scenarios,” said Jekauc.
    Humans and Machines Recognize Negative Emotions Better Than Positive Ones
    Not only does the research show that AI algorithms may be able to surpass human observers in their ability to identify emotions in the future, it also revealed a further interesting aspect: both humans and AI are better at recognizing negative emotions. “The reason could be that negative emotions are easier to identify because they’re expressed in more obvious ways,” said Jekauc. “Psychological theories suggest that people are evolutionarily better adapted to perceive negative emotional expressions, for example because defusing conflict situations quickly is essential to social cohesion.”

    Ethical Aspects Need Clarification Before Use
    The study envisions a number of sports applications for reliable emotion recognition, such as improving training methods, team dynamics and performance, and preventing burnout. Other fields, including healthcare, education, customer service and automotive safety, could also benefit from reliable early detection of emotional states.
    “Although this technology offers the prospect of significant benefits, the potential risks associated with it also have to be taken into account, especially those relating to privacy and misuse of data,” Jekauc said. “Our study adhered strictly to existing ethical guidelines and data protection regulations. And with a view to future applications of such technology in practice, it will be essential to clarify ethical and legal issues ahead of time.” More

  • in

    Molecular sponge for the electronics of the future

    Porous covalent organic frameworks (COFs) are a class of highly ordered, porous materials consisting of organic molecules that are linked by covalent bonds to form a network. They enable the construction of functional materials with molecular precision. Similar to metal organic frameworks (MOFs), which were discovered around 25 years ago and have already reached market maturity, COFs possess highly promising structural, optical and electronic properties for numerous applications, for example in gas and liquid storage, catalysis, sensor technology and energy applications.

    Previous research on COFs has generally focussed on the construction of rigid frameworks with static material properties. Dr Florian Auras and his team at the Chair of Molecular Functional Materials at TUD have now developed a design strategy for dynamic two-dimensional COFs that can open and close their pores in a controlled manner, similar to a sponge. “The main aim of the study was to equip these frameworks, which are normally very precisely ordered but rigid, with exactly the right degree of flexibility so that their structure can be switched from compact to porous. By adding solvent to the molecular sponge, we can now temporarily and reversibly change the local geometry as well as optical properties such as colour or fluorescence,” says Florian Auras, explaining his research approach.
    The ability to switch the structural and optoelectronic properties of the materials back and forth in a targeted manner makes the materials particularly interesting for future applications in electronics and information technology. “Our research results form the basis for our further research into stimuli-responsive polymers, particularly with the aim of realising switchable quantum states. When working on COFs, I am always fascinated by how precisely their properties can be manipulated by controlling the molecular structure,” adds Auras. More

  • in

    Study finds US does not have housing shortage, but shortage of affordable housing

    The United States is experiencing a housing shortage. At least, that is the case according to common belief — and is even the basis for national policy, as the Biden administration has stated plans to address the housing supply shortfall.
    But new research from the University of Kansas finds that most of the nation’s markets have ample housing in total, but nearly all lack enough units affordable to very low-income households.
    Kirk McClure, professor of public affairs & administration emeritus at KU, and Alex Schwartz of The New School co-wrote a study published in the journal Housing Policy Debate. They examined U.S. Census Bureau data from 2000 to 2020 to compare the number of households formed to the number of housing units added to determine if there were more households needing homes than units available.
    The researchers found only four of the nation’s 381 metropolitan areas experienced a housing shortage in the study time frame, as did only 19 of the country’s 526 “micropolitan” areas — those with 10,000-50,000 residents.
    The findings suggest that addressing housing prices and low incomes are more urgently needed to address housing affordability issues than simply building more homes, the authors wrote.
    “There is a commonly held belief that the United States has a shortage of housing. This can be found in the popular and academic literature and from the housing industry,” McClure said. “But the data shows that the majority of American markets have adequate supplies of housing available. Unfortunately, not enough of it is affordable, especially for low-income and very low-income families and individuals.”
    McClure and Schwartz also examined households in two categories: Very low income, defined as between 30% and 60% of area median family income, and extremely low income, with incomes below 30% of area median family income.

    The numbers showed that from 2010 to 2020, household formation did exceed the number of homes available. However, there was a large surplus of housing produced in the previous decade. In fact, from 2000 to 2020, housing production exceeded the growth of households by 3.3 million units. The surplus from 2000 to 2010 more than offset the shortages from 2010 to 2020.
    The numbers also showed that nearly all metropolitan areas have sufficient units for owner occupancy. But nearly all have shortages of rental units affordable to the very low-income renter households.
    While the authors looked at housing markets across the nation, they also examined vacancy rates, or the difference between total and occupied units, to determine how many homes were available. National total vacancy rates were 9% in 2000 and 11.4% by 2010, which marked the end of the housing bubble and the Great Recession. By the end of 2020, the rate was 9.7%, with nearly 14 million vacant units.
    “When looking at the number of housing units available, it becomes clear there is no overall shortage of housing units available. Of course, there are many factors that determine if a vacant is truly available; namely, if it is physically habitable and how much it costs to purchase or rent the unit,” McClure said. “There are also considerations over a family’s needs such as an adequate number of bedrooms or accessibility for individuals with disabilities, but the number of homes needed has not outpaced the number of homes available.”
    Not all housing markets are alike, and while there could be shortages in some, others could contain a surplus of available housing units. The study considered markets in all core-based statistical areas as defined by the Census Bureau. Metropolitan areas saw a nationwide surplus of 2.7 million more units than households in the 20-year study period, while micropolitan areas had a more modest surplus of about 300,000 units.
    Numbers of available housing units and people only tell part of the story. An individual family needs to be able to afford housing, whether they buy or rent. Shortages of any scale appear in the data only when considering renters, the authors wrote. McClure and Schwartz compared the number of available units in four submarkets of each core-based statistical area to the estimated number of units affordable to renters with incomes from 30% to 60% of the area median family income. Those rates are roughly equivalent to the federal poverty level and upper level of eligibility for various rental assistance programs. Only two metropolitan areas had shortages for very-low-income renters, and only two had surpluses available for extremely-low-income renters.
    Helping people afford the housing stock that is available would be more cost effective than expanding new home construction in the hope that additional supply would bring prices down, the authors wrote. Several federal programs have proven successful in helping renters and moderate-income buyers afford housing that would otherwise be out of reach.
    “Our nation’s affordability problems result more from low incomes confronting high housing prices rather than from housing shortages,” McClure said. “This condition suggests that we cannot build our way to housing affordability. We need to address price levels and income levels to help low-income households afford the housing that already exists, rather than increasing the supply in the hope that prices will subside.” More

  • in

    AI shows how field crops develop

    Researchers at the University of Bonn have developed software that can simulate the growth of field crops. To do this, they fed thousands of photos from field experiments into a learning algorithm. This enabled the algorithm to learn how to visualize the future development of cultivated plants based on a single initial image. Using the images created during this process, parameters such as leaf area or yield can be estimated accurately. The results have been published in the journal Plant Methods.
    Which plants should I combine in what ratio to achieve the greatest possible yield? And how will my crop develop if I use manure instead of artificial fertilizers? In the future, farmers should increasingly be able to count on computer support when answering such questions.
    Researchers from the University of Bonn have now taken a crucial step forward on the path towards this goal: “We have developed software that uses drone photos to visualize the future development of the plants shown,” explains Lukas Drees from the Institute of Geodesy and Geoinformation at the University of Bonn. The early career researcher is an employee in the PhenoRob Cluster of Excellence. The large-scale project based at the University of Bonn intends to drive forward the intelligent digitalization of agriculture to help farming become more environmentally friendly, without causing harvest yields to suffer.
    A virtual glimpse into the future to aid decision-making
    The computer program now presented by Drees and his colleagues in the journal Plant Methods is an important building block. It should eventually make it possible to simulate certain decisions virtually — for instance, to assess how the use of pesticides or fertilizers will affect crop yield.
    For this to work, the program must be fed with drone photos from field experiments. “We took thousands of images over one growth period,” explains the doctoral researcher. “In this way, for example, we documented the development of cauliflower crops under certain conditions.” The researchers then trained a learning algorithm using these images. Afterwards, based on a single aerial image of an early stage of growth, this algorithm was able to generate images showing the future development of the crop in a new, artificially created image. The whole process is very accurate as long as the crop conditions are similar to those present when the training photos were taken. Consequently, the software does not take into account the effect of a sudden cold snap or steady rain lasting several days. However, it should learn in the future how growth is affected by influences such as these — as well as an increased use of fertilizers, for example. This should enable it to predict the outcome of certain interventions by the farmer.
    “In addition, we used a second AI software that can estimate various parameters from plant photos, such as crop yield,” says Drees. “This also works with the generated images. It is thus possible to estimate quite precisely the subsequent size of the cauliflower heads at a very early stage in the growth period.”
    Focus on polycultures

    One area the researchers are focusing on is the use of polycultures. This refers to the sowing of different species in one field — such as beans and wheat. As plants have different requirements, they compete less with each other in a polyculture of this kind compared to a monoculture, where just one species is grown. This boosts yield. In addition, some species — beans are a good example of this — can bind nitrogen from the air and use it as a natural fertilizer. The other species, in this case wheat, also benefits from this.
    “Polycultures are also less susceptible to pests and other environmental influences,” explains Drees. “However, how well the whole thing works very much depends on the combined species and their mixing ratio.” When results from many different mixing experiments are fed into learning algorithms, it is possible to derive recommendations as to which plants are particularly compatible and in what ratio.
    Plant growth simulations on the basis of learning algorithms are a relatively new development. Process-based models have mostly been used for this purpose up to now. These — metaphorically speaking — have a fundamental understanding of what nutrients and environmental conditions certain plants need during their growth in order to thrive. “Our software, however, makes its statements solely based on the experience they have collected using the training images,” stresses Drees.
    Both approaches complement each other. If they were to be combined in an appropriate manner, it could significantly improve the quality of the forecasts. “This is also a point that we are investigating in our study,” says the doctoral researcher: “How can we use process- and image-based methods so they benefit from each other in the best possible way?” More