More stories

  • in

    Artificial Intelligence tool could reduce common drug side effects

    Research led by the University of Exeter and Kent and Medway NHS and Social Care Partnership Trust, published in Age and Ageing, assessed a new tool designed to calculate which medicines are more likely to experience adverse anticholinergic effects on the body and brain. These complications can occur from many -prescription and over-the-counter drugs which affects the brain by blocking a key neurotransmitter called acetylcholine. Many medicines, including some bladder medications, anti-depressants, medications for stomach and Parkinson’s disease have some degree of anticholinergic effect. They are commonly taken by older people.
    Anticholinergic side effects include confusion, blurred vision, dizziness, falls and a decline in brain function. Anticholinergic effects may also increase risks of falls and may be associated with an increase in mortality. They have also been linked to a higher risk of dementia when used long term.
    Now, researchers have developed a tool to calculate harmful effects of medicines using artificial intelligence. The team created a new online tool, International Anticholinergic Cognitive Burden Tool (IACT), is uses natural language processing which is an artificial intelligence methdolody and chemical structure analysis to identify medications that have anticholinergic effect.
    The tool is the first to incorporate a machine learning technique, to develop an automatically updated tool available on a website portal. The anticholinergic burden is assessed by assigning a score based on reported adverse events and aligning closely with the chemical structure of the drug being considered for prescription, resulting in a more accurate and up-to-date scoring system than any previous system. Ultimately, after further research and modelling with real world patient data the tool developed could help to support prescribing reducing risks form common medicines.
    Professor Chris Fox, at the University of Exeter, is one of the study authors. He said:: “Use of medicines with anticholinergic effects can have significant harmful effects for example falls and confusion which are avoidable, we urgently need to reduce the harmful side effects as this can leads to hospitalisation and death. This new tool provides a promising avenue towards a more tailored personalised medicine approach, of ensuring the right person gets a safe and effective treatment whilst avoiding unwanted anticholinergic effects.”
    The team surveyed 110 health professionals, including pharmacists and prescribing nurses. Of this group, 85 per cent said they would use a tool to assess risk of anticholinergic side effects, if available. The team also gathered usability feedback to help improve the tool further.
    Dr Saber Sami, at the University of East Anglia, said: “Our tool is the first to use innovative artificial intelligence technology in measures of anticholinergic burden — ultimately, once further research has been conducted the tool should support pharmacists and prescribing health professionals in finding the best treatment for patients.”
    Professor Ian Maidment, from Aston University, said: “I have been working in this area for over 20 years. Anti-cholinergic side-effects can be very debilitating for patients. We need better ways to assess these side-effects.”
    The research team includes collaboration with AKFA University Medical School, Uzbekistan, and the Universities of East Anglia, Aston, Kent and Aberdeen. They aim to continue development of the tool with the aim that it can be deployed in day-to-day practice which this study supports.
    Story Source:
    Materials provided by University of Exeter. Note: Content may be edited for style and length. More

  • in

    A coral pollution study unexpectedly helped explain Hurricane Maria’s fury

    Hurricane Maria struck the island of Puerto Rico early on September 20, 2017, with 250-kilometer-per-hour winds, torrential rains and a storm surge up to three meters high. In its wake: nearly 3,000 people dead, an almost yearlong power outage and over $90 billion in damages to homes, businesses and essential infrastructure, including roads and bridges.

    Geologist and diver Milton Carlo took shelter at his house in Cabo Rojo on the southwest corner of the island with his wife, daughter and infant grandson. He watched the raging winds of the Category 4 hurricane lift his neighbor’s SUV into the air, and remembers those hours as some of the worst of his life.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    For weeks, the rest of the world was in the dark about the full extent of the devastation, because Maria had destroyed the island’s main weather radar and almost all cell phone towers.

    Far away on the U.S. West Coast, in Santa Cruz, Calif., oceanographer Olivia Cheriton watched satellite radar images of Maria passing over the instruments she and her U.S. Geological Survey team had anchored a few kilometers southwest of Puerto Rico. The instruments, placed offshore from the seaside town of La Parguera, were there to track pollution circulating around some of the island’s endangered corals.

    More than half a year went by before she learned the improbable fate of those instruments: They had survived and had captured data revealing hurricane-related ocean dynamics that no scientist had ever recorded.

    The wind-driven coastal currents interacted with the seafloor in a way that prevented Maria from drawing cold water from the depths of the sea up to the surface. The sea surface stayed as warm as bathwater. Heat is a hurricane’s fuel source, so a warmer sea surface leads to a more intense storm. As Cheriton figured out later, the phenomenon she stumbled upon likely played a role in maintaining Maria’s Category 4 status as it raked Puerto Rico for eight hours.

    “There was absolutely no plan to capture the impact of a storm like Maria,” Cheriton says. “In fact, if we somehow could’ve known that a storm like that was going to occur, we wouldn’t have put hundreds of thousands of dollars’ worth of scientific instrumentation in the water.”

    A storm’s path is guided by readily observable, large-scale atmospheric features such as trade winds and high-pressure zones. Its intensity, on the other hand, is driven by weather events inside the hurricane and wave action deep below the ocean’s surface. The findings by Cheriton and colleagues, published May 2021 in Science Advances, help explain why hurricanes often get stronger before making landfall and can therefore help forecasters make more accurate predictions.

    Reef pollution

    Cheriton’s original research objective was to figure out how sea currents transport polluted sediments from Guánica Bay — where the Lajas Valley drains into the Caribbean Sea — to the pristine marine ecosystems 10 kilometers west in La Parguera Natural Reserve, famous for its bioluminescent waters.

    Endangered elkhorn and mountainous star corals, called “the poster children of Caribbean reef decline” by marine geologist Clark Sherman, live near shore in some of the world’s highest recorded concentrations of now-banned industrial chemicals. Those polychlorinated biphenyls, or PCBs, hinder coral reproduction, growth, feeding and defensive responses, says Sherman, of the University of Puerto Rico–Mayagüez.

    Elkhorn coral (left) and mountainous star coral (right) were once ubiquitous in the Caribbean. Their numbers have dropped greatly due to bleaching and disease. Pollution is partly to blame.  FROM LEFT: NICK HOBGOOD/WIKIMEDIA COMMONS (CC BY-SA 3.0); NOAA FISHERIES

    Half of corals in the Caribbean have died since monitoring began in the 1970s, and pollution is a major cause, according to an April 2020 study in Science Advances. Of particular interest to Cheriton, Sherman and their colleagues was whether the pollution had reached deepwater, or mesophotic, reefs farther offshore, which could be a refuge for coral species that were known to be dying in shallower areas.

    The main artery for this pollution is the Rio Loco — which translates to “Crazy River.” It spews a toxic runoff of eroded sediments from the Lajas Valley’s dirt roads and coffee plantations into Guánica Bay, which supports a vibrant fishing community. Other possible contributors to the pollution — oil spills, a fertilizer plant, sewage and now-defunct sugar mills — are the subject of investigations by public health researchers and the U.S. Environmental Protection Agency.

    In June 2017, the team convened in La Parguera to install underwater sensors to measure and track the currents in this threatened marine environment. From Sherman’s lab on a tiny islet overrun with iguanas the size of house cats, he and Cheriton, along with team leader and USGS research geologist Curt Storlazzi and USGS physical scientist Joshua Logan, launched a boat into choppy seas.

    Marine geologist Clark Sherman dives amid colonies of healthy great star corals, black corals, a large sea fan and a variety of sponges along the steep island shelf of southwest Puerto Rico. Sherman helped investigate whether pollution was reaching these deepwater reefs.E. TUOHY/UNIV. OF PUERTO RICO–MAYAGÜEZ

    At six sites near shore, Storlazzi, Sherman and Logan dove to the seafloor and used epoxy to anchor pressure gauges and batonlike current meters. Together the instruments measured hourly temperature, wave height and current speed. The team then moved farther offshore where the steep island shelf drops off at a 45-degree angle to a depth of 60 meters, but the heavy ocean chop scuttled their efforts to install instruments there.

    In June 2017, research geologist Curt Storlazzi (left) and physical scientist Joshua Logan (right) prepare to dive near Puerto Rico’s Guánica Bay to install instruments for monitoring currents suspected of delivering pollution to coral reefs.USGS

    For help working in the difficult conditions, Sherman enlisted two expert divers for a second attempt: Carlo, the geologist and diving safety officer, and marine scientist Evan Tuohy, both of the University of Puerto Rico–­Mayagüez. The two were able to install the most important and largest piece, a hydroacoustic instrument comprising several drums fastened to a metal grid, which tracked the direction and speed of currents every minute using pulsating sound waves. A canister containing temperature and salinity sensors took readings every two minutes. Above this equipment, an electric thermometer extended to within 12 meters of the surface, registering temperature every five meters vertically every few seconds.

    The instruments installed by Storlazzi, Logan and others collected unexpected underwater ocean observations during Hurricane Maria. An acoustic Doppler current profiler (left) used pulsating sound waves to measure the direction and speed of currents at the shelf break and slope site about 12 kilometers offshore of La Parguera. A Marotte current meter (right) measured wave height, current speed and temperature at six spots close to shore.USGS

    Working in concert, the instruments gave a high-resolution, seafloor-to-surface snapshot of the ocean’s hydrodynamics on a near-continuous basis. The equipment had to sit level on the sloping seafloor so as not to skew the measurements and remain firmly in place. Little did the researchers know that the instruments would soon be battered by one of the most destructive storms in history.

    Becoming Maria

    The word hurricane derives from the Caribbean Taino people’s Huricán, god of evil. Some of the strongest of these Atlantic tropical cyclones begin where scorching winds from the Sahara clash with moist subtropical air over the island nation of Cape Verde off western Africa. The worst of these atmospheric disturbances create severe thunderstorms with giant cumulonimbus clouds that flatten out against the stratosphere. Propelled by the Earth’s rotation, they begin to circle counterclockwise around each other — a phenomenon known as the Coriolis effect.

    Weather conditions that summer had already spawned two monster hurricanes: Harvey and Irma. By late September, the extremely warm sea surface — 29º Celsius or hotter in some places — gave up its heat energy by way of evaporation into Maria’s rushing winds. All hurricanes begin as an area of low pressure, which in turn sucks in more wind, accelerating the rise of hot air, or convection. Countervailing winds known as shear can sometimes topple the cone of moist air spiraling upward. But that didn’t happen, so Maria continued to grow in size and intensity.

    Meteorologists hoped that Maria would lose force as it moved across the Caribbean, weakened by the wake of cooler water Irma had churned up two weeks earlier. Instead, Maria tracked south, steaming toward the eastern Caribbean island of Dominica. Within 15 hours of making landfall, its maximum sustained wind speed doubled, reaching a house-leveling 260 kilometers per hour. That doubling intensified the storm from a milder (still dangerous) Category 1 to a strong Category 5.

    NOAA’s computer forecasting models did not anticipate such rapid intensification. Irma had also raged with unforeseen intensity.

    After striking Dominica hard, Maria’s eyewall broke down, replaced by an outer band of whipping thunderstorms. This slightly weakened Maria to 250 kilometers per hour before it hit Puerto Rico, while expanding the diameter of the storm’s eyewall — the area of strong winds and heaviest precipitation — to 52 kilometers. That’s close to the width of the island.

    Hurricane Maria made landfall on Puerto Rico early in the morning on September 20, 2017, and cut across the island diagonally toward the northwest. Its eyewall generated maximum sustained winds of  250 kilometers per hour and spanned almost the width of the island.CIRA/NOAA

    It’s still not fully understood why Maria had suddenly gone berserk. Various theories point to the influence of hot towers — convective bursts of heat energy from thunderclouds that punch up into the stratosphere — or deep warm pools, buoyant freshwater eddies spilling out of the Amazon and Orinoco rivers into the Atlantic, where currents carry these pockets of hurricane-fueling heat to the Gulf of Mexico and the Caribbean Sea.

    But even though these smaller-scale events may have a big impact on intensity, they aren’t fully accounted for in weather models, says Hua Leighton, a scientist at the National Oceanic and Atmospheric Administration’s hurricane research division and the University of Miami’s Cooperative Institute for Marine and Atmospheric Studies. Leighton develops forecasting models and investigates rapid intensification of hurricanes.

    “We cannot measure everything in the atmosphere,” Leighton says.

    Without accurate data on all the factors that drive hurricane intensity, computer models can’t easily predict when the catalyzing events will occur, she says. Nor can models account for everything that happens inside the ocean during a hurricane. They don’t have the data.

    Positioning instruments just before a hurricane hits is a major challenge. But NOAA is making progress. It has launched a new generation of hurricane weather buoys in the western North Atlantic and remote control surface sensors called Saildrones that examine the air-sea interface between hurricanes and the ocean (SN: 6/8/19, p. 24).

    Underwater, NOAA uses other drones, or gliders, to profile the vast areas regularly traversed by tropical storms. These gliders collected 13,200 temperature and salinity readings in 2020. By contrast, the instruments that the team set in Puerto Rico’s waters in 2017 collected over 250 million data points, including current velocity and direction — a rare and especially valuable glimpse of hurricane-induced ocean dynamics at a single location.

    A different view

    After the storm passed, Storlazzi was sure the hurricane had destroyed his instruments. They weren’t designed to take that kind of punishment. The devices generally work in much calmer conditions, not the massive swells generated by Maria, which could increase water pressure to a level that would almost certainly crush instrument sensors.

    But remarkably, the instruments were battered but not lost. Sherman, Carlo and Touhy retrieved them after Maria passed and put them in crates awaiting the research group’s return.

    Milton Carlo (left) and Evan Tuohy (right), shown in an earlier deepwater dive, helped  place the current-monitoring instruments at the hard-to-reach sites where hurricane data were collected.MIKE ECHEVARRIA

    When Storlazzi and USGS oceanographer Kurt Rosenberger pried open the instrument casings in January 2018, no water gushed out. Good sign. The electronics appeared intact. And the lithium batteries had powered the rapid-fire sampling enterprise for the entire six-month duration. The researchers quickly downloaded a flood of data, backed it up and started transmitting it to Cheriton, who began sending back plots and graphs of what the readings showed.

    Floodwaters from the massive rains brought by Maria had pushed a whole lot of polluted sediment to the reefs outside Guánica Bay, spiking PCB concentrations and threatening coral health. As of a few months after the storm, the pollution hadn’t reached the deeper reefs.

    Then the researchers realized that their data told another story: what happens underwater during a massive hurricane. They presumed that other researchers had previously captured a profile of the churning ocean depths beneath a hurricane at the edge of a tropical island.

    Remarkably, that was not the case.

    “Nobody’s even measured this, let alone reported it in any published literature,” Cheriton says. The team began to explore the hurricane data not knowing where it might lead.

    “What am I looking at here?” Cheriton kept asking herself as she plotted and analyzed temperature, current velocity and salinity values using computer algorithms. The temperature gradient that showed the ocean’s internal or underwater waves was different than anything she’d seen before.

    Oceanographer Olivia Cheriton realized that data on ocean currents told a new story about Hurricane Maria.O.M. CHERITON

    During the hurricane, the top 20 meters of the Caribbean Sea had consistently remained at or above 26º C, a few degrees warmer than the layers beneath. But the surface waters should have been cooled if, as expected, Maria’s winds had acted like a big spoon, mixing the warm surface with cold water stirred up from the seafloor 50 to 80 meters below. Normally, the cooler surface temperature restricts the heat supply, weakening the hurricane. But the cold water wasn’t reaching the surface.

    To try to make sense of what she was seeing, Cheriton imagined herself inside the data, in a protective bubble on the seafloor with the instruments as Maria swept over. Storlazzi worked alongside her analyzing the data, but focused on the sediments circulating around the coral reefs.

    Cheriton was listening to “An Awesome Wave” by indie-pop band Alt-J and getting goosebumps while the data swirled before them. Drawing on instincts from her undergraduate astronomy training, she focused her mind’s eye on a constellation of data overhead and told Storlazzi to do the same.

    “Look up Curt!” she said.

    Up at the crest of the island shelf, where the seafloor drops off, the current velocity data revealed a broad stream of water gushing from the shore at almost 1 meter per second, as if from a fire hose. Several hours before Maria arrived, the wind-driven current had reversed direction and was now moving an order of magnitude faster. The rushing surface water thus became a barrier, trapping the cold water beneath it.

    As a result, the surface stayed warm, increasing the force of the hurricane. The cooler layers below then started to pile up vertically into distinct layers, one on top of the other, beneath the gushing waters above.

    Cheriton calculated that with the fire hose phenomenon the contribution from coastal waters in this area to Maria’s intensity was, on average, 65 percent greater, compared with what it would have been otherwise.

    Oceanographer Travis Miles of Rutgers University in New Brunswick, N.J., who was not involved in the research, calls Cheriton and the team’s work a “frontier study” that draws researchers’ attention to near-shore processes. Miles can relate to Cheriton and her team’s accidental hurricane discovery from personal experience: When his water quality–sampling gliders wandered into Hurricane Irene’s path in 2011, they revealed that the ocean off the Jersey Shore had cooled in front of the storm. Irene’s onshore winds had induced seawater mixing across the broad continental shelf and lowered sea surface temperatures.

    The Puerto Rico data show that offshore winds over a steep island shelf produced the opposite effect and should help researchers better understand storm-induced mixing of coastal areas, says NOAA senior scientist Hyun-Sook Kim, who was not involved in the research. It can help with identifying deficiencies in the computer models she relies on when providing guidance to storm-tracking meteorologists at the National Hurricane Center in Miami and the Joint Typhoon Warning Center in Hawaii.

    And the unexpected findings also could help scientists get a better handle on coral reefs and the role they play in protecting coastlines. “The more we study the ocean, especially close to the coast,” Carlo says, “the more we can improve conditions for the coral and the people living on the island.” More

  • in

    Walking and slithering aren't as different as you think

    Abrahamic texts treat slithering as a special indignity visited on the wicked serpent, but evolution may draw a more continuous line through the motion of swimming microbes, wriggling worms, skittering spiders and walking horses.
    A new study found that all of these kinds of motion are well represented by a single mathematical model.
    “This didn’t come out of nowhere — this is from our real robot data,” said Dan Zhao, first author of the study in the Proceedings of the National Academy of Sciences and a recent Ph.D. graduate in mechanical engineering at the University of Michigan.
    “Even when the robot looks like it’s sliding, like its feet are slipping, its velocity is still proportional to how quickly it’s moving its body.”
    Unlike the dynamic motion of gliding birds and sharks and galloping horses — where speed is driven, at least in part, by momentum — every bit of speed for ants, centipedes, snakes and swimming microbes is driven by changing the shape of the body. This is known as kinematic motion.
    The expanded understanding of kinematic motion could change the way roboticists think about programming many-limbed robots, opening new possibilities for walking planetary rovers, for instance. More

  • in

    Machine learning shows links between bacterial population growth and environment

    Microbial populations may be small but they are surprisingly complex, making interactions with their surrounding environment difficult to study. But now, researchers from Japan have discovered that machine learning can provide the tools to do just that. In a study published this month in eLife, researchers from the University of Tsukuba have revealed that machine learning can be applied to bacterial population growth to discover how it relates to variations in their environment.
    The dynamics of microbe populations are usually represented by growth curves. Typically, three parameters taken from these curves are used to evaluate how microbial populations fit with their environment: lag time, growth rate, and saturated population size (or carrying capacity). These three parameters are probably linked; trade-offs have been observed between the growth rate and either the lag time or population size within species, and with related changes in the saturated population size and growth rate among genetically diverse strains.
    “Two questions remained: are these three parameters affected by environmental diversity, and if so, how?” says senior author of the study, Professor Bei-Wen Ying. “To answer these, we used data-driven approaches to investigate the growth strategy of bacteria.”
    The researchers built a large dataset that reflected the dynamics of Escherichia coli populations under a wide variety of environmental conditions, using almost a thousand combinations of growth media composed from 44 chemical compounds under controlled lab conditions. They then analyzed the big data for the relationships between the growth parameters and the combinations of media using machine learning (ML). ML algorithms built a model based on sample data to make predictions or decisions without being specifically programmed to do so.
    The analysis revealed that for bacterial growth, the decision-making components were distinct among different growth phases, e.g., serine, sulfate, and glucose for growth delay (lag), growth rate, and maximum growth (saturation), respectively. The results of additional simulations and analyses showed that branched-chain amino acids likely act as ubiquitous coordinators for bacterial population growth conditions.
    “Our results also revealed a common and simple strategy of risk diversification in conditions where the bacteria experienced excess resources or starvation, which makes sense in both an evolutionary and ecological context,” says Professor Ying.
    The results of this study have revealed that exploring the world of microorganisms with data-driven approaches can provide new insights that were previously unattainable via traditional biological experiments. This research shows that the ML-assisted approach, although still an emerging technology that will need to be developed in terms of its biological reliability and accessibility, could open new avenues for applications in the life sciences, especially microbiology and ecology.
    The study was funded by Japan Society for the Promotion of Science 21K19815 and 19H03215.
    Story Source:
    Materials provided by University of Tsukuba. Note: Content may be edited for style and length. More

  • in

    Scientists develop model that adjusts videogame difficulty based on player emotions

    Appropriately balancing a videogame’s difficulty is essential to provide players with a pleasant experience. In a recent study, Korean scientists developed a novel approach for dynamic difficulty adjustment where the players’ emotions are estimated using in-game data, and the difficulty level is tweaked accordingly to maximize player satisfaction. Their efforts could contribute to balancing the difficulty of games and making them more appealing to all types of players.
    Difficulty is a tough aspect to balance in video games. Some people prefer videogames that present a challenge whereas others enjoy an easy experience. To make this process easier, most developers use ‘dynamic difficulty adjustment (DDA).’ The idea of DDA is to adjust the difficulty of a game in real time according to player performance. For example, if player performance exceeds the developer’s expectations for a given difficulty level, the game’s DDA agent can automatically raise the difficulty to increase the challenge presented to the player. Though useful, this strategy is limited in that only player performance is taken into account, not how much fun they are actually having.
    In a recent study published in Expert Systems With Applications, a research team from the Gwangju Institute of Science and Technology in Korea decided to put a twist on the DDA approach. Instead of focusing on the player’s performance, they developed DDA agents that adjusted the game’s difficulty to maximize one of four different aspects related to a player’s satisfaction: challenge, competence, flow, and valence. The DDA agents were trained via machine learning using data gathered from actual human players, who played a fighting game against various artificial intelligences (AIs) and then answered a questionnaire about their experience.
    Using an algorithm called Monte-Carlo tree search, each DDA agent employed actual game data and simulated data to tune the opposing AI’s fighting style in a way that maximized a specific emotion, or ‘affective state.’ “One advantage of our approach over other emotion-centered methods is that it does not rely on external sensors, such as electroencephalography,” comments Associate Professor Kyung-Joong Kim, who led the study. “Once trained, our model can estimate player states using in-game features only.”
    The team verified — through an experiment with 20 volunteers — that the proposed DDA agents could produce AIs that improved the players’ overall experience, no matter their preference. This marks the first time that affective states are incorporated directly into DDA agents, which could be useful for commercial games. “Commercial game companies already have huge amounts of player data. They can exploit these data to model the players and solve various issues related to game balancing using our approach,” remarks Associate Professor Kim. Worth noting is that this technique also has potential for other fields that can be ‘gamified,’ such as healthcare, exercise, and education.
    This paper was made available online on June 3, 2022, and will be published in Volume 205 of the journal on November 1, 2022.
    Story Source:
    Materials provided by GIST (Gwangju Institute of Science and Technology). Note: Content may be edited for style and length. More

  • in

    Analyzing the potential of AlphaFold in drug discovery

    Over the past few decades, very few new antibiotics have been developed, largely because current methods for screening potential drugs are prohibitively expensive and time-consuming. One promising new strategy is to use computational models, which offer a potentially faster and cheaper way to identify new drugs.
    A new study from MIT reveals the potential and limitations of one such computational approach. Using protein structures generated by an artificial intelligence program called AlphaFold, the researchers explored whether existing models could accurately predict the interactions between bacterial proteins and antibacterial compounds. If so, then researchers could begin to use this type of modeling to do large-scale screens for new compounds that target previously untargeted proteins. This would enable the development of antibiotics with unprecedented mechanisms of action, a task essential to addressing the antibiotic resistance crisis.
    However, the researchers, led by James Collins, the Termeer Professor of Medical Engineering and Science in MIT’s Institute for Medical Engineering and Science (IMES) and Department of Biological Engineering, found that these existing models did not perform well for this purpose. In fact, their predictions performed little better than chance.
    “Breakthroughs such as AlphaFold are expanding the possibilities for in silico drug discovery efforts, but these developments need to be coupled with additional advances in other aspects of modeling that are part of drug discovery efforts,” Collins says. “Our study speaks to both the current abilities and the current limitations of computational platforms for drug discovery.”
    In their new study, the researchers were able to improve the performance of these types of models, known as molecular docking simulations, by applying machine-learning techniques to refine the results. However, more improvement will be necessary to fully take advantage of the protein structures provided by AlphaFold, the researchers say.
    Collins is the senior author of the study, which appears today in the journal Molecular Systems Biology. MIT postdocs Felix Wong and Aarti Krishnan are the lead authors of the paper. More

  • in

    A novel approach to creating tailored odors and fragrances using machine learning

    The sense of smell is one of the basic senses of animal species. It is critical to finding food, realizing attraction, and sensing danger. Humans detect smells, or odorants, with olfactory receptors expressed in olfactory nerve cells. These olfactory impressions of odorants on nerve cells are associated with their molecular features and physicochemical properties. This makes it possible to tailor odors to create an intended odor impression. Current methods only predict olfactory impressions from the physicochemical features of odorants. But, that method cannot predict the sensing data, which is indispensable for creating smells.
    To tackle this issue, scientists from Tokyo Institute of Technology (Tokyo Tech) have employed the innovative strategy of solving the inverse problem. Instead of predicting the smell from molecular data, this method predicts molecular features based on the odor impression. This is achieved using standard mass spectrum data and machine learning (ML) models. “We used a machine-learning-based odor predictive model that we had previously developed to obtain the odor impression. Then we predicted the mass spectrum from odor impression inversely based on the previously developed forward model,” explains Professor Takamichi Nakamoto, the leader of the research effort by Tokyo Tech. The findings have been published in PLoS One.
    The mass spectra of odor mixtures is obtained by a linear combination of the mass spectra of single components. This simple method allows for the quick preparation of the predicted spectra of odor mixtures and can also predict the required mixing ratio, an important part of the recipe for new odor preparation. “For example, we show which molecules give the mass spectrum of apple flavor with enhanced ‘fruit’ and ‘sweet’ impressions. Our analysis method shows that combinations of either 59 or 60 molecules give the same mass spectrum as the one obtained from the specified odor impression. With this information, and the correct mixing ratio needed for a certain impression, we could theoretically prepare the desired scent,” highlights Prof. Nakamoto.
    This novel method described in this study can provide highly accurate predictions of the physicochemical properties of odor mixtures, as well as the mixing ratios required to prepare them, thereby opening the door to endless tailor-made fragrances.
    Story Source:
    Materials provided by Tokyo Institute of Technology. Note: Content may be edited for style and length. More

  • in

    Robo-bug: A rechargeable, remote-control cyborg cockroach

    An international team led by researchers at the RIKEN Cluster for Pioneering Research (CPR) has engineered a system for creating remote controlled cyborg cockroaches, equipped with a tiny wireless control module that is powered by a rechargeable battery attached to a solar cell. Despite the mechanic devices, ultrathin electronics and flexible materials allow the insects to move freely. These achievements, reported in the scientific journal npj Flexible Electronics on September 5, will help make the use of cyborg insects a practical reality.
    Researchers have been trying to design cyborg insects — part insect, part machine — to help inspect hazardous areas or monitor the environment. However, for the use of cyborg insects to be practical, handlers must be able to control them remotely for long periods of time. This requires wireless control of their leg segments, powered by a tiny rechargeable battery. Keeping the battery adequately charged is fundamental — nobody wants a suddenly out-of-control team of cyborg cockroaches roaming around. While it’s possible to build docking stations for recharging the battery, the need to return and recharge could disrupt time-sensitive missions. Therefore, the best solution is to include an on-board solar cell that can continuously ensure that the battery stays charged.
    All of this is easier said than done. To successfully integrate these devices into a cockroach that has limited surface area required the research team to develop a special backpack, ultrathin organic solar cell modules, and an adhesion system that keeps the machinery attached for long periods of time while also allowing natural movements.
    Led by Kenjiro Fukuda, RIKEN CPR, the team experimented with Madagascar cockroaches, which are approximately 6 cm long. They attached the wireless leg-control module and lithium polymer battery to the top of the insect on the thorax using a specially designed backpack, which was modeled after the body of a model cockroach. The backpack was 3D printed with an elastic polymer and conformed perfectly to the curved surface of the cockroach, allowing the rigid electronic device to be stably mounted on the thorax for more than a month.
    The ultrathin 0.004 mm thick organic solar cell module was mounted on the dorsal side of the abdomen. “The body-mounted ultrathin organic solar cell module achieves a power output of 17.2 mW, which is more than 50 times larger than the power output of current state-of-the art energy harvesting devices on living insects,” according to Fukuda.
    The ultrathin and flexible organic solar cell, and how it was attached to the insect, proved necessary to ensure freedom of movement. After carefully examining natural cockroach movements, the researchers realized that the abdomen changes shape and portions of the exoskeleton overlap. To accommodate this, they interleaved adhesive and non-adhesive sections onto the films, which allowed them to bend but also stay attached. When thicker solar cell films were tested, or when the films were uniformly attached, the cockroaches took twice as long to run the same distance, and had difficulty righting themselves when on their backs.
    Once these components were integrated into the cockroaches, along with wires that stimulate the leg segments, the new cyborgs were tested. The battery was charged with pseudo-sunlight for 30 minutes, and animals were made to turn left and right using the wireless remote control.
    “Considering the deformation of the thorax and abdomen during basic locomotion, a hybrid electronic system of rigid and flexible elements in the thorax and ultrasoft devices in the abdomen appears to be an effective design for cyborg cockroaches,” says Fukuda. “Moreover, since abdominal deformation is not unique to cockroaches, our strategy can be adapted to other insects like beetles, or perhaps even flying insects like cicadas in the future.”
    Story Source:
    Materials provided by RIKEN. Note: Content may be edited for style and length. More