More stories

  • in

    'Nanomagnetic' computing can provide low-energy AI

    Researchers have shown it is possible to perform artificial intelligence using tiny nanomagnets that interact like neurons in the brain.
    The new method, developed by a team led by Imperial College London researchers, could slash the energy cost of artificial intelligence (AI), which is currently doubling globally every 3.5 months.
    In a paper published today in Nature Nanotechnology, the international team have produced the first proof that networks of nanomagnets can be used to perform AI-like processing. The researchers showed nanomagnets can be used for ‘time-series prediction’ tasks, such as predicting and regulating insulin levels in diabetic patients.
    Artificial intelligence that uses ‘neural networks’ aims to replicate the way parts of the brain work, where neurons talk to each other to process and retain information. A lot of the maths used to power neural networks was originally invented by physicists to describe the way magnets interact, but at the time it was too difficult to use magnets directly as researchers didn’t know how to put data in and get information out.
    Instead, software run on traditional silicon-based computers was used to simulate the magnet interactions, in turn simulating the brain. Now, the team have been able to use the magnets themselves to process and store data — cutting out the middleman of the software simulation and potentially offering enormous energy savings.
    Nanomagnetic states
    Nanomagnets can come in various ‘states’, depending on their direction. Applying a magnetic field to a network of nanomagnets changes the state of the magnets based on the properties of the input field, but also on the states of surrounding magnets. More

  • in

    How much does eating meat affect nations’ greenhouse gas emissions?

    The food we eat is responsible for an astounding one-third of global greenhouse gas emissions caused by human activities, according to two comprehensive studies published in 2021.

    “When people talk about food systems, they always think about the cow in the field,” says statistician Francesco Tubiello, lead author of one of the reports, appearing in last June’s Environmental Research Letters. True, cows are a major source of methane, which, like other greenhouse gases, traps heat in the atmosphere. But methane, carbon dioxide and other planet-warming gases are released from several other sources along the food production chain.

    Before 2021, scientists like Tubiello, of the Food and Agriculture Organization of the United Nations, were well aware that agriculture and related land use changes made up roughly 20 percent of the planet’s greenhouse gas emissions. Such land use changes include cutting down forests to make way for cattle grazing and pumping groundwater to flood fields for the sake of agriculture.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    But new modeling techniques used by Tubiello and colleagues, plus a study from a group at the European Commission Tubiello worked with, brought to light another big driver of emissions: the food supply chain. All the steps that take food from the farm to our plates to the landfill — transportation, processing, cooking and food waste — bring food-related emissions up from 20 percent to 33 percent.

    To slow climate change, the foods we eat deserve major attention, just like fossil fuel burning, says Amos Tai, an environmental scientist at the Chinese University of Hong Kong. The fuller picture of food-related emissions demonstrates that the world needs to make drastic changes to the food system if we are to reach international goals for reducing global warming.

    Change from developing countries

    Scientists have gained a clearer understanding of global human-related emissions in recent years through databases like EDGAR, or Emissions Database for Global Atmospheric Research, developed by the European Union. The database covers every country’s human-emitting activities, from energy production to landfill waste, from 1970 to the present. EDGAR uses a unified methodology to calculate emissions for all economic sectors, says Monica Crippa, a scientific officer at the European Commission’s Joint Research Centre.

    Crippa and colleagues, with help from Tubiello, built a companion database of food system–related emissions called EDGAR-FOOD. Using that database, the researchers arrived at the same one-third estimate as Tubiello’s group.

    Crippa’s team’s calculations, reported in Nature Food in March 2021, split food system emissions into four broad categories: land (including both agriculture and related land use changes), energy (used for producing, processing, packaging and transporting goods), industry (including the production of chemicals used in farming and materials used to package food) and waste (from unused food).

    The land sector is the biggest culprit in food system emissions, Crippa says, accounting for about 70 percent of the global total. But the picture looks different across different nations. The United States and other developed countries rely on highly centralized megafarms for much of their food production; so the energy, industry and waste categories make up more than half of these countries’ food system emissions.

    In developing countries, agriculture and changing land use are far greater contributors. Emissions in historically less developed countries have also been rising in the last 30 years, as these countries have cut down wild areas to make way for industrial farming and started eating more meat, another major contributor to emissions with impacts across all four categories.

    As a result, agriculture and related landscape shifts have driven major increases in food system emissions among developing countries in recent decades, while emissions in developed countries have not grown.

    For instance, China’s food emissions shot up by almost 50 percent from 1990 to 2018, largely due to a rise in meat-eating, according to the EDGAR-FOOD database. In 1980, the average Chinese person ate about 30 grams of meat a day, Tai says. In 2010, the average person in China ate almost five times as much, or just under 150 grams of meat a day.

    Top-emitting economies

    In recent years, Crippa says, six economies, the top emitters, have been responsible for more than half of total global food emissions. These economies, in order, are China, Brazil, the United States, India, Indonesia and the European Union. The immense populations of China and India help drive their high numbers. Brazil and Indonesia make the list because large swaths of their rainforests have been cut down to make room for farming. When those trees come down, vast amounts of carbon flow into the atmosphere (SN: 7/3/21 & 7/17/21, p. 24).

    The United States and the European Union are on the list because of heavy meat consumption. In the United States, meat and other animal products contribute the vast majority of food-related emissions, says Richard Waite, a researcher at the World Resources Institute’s food program in Washington, D.C.

    Waste is also a huge issue in the United States: More than one-third of food produced never actually gets eaten, according to a 2021 report from the U.S. Environmental Protection Agency. When food goes uneaten, the resources used to produce, transport and package it are wasted. Plus, the uneaten food goes into landfills, which produce methane, carbon dioxide and other gases as the food decomposes.

    Meat consumption drives emissions

    Climate advocates who want to reduce food emissions often focus on meat consumption, as animal products lead to far greater emissions than plants. Animal production uses more land than plant production, and “meat production is heavily inefficient,” Tai says.

    “If we eat 100 calories of grain, like maize or soybeans, we get that 100 calories,” he explains. All the energy from the food is delivered directly to the person who eats it. But if the 100 calories’ worth of grain is instead fed to a cow or a pig, when the animal is killed and processed for food, just one-tenth of the energy from that 100 calories of grain goes to the person eating the animal.

    Methane production from “the cow in the field” is another factor in meat consumption: Cows release this gas via their manure, burps and flatulence. Methane traps more heat per ton emitted than carbon dioxide, Tubiello says. So emissions from cattle farms can have an outsize impact (SN: 11/28/15, p. 22). These livestock emissions account for about one-third of global methane emissions, according to a 2021 U.N. report.

    Shifting from meats to plants

    U.S. residents should consider how they can shift to what Brent Kim calls “plant-forward” diets. “Plant-forward doesn’t mean vegan. It means reducing animal product intake, and increasing the share of plant foods that are on the plate,” says Kim, program officer at the Johns Hopkins Center for a Livable Future.

    Kim and colleagues estimated food emissions by diet and food group for 140 countries and territories, using a similar modeling framework to EDGAR-FOOD. However, the framework includes only the food production emissions (i.e. agriculture and land use), not processing, transportation and other pieces of the food system incorporated in EDGAR-FOOD.

    Producing the average U.S. resident’s diet generates more than 2,000 kilograms of greenhouse gas emissions per year, the researchers reported in 2020 in Global Environmental Change. The group measured emissions in terms of “CO2 equivalents,” a standardized unit allowing for direct comparisons between CO2 and other greenhouse gases like methane.

    Going meatless one day a week brings down that figure to about 1,600 kilograms of CO2 equivalents per year, per person. Going vegan — a diet without any meat, dairy or other animal products — cuts it by 87 percent to under 300. Going even two-thirds vegan offers a sizable drop to 740 kilograms of CO2 equivalents.

    Kim’s modeling also offers a “low food chain” option, which brings emissions down to about 300 kilograms of CO2 equivalents per year, per person. Eating low on the food chain combines a mostly plant-based diet with animal products that come from more climate-friendly sources that do not disturb ecological systems. Examples include insects, smaller fish like sardines, and oysters and other mollusks.

    Tai agrees that not everybody needs to become a vegetarian or vegan to save the planet, as meat can have important cultural and nutritional value. If you want to “start from the biggest polluter,” he says, focus on cutting beef consumption.

    But enough people need to make these changes to “send a signal back to the market” that consumers want more plant-based options, Tubiello says. Policy makers at the federal, state and local levels can also encourage climate-friendly farming practices, reduce food waste in government operations and take other actions to cut down the resources used in food production, Waite says.

    For example, the World Resources Institute, where Waite works, is part of an initiative called the Cool Food Pledge, in which companies, universities and city governments have signed on to reduce the climate impacts of the food they serve. The institutions agree to track the food they purchase every year to ensure they are progressing toward their goals, Waite says.

    Developed countries like the United States — which have been heavy meat consumers for decades — can have a big impact by changing food choices. Indeed, a paper published in Nature Food in January shows that if the populations of 54 high-income nations switched to a plant-focused diet, annual emissions from these countries’ agricultural production could drop by more than 60 percent. More

  • in

    Bye, bye, biopsy? Handheld device could painlessly identify skin cancers

    Skin biopsies are no fun: doctors carve away small lumps of tissue for laboratory testing, leaving patients with painful wounds that can take weeks to heal. That’s a price worth paying if it enables early cancer treatment. However, in recent years, aggressive diagnostic efforts have seen the number of biopsies grow around four times faster than the number of cancers detected, with about 30 benign lesions now biopsied for every case of skin cancer that’s found.
    Researchers at Stevens Institute of Technology are now developing a low-cost handheld device that could cut the rate of unnecessary biopsies in half and give dermatologists and other frontline physicians easy access to laboratory-grade cancer diagnostics. “We aren’t trying to get rid of biopsies,” said Negar Tavassolian, director of the Bio-Electromagnetics Laboratory at Stevens. “But we do want to give doctors additional tools and help them to make better decisions.”
    The team’s device uses millimeter-wave imaging — the same technology used in airport security scanners — to scan a patient’s skin. (In earlier work, Tavassolian and her team had to work with already biopsied skin for the device to detect if it was cancerous.)
    Healthy tissue reflects millimeter-wave rays differently than cancerous tissue, so it’s theoretically possible to spot cancers by monitoring contrasts in the rays reflected back from the skin. To bring that approach into clinical practice, the researchers used algorithms to fuse signals captured by multiple different antennas into a single ultrahigh-bandwidth image, reducing noise and quickly capturing high-resolution images of even the tiniest mole or blemish.
    Spearheaded by Amir Mirbeik Ph.D. ’18, the team used a tabletop version of their technology to examine 71 patients during real-world clinical visits, and found their methods could accurately distinguish benign and malignant lesions in just a few seconds. Using their device, Tavassolian and Mirbeik could identify cancerous tissue with 97% sensitivity and 98% specificity — a rate competitive with even the best hospital-grade diagnostic tools.
    “There are other advanced imaging technologies that can detect skin cancers, but they’re big, expensive machines that aren’t available in the clinic,” said Tavassolian, whose work appears in the March 23 issue of Scientific Reports. “We’re creating a low-cost device that’s as small and as easy to use as a cellphone, so we can bring advanced diagnostics within reach for everyone.”
    Because the team’s technology delivers results in seconds, it could one day be used instead of a magnifying dermatoscope in routine checkups, giving extremely accurate results almost instantly. “That means doctors can integrate accurate diagnostics into routine checkups, and ultimately treat more patients,” said Tavassolian.
    Unlike many other imaging methods, millimeter-wave rays harmlessly penetrate about 2mm into human skin, so the team’s imaging technology provides a clear 3D map of scanned lesions. Future improvements to the algorithm powering the device could significantly improve mapping of lesion margins, enabling more precise and less invasive biopsying for malignant lesions.
    The next step is to pack the team’s diagnostic kit onto an integrated circuit, a step that could soon allow functional handheld millimeter-wave diagnostic devices to be produced for as little as $100 a piece — a fraction of the cost of existing hospital-grade diagnostic equipment. The team is already working to commercialize their technology and hopes to start putting their devices in clinicians’ hands within the next two years.
    “The path forward is clear, and we know what we need to do,” said Tavassolian. “After this proof of concept, we need to miniaturize our technology, bring the price down, and bring it to the market.”
    Story Source:
    Materials provided by Stevens Institute of Technology. Note: Content may be edited for style and length. More

  • in

    The quest for an ideal quantum bit

    New qubit platform could transform quantum information science and technology.
    You are no doubt viewing this article on a digital device whose basic unit of information is the bit, either 0 or 1. Scientists worldwide are racing to develop a new kind of computer based on use of quantum bits, or qubits.
    In a recent Nature paper, a team led by the U.S. Department of Energy’s (DOE) Argonne National Laboratory has announced the creation of a new qubit platform formed by freezing neon gas into a solid at very low temperatures, spraying electrons from a light bulb’s filament onto the solid, and trapping a single electron there. This system shows great promise to be developed into ideal building blocks for future quantum computers.
    To realize a useful quantum computer, the quality requirements for the qubits are extremely demanding. While there are various forms of qubits today, none of them is ideal.
    What would make an ideal qubit? It has at least three sterling qualities, according to Dafei Jin, an Argonne scientist and the principal investigator of the project.
    It can remain in a simultaneous 0 and 1 state (remember the cat!) over a long time. Scientists call this long “coherence.” Ideally, that time would be around a second, a time step that we can perceive on a home clock in our daily life. More

  • in

    Deep learning model to predict adverse drug-drug interactions

    Prescriptions for multiple drugs, or polypharmacy, is often recommended for the treatment of complex diseases. However, upon ingestion, multiple drugs may interact in an undesirable manner, resulting in severe adverse effects or decreased clinical efficacy. Early detection of such drug-drug interactions (DDIs) is therefore essential to prevent patients from experiencing adverse effects.
    Currently, computational models and neural network-based algorithms examine prior records of known drug interactions and identify the structures and side effects they are associated with. These approaches assume that similar drugs have similar interactions and identify drug combinations associated with similar adverse effects.
    Although understanding the mechanisms of DDIs at a molecular level is essential to predict their undesirable effects, current models rely on structures and properties of drugs, with predictive range limited to previously observed interactions. They do not consider the effect of DDIs on genes and cell functionality.
    To address these limitations, Associate Professor Hojung Nam and Ph.D. candidate Eunyoung Kim from the Gwangju Institute of Science and Technology in South Korea developed a deep learning-based model to predict DDIs based on drug-induced gene expression signatures. These findings were published in the Journal of Cheminformatics on March 4, 2022.
    The DeSIDE-DDI model consists of two parts: a feature generation model and a DDI prediction model. The feature generation model predicts a drug’s effect on gene expression by considering both the structure and properties of the drug while the DDI prediction model predicts various side effects resulting from drug combinations.
    To explain the key features of this model, Prof. Nam explains, “Our model considers the effects of drugs on genes by utilizing gene expression data, providing an explanation for why a certain pair of drugs cause DDIs. It can predict DDIs for currently approved drugs as well as for novel compounds. This way, the threats of polypharmacy can be resolved before new drugs are made available to the public.”
    What’s more, since all compounds do not have drug-treated gene expression signatures, this model uses a pre-trained compound generation model to generate expected drug-treated gene expressions.
    Discussing its real-life applications, Prof. Nam remarks, “This model can discern potentially dangerous drug pairs, acting as a drug safety monitoring system. It can help researchers define the correct usage of the drug in the drug development phase.”
    A model with such potential will truly revolutionize how the safety of novel drugs is established in the future.
    Story Source:
    Materials provided by GIST (Gwangju Institute of Science and Technology). Note: Content may be edited for style and length. More

  • in

    Taste of the future: Robot chef learns to 'taste as you go'

    A robot ‘chef’ has been trained to taste food at different stages of the chewing process to assess whether it’s sufficiently seasoned.
    Working in collaboration with domestic appliances manufacturer Beko, researchers from the University of Cambridge trained their robot chef to assess the saltiness of a dish at different stages of the chewing process, imitating a similar process in humans.
    Their results could be useful in the development of automated or semi-automated food preparation by helping robots to learn what tastes good and what doesn’t, making them better cooks.
    When we chew our food, we notice a change in texture and taste. For example, biting into a fresh tomato at the height of summer will release juices, and as we chew, releasing both saliva and digestive enzymes, our perception of the tomato’s flavour will change.
    The robot chef, which has already been trained to make omelettes based on human taster’s feedback, tasted nine different variations of a simple dish of scrambled eggs and tomatoes at three different stages of the chewing process, and produced ‘taste maps’ of the different dishes.
    The researchers found that this ‘taste as you go’ approach significantly improved the robot’s ability to quickly and accurately assess the saltiness of the dish over other electronic tasting technologies, which only test a single homogenised sample. The results are reported in the journal Frontiers in Robotics & AI. More

  • in

    New open-source software automates RNA analysis to speed up research and drug development

    Scientists at Scripps Research have unveiled a new software tool for studying RNA (ribonucleic acid) molecules, which have a host of critical roles in organisms. The open-source app, “Pytheas,” described May 3, 2022, in Nature Communications, speeds up the process of characterizing and quantifying RNAs in basic research and drug-development settings.
    The app is designed specifically to analyze RNA data generated through a method called mass spectrometry. “Mass spec” is commonly used to evaluate RNA molecules that are not simple chains of standard RNA nucleotides but are instead modified in some way. Among their demonstrations, the researchers showed that Pytheas can be used to swiftly identify and quantify modified RNA molecules like those in the current Pfizer and Moderna COVID-19 mRNA vaccines.
    “The analysis of RNA data from mass spectrometry has been a relatively laborious process, lacking the tools found in other areas of biological research, and so our aim with Pytheas is to bring the field into the 21st century,” says study senior author James Williamson, PhD, professor in the Department of Integrative Structural and Computational Biology, and vice president of Research and Academic Affairs at Scripps Research.
    The first authors of the study were Luigi D’Ascenzo, PhD, and Anna Popova, PhD, respectively a postdoctoral research associate and staff scientist in the Williamson lab during the study.
    RNA is chemically very similar to DNA, and RNA molecules in cells are heavily involved in the process of translating genes into proteins, as well as in fine-tuning gene activity. Additionally, RNA-based therapeutics — which include the Pfizer and Moderna vaccines — are viewed as a highly promising new class of medicines, capable in principle of hitting their biological targets more potently and selectively than traditional small-molecule drugs.
    A common tool for detecting RNA molecules that have chemical modifications is mass spectrometry, which can be used essentially to recognize the RNAs and their modifications based on their masses. Natural RNAs often have modifications that affect their functions, while RNAs used for vaccines and RNA-based drugs are almost always modified artificially to optimize their activity and reduce side effects. Up to now, methods for processing raw mass spectrometry data on modified RNAs have been relatively slow and manual — thus, very labor-intensive — in contrast to corresponding methods in the field of protein analysis, for example. More

  • in

    Automated synthesis allows for discovery of unexpected charge transport behavior in organic molecules

    A cross-disciplinary UIUC team has demonstrated a major breakthrough in using automated synthesis to discover new molecules for organic electronics applications.
    The technology that enabled the discovery relies on an automated platform for rapid molecular synthesis at scale — which is a game-changer in the field of organic electronics and beyond. Using automated synthesis, the team was able to rapidly scan through a library of molecules with precisely defined structures, thereby uncovering, via single-molecule characterization experiments, a new mechanism for high conductance. The work was just reported in Nature Communications and is the first major result to emerge from the Molecule Maker Lab, which is located in the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign.
    The unexpectedly high conductance was uncovered in experiments led by Charles M. Schroeder, who is the James Economy Professor in materials science & engineering and a professor in chemical & biomolecular engineering. The project’s goal was to seek out new molecules with strong conductivity that might be suitable for use in molecular electronics or organic electronics applications. The team’s approach was to systematically append many different side chains to molecular backbones to understand how the side chains affected conductance.
    The first stage of the project consisted of synthesizing a large library of molecules to be characterized using single-molecule electronics experiments. If the synthesis had been done with conventional methods, it would have been a long, cumbersome process. That effort was avoided through use of the Molecule Maker Lab’s automated synthesis platform, which was designed to facilitate molecular discovery research that requires testing of large numbers of candidate molecules.
    Edward R. Jira, a Ph.D. student in chemical & biomolecular engineering who had a leading role in the project, explained the synthesis platform’s concept. “What’s really powerful… is that it leverages a building-block-based strategy where all of the chemical functionality that we’re interested in is pre-encoded in building blocks that are bench-stable, and you can have a large library of them sitting on a shelf,” he said. A single type of reaction is used repeatedly to couple the building blocks together as needed, and “because we have this diverse building block library that encodes a lot of different functionality, we can access a huge array of different structures for different applications.”
    As Schroeder put it, “Imagine snapping Legos together.”
    Co-author Martin D. Burke extended the Lego-brick analogy to explain why the synthesizer was so valuable to the experiments — and it wasn’t only because of the rapid production of the initial molecular library. “Because of the Lego-like approach for making these molecules, the team was able to understand why they are super-fast,” he explained. Once the surprisingly fast state was discovered, “using the ‘Legos,’ we could take the molecules apart piece by piece, and swap in different ‘Lego’ bricks — and thereby systematically understand the structure/function relationships that led to this ultrafast conductivity.” More