More stories

  • in

    'Pulling back the curtain' to reveal a molecular key to The Wizard of Oz

    Many people and companies worry about sensitive data getting hacked, so encrypting files with digital keys has become more commonplace. Now, researchers reporting in ACS Central Science have developed a durable molecular encryption key from sequence-defined polymers that are built and deconstructed in a sequential way. They hid their molecular key in the ink of a letter, which was mailed and then used to decrypt a file with text from The Wonderful Wizard of Oz.
    Securely sharing data relies on encryption algorithms that jumble up the information and only reveal it when the correct code or digital encryption key is used. Researchers have been developing molecular strategies, including DNA chains and polymers, to durably store and transport encryption keys. Currently, nucleic acids store more information than polymers. The challenge with polymers is that when they get too long, storing more data with each additional monomer becomes less efficient, and figuring out the information they’re hiding with analytical instruments becomes extremely difficult. Recently, Eric Anslyn and colleagues developed a method to deconstruct polymers in a sequential way, allowing their structures to be determined more easily with liquid chromatography-mass spectrometry (LC/MS). So, Anslyn, James Ruether and others wanted to test the method on a mixture of unique polymers hidden in ink to see if the approach could be used to reveal a complex molecular encryption key.
    First, the researchers generated a 256-character-long binary key that could encrypt and decrypt text files when entered into an algorithm. Next, they encoded the key into polymer sequences of eight 10-monomer-long oligourethanes. Only the middle eight monomers held the key, and the two ends acted as placeholders for synthesis and decoding. The decoding placeholder was a unique, isotopically labeled “fingerprint” monomer in each sequence, indicating where each polymer’s encoded information fit in the order of the final digital key. Then the researchers mixed the eight polymers together and used a sequential depolymerization method and LC/MS to determine the original structures and the digital key. Finally, one group of the researchers combined the polymers with isopropanol, glycerol and soot to make an ink, which they used to write a letter that they mailed to other colleagues, who didn’t know the encoded information. These scientists extracted the ink from the paper and followed the same sequential analysis to successfully reconstruct the binary key. They entered the encryption key into the algorithm, revealing a plain text file of The Wonderful Wizard of Oz. The researchers say that their results demonstrate that molecular information encryption with sequence-defined polymer mixtures is durable enough for real-world applications, such as hiding secret messages in letters and plastic objects.
    The authors acknowledge funding from the Army Research Office, the Howard Hughes Medical Institute, the Keck Foundation and the Welch Reagents Chair.
    Story Source:
    Materials provided by American Chemical Society. Note: Content may be edited for style and length. More

  • in

    Learning to fight infection

    Scientific advancements have often been held back by the need for high volumes of data, which can be costly, time-consuming, and sometimes difficult to collect. But there may be a solution to this problem when investigating how our bodies fight illness: a new machine learning method called “MotifBoost.” This approach can help interpret data from T-cell receptors (TCRs) in identifying past infections to specific pathogens. By focusing on a collection of short sequences of amino acids in the TCRs, a research team achieved more accurate results with smaller datasets. This work may shed light on the way the human immune system recognizes germs, which may lead to improved health outcomes.
    The recent pandemic has highlighted the vital importance of the human body’s ability to fight back against novel threats. The adaptive immune system uses specialized cells, including T-cells, which prepare an array of diverse receptors that can recognize antigens specific to invading germs even before they arrive for the first time. Therefore, the diversity of the receptors is an important topic of investigation. However, the correspondence between receptors and the antigens they recognize is often difficult to determine experimentally, and current computational methods often fail if not provided with enough data.
    Now, scientists from the Institute of Industrial Science at The University of Tokyo have developed a new machine learning method that can predict the infection of a donor based on limited data of TCRs. “MotifBoost” focuses on very short segments, called k-mers, in each receptor. Although the protein motifs considered by scientists are usually much longer, the team found that extracting the frequency of each combination of three consecutive amino acids was highly effective. “Our machine learning methods trained on small-scale datasets can supplement conventional classification methods which only work on very large datasets,” first author Yotaro Katayama says. MotifBoost was inspired by the fact that different people usually produce similar TCRs when exposed to the same pathogen.
    First, the researchers employed an unsupervised learning approach, in which donors were automatically sorted based on patterns found in the data, and showed that donors formed distinct clusters using the k-mer distribution based on having previous infection by cytomegalovirus (CMV) or not. Because unsupervised learning algorithms do not have information about which donors had been infected with CMV, this result indicated that the k-mer information is effective in capturing characteristics of a patient’s immune status. Then, the scientists used the k-mer distribution data for a supervised learning task, in which the algorithm was given the TCR data of each donor, along with labels for which donors were infected with a specific disease. The algorithm was then trained to predict the label for unseen samples, and the prediction performance was tested for CMV and HIV.
    “We found that existing machine learning methods can suffer from learning instability and reduced accuracy when the number of samples drops below a certain critical size. In contrast, MotifBoost performed just as well on the large dataset, and still provided a good result on the small dataset,” says senior author Tetsuya J. Kobayashi. This research may lead to new tests for viral exposure and immune status based on T-cell composition.
    This research is published in Frontiers in Immunology as “Comparative study of repertoire classification methods reveals data efficiency of k-mer feature extraction.”
    Story Source:
    Materials provided by Institute of Industrial Science, The University of Tokyo. Note: Content may be edited for style and length. More

  • in

    Scientists reveal genetic architecture underlying alcohol, cigarette abuse

    Have you ever wondered why one person can smoke cigarettes for a year and easily quit, while another person will become addicted for life? Why can’t some people help themselves from abusing alcohol and others can take it or leave it? One reason is a person’s genetic proclivity to abuse substances. UNC School of Medicine researchers led by Hyejung Won, PhD, are beginning to understand these underlying genetic differences. The more they learn, the better chance they will be able to create therapies to help the millions of people who struggle with addiction.
    Won, assistant professor of genetics and member of the UNC Neuroscience Center, and colleagues identified genes linked to cigarette smoking and drinking. The researchers found that these genes are over-represented in certain kinds of neurons — brain cells that trigger other cells to send chemical signals throughout the brain.
    The researchers, who published their work in the journal Molecular Psychiatry, also found that the genes underlying cigarette smoking were linked to the perception of pain and response to food, as well as the abuse of other drugs, such as cocaine. Other genes associated with alcohol use were linked to stress and learning, as well as abuse of other drugs, such as morphine.
    Given the lack of current treatment options for substance use disorder, the researchers also conducted analyses of a publicly available drug database to identify potential new treatments for substance abuse.
    “We found that antipsychotics and other mood stabilizers could potentially provide therapeutic relief for individuals struggling with substance abuse,” said Nancy Sey, graduate student in the Won lab and the first author of the paper. “And we’re confident our research provides a good foundation for research focused on creating better treatments to address drug dependency.”
    Parsing the Genome
    Long-term substance use and substance use disorders have been linked to many common diseases and conditions, such as lung cancer, liver disease, and mental illnesses. Yet, few treatment options are available, largely due to gaps in our understanding of the biological processes involved. More

  • in

    New model predicts how temperature affects life from quantum to classical scales

    Every biological process depends critically on temperature. It’s true of the very small, the very large, and every scale in-between, from molecules to ecosystems and across every environment.
    A general theory describing how life depends on temperature has been lacking — until now. In a paper pubished in the Proceedings of the National Academy of Sciences, researchers led by Jose Ignacio Arroyo, a Santa Fe Institute Postdoctoral Fellow, introduce a simple framework that rigorously predicts how temperature affects living things, at all scales.
    “It is very fundamental,” says SFI External Professor Pablo Marquet, an ecologist at the Pontifica Universidad Catolica de Chile, in Santiago. Marquet, Arroyo’s Ph.D. thesis advisor, also worked on the model. “You can apply this to pretty much every process that is affected by temperature. We hope it will be a landmark contribution.”
    Marquet notes that such a theory could help researchers make accurate predictions in a range of areas, including biological responses to climate change, the spread of infectious diseases, and food production.
    Previous attempts to generalize the effects of temperature on biology have lacked the “big picture” implications built into the new model, says Marquet. Biologists and ecologists often use the Arrhenius equation, for example, to describe how temperature affects the rates of chemical reactions. That approach successfully approximates how temperature influences some biological processes, but it can’t fully account for many others, including metabolism and growth rate.
    Arroyo initially set out to develop a general mathematical model to predict the behavior of any variable in biology. He quickly realized, however, that temperature was a kind of universal predictor and could guide the development of a new model. He started with a theory in chemistry that describes the kinetics of enzymes, but with a few additions and assumptions, he extended the model from the quantum-molecular level to larger, macroscopic scales.
    Importantly, the model combines three elements lacking in earlier attempts. First, like its counterpart in chemistry, it’s derived from first principles. Second, the heart of the model is a single, simple equation with only a few parameters. (Most existing models require a plethora of assumptions and parameters.) Third, “it’s universal in the sense that it can explain patterns and behaviors for any microorganisms or any taxa in any environment,” he says. All temperature responses for different processes, taxa, and scales collapse to the same general functional form.
    “I think that our ability to systematize temperature response has the potential to reveal novel unification in biological processes in order to resolve a variety of controversies,” says SFI Professor Chris Kempes who, along with SFI Professor Geoffrey West, helped the team bridge the quantum-to-classical scales.
    The PNAS paper describes predictions from the new model that align with empirical observations of diverse phenomena, including the metabolic rate of an insect, the relative germination of alfalfa, the growth rate of a bacterium, and the mortality rate of a fruit fly.
    In future publications, Arroyo says, the group plans to derive new predictions from this model — many of which were planned for the first publication. “The paper was just getting too big,” he says.
    Story Source:
    Materials provided by Santa Fe Institute. Note: Content may be edited for style and length. More

  • in

    Go with the flow: New findings about moving electricity could improve fusion devices

    Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have found that updating a mathematical model to include a physical property known as resistivity could lead to the improved design of doughnut-shaped fusion facilities known as tokamaks.
    “Resistivity is the property of any substance that inhibits the flow of electricity,” said PPPL physicist Nathaniel Ferraro, one of the collaborating researchers. “It’s kind of like the viscosity of a fluid, which inhibits things moving through it. For example, a stone will move more slowly through molasses than water, and more slowly through water than through air.”
    Scientists have discovered a new way that resistivity can cause instabilities in the plasma edge, where temperatures and pressures rise sharply. By incorporating resistivity into models that predict the behavior of plasma, a soup of electrons and atomic nuclei that makes up 99% of the visible universe, scientists can design systems for future fusion facilities that make the plasma more stable.
    “We want to use this knowledge to figure out how to develop a model that allows us to plug in certain plasma characteristics and predict whether the plasma will be stable before we actually do an experiment,” said Andreas Kleiner, a PPPL physicist who was the lead author of a paper reporting the results in Nuclear Fusion. “Basically, in this research, we saw that resistivity matters and our models ought to include it,” Kleiner said.
    Fusion, the power that drives the sun and stars, combines light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — and generates massive amounts of energy. Scientists seek to harness fusion on Earth for a virtually inexhaustible supply of power to generate electricity.
    Scientists want the plasma to be stable because instabilities can lead to plasma eruptions known as edge-localized modes (ELMs) that can damage internal components of the tokamak over time, requiring those components to be replaced more frequently. Future fusion reactors will have to run without stopping for repairs, however, for months at a time. More

  • in

    AI tech to automate process of denture design and enhance treatment efficiency without compromising accuracy

    Loss of permanent teeth is usually caused by dental diseases or trauma and is common in the global population, especially among the elderly due to aging and relatively poorer oral health.
    Failure to replace a missing tooth not only affects facial aesthetic and chewing function, but it may also lead to jawbone loss and shifting to teeth, which may cause malocclusion and bite irregularities that could have a significant impact on the health of the remaining teeth, gums, jaw muscles and jaw points.
    Artificial teeth, also known as bridges and dentures, are prosthetic devices used to replace missing teeth. It is essential for the false teeth to resemble the patient’s original tooth so that the patient can retain his or her original appearance, chewing function, oral and physical health.
    Currently, the process of designing and creating dentures is highly time-consuming as the existing computerised design process requires tedious manual inputs, teeth occlusion information collection as well as multiple denture fitting procedures due to limited accuracy of exciting technologies.
    Researchers from the Faculty of Dentistry at the University of Hong Kong (HKU) and the Department of Computer Science of Chu Hai College of Higher Education, collaborated to develop a new approach using artificial intelligence to automate the design of individualised dentures, in order to enhance the treatment efficiency and improve patient experience.
    The AI technology used in the process was based on 3D Generative Adversarial Network (3D-GAN) algorithm and tested on 175 participants recruited at HKU. The study shows that AI technology could reconstruct the shape of a natural healthy tooth and automate the process of false teeth design with high accuracy.
    “The 3D GAN algorithm was selected due to its superior performance on 3D object reconstruction compared to other AI algorithms. In the preliminary study, 3D GAN was able to rebuild similar shapes to the original teeth for 60% of the cases. It is expected to mature with more AI training data,” co-Investigator, Dr Reinhard Chau explained.
    The new approach only requires the digital model of a patient’s dentition to function. It can learn the features of an individual’s teeth from the rest of the dentition and generate a false tooth that looks like the missing tooth.
    “This will facilitate the treatment workflow for dentists in replacing a missing tooth, as the preparation and fitting process will require minimal time, and a patient will not need to stay at the clinic for long hours,” said Principal Investigator Dr Walter Lam.
    The study entitled “Artificial intelligence-designed single molar dental prostheses: A protocol of prospective experimental study” is published in the journal PLoS ONE. The preliminary results of the study were presented in the recent International Association of Dental Research (IADR) General Session. The study won the IADR Neal Garrett Clinical Research Prize and First runner-up in the 2022 IADR-SEA Hatton Award — Senior Category.
    Story Source:
    Materials provided by The University of Hong Kong. Note: Content may be edited for style and length. More

  • in

    Researcher uses graphene for same-time, same-position biomolecule isolation and sensing

    New research led by University of Massachusetts Amherst assistant professor Jinglei Ping has overcome a major challenge to isolating and detecting molecules at the same time and at the same location in a microdevice. The work, recently published in ACSNano, demonstrates an important advance in using graphene for electrokinetic biosample processing and analysis and could allow lab-on-a-chip devices to become smaller and achieve results faster.
    The process of detecting biomolecules has been complicated and time consuming. “We usually first have to isolate them in a complex medium in a device and then send them to another device or another spot in the same device for detection,” says Ping, who is in the College of Engineering’s Mechanical and Industrial Engineering Department and is also affiliated with the university’s Institute of Applied Life Sciences. “Now we can isolate them and detect them at the same microscale spot in a microfluidic device at the same time — no one has ever demonstrated this before.”
    His lab achieved this advance by using graphene, a one-atom-thick honeycomb lattice of carbon atoms, as microelectrodes in a microfluidic device.
    “We found that, compared to typical inert-metal microelectrodes, the electrolysis stability for graphene microelectrodes is more than 1,000 times improved, making them ideal for high-performance electrokinetic analysis,” he says.
    Also, Ping added, since monolayer graphene is transparent, “we developed a three-dimensional multi-stream microfluidic strategy to microscopically detect the isolated molecules and calibrate the detection at the same time from a direction normal to the graphene microelectrodes.”
    The new approach developed in the work paves the way to the creation of lab-on-a-chip devices of maximal time and size efficiencies, Ping says. Also, the approach is not limited to analyzing biomolecules and can potentially be used to separate, detect and stimulate microorganisms such as cells and bacteria.
    Story Source:
    Materials provided by University of Massachusetts Amherst. Note: Content may be edited for style and length. More

  • in

    Robot dog learns to walk in one hour

    A newborn giraffe or foal must learn to walk on its legs as fast as possible to avoid predators. Animals are born with muscle coordination networks located in their spinal cord. However, learning the precise coordination of leg muscles and tendons takes some time. Initially, baby animals rely heavily on hard-wired spinal cord reflexes. While somewhat more basic, motor control reflexes help the animal to avoid falling and hurting themselves during their first walking attempts. The following, more advanced and precise muscle control must be practiced, until eventually the nervous system is well adapted to the young animal’s leg muscles and tendons. No more uncontrolled stumbling — the young animal can now keep up with the adults.
    Researchers at the Max Planck Institute for Intelligent Systems (MPI-IS) in Stuttgart conducted a research study to find out how animals learn to walk and learn from stumbling. They built a four-legged, dog-sized robot, that helped them figure out the details.
    “As engineers and roboticists, we sought the answer by building a robot that features reflexes just like an animal and learns from mistakes,” says Felix Ruppert, a former doctoral student in the Dynamic Locomotion research group at MPI-IS. “If an animal stumbles, is that a mistake? Not if it happens once. But if it stumbles frequently, it gives us a measure of how well the robot walks.”
    Felix Ruppert is first author of “Learning Plastic Matching of Robot Dynamics in Closed-loop Central Pattern Generators,” which will be published July 18, 2022 in the journal Nature Machine Intelligence.
    Learning algorithm optimizes virtual spinal cord
    After learning to walk in just one hour, Ruppert’s robot makes good use of its complex leg mechanics. A Bayesian optimization algorithm guides the learning: the measured foot sensor information is matched with target data from the modeled virtual spinal cord running as a program in the robot’s computer. The robot learns to walk by continuously comparing sent and expected sensor information, running reflex loops, and adapting its motor control patterns. More