More stories

  • in

    Research advances technology of AI assistance for anesthesiologists

    A new study by researchers at MIT and Massachusetts General Hospital suggests the day may be approaching when advanced artificial intelligence systems could assist anesthesiologists in the operating room.
    In a special edition of Artificial Intelligence in Medicine, the team of neuroscientists, engineers and physicians demonstrated a machine learning algorithm for continuously automating dosing of the anesthetic drug propofol. Using an application of deep reinforcement learning, in which the software’s neural networks simultaneously learned how its dosing choices maintain unconsciousness and how to critique the efficacy of its own actions, the algorithm outperformed more traditional software in sophisticated, physiology-based simulations of patients. It also closely matched the performance of real anesthesiologists when showing what it would do to maintain unconsciousness given recorded data from nine real surgeries.
    The algorithm’s advances increase the feasibility for computers to maintain patient unconsciousness with no more drug than is needed, thereby freeing up anesthesiologists for all the other responsibilities they have in the operating room, including making sure patients remain immobile, experience no pain, remain physiologically stable, and receive adequate oxygen said co-lead authors Gabe Schamberg and Marcus Badgeley.
    “One can think of our goal as being analogous to an airplane’s auto-pilot where the captain is always in the cockpit paying attention,” said Schamberg, a former MIT postdoc who is also the study’s corresponding author. “Anesthesiologists have to simultaneously monitor numerous aspects of a patient’s physiological state, and so it makes sense to automate those aspects of patient care that we understand well.”
    Senior author Emery N. Brown, a neuroscientist at The Picower Institute for Learning and Memory and Institute for Medical Engineering and Science at MIT and an anesthesiologist at MGH, said the algorithm’s potential to help optimize drug dosing could improve patient care.
    “Algorithms such as this one allow anesthesiologists to maintain more careful, near continuous vigilance over the patient during general anesthesia,” said Brown, Edward Hood Taplin Professor Computational Neuroscience and Health Sciences & Technology at MIT. More

  • in

    Artificial intelligence system rapidly predicts how two proteins will attach

    Antibodies, small proteins produced by the immune system, can attach to specific parts of a virus to neutralize it. As scientists continue to battle SARS-CoV-2, the virus that causes Covid-19, one possible weapon is a synthetic antibody that binds with the virus’ spike proteins to prevent the virus from entering a human cell.
    To develop a successful synthetic antibody, researchers must understand exactly how that attachment will happen. Proteins, with lumpy 3D structures containing many folds, can stick together in millions of combinations, so finding the right protein complex among almost countless candidates is extremely time-consuming.
    To streamline the process, MIT researchers created a machine-learning model that can directly predict the complex that will form when two proteins bind together. Their technique is between 80 and 500 times faster than state-of-the-art software methods, and often predicts protein structures that are closer to actual structures that have been observed experimentally.
    This technique could help scientists better understand some biological processes that involve protein interactions, like DNA replication and repair; it could also speed up the process of developing new medicines.
    “Deep learning is very good at capturing interactions between different proteins that are otherwise difficult for chemists or biologists to write experimentally. Some of these interactions are very complicated, and people haven’t found good ways to express them. This deep-learning model can learn these types of interactions from data,” says Octavian-Eugen Ganea, a postdoc in the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) and co-lead author of the paper.
    Ganea’s co-lead author is Xinyuan Huang, a graduate student at ETH Zurich. MIT co-authors include Regina Barzilay, the School of Engineering Distinguished Professor for AI and Health in CSAIL, and Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering in CSAIL and a member of the Institute for Data, Systems, and Society. The research will be presented at the International Conference on Learning Representations. More

  • in

    New super-conductors could take data beyond zeroes and ones

    Remember flip-phones? Our smartphones may one day look just as obsolete thanks to spintronics, an incipient field of research promising to revolutionize the way our electronic devices send and receive signals.
    In most current technologies, data is encoded as a zero or a one, depending on the number of electrons that reach a capacitor. With spintronics, data is also transferred according to the direction in which these electrons spin.
    In a new study appearing this week in the Proceedings of the National Academy of Sciences, a team of Duke University and Weizmann Institute researchers led by Michael Therien, professor of Chemistry at Duke, report a keystone achievement in the field: the development of a conducting system that controls the spin of electrons and transmits a spin current over long distances, without the need for the ultra-cold temperatures required by typical spin-conductors.
    “The structures we present here are exciting because they define new strategies to generate large magnitude spin currents at room temperature,” said Chih-Hung Ko, first author of the paper and recent Duke chemistry Ph.D.
    Electrons are like spinning tops. Spin-up electrons rotate clockwise, and spin-down electrons rotate counter-clockwise. Electrons with opposite spins can occupy the same volume, but electrons that spin in the same direction repel themselves, like magnets of the same polarity.
    By controlling the way that electrons spin along a current, scientists can encode a new layer of information into an electric signal. More

  • in

    How Omicron escapes from antibodies

    A new study from MIT suggests that the dozens of mutations in the spike protein of the Omicron variant help it to evade all four of the classes of antibodies that can target the SARS-CoV-2 virus that causes Covid-19.
    This includes antibodies generated by vaccinated or previously infected people, as well as most of the monoclonal antibody treatments that have been developed, says Ram Sasisekharan, the Alfred H. Caspary Professor of Biological Engineering and Health Sciences and Technology (HST) at MIT.
    Using a computational approach that allowed them to determine how mutated amino acids of the viral spike protein influence nearby amino acids, the researchers were able to get a multidimensional view of how the virus evades antibodies. According to Sasisekharan, the traditional approach of only examining changes in the virus’ genetic sequence reduces the complexity of the spike protein’s three-dimensional surface and doesn’t describe the multidimensional complexity of the protein surfaces that antibodies are attempting to bind to.
    “It is important to get a more comprehensive picture of the many mutations seen in Omicron, especially in the context of the spike protein, given that the spike protein is vital for the virus’s function, and all the major vaccines are based on that protein,” he says. “There is a need for tools or approaches that can rapidly determine the impact of mutations in new virus variants of concern, especially for SARS-CoV-2.”
    Sasisekharan is the senior author of the study, which appears this week in Cell Reports Medicine. The lead author of the paper is MIT HST graduate student Nathaniel Miller. Technical associate Thomas Clark and research scientist Rahul Raman are also authors of the paper.
    Even though Omicron is able to evade most antibodies to some degree, vaccines still offer protection, Sasisekharan says. More

  • in

    Researchers develop highly accurate modeling tool to predict COVID-19 risk

    As new coronavirus variants emerge and quickly spread around the globe, both the public and policymakers are faced with a quandary: maintaining a semblance of normality, while also minimizing infections. While digital contact tracing apps offered promise, the adoption rate has been low, due in part to privacy concerns.
    At USC, researchers are advocating for a new approach to predict the chance of infection from Covid-19: combining anonymized cellphone location data with mobility patterns — broad patterns of how people move from place to place.
    To produce “risk scores” for specific locations and times, the team used a large dataset of anonymous, real-world location signals from cell phones across the US in 2019 and 2020. The system shows a 50% improvement in accuracy compared to current systems, said the researchers.
    “Our results show that it is possible to predict and target specific areas that are high-risk, as opposed to putting all businesses under one umbrella. Such risk-targeted policies can be significantly more effective, both for controlling Covid-19 and economically,” said lead author Sepanta Zeighami, a computer science Ph.D. student advised by Professor Cyrus Shahabi.
    “It’s also unlikely that Covid-19 will be the last pandemic in human history, so if we want to avoid the chaos of 2020 and the tragic losses while keeping daily life as unaffected as possible when the next pandemic happens, we need such data-driven approaches.”
    To address privacy concerns, the mobility data comes in an aggregated format, allowing the researchers to see patterns without identifying individual users. The data is not being used for contact tracing, identifying infected individuals, or where they are going, said the researchers. More

  • in

    The past’s extreme ocean heat waves are now the new normal

    Yesterday’s scorching ocean extremes are today’s new normal. A new analysis of surface ocean temperatures over the past 150 years reveals that in 2019, 57 percent of the ocean’s surface experienced temperatures rarely seen a century ago, researchers report February 1 in PLOS Climate.

    To provide context for the frequency and duration of modern extreme heat events, marine ecologists Kisei Tanaka, now at the National Oceanographic and Atmospheric Administration in Honolulu, and Kyle Van Houtan, now at the Loggerhead Marinelife Center in Juno Beach, Fla., analyzed monthly sea-surface temperatures from 1870 through 2019, mapping where and when extreme heat events occurred decade to decade.

    Looking at monthly extremes rather than annual averages revealed new benchmarks in how the ocean is changing. More and more patches of water hit extreme temperatures over time, the team found. Then, in 2014, the entire ocean hit the “point of no return,” Van Houtan says. Beginning that year, at least half of the ocean’s surface waters saw temperatures hotter than the most extreme events from 1870 to 1919.

    Marine heat waves are defined as at least five days of unusually high temperatures for a patch of ocean. Heat waves wreak havoc on ocean ecosystems, leading to seabird starvation, coral bleaching, dying kelp forests, and migration of fish, whales and turtles in search of cooler waters (SN: 1/15/20; SN: 8/10/20).

    In May 2020, NOAA announced that it was updating its “climate normals” — what the agency uses to put daily weather events in historical context — from the average 1981–2010 values to the higher 1991–2020 averages (SN: 5/26/21). 

    This study emphasizes that ocean heat extremes are also now the norm, Van Houtan says. “Much of the public discussion now on climate change is about future events, and whether or not they might happen,” he says. “Extreme heat became common in our ocean in 2014. It’s a documented historical fact, not a future possibility.”

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up. More

  • in

    The power of chaos: A robust and low-cost cryptosystem for the post-quantum era

    Fast algorithms on quantum computers could easily break many widely used cryptosystems, necessitating more innovative solutions for digital security. In a recent study, a team of scientists designed a stream cipher consisting of three cryptographic primitives based on independent mathematical models of chaos. The resulting cryptographic approach is robust to attacks from large-scale quantum computers and can be implemented on low-cost computers, paving the way to secure digital communications in the post-quantum era.
    While for most of us cryptographic systems are things that just run “under the hood,” they are an essential element in the world of digital communications. However, the upcoming rise of quantum computers could shake the field of cryptography to its core. Fast algorithms running on these machines could break some of the most widely used cryptosystems, rendering them vulnerable. Well aware of this looming threat, cryptography researchers worldwide are working on novel encryption methods that can withstand attacks from quantum computers.
    Chaos theory is actively being studied as a basis for post-quantum era cryptosystems. In mathematics, chaos is a property of certain dynamic systems that makes them extremely sensitive to initial conditions. While technically deterministic (non-random), these systems evolve in such complex ways that predicting their long-term state with incomplete information is practically impossible, since even small rounding errors in the initial conditions yield diverging results. This unique characteristic of chaotic systems can be leveraged to produce highly secure cryptographic systems, as a team of researchers from Ritsumeikan University, Japan, showed in a recent study published in IEEE Transactions on Circuits and Systems I.
    Led by Professor Takaya Miyano, the team developed an unprecedented stream cipher consisting of three cryptographic primitives based on independent mathematical models of chaos. The first primitive is a pseudorandom number generator based on the augmented Lorenz (AL) map. The pseudorandom numbers produced using this approach are used to create key streams for encrypting/decrypting messages, which take the stage in the second and perhaps most remarkable primitive — an innovative method for secret-key exchange.
    This novel strategy for exchanging secret keys specifying the AL map is based on the synchronization of two chaotic Lorenz oscillators, which can be independently and randomly initialized by the two communicating users, without either of them knowing the state of the other’s oscillator. To conceal the internal states of these oscillators, the communicating users (the sender and the receiver) mask the value of one of the variables of their oscillator by multiplying it with a locally generated random number. The masked value of the sender is then sent to receiver and vice-versa. After a short time, when these back-and-forth exchanges cause both oscillators to sync up almost perfectly to the same state in spite of the randomization of the variables, the users can mask and exchange secret keys and then locally unmask them with simple calculations.
    Finally, the third primitive is a hash function based on the logistic map (a chaotic equation of motion), which allows the sender to send a hash value and, in turn, allows the receiver to ensure that the received secret key is correct, i.e., the chaotic oscillators were synchronized properly.
    The researchers showed that a stream cipher assembled using these three primitives is extremely secure and resistant to statistical attacks and eavesdropping since it is mathematically impossible to synchronize their own oscillator to either the sender’s or the receiver’s ones. This is an unprecedented achievement, as Prof. Miyano states: “Most chaos-based cryptosystems can be broken by attacks using classical computers within a practically short time. In contrast, our methods, especially the one for secret-key exchange, appear to be robust against such attacks and, more importantly, even hard to break using quantum computers.”
    In addition to its security, the proposed key exchange method is applicable to existing block ciphers, such as the widely used Advanced Encryption Standard (AES). Moreover, the researchers could implement their chaos-based stream cipher on the Raspberry Pi 4, a small-scale computer, using Python 3.8. They even used it to securely transmit a famous painting by Johannes Vermeer between Kusatsu and Sendai, two places in Japan 600 km apart. “The implementation and running costs of our cryptosystem are remarkably low compared with those of quantum cryptography,” highlights Prof. Miyano, “Our work thus provides a cryptographic approach that guarantees the privacy of daily communications between people all over the world in the post-quantum era.”
    With such power of chaos-based cryptography, we may not have much to worry about the dark sides of quantum computing.
    Story Source:
    Materials provided by Ritsumeikan University. Note: Content may be edited for style and length. More

  • in

    On the spot drug delivery with light-controlled organic microswimmers

    Science Fiction novelists couldn’t have come up with a crazier plot: microrobots streaming through blood or through other fluids in our body which are driven by light, can carry drugs to cancer cells and drop off the medication on the spot. What sounds like a far-fetched fantasy, is however the short summary of a research project published in the journal Science Robotics. The microswimmers presented in the work bear the potential to one day perform tasks in living organisms or biological environments that are not easily accessible otherwise. Looking even further ahead, the swimmers could perhaps one day help treat cancer or other diseases.
    In their paper “Light-driven carbon nitride microswimmers with propulsion in biological and ionic media and responsive on-demand drug delivery,” a team of scientists from the Max Planck Institute for Intelligent Systems (MPI-IS) and its neighboring institute, the Max Planck Institute for Solid State Research (MPI-FKF), demonstrate organic microparticles that can steer through biological fluids and dissolved blood in an unprecedented way. Even in very salty liquids, the microswimmers can be propelled forward at high speed by visible light, either individually or as a swarm. Additionally, they are partially biocompatible and can take up and release cargo on demand. At MPI-IS, scientists from the Physical Intelligence Department led by Metin Sitti were involved and at MPI-FKF, scientists from the Nanochemistry Department led by Bettina Lotsch.
    Designing and fabricating such highly advanced microswimmers seemed impossible up until now. Locomotion by light energy is hindered by the salts found in water or the body. This requires a sophisticated design that is difficult to scale up. Additionally, controlling the robots from the outside is challenging and often costly. Controlled cargo uptake and on-the-spot delivery is another supreme discipline in the field of nanorobotics.
    The scientists used a porous two-dimensional carbon nitride (CNx) that can be synthesized from organic materials, for instance, urea. Like the solar cells of a photovoltaic panel, carbon nitride can absorb light which then provides the energy to propel the robot forward when light illuminates the particle surface.
    High ion tolerance
    “The use of light as the energy source of propulsion is very convenient when doing experiments in a petri dish or for applications directly under the skin,” says Filip Podjaski, a group leader in the Nanochemistry Department at MPI-FKF. “There is just one problem: even tiny concentrations of salts prohibit light-controlled motion. Salts are found in all biological liquids: in blood, cellular fluids, digestive fluids etc. However, we have shown that our CNx microswimmers function in all biological liquids — even when the concentration of salt ions is very high. This is only possible due to a favorable interplay of different factors: efficient light energy conversion as the driving force, as well as the porous structure of the nanoparticles, which allows ions to flow through them, reducing the resistance created by salt, so to speak. In addition, in this material, light favors the mobility of ions — making the particle even faster.”
    Having shown the swimmers are salt-tolerant, the team then tackled the challenge to use them as drug carriers. “This is also possible due to the material’s porosity,” Varun Sridhar explains. He is a postdoctoral researcher at MPI-IS and the first author of the publication. He and his team loaded the small pores of the swimmers with the anti-cancer drug Doxorubicin. “The particles adsorbed the drug like a sponge, up to unprecedentedly high amounts of 185% of the carrier mass while staying stably bound to the carbon nitride — even longer than a month. We then showed that controlled release of the drug is possible in a fluid with an acidic pH level. In addition, we were able to illuminate the microswimmers and thus release the drug, regardless of a change in pH. And even when loaded to full capacity, the swimmer did not slow down significantly, which is great.” More