More stories

  • in

    New technique lets scientists create resistance-free electron channels

    An international research team led by Lawrence Berkeley National Laboratory (Berkeley Lab) has taken the first atomic-resolution images and demonstrated electrical control of a chiral interface state — an exotic quantum phenomenon that could help researchers advance quantum computing and energy-efficient electronics.
    The chiral interface state is a conducting channel that allows electrons to travel in only one direction, preventing them from being scattered backwards and causing energy-wasting electrical resistance. Researchers are working to better understand the properties of chiral interface states in real materials but visualizing their spatial characteristics has proved to be exceptionally difficult.
    But now, for the first time, atomic-resolution images captured by a research team at Berkeley Lab and UC Berkeley have directly visualized a chiral interface state. The researchers also demonstrated on-demand creation of these resistance-free conducting channels in a 2D insulator.
    Their work, which was reported in the journal Nature Physics, is part of Berkeley Lab’s broader push to advance quantum computing and other quantum information system applications, including the design and synthesis of quantum materials to address pressing technological needs.
    “Previous experiments have demonstrated that chiral interface states exist, but no one has ever visualized them with such high resolution. Our work shows for the first time what these 1D states look like at the atomic scale, including how we can alter them — and even create them,” said first author Canxun Zhang, a former graduate student researcher in Berkeley Lab’s Materials Sciences Division and the Department of Physics at UC Berkeley. He is now a postdoctoral researcher at UC Santa Barbara.
    Chiral interface states can occur in certain types of 2D materials known as quantum anomalous Hall (QAH) insulators that are insulators in bulk but conduct electrons without resistance at one-dimensional “edges” — the physical boundaries of the material and interfaces with other materials.
    To prepare chiral interface states, the team worked at Berkeley Lab’s Molecular Foundry to fabricate a device called twisted monolayer-bilayer graphene, which is a stack of two atomically thin layers of graphene rotated precisely relative to one another, creating a moiré superlattice that exhibits the QAH effect.

    In subsequent experiments at the UC Berkeley Department of Physics, the researchers used a scanning tunneling microscope (STM) to detect different electronic states in the sample, allowing them to visualize the wavefunction of the chiral interface state. Other experiments showed that the chiral interface state can be moved across the sample by modulating the voltage on a gate electrode placed underneath the graphene layers. In a final demonstration of control, the researchers showed that a voltage pulse from the tip of an STM probe can “write” a chiral interface state into the sample, erase it, and even rewrite a new one where electrons flow in the opposite direction.
    The findings may help researchers build tunable networks of electron channels with promise for energy-efficient microelectronics and low-power magnetic memory devices in the future, and for quantum computation making use of the exotic electron behaviors in QAH insulators.
    The researchers intend to use their technique to study more exotic physics in related materials, such as anyons, a new type of quasiparticle that could enable a route to quantum computation.
    “Our results provide information that wasn’t possible before. There is still a long way to go, but this is a good first step,” Zhang said.
    The work was led by Michael Crommie,a senior faculty scientist in Berkeley Lab’s Materials Sciences Division and physics professor at UC Berkeley.
    Tiancong Zhu, a former postdoctoral researcher in the Crommie group at Berkeley Lab and UC Berkeley, contributed as co-corresponding author and is now a physics professor at Purdue University.
    The Molecular Foundry is a DOE Office of Science user facility at Berkeley Lab.
    This work was supported by the DOE Office of Science. Additional funding was provided by the National Science Foundation. More

  • in

    Will the convergence of light and matter in Janus particles transcend performance limitations in the optical display industry?

    A research team consisting of Professor Kyoung-Duck Park and Hyeongwoo Lee, an integrated PhD student, from the Department of Physics at Pohang University of Science and Technology (POSTECH) has pioneered an innovative technique in ultra-high-resolution spectroscopy. Their breakthrough marks the world’s first instance of electrically controlling polaritons — hybridized light-matter particles — at room temperature.
    Polaritons are “half-light half-matter” hybrid particles, having both the characteristics of photons — particles of light — and those of solid matter. Their unique characteristics exhibit properties distinct from both traditional photons and solid matter, unlocking the potential for next-generation materials, particularly in surpassing performance limitations of optical displays. Until now, the inability to electrically control polaritons at room temperature on a single particle level has hindered their commercial viability.
    The research team has devised a novel method called “electric-field tip-enhanced strong coupling spectroscopy,” enabling ultra-high-resolution electrically controlled spectroscopy. This new technique empowers the active manipulation of individual polariton particles at room temperature.
    This technique introduces a novel approach to measurement, integrating super-resolution microscopy previously invented by Prof. Kyoung-Duck Park ‘s team with ultra-precise electrical control. The resulting instrument not only facilitates stable generation of polariton in a distinctive physical state called strong coupling at room temperature but also allows for the manipulation of the color and brightness of the light emitted by the polariton particles through the use of electric-field. Using polariton particles instead of quantum dots, key materials of QLED televisions, offers a notable advantage. A single polariton particle can emit light in all colors with significantly enhanced brightness. This eliminates the need for three distinct types of quantum dots to produce red, green, and blue light separately. Moreover, this property can be electrically controlled similar to conventional electronics. In terms of academic significance, the team has successfully established and experimentally validated the quantum confined stark effect in the strong coupling regime, shedding light on a longstanding mystery in polariton particle research.
    The team’s accomplishment holds profound significance as it marks a scientific breakthrough paving the path for the next generation of research aimed at creating diverse optoelectronic devices and optical components based on polariton technology. This breakthrough is poised to make a substantial contribution to industrial advancement, particularly in providing key source technology for the development of groundbreaking products within the optical display industry including ultra-bright and compact outdoor displays. Hyeongwoo Lee, the lead author of the paper, emphasized the research’s importance, stating that it represents “a significant discovery with the potential to drive advancements across numerous fields including next-generation optical sensors, optical communications, and quantum photonic devices.”
    The research utilized quantum dots fabricated by Professor Sohee Jeong’s team and Professor Jaehoon Lim’s team from Sungkyunkwan University. The theoretical model was crafted by Professor Alexander Efros of the Naval Research Laboratory while data analysis was conducted by Professor Markus Raschke’s team from the University of Colorado and Professor Matthew Pelton’s team from the University of Maryland. Yeonjeong Koo, Jinhyuk Bae, Mingu Kang, Taeyoung Moon, and Huitae Joo from POSTECH’s Physics Department carried out the measurement work.
    This research has been recently published in Physical Review Letters, an international physics journal, and was conducted with support from the Samsung Future Technology Incubation Program. More

  • in

    How climate change will impact food production and financial institutions

    Researchers at the University of California San Diego School of Global Policy and Strategy have developed a new method to predict the financial impacts climate change will have on agriculture, which can help support food security and financial stability for countries increasingly prone to climate catastrophes.
    The study, published today in the Proceedings of the National Academy of Sciences, uses climate and agricultural data from Brazil. It finds that climate change has a cascading effect on farming, leading to increased loan defaults for one of the nation’s largest public sector banks. Over the next three decades, climate-driven loan defaults could increase by up to 7%, according to the study.
    The projections in the paper revealed that although temperatures are rising everywhere, there is substantial variation in what that looks like from region to region, which underscores the need to build distinct types of physical and financial resilience.
    For example, parts of northern Brazil are predicted to have more dramatic seasonal swings around 2050, with heavier rainfall in winter and drier summers, so policymakers should be thinking about the need for water storage by building dams and reservoirs as well as increasing groundwater storage capacity. Conversely, central Brazil may have fairly steady weather, but will have higher overall temperatures, pointing to a need for heat-resistant crops.
    The authors of the paper used a statistical approach pairing past climate data in Brazil with information on crop productivity, farm revenue and agricultural loan performance. They combined this data with climate simulations to predict future weather conditions and their impacts on farming and how those changes will affect financial institutions.
    “A difficulty in studying climate impacts on agriculture is that there are all sorts of adaptations happening all the time that aren’t easily observed, but are really important for understanding vulnerability and how risk is changing,” said coauthor Jennifer Burney, professor of environmental science at UC San Diego’s School of Global Policy and Strategy and Scripps Institution of Oceanography. “We were able to distinguish signals from different types of climate impacts and which ones led to this larger financial risk.”
    Systematic thinking about building resilience against climate change around the globe
    A key objective of the research is to support resilient food security under a changing climate, which requires understanding of when small climate shifts might have outsized impacts, spilling across regions or into other sectors through institutions like trade and banking.

    Understanding the systemic risk posed by climate change is especially helpful for policymakers and disaster relief agencies, as climate change has increasingly become a national security threat. To that end, the statistical approach developed in the study could be applied around the globe.
    “The technique we developed will help populations identify where they are most vulnerable, how climate change will hurt them the most economically and what institutions they should focus on to build resilience,” said study coauthor Craig McIntosh, professor of economics at the School of Global Policy and Strategy.
    For example, some governments in the Western Pacific region buy extra food on the global market in emerging El Niño years, when their own crop productivity suffers. The statistical approach used in the study could help governments around the world understand their own climate conditions and whether local, regional or international institutions will be best placed to address them.
    The research could be especially helpful with the development of the loss and damage fund established by the United Nations in 2022. The fund is designed to help compensate developing nations that have contributed the least to the climate crisis but have been facing the brunt of its devastating floods, drought and sea-level rise.
    “Our technique could help countries think about where the resilience returns would be highest for the money spent,” said Krislert Samphantharak, professor of economics at the School of Global Policy and Strategy. “This technique also helps to identify where international reinsurance might be needed.”
    The “Empirical Modeling of Agricultural Climate Risk” study was also coauthored by Bruno Lopez-Videla, who earned a Ph.D. in economics from UC San Diego in 2021 and Alexandre Gori Maia of the Universidade Estadual de Campinas in Brazil. More

  • in

    A pulse of innovation: AI at the service of heart research

    Understanding heart function and disease, as well as testing new drugs for heart conditions, has long been a complex and time-consuming task. A promising way to study disease and test new drugs is to use cellular and engineered tissue models in a dish, but existing methods to study heart cell contraction and calcium handling require a good deal of manual work, are prone to errors, and need expensive specialized equipment. There clearly is a critical medical need for a more efficient, accurate, and accessible way to study heart function, using a methodology based on artificial intelligence (AI) and machine learning.
    BeatProfiler, new tool to rapidly analyze heart cell function
    Researchers at Columbia Engineering unveiled a groundbreaking new tool today that addresses these challenges head-on. BeatProfiler is a comprehensive software that automates the analysis of heart cell function from video data and is the first system to integrate the analysis of different heart function indicators, such as contractility, calcium handling, and force output into one tool, speeding up the process significantly and reducing the chance for errors. BeatProfiler enabled the researchers to not only distinguish between different diseases and levels of their severity but also to rapidly and objectively test drugs that affect heart function. The study was published on April 8 in IEEE Open Journal of Engineering in Medicine and Biology.
    “This is truly a transformative tool,” said project leader Gordana Vunjak-Novakovic, University Professor and the Mikati Foundation Professor of Biomedical Engineering, Medical Sciences, and Dental Medicine at Columbia. “It’s fast, comprehensive, automated, and compatible with a broad range of computer platforms so it is easily accessible to investigators and clinicians.”
    Software is open-source
    The team, which included Barry Fine, assistant professor of medicine (in Cardiology) at Columbia University Irving Medical Center, elected not to file a patent application, and instead are offering the AI software as open source, so it can be directly used — for free — by any lab. They believe that this is important for disseminating the results of their research, as well as for getting feedback from users in academic, clinical, and commercial labs that can help the team to further refine the software.
    The need to diagnose heart disease quickly and accurately
    This project was driven, like much of Vunjak-Novakovic’s research, by a clinical need to diagnose heart diseases more quickly and accurately. This was a project that was several years in the making in which the team added different features piece by piece. While the overarching need was to develop a tool that could better capture the function of the cardiac models that the team was building to study cardiac diseases and assess the efficacy of potential therapeutics, the researchers had an urgent need to quickly and accurately assess the function of their cardiac models in real-time.

    As the lab was making more and more cardiac tissues through innovations such as milliPillar and multiorgan tissue models, the increased capabilities of the tissues required the researchers to develop a method to more rapidly quantify the function of cardiomyocytes (heart muscle cells) and tissues to enable studies exploring genetic cardiomyopathies, cosmic radiation, immune-mediated inflammation, and drug discovery.
    Collaborators in software development, machine learning, and more
    In the last year and a half, lead author Youngbin Kim and his coauthors developed a graphical user interface (GUI) on top of the code so that biomedical researchers with no coding expertise could easily analyze the data with just a few clicks. This brought together experts in software development (for the GUI development), machine learning (for developing computer vision technology and disease/drug classifiers), signal processing (for processing contractile and calcium signals), engineering (translating pillar deflection on the cardiac platform to mechanical force), and user experience by lab members (to give feedback for improvements in the interface).
    The results
    The study showed that BeatProfiler could accurately analyze cardiomyocyte function, outperforming existing tools by being faster — up to 50 times in some cases — and more reliable. It detected subtle changes in engineered heat tissue force response that other tools might miss.
    “This level of analysis speed and versatility is unprecedented in cardiac research,” said Kim, a PhD candidate in Vunjak-Novakovic’s lab at Columbia Engineering. “Using machine learning, the functional measurements analyzed by BeatProfiler helped us to distinguish between diseased and healthy heart cells with high accuracy and even to classify different cardiac drugs based on how they affect the heart.”
    What’s next
    The team is working to expand BeatProfiler’s capabilities for new applications in heart research, including a full spectrum of diseases that affect the pumping of the heart, and drug development. To ensure that BeatProfiler can be applied to a wide variety of research questions, they are testing and validating its performance across additional in vitro cardiac models, including different engineered heart tissue models. They are also refining their machine-learning algorithm to extend and generalize its use to a variety of heart diseases and drug effect classification. The long-term goal is to adapt BeatProfiler to pharmaceutical settings to speed up the testing of hundreds of thousands of candidate drugs at once. More

  • in

    Engineers design soft and flexible ‘skeletons’ for muscle-powered robots

    Our muscles are nature’s perfect actuators — devices that turn energy into motion. For their size, muscle fibers are more powerful and precise than most synthetic actuators. They can even heal from damage and grow stronger with exercise.
    For these reasons, engineers are exploring ways to power robots with natural muscles. They’ve demonstrated a handful of “biohybrid” robots that use muscle-based actuators to power artificial skeletons that walk, swim, pump, and grip. But for every bot, there’s a very different build, and no general blueprint for how to get the most out of muscles for any given robot design.
    Now, MIT engineers have developed a spring-like device that could be used as a basic skeleton-like module for almost any muscle-bound bot. The new spring, or “flexure,” is designed to get the most work out of any attached muscle tissues. Like a leg press that’s fit with just the right amount of weight, the device maximizes the amount of movement that a muscle can naturally produce.
    The researchers found that when they fit a ring of muscle tissue onto the device, much like a rubber band stretched around two posts, the muscle pulled on the spring, reliably and repeatedly, and stretched it five times more, compared with other previous device designs.
    The team sees the flexure design as a new building block that can be combined with other flexures to build any configuration of artificial skeletons. Engineers can then fit the skeletons with muscle tissues to power their movements.
    “These flexures are like a skeleton that people can now use to turn muscle actuation into multiple degrees of freedom of motion in a very predictable way,” says Ritu Raman, the Brit and Alex d’Arbeloff Career Development Professor in Engineering Design at MIT. “We are giving roboticists a new set of rules to make powerful and precise muscle-powered robots that do interesting things.”
    Raman and her colleagues report the details of the new flexure design in a paper appearing in the journal Advanced Intelligent Systems. The study’s MIT co-authors include Naomi Lynch ’12, SM ’23; undergraduate Tara Sheehan; graduate students Nicolas Castro, Laura Rosado, and Brandon Rios; and professor of mechanical engineering Martin Culpepper.

    Muscle pull
    When left alone in a petri dish in favorable conditions, muscle tissue will contract on its own but in directions that are not entirely predictable or of much use.
    “If muscle is not attached to anything, it will move a lot, but with huge variability, where it’s just flailing around in liquid,” Raman says.
    To get a muscle to work like a mechanical actuator, engineers typically attach a band of muscle tissue between two small, flexible posts. As the muscle band naturally contracts, it can bend the posts and pull them together, producing some movement that would ideally power part of a robotic skeleton. But in these designs, muscles have produced limited movement, mainly because the tissues are so variable in how they contact the posts. Depending on where the muscles are placed on the posts, and how much of the muscle surface is touching the post, the muscles may succeed in pulling the posts together but at other times may wobble around in uncontrollable ways.
    Raman’s group looked to design a skeleton that focuses and maximizes a muscle’s contractions regardless of exactly where and how it is placed on a skeleton, to generate the most movement in a predictable, reliable way.
    “The question is: How do we design a skeleton that most efficiently uses the force the muscle is generating?” Raman says.

    The researchers first considered the multiple directions that a muscle can naturally move. They reasoned that if a muscle is to pull two posts together along a specific direction, the posts should be connected to a spring that only allows them to move in that direction when pulled.
    “We need a device that is very soft and flexible in one direction, and very stiff in all other directions, so that when a muscle contracts, all that force gets efficiently converted into motion in one direction,” Raman says.
    Soft flex
    As it turns out, Raman found many such devices in Professor Martin Culpepper’s lab. Culpepper’s group at MIT specializes in the design and fabrication of machine elements such as miniature actuators, bearings, and other mechanisms, that can be built into machines and systems to enable ultraprecise movement, measurement, and control, for a wide variety of applications. Among the group’s precision machined elements are flexures — spring-like devices, often made from parallel beams, that can flex and stretch with nanometer precision.
    “Depending on how thin and far apart the beams are, you can change how stiff the spring appears to be,” Raman says.
    She and Culpepper teamed up to design a flexure specifically tailored with a configuration and stiffness to enable muscle tissue to naturally contract and maximally stretch the spring. The team designed the device’s configuration and dimensions based on numerous calculations they carried out to relate a muscle’s natural forces with a flexure’s stiffness and degree of movement.
    The flexure they ultimately designed is 1/100 the stiffness of muscle tissue itself. The device resembles a miniature, accordion-like structure, the corners of which are pinned to an underlying base by a small post, which sits near a neighboring post that is fit directly onto the base. Raman then wrapped a band of muscle around the two corner posts (the team molded the bands from live muscle fibers that they grew from mouse cells), and measured how close the posts were pulled together as the muscle band contracted.
    The team found that the flexure’s configuration enabled the muscle band to contract mostly along the direction between the two posts. This focused contraction allowed the muscle to pull the posts much closer together — five times closer — compared with previous muscle actuator designs.
    “The flexure is a skeleton that we designed to be very soft and flexible in one direction, and very stiff in all other directions,” Raman says. “When the muscle contracts, all the force is converted into movement in that direction. It’s a huge magnification.”
    The team found they could use the device to precisely measure muscle performance and endurance. When they varied the frequency of muscle contractions (for instance, stimulating the bands to contract once versus four times per second), they observed that the muscles “grew tired” at higher frequencies, and didn’t generate as much pull.
    “Looking at how quickly our muscles get tired, and how we can exercise them to have high-endurance responses — this is what we can uncover with this platform,” Raman says.
    The researchers are now adapting and combining flexures to build precise, articulated, and reliable robots, powered by natural muscles.
    “An example of a robot we are trying to build in the future is a surgical robot that can perform minimally invasive procedures inside the body,” Raman says. “Technically, muscles can power robots of any size, but we are particularly excited in making small robots, as this is where biological actuators excel in terms of strength, efficiency, and adaptability.” More

  • in

    Researchers developed new method for detecting heart failure with a smartphone

    The new technology, which was created at the University of Turku and developed by the company CardioSignal, uses a smartphone to analyse heart movement and detect heart failure. The study involved five organisations from Finland and the United States.
    Heart failure is a condition affecting tens of millions of people worldwide, in which the heart is unable to perform its normal function of pumping blood to the body. It is a serious condition that develops as a result of a number of cardiovascular diseases and its symptoms may require repeated hospitalisation.
    Heart failure is challenging to diagnose because its symptoms, such as shortness of breath, abnormal fatigue on exertion, and swelling, can be caused by a number of conditions. There is no simple test available to detect it and diagnostics relies on an examination by a doctor, blood tests, and sophisticated imaging, such as an ultrasound scan of the heart.
    Gyrocardiography is a non-invasive technique for measuring cardiac vibrations on the chest. The smartphone’s built-in motion sensors can detect and record these vibrations, including those that doctors cannot hear with a stethoscope. The method has been developed over the last 10 years by researchers at the University of Turku and CardioSignal.
    The researchers’ latest study on using smartphone motion sensors to detect heart failure was carried out at the Turku and Helsinki University Hospitals in Finland and Stanford University Hospital in the US. Approximately 1,000 people took part in the study, of whom around 200 were patients suffering from heart failure. The study compared the data provided by the motion sensors in the heart failure patients and patients without heart disease.
    “The results we obtained with this new method are promising and may in the future make it easier to detect heart failure,” says Cardiologist Antti Saraste, one of the two main authors of the research article and the Professor of Cardiovascular Medicine at the University of Turku, Finland.
    Precise detection uncovers heart failure
    The researchers found that heart failure is associated with typical changes in the motion sensor data collected by a smartphone. On the basis of this data, the researchers were able to identify the majority of patients with heart failure.

    The analysis of the movements detected by the gyroscope and accelerometer is so accurate that in the future it could provide healthcare professionals with a quick and easy way to detect heart failure.
    “Primary healthcare has very limited tools for detecting heart failure. We can create completely new treatment options for remote monitoring of at-risk groups and for monitoring already diagnosed patients after hospitalisation,” says CardioSignal’s founding member and CEO, Cardiologist Juuso Blomster.
    Consistent with several European countries, heart failure affects around 1-2% of the population in Finland, but it is much more common in older adults, affecting around one in ten people aged 70. Detecting heart failure is important as effective treatment can help to alleviate its symptoms. Accurate diagnosis and timely access to treatment can also reduce healthcare costs, which are driven up by emergency room visits and hospital stays, especially during exacerbations.
    The joint research projects between CardioSignal and the University of Turku aim to promote people’s health and reduce healthcare costs through innovation, improved disease diagnostics, and prevention of serious complications. More

  • in

    Novel robust-optimal controllers based on fuzzy descriptor system

    Nonlinear systems have applications in many diverse fields from robotics to economics. Unlike linear systems, the output is not proportional to the input is such systems. A classic example is the motion of a pendulum. Due to the inherent nature of nonlinear systems, their mathematical modelling and, consequently, control is difficult. In this context, the Takagi-Sugeno (T-S) fuzzy system emerges as a highly effective tool. This system leverages fuzzy logic to map input and output values to approximate a nonlinear system as multiple linear systems which are easier to model. Fuzzy logic is a form of mathematical logic in which, instead of requiring all statements to be true (1) or false (0), the truth values can be any value between 0 and 1. T-S fuzzy system has thus served as the foundation for several nonlinear control methods, with the Parallel Distributed Compensator (PDC) method being the most prominent.
    Furthermore, scientists have developed an enhanced version of this system, known as the fuzzy descriptor system (FDS). It combines the T-S fuzzy system with the powerful space-state representation, which describes a physical system in terms of state variables, input variables, and output variables. Despite extensive research, optimal control strategies in the context of T-S FDSs are still largely unexplored. Additionally, while robust control methods, which protect against disturbances, have been explored for T-S FDS using methods like Linear Matrix Inequalities (LMI), these methods introduce additional complexity and optimization challenges.
    To overcome these limitations, a group of researchers, led by Associate Professor Ngoc-Tam Bui from the Innovative Global Program of the College of Engineering at Shibaura Institute of Technology in Japan and including Thi-Van-Anh Nguyen, Quy-Thinh Dao, and Duc-Binh Pham, all from Hanoi University of Science and Technology, developed novel optimal and robust-optimal controllers based on the T-S fuzzy descriptor model. Their study was published in the journal Scientific Reports on March 07, 2024.
    To develop the controllers, the team first utilized the powerful Lyapunov stability theory to establish the stability conditions for the mathematical model of the FDS. However, these stability conditions cannot be directly used. As Dr. Bui explains, “The stability conditions for the FDS model make it difficult to solve using established mathematical tools. To make them more amenable, we systematically transformed them into LMI.” These modified conditions formed the basis for developing three controllers: the stability controller which uses PDC to manage deviations, the optimal controller which minimizes a cost function to obtain optimal control, and the robust-optimal controller which combines the benefits of both of them.
    The researchers demonstrated the effectiveness of these controllers in controlling a rotary inverted pendulum, a challenging system comprising an inverted pendulum sitting on a rotating base. The problem is to keep the pendulum upright by controlling the rotation of the base. The researchers tested the performance of the controllers using distinct simulation scenarios. Simulations revealed that the stability controller effectively stabilized the system when the initial displacement angle was small, while with larger initial angles, there were more oscillations, and the settling time was higher. The high settling time was effectively addressed by the optimal controller, reducing it from 13 to 2 seconds, representing a six-fold reduction. Moreover, it also reduced the maximum amplitudes during oscillations.
    The robust-optimal controller was tested using two different scenarios. In the first case, the mass of the pendulum bar was changed, while in the second, white noise was introduced into the control input. Compared to the optimal controller, it performed the same in the first scenario. However, the controller was considerably better in the second scenario, showing no oscillations while the optimal controller showed clear oscillations. Notably, the robust-optimal controller showed the lowest error values.
    These results highlight the adaptability and potential of these controllers in practical scenarios. “The research findings hold promising implications for various real-life applications where stable control in dynamic and uncertain environments is paramount. Specifically, autonomous vehicles and industrial robots can achieve enhanced performance and adaptability using the proposed controllers,” remarks Dr. Bui. “Overall, our research opens avenues for advancing control strategies in various domains, ultimately contributing to more capable autonomous systems, making transportationsafer, healthcare more effective, and manufacturing more efficient.” More

  • in

    Protecting art and passwords with biochemistry

    Security experts fear Q-​Day, the day when quantum computers become so powerful that they can crack today’s passwords. Some experts estimate that this day will come within the next ten years. Password checks are based on cryptographic one-​way functions, which calculate an output value from an input value. This makes it possible to check the validity of a password without transmitting the password itself: the one-​way function converts the password into an output value that can then be used to check its validity in, say, online banking. What makes one-​way functions special is that it’s impossible to use their output value to deduce the input value — in other words, the password. At least not with today’s resources. However, future quantum computers could make this kind of inverse calculation easier.
    Researchers at ETH Zurich have now presented a cryptographic one-​way function that works differently from today’s and will also be secure in the future. Rather than processing the data using arithmetic operations, it is stored as a sequence of nucleotides — the chemical building blocks of DNA.
    Based on true randomness
    “Our system is based on true randomness. The input and output values are physically linked, and it’s only possible to get from the input value to the output value, not the other way round,” explains Robert Grass, a professor in the Department of Chemistry and Applied Biosciences. “Since it’s a physical system and not a digital one, it can’t be decoded by an algorithm, not even by one that runs on a quantum computer,” adds Anne Lüscher, a doctoral student in Grass’s group. She is the lead author of the paper, which was published in the journal Nature Communications.
    The researchers’ new system can serve as a counterfeit-​proof way of certifying the authenticity of valuable objects such as works of art. The technology could also be used to trace raw materials and industrial products.
    How it works
    The new biochemical one-​way function is based on a pool of one hundred million different DNA molecules. Each of the molecules contains two segments featuring a random sequence of nucleotides: one segment for the input value and one for the output value. There are several hundred identical copies of each of these DNA molecules in the pool, and the pool can also be divided into several pools; these are identical because they contain the same random DNA molecules. The pools can be located in different places, or they can be built into objects.

    Anyone in possession of this DNA pool holds the security system’s lock. The polymerase chain reaction (PCR) can be used to test a key, or input value, which takes the form of a short sequence of nucleotides. During the PCR, this key searches the pool of hundreds of millions of DNA molecules for the molecule with the matching input value, and the PCR then amplifies the output value located on the same molecule. DNA sequencing is used to make the output value readable.
    At first glance, the principle seems complicated. “However, producing DNA molecules with built-​in randomness is cheap and easy,” Grass says. The production costs for a DNA pool that can be divided up in this way are less than 1 Swiss franc. Using DNA sequencing to read out the output value is more time-​consuming and expensive, but many biology laboratories already possess the necessary equipment.
    Securing valuable goods and supply chains
    ETH Zurich has applied for a patent on this new technology. The researchers now want to optimise and refine it to bring it to market. Because using the method calls for specialised laboratory infrastructure, the scientists think the most likely application for this form of password verification is currently for highly sensitive goods or for access to buildings with restricted access. This technology won’t be an option for the broader public to check passwords until DNA sequencing in particular becomes easier.
    A little more thought has already gone into the idea of using the technology for the forgery-​proof certification of works of art. For instance, if there are ten copies of a picture, the artist can mark them all with the DNA pool — perhaps by mixing the DNA into the paint, spraying it onto the picture or applying it to a specific spot.
    If several owners later wish to have the authenticity of these artworks confirmed, they can get together, agree on a key (i.e. an input value) and carry out the DNA test. All the copies for which the test produces the same output value will have been proven genuine. The new technology could also be used to link crypto-​assets such as NFTs, which exist only in the digital world, to an object and thus to the physical world.
    Furthermore, it would support counterfeit-​proof tracking along supply chains of industrial goods or raw materials. “The aviation industry, for example, has to be able to provide complete proof that it uses only original components. Our technology can guarantee traceability,” Grass says. In addition, the method could be used to label the authenticity of original medicines or cosmetics. More