More stories

  • in

    New organ-on-a-chip model of human synovium could accelerate development of treatments for arthritis

    The synovium is a membrane-like structure that lines the knee joint and helps to keep the joint happy and healthy, mainly by producing and maintaining synovial fluid. Inflammation of this tissue is implicated in the onset and progression of arthritic diseases such as rheumatoid and osteoarthritis. Therefore, treatments that target the synovium are promising in treating these diseases. However, we need better models in the laboratory that allow us to find and test new treatments. We have developed an organ-on-a-chip based model of the human synovium, and its associated vasculature, to address this need.
    Researchers at Queen Mary University of London have developed a new organ-on-a-chip model of the human synovium, a membrane-like tissue that lines the joints. The model, published in the journal Biomedical Materials, could help researchers to better understand the mechanisms of arthritis and to develop new treatments for this group of debilitating diseases.
    In the UK, more than 10 million people live with a form of arthritis, which affects the joints and can cause pain, stiffness, and swelling. There is currently no cure for arthritis and the search for new therapeutics is limited by a lack of accurate models.
    The new synovium-on-a-chip model is a three-dimensional microfluidic device that contains human synovial cells and blood vessel cells. The device is subjected to mechanical loading, which mimics the forces applied to the synovium during joint movement.
    The developed synovium-on-a-chip model was able to mimic the behaviour of native human synovium, producing key synovial fluid components and responding to inflammation. This suggests that the new platform has immense potential to help researchers understand disease mechanisms and identify and test new therapies for arthritic diseases.
    “Our model is the first human, vascularised, synovium-on-a-chip model with applied mechanical loading and successfully replicates a number of key features of native synovium biology,” said Dr Timothy Hopkins, Versus Arthritis Foundation Fellow, joint lead author of the study. “The model was developed upon a commercially available platform (Emulate Inc.), that allows for widespread adoption without the need for specialist knowledge of chip fabrication. The vascularised synovium-on-a-chip can act as a foundational model for academic research, with which fundamental questions can be addressed, and complexity (further cell and tissue types) can be added. In addition, we envisage that our model could eventually form part of the drug discovery pipeline in an industrial setting. Some of these conversations have already commenced.”
    The researchers are currently using the synovium-on-a-chip model to study the disease mechanisms of arthritis and to develop stratified and personalized organ-on-a-chip models of human synovium and associated tissues.
    “We believe that our synovium-on-a-chip model, and related models of human joints currently under development in our lab, have the potential to transform pre-clinical testing, streamlining delivery of new therapeutics for treatment of arthritis,” Prof. Martin Knight, Professor of Mechanobiology said. “We are excited to share this model with the scientific community and to work with industry partners to bring new treatments to patients as quickly as possible.” More

  • in

    Self-correcting quantum computers within reach?

    Quantum computers promise to reach speeds and efficiencies impossible for even the fastest supercomputers of today. Yet the technology hasn’t seen much scale-up and commercialization largely due to its inability to self-correct. Quantum computers, unlike classical ones, cannot correct errors by copying encoded data over and over. Scientists had to find another way.
    Now, a new paper in Nature illustrates a Harvard quantum computing platform’s potential to solve the longstanding problem known as quantum error correction.
    Leading the Harvard team is quantum optics expert Mikhail Lukin, the Joshua and Beth Friedman University Professor in physics and co-director of the Harvard Quantum Initiative. The work reported in Nature was a collaboration among Harvard, MIT, and Boston-based QuEra Computing. Also involved was the group of Markus Greiner, the George Vasmer Leverett Professor of Physics.
    An effort spanning the last several years, the Harvard platform is built on an array of very cold, laser-trapped rubidium atoms. Each atom acts as a bit — or a “qubit” as it’s called in the quantum world — which can perform extremely fast calculations.
    The team’s chief innovation is configuring their “neutral atom array” to be able to dynamically change its layout by moving and connecting atoms — this is called “entangling” in physics parlance — mid-computation. Operations that entangle pairs of atoms, called two-qubit logic gates, are units of computing power.
    Running a complicated algorithm on a quantum computer requires many gates. However, these gate operations are notoriously error-prone, and a buildup of errors renders the algorithm useless.
    In the new paper, the team reports near-flawless performance of its two-qubit entangling gates with extremely low error rates. For the first time, they demonstrated the ability to entangle atoms with error rates below 0.5 percent. In terms of operation quality, this puts their technology’s performance on par with other leading types of quantum computing platforms, like superconducting qubits and trapped-ion qubits. More

  • in

    New study unveils stretchable high-resolution user-interactive synesthesia displays for visual–acoustic encryption

    The future of human-machine interfaces is on the cusp of a revolution with the unveiling of a groundbreaking technology — a stretchable high-resolution multicolor synesthesia display that generates synchronized sound and light as input/output sources. A research team, led by Professor Moon Kee Choi in the Department of Materials Science and Engineering at UNIST, has succeeded in developing this cutting-edge display using transfer-printing techniques, propelling the field of multifunctional displays into new realms of possibility.
    Traditionally, multifunctional displays have been confined to visualizing mechanical and electrical signals in light. However, this pioneering stretchable synesthesia display shatters preconceived boundaries by offering unparalleled optical performance and precise sound pressure levels. Its inherent stretchability ensures seamless operation under both static and dynamic deformation, preserving the integrity of the sound relative to the input waveform.
    A key advantage of this groundbreaking technology is its potential to revolutionize wearable devices, mobile devices, and the Internet of Things (IoT) as the next generation of displays. By seamlessly generating sound and light simultaneously, the stretchable display delivers a distinctive user experience and unlocks untapped potential for advanced encryption and authentication.
    To demonstrate the capabilities of this synesthesia display, the research team presented two innovative applications. Firstly, they showcased visual-acoustic encryption, an advanced encryption method that combines visual and auditory cues. This breakthrough sets the stage for reinforced authentication systems that leverage the power of both sight and sound, elevating security to new heights.
    Secondly, the team introduced a multiplex quick response code that bridges multiple domains with a single device. This remarkable technology empowers users to interact with the display, ushering in a new era of seamless integration and user-friendly experiences.
    Professor Choi enthused, “The demand for next-generation displays is skyrocketing, and this stretchable high-resolution display that generates sound and light simultaneously overcomes the limitations of previous light-emitting devices. Our novel light-emission layer transfer technology, achieved through surface energy control, enables us to achieve remarkable patterns and maintain stability even under deformation.”
    The manufactured device boasts exceptional brightness and sound characteristics, with a circular shape maintained at a remarkable rate of over 95% in more than 5,000 deformation experiments. This unparalleled durability and versatility render the stretchable display ideal for a wide range of applications, including wearable speakers, double encryption devices, and multi-quick response code implementations.
    According to the research team, this remarkable advancement in display technology propels us one step closer to a future where multifunctional displays seamlessly integrate with our daily lives. As the demand for advanced human-machine interfaces continues to surge, the stretchable high-resolution multicolor synesthesia display offers a tantalizing glimpse into the limitless possibilities of tomorrow. More

  • in

    AI just got 100-fold more energy efficient

    Forget the cloud.
    Northwestern University engineers have developed a new nanoelectronic device that can perform accurate machine-learning classification tasks in the most energy-efficient manner yet. Using 100-fold less energy than current technologies, the device can crunch large amounts of data and perform artificial intelligence (AI) tasks in real time without beaming data to the cloud for analysis.
    With its tiny footprint, ultra-low power consumption and lack of lag time to receive analyses, the device is ideal for direct incorporation into wearable electronics (like smart watches and fitness trackers) for real-time data processing and near-instant diagnostics.
    To test the concept, engineers used the device to classify large amounts of information from publicly available electrocardiogram (ECG) datasets. Not only could the device efficiently and correctly identify an irregular heartbeat, it also was able to determine the arrhythmia subtype from among six different categories with near 95% accuracy.
    The research will be published on Oct. 12 in the journal Nature Electronics.
    “Today, most sensors collect data and then send it to the cloud, where the analysis occurs on energy-hungry servers before the results are finally sent back to the user,” said Northwestern’s Mark C. Hersam, the study’s senior author. “This approach is incredibly expensive, consumes significant energy and adds a time delay. Our device is so energy efficient that it can be deployed directly in wearable electronics for real-time detection and data processing, enabling more rapid intervention for health emergencies.”
    A nanotechnology expert, Hersam is Walter P. Murphy Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering. He also is chair of the Department of Materials Science and Engineering, director of the Materials Research Science and Engineering Center and member of the International Institute of Nanotechnology. Hersam co-led the research with Han Wang, a professor at the University of Southern California, and Vinod Sangwan, a research assistant professor at Northwestern. More

  • in

    New cyber algorithm shuts down malicious robotic attack

    Australian researchers have designed an algorithm that can intercept a man-in-the-middle (MitM) cyberattack on an unmanned military robot and shut it down in seconds.
    In an experiment using deep learning neural networks to simulate the behaviour of the human brain, artificial intelligence experts from Charles Sturt University and the University of South Australia (UniSA) trained the robot’s operating system to learn the signature of a MitM eavesdropping cyberattack. This is where attackers interrupt an existing conversation or data transfer.
    The algorithm, tested in real time on a replica of a United States army combat ground vehicle, was 99% successful in preventing a malicious attack. False positive rates of less than 2% validated the system, demonstrating its effectiveness.
    The results have been published in IEEE Transactions on Dependable and Secure Computing.
    UniSA autonomous systems researcher, Professor Anthony Finn, says the proposed algorithm performs better than other recognition techniques used around the world to detect cyberattacks.
    Professor Finn and Dr Fendy Santoso from Charles Sturt Artificial Intelligence and Cyber Futures Institute collaborated with the US Army Futures Command to replicate a man-in-the-middle cyberattack on a GVT-BOT ground vehicle and trained its operating system to recognise an attack.
    “The robot operating system (ROS) is extremely susceptible to data breaches and electronic hijacking because it is so highly networked,” Prof Finn says. More

  • in

    Exploring parameter shift for quantum fisher information

    Quantum computing uses quantum mechanics to process and store information in a way that is different from classical computers. While classical computers rely on bits like tiny switches that can be either 0 or 1, quantum computers use quantum bits (qubits). Qubits are unique because they can be in a mixture of 0 and 1 simultaneously — a state referred to as superposition. This unique property enables quantum computers to solve specific problems significantly faster than classical ones.
    In a recent publication in EPJ Quantum Technology, Le Bin Ho from Tohoku University’s Frontier Institute for Interdisciplinary Sciences has developed a technique called “Time-dependent Stochastic Parameter Shift” in the realm of quantum computing and quantum machine learning. This breakthrough method revolutionizes the estimation of gradients or derivatives of functions, a crucial step in many computational tasks.
    Typically, computing derivatives requires dissecting the function and calculating the rate of change over a small interval. But even classical computers cannot keep dividing indefinitely. In contrast, quantum computers can accomplish this task without having to discrete the function. This feature is achievable because quantum computers operate in a realm known as “quantum space,” characterized by periodicity, and no need for endless subdivisions.
    One way to illustrate this concept is by comparing the sizes of two elementary schools on a map. To do this, one might print out maps of the schools and then cut them into smaller pieces. After cutting, these pieces can be arranged into a line, with their total length compared. However, the pieces may not form a perfect rectangle, leading to inaccuracies. An infinite subdivision would be required to minimize these errors, an impractical solution, even for classical computers.
    A more straightforward method involves weighing the paper pieces representing the two schools and comparing their weights. This method yields accurate results when the paper sizes are large enough to detect the mass difference. This bears resemblance to the parameter shift concept, though operating in different spaces that do not necessitate infinite intervals.
    “Our time-dependent stochastic method is applicable to the broader applications for higher-order derivatives and can be employed to compute the quantum Fisher information matrix (QFIM), a pivotal concept in quantum information theory and quantum metrology,” states Le.
    “QFIM is intricately linked to various disciplines, including quantum metrology, phase transitions, entanglement witness, Fubini-Study metric, and quantum speed limits, making it a fundamental quantity with various applications. Therefore, calculating QFIM on quantum computers can open doors to utilizing quantum computers across diverse fields such as cryptography, optimization, drug discovery, materials science, and beyond.”
    Le also showed how this method can be used in various applications, including quantum metrology with single and multiple magnetic fields and Hamiltonian tomography applied to intricate many-body systems. He also meticulously compared the new approach to the exact theoretical method and another approximation model called the Suzuki-Trotter. Although the method aligned closely with the theoretical approach, the Suzuki-Trotter approximation deviated from the true value. Enhancing the results of the Suzuki-Trotter approximation would necessitate an infinite subdivision of the Suzuki-Trotter steps. More

  • in

    A step towards AI-based precision medicine

    Artificial intelligence, AI, which finds patterns in complex biological data could eventually contribute to the development of individually tailored healthcare. Researchers at Linköping University, Sweden, have developed an AI-based method applicable to various medical and biological issues. Their models can for instance accurately estimate people’s chronological age and determine whether they have been smokers or not.
    There are many factors that can affect which out of all our genes are used at any given point in time. Smoking, dietary habits and environmental pollution are some such factors. This regulation of gene activity can be likened to a power switch determining which genes are switched on or off, witout altering the actual genes, and is called epigenetics.
    Researchers at Linköping University have used data with epigenetic information from more than 75,000 human samples to train a large number of AI neural network models. They hope that such AI-based models could eventually be used in precision medicine to develop treatments and preventive strategies tailored to the individual. Their models are of the autoencoder type, that self-organises the information and finds interrelation patterns in the large amount of data.
    To test their model, the LiU researchers compared it with existing models. There are already existing models of the effects of smoking on the body, building on the fact that specific epigenetic changes reflect the effect of smoking on the functioning of the lungs. These traces remain in the DNA long after a person has quit smoking, and this type of model can identify whether someone is a current, former or never smoker. Other models can, based on epigenetic markers, estimate the chronological age of an individual, or group individuals according to whether they have a disease or are healthy.
    The LiU researchers trained their autoencoder and then used the result to answer three different queries: age determination, smoker status and diagnosing the disease systemic lupus erythematosus, SLE. Although the existing models rely on selected epigenetic markers known to be associated with the condition they aim to classify. However, it turned out that the LiU researchers’ autoencoders functioned better or equally well.
    “Our models not only enable us to classify individuals based on their epigenetic data. We found that our models can identify previously known epigenetic markers used in other models, but also new markers associated with the condition we’re examining. One example of this is that our model for smoking identifies markers associated with respiratory diseases, such as lung cancer, and DNA damage,” says David Martínez, PhD student at Linköping University.
    The objective of the autoencoder models is to enable compression of extremely complex biological data into a representation of the most relevant characteristics and patterns in data. More

  • in

    New easy-to-use optical chip can self-configure to perform various functions

    Researchers have developed an easy-to-use optical chip that can configure itself to achieve various functions. The positive real-valued matrix computation they have achieved gives the chip the potential to be used in applications requiring optical neural networks. Optical neural networks can be used for a variety of data-heavy tasks such as image classification, gesture interpretation and speech recognition.
    Photonic integrated circuits that can be reconfigured after manufacturing to perform different functions have been developed previously. However, they tend to be difficult to configure because the user needs to understand the internal structure and principles of the chip and individually adjust its basic units.
    “Our new chip can be treated as a black box, meaning users don’t need to understand its internal structure to change its function,” said research team leader Jianji Dong from Huazhong University of Science and Technology in China. “They only need to set a training objective, and, with computer control, the chip will self-configure to achieve the desired functionality based on the input and output.”
    In the journal Optical Materials Express, the researchers describe their new chip, which is based on a network of waveguide-based optical components called Mach-Zehnder interferometers (MZIs) arranged in a quadrilateral pattern. The researchers showed that the chip can self-configure to perform optical routing, low-loss light energy splitting and the matrix computations used to create neural networks.
    “In the future, we anticipate the realization of larger-scale on-chip programmable waveguide networks,” said Dong. “With additional development, it may become possible to achieve optical functions comparable to those of field-programmable gate arrays (FPGAs) — electrical integrated circuits that can be reprogrammed to perform any desired application after they are manufactured.”
    Creating the programmable MZI network
    The on-chip quadrilateral MZI network is potentially useful for applications involving optical neural networks, which are created from networks of interconnected nodes. To use an optical neural network effectively, the network must be trained with known data to determine the weights between each pair of nodes — a task that involves matrix multiplication. More