More stories

  • in

    Dengue detection smartphone tech shows new hope for low-cost diagnostics

    Accurate home testing could be used for a wider range of illnesses, as new research shows the capability of smartphone-powered tests for Dengue Fever.
    In a paper published in PLOS Neglected Tropical Diseases today, biomedical technology researchers from the University of Reading used a new diagnostic kit called Cygnus to detect Dengue Fever with significantly improved rates over lateral flow testing kits.
    Working with academics and clinicians in Thailand, the team trialled the tests alongside already established alternatives in and found the new tests showed 82% clinical sensitivity, beating lateral flow testing (74% sensitivity) and matching hospital-based lab diagnostics (83% sensitivity). At the same time, these devices make 10 measurements allowing us to identify which of the 4 different dengue virus types caused the infection.
    Dr Sarah Needs, Postdoctoral Research Associate in Microfluidic Antimicrobial Resistance Testing from the University of Reading is lead author of the paper.
    Dr Needs said:
    “The paper shows exciting potential for the use of the microfluidic ‘lab on a strip’ tests that can used in conjunction with a smartphone and are more powerful than LFT testing in this case. As well as being cheap to produce, the lab on a strip technology allows users to test many different targets at once in one single sample, so it could be useful to detect multiple diseases not just one. More

  • in

    Invisible helium atoms provide exquisitely sensitive test of fundamental theory

    Physicists at the Australian National University have developed the most sensitive method ever for measuring the potential energy of an atom (within a hundredth of a decillionth of a joule — or 10-35 joule), and used it to validate one of the most tested theories in physics — quantum electrodynamics (QED).
    The research, published this week in Science relies on finding the colour of laser light where a helium atom is invisible, and is an independent corroboration of previous methods used to test QED, which have involved measuring transitions from one atomic energy state to another.
    “This invisibility is only for a specific atom and a specific colour of light — so it couldn’t be used to make an invisibility cloak that Harry Potter would use to investigate dark corners at Hogwarts,” said lead author, Bryce Henson, a PhD student at ANU Research School of Physics.
    “But we were able to use to investigate some dark corners of QED theory.”
    “We were hoping to catch QED out, because there have been some previous discrepancies between theory and experiments, but it passed with a pretty good mark.”
    Quantum Electrodynamics, or QED, was developed in the late 1940s and describes how light and matter interact, incorporating both quantum mechanics and Einstein’s special theory of relativity in a way that has remained successful for nearly eighty years. More

  • in

    Engineered crystals could help computers run on less power

    Computers may be growing smaller and more powerful, but they require a great deal of energy to operate. The total amount of energy the U.S. dedicates to computing has risen dramatically over the last decade and is quickly approaching that of other major sectors, like transportation.
    In a study published online this week the journal Nature, University of California, Berkeley, engineers describe a major breakthrough in the design of a component of transistors — the tiny electrical switches that form the building blocks of computers — that could significantly reduce their energy consumption without sacrificing speed, size or performance. The component, called the gate oxide, plays a key role in switching the transistor on and off.
    “We have been able to show that our gate-oxide technology is better than commercially available transistors: What the trillion-dollar semiconductor industry can do today — we can essentially beat them,” said study senior author Sayeef Salahuddin, the TSMC Distinguished professor of Electrical Engineering and Computer Sciences at UC Berkeley.
    This boost in efficiency is made possible by an effect called negative capacitance, which helps reduce the amount of voltage that is needed to store charge in a material. Salahuddin theoretically predicted the existence of negative capacitance in 2008 and first demonstrated the effect in a ferroelectric crystal in 2011.
    The new study shows how negative capacitance can be achieved in an engineered crystal composed of a layered stack of hafnium oxide and zirconium oxide, which is readily compatible with advanced silicon transistors. By incorporating the material into model transistors, the study demonstrates how the negative capacitance effect can significantly lower the amount of voltage required to control transistors, and as a result, the amount of energy consumed by a computer.
    “In the last 10 years, the energy used for computing has increased exponentially, already accounting for single digit percentages of the world’s energy production, which grows only linearly, without an end in sight,” Salahuddin said. “Usually, when we are using our computers and our cell phones, we don’t think about how much energy we are using. But it is a huge amount, and it is only going to go up. Our goal is to reduce the energy needs of this basic building block of computing, because that brings down the energy needs for the entire system.”
    Bringing negative capacitance to real technology More

  • in

    Computerized, rolling DNA motors move molecular robotics to next level

    Chemists integrated computer functions into rolling DNA-based motors, opening a new realm of possibilities for miniature, molecular robots. Nature Nanotechnology published the development, the first DNA-based motors that combine computational power with the ability to burn fuel and move in an intentional direction.
    “One of our big innovations, beyond getting the DNA motors to perform logic computations, is finding a way to convert that information into a simple output signal — motion or no motion,” says Selma Piranej, an Emory University PhD candidate in chemistry, and first author of the paper. “This signal can be read by anyone holding a cell phone equipped with an inexpensive magnifying attachment.”
    “Selma’s breakthrough removes major roadblocks that stood in the way of making DNA computers useful and practical for a range of biomedical applications,” says Khalid Salaita, senior author of the paper and an Emory professor of chemistry at Emory University. Salaita is also on the faculty of the Wallace H. Coulter Department of Biomedical Engineering, a joint program of Georgia Tech and Emory.
    The motors can sense chemical information in their environment, process that information, and then respond accordingly, mimicking some basic properties of living cells.
    “Previous DNA computers did not have directed motion built in,” Salaita says. “But to get more sophisticated operations, you need to combine both computation and directed motion. Our DNA computers are essentially autonomous robots with sensing capabilities that determine whether they move or not.”
    The motors can be programmed to respond to a specific pathogen or DNA sequence, making them a potential technology for medical testing and diagnostics. More

  • in

    Researchers engineer electrically tunable graphene devices to study rare physics

    An international team, co-led by researchers at The University of Manchester’s National Graphene Institute (NGI) in the UK and the Penn State College of Engineering in the US, has developed a tunable graphene-based platform that allows for fine control over the interaction between light and matter in the terahertz (THz) spectrum to reveal rare phenomena known as exceptional points. The team published their results today (8 April) in Science.
    The work could advance optoelectronic technologies to better generate, control and sense light and potentially communications, according to the researchers. They demonstrated a way to control THz waves, which exist at frequencies between those of microwaves and infrared waves. The feat could contribute to the development of ‘beyond-5G’ wireless technology for high-speed communication networks.
    Weak and strong interactions
    Light and matter can couple, interacting at different levels: weakly, where they might be correlated but do not change each other’s constituents; or strongly, where their interactions can fundamentally change the system. The ability to control how the coupling shifts from weak to strong and back again has been a major challenge to advancing optoelectronic devices — a challenge researchers have now solved.
    “We have demonstrated a new class of optoelectronic devices using concepts of topology — a branch of mathematics studying properties of geometric objects,” said co-corresponding author Coskun Kocabas, professor of 2D device materials at The University of Manchester. “Using exceptional point singularities, we show that topological concepts can be used to engineer optoelectronic devices that enable new ways to manipulate terahertz light.”
    Kocabas is also affiliated with the Henry Royce Institute for Advanced Materials, headquartered in Manchester. More

  • in

    Nano particle trapped between mirrors works as a quantum sensor

    Sensors are a pillar of the Internet of Things, providing the data to control all sorts of objects. Here, precision is essential, and this is where quantum technologies could make a difference. Researchers are now demonstrating how nanoparticles in tiny optical resonators can be transferred into quantum regime and used as high-precision sensors.
    Advances in quantum physics offer new opportunities to significantly improve the precision of sensors and thus enable new technologies. A team led by Oriol Romero-Isart of the Institute of Quantum Optics and Quantum Information at the Austrian Academy of Sciences and the Department of Theoretical Physics at the University of Innsbruck and a team lead by Romain Quidant of ETH Zurich are now proposing a new concept for a high-precision quantum sensor. The researchers suggest that the motional fluctuations of a nanoparticle trapped in a microscopic optical resonator could be reduced significantly below the zero-point motion, by exploiting the fast unstable dynamics of the system.
    Particle caught between mirrors
    Mechanical quantum squeezing reduces the uncertainty of motional fluctuations below the zero-point motion, and it has been experimentally demonstrated in the past with micromechanical resonators in the quantum regime. The researchers now propose a novel approach, especially tailored to levitated mechanical systems. “We demonstrate that a properly designed optical cavity can be used to rapidly and strongly squeeze the motion of a levitated nanoparticle,” says Katja Kustura of Oriol Romero-Isart’s team in Innsbruck. In an optical resonator, light is reflected between mirrors and it interacts with the levitated nanoparticle. Such interaction can give rise to dynamical instabilities, which are often considered undesirable.
    The researchers now show how they can instead be used as a resource. “In the present work, we show how, by properly controlling these instabilities, the resulting unstable dynamics of a mechanical oscillator inside an optical cavity leads to mechanical squeezing,” Kustura says. The new protocol is robust in the presence of dissipation, making it particularly feasible in levitated optomechanics. In the paper, published in the journal Physical Review Letters, the researchers apply this approach to a silica nanoparticle coupled to a microcavity via coherent scattering. “This example shows that we can squeeze the particle by orders of magnitude below the zero-point motion, even if starting from an initial thermal state,” Oriol Romero-Isart is pleased to say.
    The work provides a new use of optical cavities as mechanical quantum squeezers, and it suggests a viable new route in levitated optomechanics beyond the quantum ground state cooling. Micro-resonators thus offer an interesting new platform for the design of quantum sensors, which could be used, for example, in satellite missions, self-driving cars, and in seismology. The research in Innsbruck and Zurich was financially supported by the European Union.
    Story Source:
    Materials provided by University of Innsbruck. Note: Content may be edited for style and length. More

  • in

    Blockchain offers a solution to post-Brexit border digitization to build supply chain trust, research shows

    As a result of the UK leaving the European Union, logistics firms have faced additional friction at UK borders. Consequently, there have been calls for automated digital borders, but few such systems exist. Surrey researchers have now discovered that a blockchain-based platform can improve supply chain efficiency and trust development at our borders.
    Blockchain is a system in which a record of transactions made in bitcoin, or another cryptocurrency, are maintained across several computers that are linked in a peer-to-peer network. The blockchain based platform studied in this case is known as an RFIT platform; a pilot implementation blockchain system that links data together and ensures that this data is unalterable. This end-to-end visibility of unchangeable data helps to build trust between supply partners.
    Professor of Digital Transformation at the University of Surrey and co-author of the study, Glenn Parry, said:
    “Since the UK’s withdrawal from the EU Customs Union, businesses have faced increased paperwork, border delays and higher costs. A digitally managed border system that identifies trusted shipments appears an obvious solution, but we needed to define what trust actually means and how a digital system can help.
    “Supply chain participants have long recognised the importance of trust in business relationships. Trust is the primary reason companies cite when supply chain relationships break down, which is especially true at customs borders. Current supply chain friction at UK borders is replicated across the world. Delay is caused by a lack of trust in goods flows, and hence a need to inspect.”
    Surrey academics stressed that the introduction of this platform does not remove the need for trust and trust-building processes in established buyer-supplier relationships. It’s crucial that blockchain platform providers continue to build a position of trust with all participants.
    In the case of the import of wine from Australia to the UK, researchers found that the RFIT platform can employ a blockchain layer to make documentation unalterable. The platform facilitates the building of trust across the supply chain by providing a single source of validated data and increasing visibility. Reduced data asymmetry between border agencies and suppliers improves accuracy, timeliness, and integrity.
    Through its 2025 UK Border Strategy, the UK Government is seeking to establish technology leadership in reducing friction in cross-border supply chains.
    Visiting Fellow at Surrey and co-author of the study, Mike Brookbanks, said:
    “The broader findings from the case study are influencing the UK Government on how to address the current challenges with supply chains at UK customs borders. We hope our work will also influence the Government’s current focus on trust ecosystems, as part of the single trade window (STW) initiative. We truly believe that the use of this innovative digital technology will form the Government’s first step in developing a utility trade platform, encouraging broader digitisation of our borders.”
    Story Source:
    Materials provided by University of Surrey. Note: Content may be edited for style and length. More

  • in

    AI predicts if — and when — someone will have cardiac arrest

    A new artificial intelligence-based approach can predict, significantly more accurately than a doctor, if and when a patient could die of cardiac arrest. The technology, built on raw images of patient’s diseased hearts and patient backgrounds, stands to revolutionize clinical decision making and increase survival from sudden and lethal cardiac arrhythmias, one of medicine’s deadliest and most puzzling conditions.
    The work, led by Johns Hopkins University researchers, is detailed today in Nature Cardiovascular Research.
    “Sudden cardiac death caused by arrhythmia accounts for as many as 20 percent of all deaths worldwide and we know little about why it’s happening or how to tell who’s at risk,” said senior author Natalia Trayanova, the Murray B. Sachs professor of Biomedical Engineering and Medicine. “There are patients who may be at low risk of sudden cardiac death getting defibrillators that they might not need and then there are high-risk patients that aren’t getting the treatment they need and could die in the prime of their life. What our algorithm can do is determine who is at risk for cardiac death and when it will occur, allowing doctors to decide exactly what needs to be done.”
    The team is the first to use neural networks to build a personalized survival assessment for each patient with heart disease. These risk measures provide with high accuracy the chance for a sudden cardiac death over 10 years, and when it’s most likely to happen.
    The deep learning technology is called Survival Study of Cardiac Arrhythmia Risk (SSCAR). The name alludes to cardiac scarring caused by heart disease that often results in lethal arrhythmias, and the key to the algorithm’s predictions.
    The team used contrast-enhanced cardiac imagesthat visualize scar distribution from hundreds of real patients at Johns Hopkins Hospital with cardiac scarring to train an algorithm to detect patterns and relationships not visible to the naked eye. Current clinical cardiac image analysis extracts only simple scar features like volume and mass, severely underutilizing what’s demonstrated in this work to be critical data. More