More stories

  • in

    New transistor could cut 5% from world’s digital energy budget

    A new spin on one of the 20th century’s smallest but grandest inventions, the transistor, could help feed the world’s ever-growing appetite for digital memory while slicing up to 5% of the energy from its power-hungry diet.
    Following years of innovations from the University of Nebraska-Lincoln’s Christian Binek and University at Buffalo’s Jonathan Bird and Keke He, the physicists recently teamed up to craft the first magneto-electric transistor.
    Along with curbing the energy consumption of any microelectronics that incorporate it, the team’s design could reduce the number of transistors needed to store certain data by as much as 75%, said Nebraska physicist Peter Dowben, leading to smaller devices. It could also lend those microelectronics steel-trap memory that remembers exactly where its users leave off, even after being shut down or abruptly losing power.
    “The implications of this most recent demonstration are profound,” said Dowben, who co-authored a recent paper on the work that graced the cover of the journal Advanced Materials.
    Many millions of transistors line the surface of every modern integrated circuit, or microchip, which itself is manufactured in staggering numbers — roughly 1 trillion in 2020 alone — from the industry-favorite semiconducting material, silicon. By regulating the flow of electric current within a microchip, the tiny transistor effectively acts as a nanoscopic on-off switch that’s essential to writing, reading and storing data as the 1s and 0s of digital technology.
    But silicon-based microchips are nearing their practical limits, Dowben said. Those limits have the semiconductor industry investigating and funding every promising alternative it can. More

  • in

    Innovative technology will use smart sensors to ensure vaccine safety

    A new study from Tel Aviv University enables developers, for the first time in the world, to determine vaccine safety via smart sensors that measure objective physiological parameters. According to the researchers, most clinical trials testing the safety of new vaccines. including COVID-19 vaccines, rely on participants’ subjective reports, which can lead to biased results. In contrast, objective physiological data, obtained through sensors attached to the body, is clear and unambiguous.
    The study was led by Dr. Yftach Gepner of the Department of Epidemiology and Preventive Medicine at TAU’s Sackler Faculty of Medicine, together with Dr. Dan Yamin and Dr. Erez Shmueli from TAU’s Fleischman Faculty of Engineering. The paper was published in Communications Medicine, a journal from the Nature portfolio.
    Dr. Gepner: “In most methods used today, clinical trials designed to evaluate the safety of a new drug or vaccine employ self-report questionnaires, asking participants how they feel before and after receiving the treatment. This is clearly a totally subjective report. Even when Pfizer and Moderna developed their vaccines for the new COVID-19 virus, they used self-reports to prove their safety.”
    In the current study, researchers from Tel Aviv University demonstrated that smart sensors can be used to test new vaccines. The study was conducted when many Israelis received their second dose of the COVID-19 vaccine. The researchers equipped volunteers with innovative, FDA-approved sensors developed by the Israeli company Biobeat. Attached to their chests, these sensors measured physiological reactions from one day before to three days after receiving the vaccine. The innovative sensors monitored 13 physiological parameters, such as: heart rate, breathing rate, saturation (blood oxygen levels), heartbeat volume, temperature, cardiac output, and blood pressure.
    The surprising results: a significant discrepancy was found between subjective self-reports about side effects and actual measurements. That is, in nearly all objective measures, significant changes were identified after vaccination, even for subjects who reported having no reaction at all.
    In addition, the study found that side effects escalate over the first 48 hours, and then parameters return to the level measured before vaccination. In other words: a direct assessment of the vaccine’s safety identified physiological reactions during the first 48 hours, with levels restabilizing afterwards.
    “The message from our study is clear,” says Dr. Gepner. “In 2022 the time has come to conduct continual, sensitive, objective testing of the safety of new vaccines and therapies. There is no reason to rely on self-reports or wait for the occurrence of rare side effects like myocarditis, an inflammation of the heart muscle, which occurs in one of 10,000 patients. Preliminary signs that predict such conditions can be detected with advanced sensors, identifying normal vs. extreme alterations in physiological parameters and any risk of inflammation. Today trial participants are invited to the clinic for blood pressure testing, but often their blood pressure rises just because the situation is stressful. Continual monitoring at home solves these problems with simple, convenient, inexpensive, and accurate means. This is the kind of medicine we should strive for in 2022.”
    Story Source:
    Materials provided by Tel-Aviv University. Note: Content may be edited for style and length. More

  • in

    Trainee teachers made sharper assessments about learning difficulties after receiving feedback from AI

    A trial in which trainee teachers who were being taught to identify pupils with potential learning difficulties had their work ‘marked’ by artificial intelligence has found the approach significantly improved their reasoning.
    The study, with 178 trainee teachers in Germany, was carried out by a research team led by academics at the University of Cambridge and Ludwig-Maximilians-Universität München (LMU Munich). It provides some of the first evidence that artificial intelligence (AI) could enhance teachers’ ‘diagnostic reasoning’: the ability to collect and assess evidence about a pupil, and draw appropriate conclusions so they can be given tailored support.
    During the trial, trainees were asked to assess six fictionalised ‘simulated’ pupils with potential learning difficulties. They were given examples of their schoolwork, as well as other information such as behaviour records and transcriptions of conversations with parents. They then had to decide whether or not each pupil had learning difficulties such as dyslexia or Attention Deficit Hyperactivity Disorder (ADHD), and explain their reasoning.
    Immediately after submitting their answers, half of the trainees received a prototype ‘expert solution’, written in advance by a qualified professional, to compare with their own. This is typical of the practice material student teachers usually receive outside taught classes. The others received AI-generated feedback, which highlighted the correct parts of their solution and flagged aspects they might have improved.
    After completing the six preparatory exercises, the trainees then took two similar follow-up tests — this time without any feedback. The tests were scored by the researchers, who assessed both their ‘diagnostic accuracy’ (whether the trainees had correctly identified cases of dyslexia or ADHD), and their diagnostic reasoning: how well they had used the available evidence to make this judgement.
    The average score for diagnostic reasoning among trainees who had received AI feedback during the six preliminary exercises was an estimated 10 percentage points higher than those who had worked with the pre-written expert solutions. More

  • in

    From computer to benchtop: Researchers find clues to new mechanisms for coronaviruses infections

    A group of bat viruses related to SARS-CoV-2 can also infect human cells but uses a different and unknown entryway.
    While researchers are still honing in on how these viruses infect cells, the findings could help in the development of new vaccines that prevent coronaviruses from causing another pandemic.
    Publishing in the journal, eBioMedicine, a team of Washington State University researchers used a computational approach based on network science to distinguish between a group of coronaviruses that can infect human cells from those that can’t. The researchers then confirmed their computational results in the laboratory, showing that a specific cluster of viruses can infect both human and bat cells.
    “What we find with these viruses is that they’re able to get into the cells through another mechanism or receptor, and that has a lot of implications for how, and if, they would be able to infect us,” said Michael Letko, co-senior author and assistant professor in the Paul Allen School of Global Health.
    Cross-species transmission of coronaviruses poses a serious threat to global health. While numerous coronaviruses have been discovered in wildlife, researchers haven’t been able to predict which pose the greatest threat to humans and are left scrambling to develop vaccines after viruses spill over.
    “As we encroach more and more on places where there are human and animal interactions, it’s quite likely that there will be many viruses that will need to be examined,” said Shira Broschat, professor in the School of Electrical Engineering and Computer Science, also co-senior author on the paper. More

  • in

    Toward high-powered telecommunication systems

    For all the recent advances in integrated lithium niobate photonic circuits — from frequency combs to frequency converters and modulators — one big component has remained frustratingly difficult to integrate: lasers.
    Long haul telecommunication networks, data center optical interconnects, and microwave photonic systems all rely on lasers to generate an optical carrier used in data transmission. In most cases, lasers are stand-alone devices, external to the modulators, making the whole system more expensive and less stable and scalable.
    Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) in collaboration with industry partners at Freedom Photonics and HyperLight Corporation, have developed the first fully integrated high-power laser on a lithium niobate chip, paving the way for high-powered telecommunication systems, fully integrated spectrometers, optical remote sensing, and efficient frequency conversion for quantum networks, among other applications.
    “Integrated lithium niobate photonics is a promising platform for the development of high-performance chip-scale optical systems, but getting a laser onto a lithium niobate chip has proved to be one of the biggest design challenges,” said Marko Loncar, the Tiantsai Lin Professor of Electrical Engineering and Applied Physics at SEAS and senior author of the study. “In this research, we used all the nano-fabrication tricks and techniques learned from previous developments in integrated lithium niobate photonics to overcome those challenges and achieve the goal of integrating a high-powered laser on a thin-film lithium niobate platform.”
    The research is published in the journal Optica.
    Loncar and his team used small but powerful distributed feedback lasers for their integrated chip. On chip, the lasers sit in small wells or trenches etched into the lithium niobate and deliver up to 60 milliwatts of optical power in the waveguides fabricated in the same platform. The researchers combined the laser with a 50 gigahertz electro-optic modulator in lithium niobate to build a high-power transmitter.
    “Integrating high-performance plug-and-play lasers would significantly reduce the cost, complexity, and power consumption of future communication systems,” said Amirhassan Shams-Ansari, a graduate student at SEAS and first author of the study. “It’s a building block that can be integrated into larger optical systems for a range of applications, in sensing, lidar, and data telecommunications.”
    By combining thin-film lithium niobate devices with high-power lasers using an industry-friendly process, this research represents a key step towards large-scale, low-cost, and high-performance transmitter arrays and optical networks. Next, the team aims to increase the laser’s power and scalability for even more applications.
    Harvard’s Office of Technology Development has protected the intellectual property arising from the Loncar Lab’s innovations in lithium niobate systems. Loncar is a cofounder of HyperLight Corporation, a startup which was launched to commercialize integrated photonic chips based on certain innovations developed in his lab.
    The research was co-authored by Dylan Renaud, Rebecca Cheng, Linbo Shao,
    Di Zhu, and Mengjie Yu, from SEAS, Hannah R. Grant, Leif Johansson from Freedom Photonics and Lingyan He and Mian Zhang from HyperLight Corporation. It was supported by the Defense Advanced Research Projects Agency under grant HR0011-20-C-0137 and the Air Force Office of Scientific Research under grant FA9550-19-1-0376. More

  • in

    Dengue detection smartphone tech shows new hope for low-cost diagnostics

    Accurate home testing could be used for a wider range of illnesses, as new research shows the capability of smartphone-powered tests for Dengue Fever.
    In a paper published in PLOS Neglected Tropical Diseases today, biomedical technology researchers from the University of Reading used a new diagnostic kit called Cygnus to detect Dengue Fever with significantly improved rates over lateral flow testing kits.
    Working with academics and clinicians in Thailand, the team trialled the tests alongside already established alternatives in and found the new tests showed 82% clinical sensitivity, beating lateral flow testing (74% sensitivity) and matching hospital-based lab diagnostics (83% sensitivity). At the same time, these devices make 10 measurements allowing us to identify which of the 4 different dengue virus types caused the infection.
    Dr Sarah Needs, Postdoctoral Research Associate in Microfluidic Antimicrobial Resistance Testing from the University of Reading is lead author of the paper.
    Dr Needs said:
    “The paper shows exciting potential for the use of the microfluidic ‘lab on a strip’ tests that can used in conjunction with a smartphone and are more powerful than LFT testing in this case. As well as being cheap to produce, the lab on a strip technology allows users to test many different targets at once in one single sample, so it could be useful to detect multiple diseases not just one. More

  • in

    Invisible helium atoms provide exquisitely sensitive test of fundamental theory

    Physicists at the Australian National University have developed the most sensitive method ever for measuring the potential energy of an atom (within a hundredth of a decillionth of a joule — or 10-35 joule), and used it to validate one of the most tested theories in physics — quantum electrodynamics (QED).
    The research, published this week in Science relies on finding the colour of laser light where a helium atom is invisible, and is an independent corroboration of previous methods used to test QED, which have involved measuring transitions from one atomic energy state to another.
    “This invisibility is only for a specific atom and a specific colour of light — so it couldn’t be used to make an invisibility cloak that Harry Potter would use to investigate dark corners at Hogwarts,” said lead author, Bryce Henson, a PhD student at ANU Research School of Physics.
    “But we were able to use to investigate some dark corners of QED theory.”
    “We were hoping to catch QED out, because there have been some previous discrepancies between theory and experiments, but it passed with a pretty good mark.”
    Quantum Electrodynamics, or QED, was developed in the late 1940s and describes how light and matter interact, incorporating both quantum mechanics and Einstein’s special theory of relativity in a way that has remained successful for nearly eighty years. More

  • in

    Engineered crystals could help computers run on less power

    Computers may be growing smaller and more powerful, but they require a great deal of energy to operate. The total amount of energy the U.S. dedicates to computing has risen dramatically over the last decade and is quickly approaching that of other major sectors, like transportation.
    In a study published online this week the journal Nature, University of California, Berkeley, engineers describe a major breakthrough in the design of a component of transistors — the tiny electrical switches that form the building blocks of computers — that could significantly reduce their energy consumption without sacrificing speed, size or performance. The component, called the gate oxide, plays a key role in switching the transistor on and off.
    “We have been able to show that our gate-oxide technology is better than commercially available transistors: What the trillion-dollar semiconductor industry can do today — we can essentially beat them,” said study senior author Sayeef Salahuddin, the TSMC Distinguished professor of Electrical Engineering and Computer Sciences at UC Berkeley.
    This boost in efficiency is made possible by an effect called negative capacitance, which helps reduce the amount of voltage that is needed to store charge in a material. Salahuddin theoretically predicted the existence of negative capacitance in 2008 and first demonstrated the effect in a ferroelectric crystal in 2011.
    The new study shows how negative capacitance can be achieved in an engineered crystal composed of a layered stack of hafnium oxide and zirconium oxide, which is readily compatible with advanced silicon transistors. By incorporating the material into model transistors, the study demonstrates how the negative capacitance effect can significantly lower the amount of voltage required to control transistors, and as a result, the amount of energy consumed by a computer.
    “In the last 10 years, the energy used for computing has increased exponentially, already accounting for single digit percentages of the world’s energy production, which grows only linearly, without an end in sight,” Salahuddin said. “Usually, when we are using our computers and our cell phones, we don’t think about how much energy we are using. But it is a huge amount, and it is only going to go up. Our goal is to reduce the energy needs of this basic building block of computing, because that brings down the energy needs for the entire system.”
    Bringing negative capacitance to real technology More