More stories

  • in

    Trainee teachers made sharper assessments about learning difficulties after receiving feedback from AI

    A trial in which trainee teachers who were being taught to identify pupils with potential learning difficulties had their work ‘marked’ by artificial intelligence has found the approach significantly improved their reasoning.
    The study, with 178 trainee teachers in Germany, was carried out by a research team led by academics at the University of Cambridge and Ludwig-Maximilians-Universität München (LMU Munich). It provides some of the first evidence that artificial intelligence (AI) could enhance teachers’ ‘diagnostic reasoning’: the ability to collect and assess evidence about a pupil, and draw appropriate conclusions so they can be given tailored support.
    During the trial, trainees were asked to assess six fictionalised ‘simulated’ pupils with potential learning difficulties. They were given examples of their schoolwork, as well as other information such as behaviour records and transcriptions of conversations with parents. They then had to decide whether or not each pupil had learning difficulties such as dyslexia or Attention Deficit Hyperactivity Disorder (ADHD), and explain their reasoning.
    Immediately after submitting their answers, half of the trainees received a prototype ‘expert solution’, written in advance by a qualified professional, to compare with their own. This is typical of the practice material student teachers usually receive outside taught classes. The others received AI-generated feedback, which highlighted the correct parts of their solution and flagged aspects they might have improved.
    After completing the six preparatory exercises, the trainees then took two similar follow-up tests — this time without any feedback. The tests were scored by the researchers, who assessed both their ‘diagnostic accuracy’ (whether the trainees had correctly identified cases of dyslexia or ADHD), and their diagnostic reasoning: how well they had used the available evidence to make this judgement.
    The average score for diagnostic reasoning among trainees who had received AI feedback during the six preliminary exercises was an estimated 10 percentage points higher than those who had worked with the pre-written expert solutions. More

  • in

    From computer to benchtop: Researchers find clues to new mechanisms for coronaviruses infections

    A group of bat viruses related to SARS-CoV-2 can also infect human cells but uses a different and unknown entryway.
    While researchers are still honing in on how these viruses infect cells, the findings could help in the development of new vaccines that prevent coronaviruses from causing another pandemic.
    Publishing in the journal, eBioMedicine, a team of Washington State University researchers used a computational approach based on network science to distinguish between a group of coronaviruses that can infect human cells from those that can’t. The researchers then confirmed their computational results in the laboratory, showing that a specific cluster of viruses can infect both human and bat cells.
    “What we find with these viruses is that they’re able to get into the cells through another mechanism or receptor, and that has a lot of implications for how, and if, they would be able to infect us,” said Michael Letko, co-senior author and assistant professor in the Paul Allen School of Global Health.
    Cross-species transmission of coronaviruses poses a serious threat to global health. While numerous coronaviruses have been discovered in wildlife, researchers haven’t been able to predict which pose the greatest threat to humans and are left scrambling to develop vaccines after viruses spill over.
    “As we encroach more and more on places where there are human and animal interactions, it’s quite likely that there will be many viruses that will need to be examined,” said Shira Broschat, professor in the School of Electrical Engineering and Computer Science, also co-senior author on the paper. More

  • in

    Toward high-powered telecommunication systems

    For all the recent advances in integrated lithium niobate photonic circuits — from frequency combs to frequency converters and modulators — one big component has remained frustratingly difficult to integrate: lasers.
    Long haul telecommunication networks, data center optical interconnects, and microwave photonic systems all rely on lasers to generate an optical carrier used in data transmission. In most cases, lasers are stand-alone devices, external to the modulators, making the whole system more expensive and less stable and scalable.
    Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) in collaboration with industry partners at Freedom Photonics and HyperLight Corporation, have developed the first fully integrated high-power laser on a lithium niobate chip, paving the way for high-powered telecommunication systems, fully integrated spectrometers, optical remote sensing, and efficient frequency conversion for quantum networks, among other applications.
    “Integrated lithium niobate photonics is a promising platform for the development of high-performance chip-scale optical systems, but getting a laser onto a lithium niobate chip has proved to be one of the biggest design challenges,” said Marko Loncar, the Tiantsai Lin Professor of Electrical Engineering and Applied Physics at SEAS and senior author of the study. “In this research, we used all the nano-fabrication tricks and techniques learned from previous developments in integrated lithium niobate photonics to overcome those challenges and achieve the goal of integrating a high-powered laser on a thin-film lithium niobate platform.”
    The research is published in the journal Optica.
    Loncar and his team used small but powerful distributed feedback lasers for their integrated chip. On chip, the lasers sit in small wells or trenches etched into the lithium niobate and deliver up to 60 milliwatts of optical power in the waveguides fabricated in the same platform. The researchers combined the laser with a 50 gigahertz electro-optic modulator in lithium niobate to build a high-power transmitter.
    “Integrating high-performance plug-and-play lasers would significantly reduce the cost, complexity, and power consumption of future communication systems,” said Amirhassan Shams-Ansari, a graduate student at SEAS and first author of the study. “It’s a building block that can be integrated into larger optical systems for a range of applications, in sensing, lidar, and data telecommunications.”
    By combining thin-film lithium niobate devices with high-power lasers using an industry-friendly process, this research represents a key step towards large-scale, low-cost, and high-performance transmitter arrays and optical networks. Next, the team aims to increase the laser’s power and scalability for even more applications.
    Harvard’s Office of Technology Development has protected the intellectual property arising from the Loncar Lab’s innovations in lithium niobate systems. Loncar is a cofounder of HyperLight Corporation, a startup which was launched to commercialize integrated photonic chips based on certain innovations developed in his lab.
    The research was co-authored by Dylan Renaud, Rebecca Cheng, Linbo Shao,
    Di Zhu, and Mengjie Yu, from SEAS, Hannah R. Grant, Leif Johansson from Freedom Photonics and Lingyan He and Mian Zhang from HyperLight Corporation. It was supported by the Defense Advanced Research Projects Agency under grant HR0011-20-C-0137 and the Air Force Office of Scientific Research under grant FA9550-19-1-0376. More

  • in

    Dengue detection smartphone tech shows new hope for low-cost diagnostics

    Accurate home testing could be used for a wider range of illnesses, as new research shows the capability of smartphone-powered tests for Dengue Fever.
    In a paper published in PLOS Neglected Tropical Diseases today, biomedical technology researchers from the University of Reading used a new diagnostic kit called Cygnus to detect Dengue Fever with significantly improved rates over lateral flow testing kits.
    Working with academics and clinicians in Thailand, the team trialled the tests alongside already established alternatives in and found the new tests showed 82% clinical sensitivity, beating lateral flow testing (74% sensitivity) and matching hospital-based lab diagnostics (83% sensitivity). At the same time, these devices make 10 measurements allowing us to identify which of the 4 different dengue virus types caused the infection.
    Dr Sarah Needs, Postdoctoral Research Associate in Microfluidic Antimicrobial Resistance Testing from the University of Reading is lead author of the paper.
    Dr Needs said:
    “The paper shows exciting potential for the use of the microfluidic ‘lab on a strip’ tests that can used in conjunction with a smartphone and are more powerful than LFT testing in this case. As well as being cheap to produce, the lab on a strip technology allows users to test many different targets at once in one single sample, so it could be useful to detect multiple diseases not just one. More

  • in

    50 years ago, the future of solar energy looked bright

    Farming the sun’s energy – Science News, April 8, 1972

    More and more scientists and engineers are beginning to believe that solar conversion will account for a significant portion of the world’s future power needs.… What has changed the atmosphere lately is … the possibility of putting together large-scale units, solar-energy “farms” that would compete with power stations in the megawatt range and higher.

    Update

    Solar energy production in the United States ramped up as solar panels became cheaper to manufacture and more efficient at generating electricity (SN: 3/1/08, p. 133). Since the first U.S. solar power plant opened in 1982, thousands more have been built, bringing the country’s solar capacity today to more than 100 gigawatts. In 2021, solar energy made up nearly 3 percent of the electricity produced in the United States. And the future is looking bright: Solar energy and storage is projected to account for more than 60 percent of the U.S. power grid’s new generating capacity from 2022 through 2023, according to the U.S. Energy Information Administration. More

  • in

    Invisible helium atoms provide exquisitely sensitive test of fundamental theory

    Physicists at the Australian National University have developed the most sensitive method ever for measuring the potential energy of an atom (within a hundredth of a decillionth of a joule — or 10-35 joule), and used it to validate one of the most tested theories in physics — quantum electrodynamics (QED).
    The research, published this week in Science relies on finding the colour of laser light where a helium atom is invisible, and is an independent corroboration of previous methods used to test QED, which have involved measuring transitions from one atomic energy state to another.
    “This invisibility is only for a specific atom and a specific colour of light — so it couldn’t be used to make an invisibility cloak that Harry Potter would use to investigate dark corners at Hogwarts,” said lead author, Bryce Henson, a PhD student at ANU Research School of Physics.
    “But we were able to use to investigate some dark corners of QED theory.”
    “We were hoping to catch QED out, because there have been some previous discrepancies between theory and experiments, but it passed with a pretty good mark.”
    Quantum Electrodynamics, or QED, was developed in the late 1940s and describes how light and matter interact, incorporating both quantum mechanics and Einstein’s special theory of relativity in a way that has remained successful for nearly eighty years. More

  • in

    Engineered crystals could help computers run on less power

    Computers may be growing smaller and more powerful, but they require a great deal of energy to operate. The total amount of energy the U.S. dedicates to computing has risen dramatically over the last decade and is quickly approaching that of other major sectors, like transportation.
    In a study published online this week the journal Nature, University of California, Berkeley, engineers describe a major breakthrough in the design of a component of transistors — the tiny electrical switches that form the building blocks of computers — that could significantly reduce their energy consumption without sacrificing speed, size or performance. The component, called the gate oxide, plays a key role in switching the transistor on and off.
    “We have been able to show that our gate-oxide technology is better than commercially available transistors: What the trillion-dollar semiconductor industry can do today — we can essentially beat them,” said study senior author Sayeef Salahuddin, the TSMC Distinguished professor of Electrical Engineering and Computer Sciences at UC Berkeley.
    This boost in efficiency is made possible by an effect called negative capacitance, which helps reduce the amount of voltage that is needed to store charge in a material. Salahuddin theoretically predicted the existence of negative capacitance in 2008 and first demonstrated the effect in a ferroelectric crystal in 2011.
    The new study shows how negative capacitance can be achieved in an engineered crystal composed of a layered stack of hafnium oxide and zirconium oxide, which is readily compatible with advanced silicon transistors. By incorporating the material into model transistors, the study demonstrates how the negative capacitance effect can significantly lower the amount of voltage required to control transistors, and as a result, the amount of energy consumed by a computer.
    “In the last 10 years, the energy used for computing has increased exponentially, already accounting for single digit percentages of the world’s energy production, which grows only linearly, without an end in sight,” Salahuddin said. “Usually, when we are using our computers and our cell phones, we don’t think about how much energy we are using. But it is a huge amount, and it is only going to go up. Our goal is to reduce the energy needs of this basic building block of computing, because that brings down the energy needs for the entire system.”
    Bringing negative capacitance to real technology More

  • in

    Computerized, rolling DNA motors move molecular robotics to next level

    Chemists integrated computer functions into rolling DNA-based motors, opening a new realm of possibilities for miniature, molecular robots. Nature Nanotechnology published the development, the first DNA-based motors that combine computational power with the ability to burn fuel and move in an intentional direction.
    “One of our big innovations, beyond getting the DNA motors to perform logic computations, is finding a way to convert that information into a simple output signal — motion or no motion,” says Selma Piranej, an Emory University PhD candidate in chemistry, and first author of the paper. “This signal can be read by anyone holding a cell phone equipped with an inexpensive magnifying attachment.”
    “Selma’s breakthrough removes major roadblocks that stood in the way of making DNA computers useful and practical for a range of biomedical applications,” says Khalid Salaita, senior author of the paper and an Emory professor of chemistry at Emory University. Salaita is also on the faculty of the Wallace H. Coulter Department of Biomedical Engineering, a joint program of Georgia Tech and Emory.
    The motors can sense chemical information in their environment, process that information, and then respond accordingly, mimicking some basic properties of living cells.
    “Previous DNA computers did not have directed motion built in,” Salaita says. “But to get more sophisticated operations, you need to combine both computation and directed motion. Our DNA computers are essentially autonomous robots with sensing capabilities that determine whether they move or not.”
    The motors can be programmed to respond to a specific pathogen or DNA sequence, making them a potential technology for medical testing and diagnostics. More