More stories

  • in

    Teaching photonic chips to 'learn'

    A multi-institution research team has developed an optical chip that can train machine learning hardware.
    Machine learning applications skyrocketed to $165B annually, according to a recent report from McKinsey. But, before a machine can perform intelligence tasks such as recognizing the details of an image, it must be trained. Training of modern-day artificial intelligence (AI) systems like Tesla’s autopilot costs several million dollars in electric power consumption and requires supercomputer-like infrastructure. This surging AI “appetite” leaves an ever-widening gap between computer hardware and demand for AI. Photonic integrated circuits, or simply optical chips, have emerged as a possible solution to deliver higher computing performance, as measured by the number of operations performed per second per watt used, or TOPS/W. However, though they’ve demonstrated improved core operations in machine intelligence used for data classification, photonic chips have yet to improve the actual front-end learning and machine training process.
    Machine learning is a two-step procedure. First, data is used to train the system and then other data is used to test the performance of the AI system. IIn a new paper, a team of researchers from the George Washington University, Queens University, University of British Columbia and Princeton University set out to do just that. After one training step, the team observed an error and reconfigured the hardware for a second training cycle followed by additional training cycles until a sufficient AI performance was reached (e.g. the system is able to correctly label objects appearing in a movie). Thus far, photonic chips have only demonstrated an ability to classify and infer information from data. Now, researchers have made it possible to speed up the training step itself.
    This added AI capability is part of a larger effort around photonic tensor cores and other electronic-photonic application-specific integrated circuits (ASIC) that leverage photonic chip manufacturing for machine learning and AI applications.
    “This novel hardware will speed up the training of machine learning systems and harness the best of what both photonics and electronic chips have to offer. It is a major leap forward for AI hardware acceleration. These are the kinds of advancements we need in the semiconductor industry as underscored by the recently passed CHIPS Act.”
    -Volker Sorger, Professor of Electrical and Computer Engineering at the George Washington University and founder of the start-up company Optelligence.
    “The training of AI systems costs a significant amount of energy and carbon footprint. For example, a single AI transformer takes about five times as much CO2 in electricity as a gasoline car spends in its lifetime. Our training on photonic chips will help to reduce this overhead.”
    -Bhavin Shastri, Assistant Professor of Physics Department Queens University
    Story Source:
    Materials provided by George Washington University. Note: Content may be edited for style and length. More

  • in

    Quantum algorithms save time in the calculation of electron dynamics

    Quantum computers promise significantly shorter computing times for complex problems. But there are still only a few quantum computers worldwide with a limited number of so-called qubits. However, quantum computer algorithms can already run on conventional servers that simulate a quantum computer. A team at HZB has succeeded to calculate the electron orbitals and their dynamic development on the example of a small molecule after a laser pulse excitation. In principle, the method is also suitable for investigating larger molecules that cannot be calculated using conventional methods.
    “These quantum computer algorithms were originally developed in a completely different context. We used them here for the first time to calculate electron densities of molecules, in particular also their dynamic evolution after excitation by a light pulse,” says Annika Bande, who heads a group on theoretical chemistry at HZB. Together with Fabian Langkabel, who is doing his doctorate with Bande, she has now shown in a study how well this works.
    Error-free quantum computer
    “We developed an algorithm for a fictitious, completely error-free quantum computer and ran it on a classical server simulating a quantum computer of ten Qbits,” says Fabian Langkabel. The scientists limited their study to smaller molecules in order to be able to perform the calculations without a real quantum computer and to compare them with conventional calculations.
    Faster computation
    Indeed, the quantum algorithms produced the expected results. In contrast to conventional calculations, however, the quantum algorithms are also suitable for calculating significantly larger molecules with future quantum computers: “This has to do with the calculation times. They increase with the number of atoms that make up the molecule,” says Langkabel. While the computing time multiplies with each additional atom for conventional methods, this is not the case for quantum algorithms, which makes them much faster.
    Photocatalysis, light reception and more
    The study thus shows a new way to calculate electron densities and their “response” to excitations with light in advance with very high spatial and temporal resolution. This makes it possible, for example, to simulate and understand ultrafast decay processes, which are also crucial in quantum computers made of so-called quantum dots. Also predictions about the physical or chemical behaviour of molecules are possible, for example during the absorption of light and the subsequent transfer of electrical charges. This could facilitate the development of photocatalysts for the production of green hydrogen with sunlight or help to understand processes in the light-sensitive receptor molecules in the eye.
    Story Source:
    Materials provided by Helmholtz-Zentrum Berlin für Materialien und Energie. Note: Content may be edited for style and length. More

  • in

    Glass-like shells of diatoms help turn light into energy in dim conditions

    A new study has revealed how the glass-like shells of diatoms help these microscopic organisms perform photosynthesis in dim conditions. A better understanding of how these phytoplankton harvest and interact with light could lead to improved solar cells, sensing devices and optical components.
    “The computational model and toolkit we developed could pave the way toward mass-manufacturable, sustainable optical devices and more efficient light harvesting tools that are based on diatom shells,” said research team member Santiago Bernal from McGill University in Canada. “This could be used for biomimetic devices for sensing, new telecommunications technologies or affordable ways to make clean energy.”
    Diatoms are single-celled organisms found in most bodies of water. Their shells are covered in holes that respond to light differently depending on their size, spacing and configuration. In the journal Optical Materials Express, the researchers, led by McGill University’s David V. Plant and Mark Andrews, report the first optical study of an entire diatom shell. They analyzed how different sections of the shell, or frustule, respond to sunlight and how this response is connected to photosynthesis.
    “Based on our findings, we estimate that the frustule can contribute a 9.83 percent boost to photosynthesis, especially during transitions from high to low sunlight,” said Yannick D’Mello, first author of the paper. “Our model is the first to explain the optical behavior of the entire frustule. So, it contributes to the hypothesis that the frustule enhances photosynthesis in diatoms.”
    Combining microscopy and simulation
    Diatoms have evolved for millions of years to survive in any aquatic environment. This includes their shell, which is composed of many regions that work together to harvest sunlight. To study the optical response of diatom frustules, the researchers combined computer optical simulations with several microscopy techniques. More

  • in

    Self-organization: What robotics can learn from amoebae

    LMU researchers have developed a new model to describe how biological or technical systems form complex structures without external guidance.
    Amoebae are single-cell organisms. By means of self-organization, they can form complex structures — and do this purely through local interactions: If they have a lot of food, they disperse evenly through a culture medium. But if food becomes scarce, they emit the messenger known as cyclic adenosine monophosphate (cAMP). This chemical signal induces amoebae to gather in one place and form a multicellular aggregation. The result is a fruiting body.
    “The phenomenon is well known,” says Prof. Erwin Frey from LMU’s Faculty of Physics. “Before now, however, no research group has investigated how information processing, at a general level, affects the aggregation of systems of agents when individual agents — in our case, amoebae — are self-propelled.” More knowledge about these mechanisms would also be interesting, adds Frey, as regards translating them to artificial technical systems.
    Together with other researchers, Frey describes in Nature Communications how active systems that process information in their environment can be used — for technological or biological applications. It is not about understanding all details of the communication between individual agents, but about the specific structures formed through self-organization. This applies to amoebae — and also to certain kinds of robots. The research was undertaken in collaboration with Prof. Igor Aronson during his stay at LMU as a Humboldt Research Award winner.
    From biological mechanism to technological application
    Background: The term “active matter” refers to biological or technical systems from which larger structures are formed by means of self-organization. Such processes are based upon exclusively local interactions between identical, self-propelled units, such as amoebae or indeed robots.
    Inspired by biological systems, Frey and his co-authors propose a new model in which self-propelled agents communicate with each other. These agents recognize chemical, biological, or physical signals at a local level and make individual decisions using their internal machinery that result in collective self-organization. This orientation gives rise to larger structures, which can span multiple length scales.
    The new paradigm of communicating active matter forms the basis of the study. Local decisions in response to a signal and the transmission of information, lead to collectively controlled self-organization.
    Frey sees a possible application of the new model in soft robots — which is to say, robots that are made of soft materials. Such robots are suitable, for example, for performing tasks in human bodies. They can communicate with other soft robots via electromagnetic waves for purposes such as administering drugs at specific sites in the body. The new model can help nanotechnologists design such robot systems by describing the collective properties of robot swarms.
    “It’s sufficient to roughly understand how individual agents communicate with each other; self-organization takes care of the rest,” says Frey. “This is a paradigm shift specifically in robotics, where researchers are attempting to do precisely the opposite — they want to obtain extremely high levels of control.” But that does not always succeed. “Our proposal, by contrast, is to exploit the capacity for self-organization.”
    Story Source:
    Materials provided by Ludwig-Maximilians-Universität München. Note: Content may be edited for style and length. More

  • in

    The interplay between epidemics, prevention information, and mass media

    When an epidemic strikes, more than just infections spread. As cases mount, information about the disease, how to spot it, and how to prevent it propagates rapidly among people in affected areas as well. Relatively little is known, however, about the interplay between the course of epidemics and this diffusion of information to the public.
    A pair of researchers developed a model that examines epidemics through two lenses — the spread of disease and the spread of information — to understand how reliable information can be better disseminated during these events. In Chaos, by AIP Publishing, Xifen Wu and Haibo Bao report their two-layered model can predict the effects of mass media and infection prevention information on the epidemic threshold.
    “In recent years, epidemics spread all over the world together with preventive information. And the mass media affected the people’s attitudes toward epidemic prevention,” said Bao. “Our aim is to find how these factors influence the epidemic propagation and provide certain guidance for epidemic prevention and control.”
    To tackle their question, the researchers’ model compares the interactions between two layers of information. The first is the transmission of the disease itself, propagated through physical contact between people. The second occupies the information space of social networks, where different voices are sharing the do’s and don’ts of infection prevention, called positive and negative information respectively.
    The model provides a set of equations that can be used to calculate the epidemic threshold using a technique called microscopic Markov chains.
    Central to this calculation is the time delay between becoming infected and recovering. The longer it takes for patients to recover from an infection, they found, the less likely a patient is cured, leading to a lower recovery rate and making it easier for a disease to break out.
    Disseminating effective prevention practices and using mass media, however, can increase the epidemic threshold, making it more difficult for the infection to spread. They simulate this by reducing the time delays related to recovery, which boosts recovery rates.
    “The major challenge in our work is how to analyze the impact of positive information, negative information, and the mass media on the recovery rate and epidemic prevalence at the same time,” Bao said. “What surprised us the most is that it is not always possible to improve the recovery rate by increasing the communication rate of mass media.”
    Bao hopes the work inspires others to use high-level mathematics to tackle such cross-disciplinary questions. They next look to analyze the impact of population mobility and vaccination.
    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    Microlaser chip adds new dimensions to quantum communication

    Researchers at Penn Engineering have created a chip that outstrips the security and robustness of existing quantum communications hardware. Their technology communicates in “qudits,” doubling the quantum information space of any previous on-chip laser.
    Liang Feng, Professor in the Departments of Materials Science and Engineering (MSE) and Electrical Systems and Engineering (ESE), along with MSE postdoctoral fellow Zhifeng Zhang and ESE Ph.D. student Haoqi Zhao, debuted the technology in a recent study published in Nature. The group worked in collaboration with scientists from the Polytechnic University of Milan, the Institute for Cross-Disciplinary Physics and Complex Systems, Duke University and the City University of New York (CUNY).
    Bits, Qubits and Qudits
    While non-quantum chips store, transmit and compute data using bits, state-of-the-art quantum devices use qubits. Bits can be 1s or 0s, while qubits are units of digital information capable of being both 1 and 0 at the same time. In quantum mechanics, this state of simultaneity is called “superposition.”
    A quantum bit in a state of superposition greater than two levels is called a qudit to signal these additional dimensions.
    “In classical communications,” says Feng, “a laser can emit a pulse coded as either 1 or 0. These pulses can easily be cloned by an interceptor looking to steal information and are therefore not very secure. In quantum communications with qubits, the pulse can have any superposition state between 1 and 0. Superposition makes it so a quantum pulse cannot be copied. Unlike algorithmic encryption, which blocks hackers using complex math, quantum cryptography is a physical system that keeps information secure.”
    Qubits, however, aren’t perfect. With only two levels of superposition, qubits have limited storage space and low tolerance for interference. More

  • in

    'Brain-like computing' at molecular level is possible

    A breakthrough discovery at University of Limerick in Ireland has revealed for the first time that unconventional brain-like computing at the tiniest scale of atoms and molecules is possible.
    Researchers at University of Limerick’s Bernal Institute worked with an international team of scientists to create a new type of organic material that learns from its past behaviour.
    The discovery of the ‘dynamic molecular switch’ that emulate synaptic behaviour is revealed in a new study in the international journal Nature Materials.
    The study was led by Damien Thompson, Professor of Molecular Modelling in UL’s Department of Physics and Director of SSPC, the UL-hosted Science Foundation Ireland Research Centre for Pharmaceuticals, together with Christian Nijhuis at the Centre for Molecules and Brain-Inspired Nano Systems in University of Twente and Enrique del Barco from University of Central Florida.
    Working during lockdowns, the team developed a two-nanometre thick layer of molecules, which is 50,000 times thinner than a strand of hair and remembers its history as electrons pass through it.
    Professor Thompson explained that the “switching probability and the values of the on/off states continually change in the molecular material, which provides a disruptive new alternative to conventional silicon-based digital switches that can only ever be either on or off.”
    The newly discovered dynamic organic switch displays all the mathematical logic functions necessary for deep learning, successfully emulating Pavlovian ‘call and response’ synaptic brain-like behaviour. More

  • in

    A possible game changer for next generation microelectronics

    Tiny magnetic whirlpools could transform memory storage in high performance computers.
    Magnets generate invisible fields that attract certain materials. A common example is fridge magnets. Far more important to our everyday lives, magnets also can store data in computers. Exploiting the direction of the magnetic field (say, up or down), microscopic bar magnets each can store one bit of memory as a zero or a one — the language of computers.
    Scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory want to replace the bar magnets with tiny magnetic vortices. As tiny as billionths of a meter, these vortices are called skyrmions, which form in certain magnetic materials. They could one day usher in a new generation of microelectronics for memory storage in high performance computers.
    “The bar magnets in computer memory are like shoelaces tied with a single knot; it takes almost no energy to undo them,” said Arthur McCray, a Northwestern University graduate student working in Argonne’s Materials Science Division (MSD). And any bar magnets malfunctioning due to some disruption will affect the others.
    “By contrast, skyrmions are like shoelaces tied with a double knot. No matter how hard you pull on a strand, the shoelaces remain tied.” The skyrmions are thus extremely stable to any disruption. Another important feature is that scientists can control their behavior by changing the temperature or applying an electric current.
    Scientists have much to learn about skyrmion behavior under different conditions. To study them, the Argonne-led team developed an artificial intelligence (AI) program that works with a high-power electron microscope at the Center for Nanoscale Materials (CNM), a DOE Office of Science user facility at Argonne. The microscope can visualize skyrmions in samples at very low temperatures. More