More stories

  • in

    A new chip for decoding data transmissions demonstrates record-breaking energy efficiency

    Imagine using an online banking app to deposit money into your account. Like all information sent over the internet, those communications could be corrupted by noise that inserts errors into the data.
    To overcome this problem, senders encode data before they are transmitted, and then a receiver uses a decoding algorithm to correct errors and recover the original message. In some instances, data are received with reliability information that helps the decoder figure out which parts of a transmission are likely errors.
    Researchers at MIT and elsewhere have developed a decoder chip that employs a new statistical model to use this reliability information in a way that is much simpler and faster than conventional techniques.
    Their chip uses a universal decoding algorithm the team previously developed, which can unravel any error correcting code. Typically, decoding hardware can only process one particular type of code. This new, universal decoder chip has broken the record for energy-efficient decoding, performing between 10 and 100 times better than other hardware.
    This advance could enable mobile devices with fewer chips, since they would no longer need separate hardware for multiple codes. This would reduce the amount of material needed for fabrication, cutting costs and improving sustainability. By making the decoding process less energy intensive, the chip could also improve device performance and lengthen battery life. It could be especially useful for demanding applications like augmented and virtual reality and 5G networks.
    “This is the first time anyone has broken below the 1 picojoule-per-bit barrier for decoding. That is roughly the same amount of energy you need to transmit a bit inside the system. It had been a big symbolic threshold, but it also changes the balance in the receiver of what might be the most pressing part from an energy perspective — we can move that away from the decoder to other elements,” says Muriel Médard, the School of Science NEC Professor of Software Science and Engineering, a professor in the Department of Electrical Engineering and Computer Science, and a co-author of a paper presenting the new chip.

    Médard’s co-authors include lead author Arslan Riaz, a graduate student at Boston University (BU); Rabia Tugce Yazicigil, assistant professor of electrical and computer engineering at BU; and Ken R. Duffy, then director of the Hamilton Institute at Maynooth University and now a professor at Northeastern University, as well as others from MIT, BU, and Maynooth University. The work is being presented at the International Solid-States Circuits Conference.
    Smarter sorting
    Digital data are transmitted over a network in the form of bits (0s and 1s). A sender encodes data by adding an error-correcting code, which is a redundant string of 0s and 1s that can be viewed as a hash. Information about this hash is held in a specific code book. A decoding algorithm at the receiver, designed for this particular code, uses its code book and the hash structure to retrieve the original information, which may have been jumbled by noise. Since each algorithm is code-specific, and most require dedicated hardware, a device would need many chips to decode different codes.
    The researchers previously demonstrated GRAND (Guessing Random Additive Noise Decoding), a universal decoding algorithm that can crack any code. GRAND works by guessing the noise that affected the transmission, subtracting that noise pattern from the received data, and then checking what remains in a code book. It guesses a series of noise patterns in the order they are likely to occur.
    Data are often received with reliability information, also called soft information, that helps a decoder figure out which pieces are errors. The new decoding chip, called ORBGRAND (Ordered Reliability Bits GRAND), uses this reliability information to sort data based on how likely each bit is to be an error.

    But it isn’t as simple as ordering single bits. While the most unreliable bit might be the likeliest error, perhaps the third and fourth most unreliable bits together are as likely to be an error as the seventh-most unreliable bit. ORBGRAND uses a new statistical model that can sort bits in this fashion, considering that multiple bits together are as likely to be an error as some single bits.
    “If your car isn’t working, soft information might tell you that it is probably the battery. But if it isn’t the battery alone, maybe it is the battery and the alternator together that are causing the problem. This is how a rational person would troubleshoot — you’d say that it could actually be these two things together before going down the list to something that is much less likely,” Médard says.
    This is a much more efficient approach than traditional decoders, which would instead look at the code structure and have a performance that is generally designed for the worst-case.
    “With a traditional decoder, you’d pull out the blueprint of the car and examine each and every piece. You’ll find the problem, but it will take you a long time and you’ll get very frustrated,” Médard explains.
    ORBGRAND stops sorting as soon as a code word is found, which is often very soon. The chip also employs parallelization, generating and testing multiple noise patterns simultaneously so it finds the code word faster. Because the decoder stops working once it finds the code word, its energy consumption stays low even though it runs multiple processes simultaneously.
    Record-breaking efficiency
    When they compared their approach to other chips, ORBGRAND decoded with maximum accuracy while consuming only 0.76 picojoules of energy per bit, breaking the previous performance record. ORBGRAND consumes between 10 and 100 times less energy than other devices.
    One of the biggest challenges of developing the new chip came from this reduced energy consumption, Médard says. With ORBGRAND, generating noise sequences is now so energy-efficient that other processes the researchers hadn’t focused on before, like checking the code word in a code book, consume most of the effort.
    “Now, this checking process, which is like turning on the car to see if it works, is the hardest part. So, we need to find more efficient ways to do that,” she says.
    The team is also exploring ways to change the modulation of transmissions so they can take advantage of the improved efficiency of the ORBGRAND chip. They also plan to see how their technique could be utilized to more efficiently manage multiple transmissions that overlap.
    The research is funded, in part, by the U.S. Defense Advanced Research Projects Agency (DARPA) and Science Foundation Ireland. More

  • in

    MoBIE enables modern microscopy with massive data sets

    High-resolution microscopy techniques, for example electron microscopy or super-resolution microscopy, produce huge amounts of data. The visualization, analysis and dissemination of such large imaging data sets poses significant challenges. Now, these tasks can be carried out using MoBIE, which stands for Multimodal Big Image Data Exploration, a new user-friendly, freely available tool developed by researchers from the University of Göttingen and EMBL Heidelberg. This means that researchers such as biologists, who rely on high-resolution microscopy techniques, can incorporate multiple data sets to study the processes of life at the very smallest scales. Their method has now been published in Nature Methods.

    advertisement More

  • in

    Let there be (controlled) light

    In the very near future, quantum computers are expected to revolutionize the way we compute, with new approaches to database searches, AI systems, simulations and more. But to achieve such novel quantum technology applications, photonic integrated circuits which can effectively control photonic quantum states — the so-called qubits — are needed. Physicists from the Helmholtz-Zentrum Dresden-Rossendorf (HZDR), TU Dresden and Leibniz-Institut für Kristallzüchtung (IKZ) have made a breakthrough in this effort: for the first time, they demonstrated the controlled creation of single-photon emitters in silicon at the nanoscale, as they report in Nature Communications.
    Photonic integrated circuits, or in short, PICs, utilize particles of light, better known as photons, as opposed to electrons that run in electronic integrated circuits. The main difference between the two: A photonic integrated circuit provides functions for information signals imposed on optical wavelengths typically in the near infrared spectrum. “Actually, these PICs with many integrated photonic components are able to generate, route, process and detect light on a single chip,” says Dr. Georgy Astakhov, Head of Quantum Technologies at HZDR’s Institute of Ion Beam Physics and Materials Research, and adds: “This modality is poised to play a key role in upcoming future technology, such as quantum computing. And PICs will lead the way.”
    Before, quantum photonics experiments were notorious for the massive use of “bulk optics” distributed across the optical table and occupying the entire lab. Now, photonic chips are radically changing this landscape. Miniaturization, stability and suitability for mass production might turn them into the workhorse of modern-day quantum photonics.
    From random to control mode
    Monolithic integration of single-photon sources in a controllable way would give a resource-efficient route to implement millions of photonic qubits in PICs. To run quantum computation protocols, these photons must be indistinguishable. With this, industrial-scale photonic quantum processor production would become feasible.
    However, the currently established fabrication method stands in the way of the compatibility of this promising concept with today’s semiconductor technology.
    In a first attempt reported about two years ago, the researchers were already able to generate single photons on a silicon wafer, but only in a random and non-scalable way. Since then, they have come far. “Now, we show how focused ion beams from liquid metal alloy ion sources are used to place single-photon emitters at desired positions on the wafer while obtaining a high creation yield and high spectral quality,” says Dr. Nico Klingner, physicist.
    Furthermore, the scientists at HZDR subjected the same single-photon emitters to a rigorous material testing program: After several cooling-down and warming-up cycles, they did not observe any degradation of their optical properties. These findings meet the preconditions required for mass production later on.
    To translate this achievement into a widespread technology, and allow for wafer-scale engineering of individual photon emitters on the atomic scale compatible with established foundry manufacturing, the team implemented broad-beam implantation in a commercial implanter through a lithographically defined mask. “This work really allowed us to take advantage of the state-of-the-art silicon processing cleanroom and electron beam lithography machines at the Nano Fabrication facility Rossendorf,” explains Dr. Ciarán Fowley, Cleanroom group leader and Head of Nanofabrication and Analysis.
    Using both methods, the team can create dozens of telecom single-photon emitters at predefined locations with a spatial accuracy of about 50 nm. They emit in the strategically important telecommunication O-band and exhibit stable operation over days under continuous-wave excitation.
    The scientists are convinced that the realization of controllable fabrication of single-photon emitters in silicon makes them a highly promising candidate for photonic quantum technologies, with a fabrication pathway compatible with very large-scale integration. These single-photon emitters are now technologically ready for production in semiconductor fabs and incorporation into the existing telecommunication infrastructure. More

  • in

    Theory can sort order from chaos in complex quantum systems

    It’s not easy to make sense of quantum-scale motion, but a new mathematical theory developed by scientists at Rice University and Oxford University could help — and may provide insight into improving a variety of computing, electrochemical and biological systems.
    The theory developed by Rice theorist Peter Wolynes and Oxford theoretical chemist David Logan gives a simple prediction for the threshold at which large quantum systems switch from orderly motion like a clock to random, erratic motion like asteroids moving around in the early solar system. Using a computational analysis of a photosynthesis model, collaborators at the University of Illinois Urbana-Champaign showed that the theory can predict the nature of the motions in a chlorophyll molecule when it absorbs energy from sunlight.
    The theory applies to any sufficiently complex quantum system and may give insights into building better quantum computers. It could also, for instance, help design features of next-generation solar cells or perhaps make batteries last longer.
    The study is published this week in the Proceedings of the National Academy of Sciences.
    Nothing is ever completely still on the molecular level, especially when quantum physics plays a role. A water droplet gleaming on a leaf may look motionless, but inside, over a sextillion molecules are vibrating nonstop. Hydrogen and oxygen atoms and the subatomic particles within them — the nuclei and electrons — constantly move and interact.
    “In thinking about the motions of individual molecules at quantum scale, there is often this comparison to the way we think of the solar system,” Wolynes said. “You learn that there are eight planets in our solar system, each one with a well-defined orbit. But in fact, the orbits interact with each other. Nevertheless, the orbits are very predictable. You can go to a planetarium, and they’ll show you what the sky looked like 2,000 years ago. A lot of the motions of the atoms in molecules are exactly that regular or clocklike.”
    When Wolynes and Logan first posed the question of predicting the regularity or randomness of quantum motion, they tested their math against observations of vibrational motions in individual molecules.

    “You only have to know two things about a molecule to be able to analyze its quantum motion patterns,” Wolynes said. “First, you need to know the vibrational frequencies of its particles, that’s to say the frequencies at which the vibrations occur which are like the orbits, and, second, how these vibrations nonlinearly interact with each other. These anharmonic interactions depend mostly on the mass of atoms. For organic molecules, you can predict how strongly those vibrational orbits would interact with one other.”
    Things are more complicated when the molecules also dramatically change structure, for instance as a result of a chemical reaction.
    “As soon as we start looking at molecules that chemically react or rearrange their structure, we know that there’s at least some element of unpredictability or randomness in the process because, even in classical terms, the reaction either happens, or it doesn’t happen,” Wolynes said. “When we try to understand how chemical changes occur, there’s this question: Is the overall motion more clocklike or is it more irregular?”
    Aside from their nonstop vibrations, which happen without light, electrons can have quantum-level interactions that sometimes lead to a more dramatic turn.
    “Because they’re very light, electrons normally move thousands of times faster than the centers of the atoms, the nuclei,” he said. “So though they are constantly moving, the electrons’ orbits smoothly adjust to what the nuclei do. But every now and again, the nuclei come to a place where the electronic energies will almost be equal whether the excitation is on one molecule or on the other. That’s what’s called a surface crossing. At that point, the excitation has a chance to jump from one electronic level to another.”
    Predicting at which point the transfer of energy that takes place during photosynthesis turns from orderly motion to randomness or dissipation would take a significant amount of time and effort by direct computation.

    “It is very nice that we have a very simple formula that determines when this happens,” said Martin Gruebele, a chemist at the University of Illinois Urbana-Champaign and co-author on the study who is a part of the joint Rice-Illinois Center for Adapting Flaws into Features (CAFF) funded by the National Science Foundation. “That’s something we just didn’t have before and figuring it out required very lengthy calculations.”
    The Logan-Wolynes theory opens up a wide array of scientific inquiry ranging from the theoretical exploration of the fundamentals of quantum mechanics to practical applications.
    “The Logan-Wolynes theory did pretty well in terms of telling you at roughly what energy input you’d get a change in quantum-system behavior,” Wolynes said. “But one of the interesting things that the large-scale computations of (co-author Chenghao) Zhang and Gruebele found is that there are these exceptions that stand out from all the possible orbiting patterns you might have. Occasionally there’s a few stragglers where simple motions persist for long times and don’t seem to get randomized. One of the questions we’re going to pursue in the future is how much that persistent regularity is actually influencing processes like photosynthesis.
    “Another direction that is being pursued at Rice where this theory can help is the problem of making a quantum computer that behaves as much as possible in a clocklike way,” he said. “You don’t want your computers to be randomly changing information. The larger and more sophisticated you make a computer, the likelier it is that you’ll run into some kind of randomization effects.”
    Gruebele and collaborators at Illinois also plan to use these ideas in other scientific contexts. “One of our goals, for instance, is to design better human-built light-harvesting molecules that might consist of carbon dots that can transfer the energy to their periphery where it can be harvested,” Gruebele said.
    Wolynes is Rice’s Bullard-Welch Foundation Professor of Science and a professor of chemistry, of biochemistry and cell biology, of physics and astronomy and of materials science and nanoengineering and co-director of its Center for Theoretical Biological Physics (CTBP), which is funded by the National Science Foundation. Logan is the Coulson Professor of Theoretical Chemistry at Oxford. Gruebele is the James R. Eiszner Endowed Chair in Chemistry and Zhang is a graduate student in physics at the University of Illinois Urbana-Champaign.
    The James R. Eiszner Chair in Chemistry and the Physics Department at Illinois, the Bullard-Welch Chair at Rice (C-0016) and the National Science Foundation (PHY-2019745) supported the research. More

  • in

    The quantum twisting microscope: A new lens on quantum materials

    One of the striking aspects of the quantum world is that a particle, say, an electron, is also a wave, meaning that it exists in many places at the same time. In a new study, reported today in Nature, researchers from the Weizmann Institute of Science make use of this property to develop a new type of tool — the quantum twisting microscope (QTM) — that can create novel quantum materials while simultaneously gazing into the most fundamental quantum nature of their electrons. The study’s findings may be used to create electronic materials with unprecedented functionalities.
    The QTM involves the “twisting,” or rotating, of two atomically-thin layers of material with respect to one another. In recent years, such twisting has become a major source of discoveries. It began with the discovery that placing two layers of graphene, one-atom-thick crystalline sheets of carbon, one atop the other with a slight relative twist angle, leads to a “sandwich” with unexpected new properties. The twist angle turned out to be the most critical parameter for controlling the behavior of electrons: Changing it by merely one-tenth of a degree could transform the material from an exotic superconductor into an unconventional insulator. But critical as it is, this parameter is also the hardest to control in experiments. By and large, twisting two layers to a new angle requires building a new “sandwich” from scratch, a process that is very long and tedious.
    “Our original motivation was to solve this problem by building a machine that could continuously twist any two materials with respect to one another, readily producing an infinite range of novel materials,” says team leader Prof. Shahal Ilani of Weizmann’s Condensed Matter Physics Department. “However, while building this machine, we discovered that it can also be turned into a very powerful microscope, capable of seeing quantum electronic waves in ways that were unimaginable before.”
    Creating a quantum picture
    Pictures have long played a central role in scientific discovery. Light microscopes and telescopes routinely provide images that allow scientists to gain a deeper understanding of biological and astrophysical systems. Taking pictures of electrons inside materials, on the other hand, has for many years been notoriously hard, owing to the small dimensions involved. This was transformed some 40 years ago with the invention of the scanning tunneling microscope, which earned its developers the 1986 Nobel Prize in Physics. This microscope uses an atomically sharp needle to scan the surface of a material, measuring the electric current and gradually building an image of the distribution of electrons in the sample.
    “Many different scanning probes have been developed since this invention, each measuring a different electronic property, but all of them measure these properties at one location at a time. So, they mostly see electrons as particles, and can only indirectly learn about their wave nature,” explains Prof. Ady Stern from the Weizmann Institute, who took part in the study along with three other theoretical physicists from the same department: Profs. Binghai Yan, Yuval Oreg and Erez Berg. “As it turned out, the tool that we have built can visualize the quantum electronic waves directly, giving us a way to unravel the quantum dances they perform inside the material,” Stern says.

    Spotting an electron in several places at once
    “The trick for seeing quantum waves is to spot the same electron in different locations at the same time,” says Alon Inbar, a lead author on the paper. “The measurement is conceptually similar to the famous two-slit experiment, which was used a century ago to prove for the first time that electrons in quantum mechanics have a wave nature,” adds Dr. John Birkbeck, another lead author. “The only difference is that we perform such an experiment at the tip of our scanning microscope.”
    To achieve this, the researchers replaced the atomically sharp tip of the scanning tunneling microscope with a tip that contains a flat layer of a quantum material, such as a single layer of graphene. When this layer is brought into contact with the surface of the sample of interest, it forms a two-dimensional interface across which electrons can tunnel at many different locations. Quantum mechanically, they tunnel in all locations simultaneously, and the tunneling events at different locations interfere with each other. This interference allows an electron to tunnel only if its wave functions on both sides of the interface match exactly. “To see a quantum electron, we have to be gentle,” says Ilani. “If we don’t ask it the rude question ‘Where are you?’ but instead provide it with multiple routes to cross into our detector without us knowing where it actually crossed, we allow it to preserve its fragile wave-like nature.”
    Twist and tunnel
    Generally, the electronic waves in the tip and the sample propagate in different directions and therefore do not match. The QTM uses its twisting capability to find the angle at which matching occurs: By continuously twisting the tip with respect to the sample, the tool causes their corresponding wave functions to also twist with respect to one another. Once these wave functions match on both sides of the interface, tunneling can occur. The twisting therefore allows the QTM to map how the electronic wave function depends on momentum, similarly to the way lateral translations of the tip enable the mapping of its dependence on position. Merely knowing at which angles electrons cross the interface supplies the researchers with a great deal of information about the probed material. In this manner they can learn about the collective organization of electrons within the sample, their speed, energy distribution, patterns of interference and even the interactions of different waves with one another.
    A new twist on quantum materials
    “Our microscope will give scientists a new kind of ‘lens’ for observing and measuring the properties of quantum materials,” says Jiewen Xiao, another lead author.
    The Weizmann team has already applied their microscope to studying the properties of several key quantum materials at room temperature and is now gearing up toward doing new experiments at temperatures of a few kelvins, where some of the most exciting quantum mechanical effects are known to take place.
    Peering so deeply into the quantum world can help reveal fundamental truths about nature. In the future, it might also have a tremendous effect on emerging technologies. The QTM will provide researchers with access to an unprecedented spectrum of new quantum interfaces, as well as new “eyes” for discovering quantum phenomena within them. More

  • in

    Reducing social media use significantly improves body image in teens, young adults

    Teens and young adults who reduced their social media use by 50% for just a few weeks saw significant improvement in how they felt about both their weight and their overall appearance compared with peers who maintained consistent levels of social media use, according to research published by the American Psychological Association.
    “Adolescence is a vulnerable period for the development of body image issues, eating disorders and mental illness,” said lead author Gary Goldfield, PhD, of Children’s Hospital of Eastern Ontario Research Institute. “Youth are spending, on average, between six to eight hours per day on screens, much of it on social media. Social media can expose users to hundreds or even thousands of images and photos every day, including those of celebrities and fashion or fitness models, which we know leads to an internalization of beauty ideals that are unattainable for almost everyone, resulting in greater dissatisfaction with body weight and shape.”
    However, much of the psychological research on social media, body image and mental health is correlational, according to Goldfield, so it is uncertain whether people with body image and mental health issues spend more time on social media or if social media use leads to greater body image and mental health issues.
    To better understand the causal effects of reducing social media use on body image, Goldfield and his colleagues previously conducted a pilot study with 38 undergraduate students with elevated levels of anxiety and/or depression. Some of the participants were asked to limit their social media use to no more than 60 minutes per day, while others were allowed unrestricted access. Compared with participants who had unlimited access, participants who restricted their use showed improvements in how they regarded their overall appearance (but not their weight) after three weeks. Due to the small sample size, though, the researchers were unable to conduct a meaningful analysis of the effect of gender.
    The current experiment, involving 220 undergraduate students aged 17-25 (76% female, 23% male, 1% other) and published in the journal Psychology of Popular Media, sought to expand the pilot study and address the gender limitation. In order to qualify, participants had to be regular social media users (at least two hours per day on their smartphones) and exhibit symptoms of depression or anxiety.
    For the first week of the experiment, all participants were instructed to use their social media as they normally would. Social media use was measured using a screen-time tracking program to which participants provided a daily screenshot. After the first week, half the participants were instructed to reduce their social media use to no more than 60 minutes per day. At the start of the experiment, participants also responded to a series of statements about their overall appearance (e.g., “I’m pretty happy about the way I look,”) and weight (e.g., “I am satisfied with my weight,”) on a 5-point scale, with 1 indicating “never” and 5 “always.” Participants completed a similar questionnaire at the end of the experiment.
    For the next three weeks, participants who were instructed to restrict their social media use reduced it by approximately 50% to an average of 78 minutes per day versus the control group, which averaged 188 minutes of social media use per day.
    Participants who reduced their social media use had a significant improvement in how they regarded both their overall appearance and body weight after the three-week intervention, compared with the control group, who saw no significant change. Gender did not appear to make any difference in the effects.
    “Our brief, four-week intervention using screen-time trackers showed that reducing social media use yielded significant improvements in appearance and weight esteem in distressed youth with heavy social media use,” said Goldfield. “Reducing social media use is a feasible method of producing a short-term positive effect on body image among a vulnerable population of users and should be evaluated as a potential component in the treatment of body-image-related disturbances.”
    While the current study was conducted as a proof of concept, Goldfield and his colleagues are in the process of conducting a larger study to see if reduction in social media use can be maintained for longer periods and whether that reduction can lead to even greater psychological benefits. More