More stories

  • in

    Pythagoras was wrong: there are no universal musical harmonies, new study finds

    The tone and tuning of musical instruments has the power to manipulate our appreciation of harmony, new research shows. The findings challenge centuries of Western music theory and encourage greater experimentation with instruments from different cultures.
    According to the Ancient Greek philosopher Pythagoras, ‘consonance’ — a pleasant-sounding combination of notes — is produced by special relationships between simple numbers such as 3 and 4. More recently, scholars have tried to find psychological explanations, but these ‘integer ratios’ are still credited with making a chord sound beautiful, and deviation from them is thought to make music ‘dissonant’, unpleasant sounding.
    But researchers from Cambridge University, Princeton and the Max Planck Institute for Empirical Aesthetics, have now discovered two key ways in which Pythagoras was wrong.
    Their study, published in Nature Communications, shows that in normal listening contexts, we do not actually prefer chords to be perfectly in these mathematical ratios.
    “We prefer slight amounts of deviation. We like a little imperfection because this gives life to the sounds, and that is attractive to us,” said co-author, Dr Peter Harrison, from Cambridge University’s Faculty of Music and Director of its Centre for Music and Science.
    The researchers also found that the role played by these mathematical relationships disappears when you consider certain musical instruments that are less familiar to Western musicians, audiences and scholars. These instruments tend to be bells, gongs, types of xylophones and other kinds of pitched percussion instruments. In particular, they studied the ‘bonang’, an instrument from the Javanese gamelan built from a collection of small gongs.
    “When we use instruments like the bonang, Pythagoras’s special numbers go out the window and we encounter entirely new patterns of consonance and dissonance,” Dr Harrison said.

    “The shape of some percussion instruments means that when you hit them, and they resonate, their frequency components don’t respect those traditional mathematical relationships. That’s when we find interesting things happening.”
    “Western research has focused so much on familiar orchestral instruments, but other musical cultures use instruments that, because of their shape and physics, are what we would call ‘inharmonic’.
    The researchers created an online laboratory in which over 4,000 people from the US and South Korea participated in 23 behavioural experiments. Participants were played chords and invited to give each a numeric pleasantness rating or to use a slider to adjust particular notes in a chord to make it sound more pleasant. The experiments produced over 235,000 human judgments.
    The experiments explored musical chords from different perspectives. Some zoomed in on particular musical intervals and asked participants to judge whether they preferred them perfectly tuned, slightly sharp or slightly flat. The researchers were surprised to find a significant preference for slight imperfection, or ‘inharmonicity’. Other experiments explored harmony perception with Western and non-Western musical instruments, including the bonang.
    Instinctive appreciation of new kinds of harmony
    The researchers found that the bonang’s consonances mapped neatly onto the particular musical scale used in the Indonesian culture from which it comes. These consonances cannot be replicated on a Western piano, for instance, because they would fall between the cracks of the scale traditionally used.

    “Our findings challenge the traditional idea that harmony can only be one way, that chords have to reflect these mathematical relationships. We show that there are many more kinds of harmony out there, and that there are good reasons why other cultures developed them,” Dr Harrison said.
    Importantly, the study suggests that its participants — not trained musicians and unfamiliar with Javanese music — were able to appreciate the new consonances of the bonang’s tones instinctively.
    “Music creation is all about exploring the creative possibilities of a given set of qualities, for example, finding out what kinds of melodies can you play on a flute, or what kinds of sounds can you make with your mouth,” Harrison said.
    “Our findings suggest that if you use different instruments, you can unlock a whole new harmonic language that people intuitively appreciate, they don’t need to study it to appreciate it. A lot of experimental music in the last 100 years of Western classical music has been quite hard for listeners because it involves highly abstract structures that are hard to enjoy. In contrast, psychological findings like ours can help stimulate new music that listeners intuitively enjoy.”
    Exciting opportunities for musicians and producers
    Dr Harrison hopes that the research will encourage musicians to try out unfamiliar instruments and see if they offer new harmonies and open up new creative possibilities.
    “Quite a lot of pop music now tries to marry Western harmony with local melodies from the Middle East, India, and other parts of the world. That can be more or less successful, but one problem is that notes can sound dissonant if you play them with Western instruments.
    “Musicians and producers might be able to make that marriage work better if they took account of our findings and considered changing the ‘timbre’, the tone quality, by using specially chosen real or synthesised instruments. Then they really might get the best of both worlds: harmony and local scale systems.”
    Harrison and his collaborators are exploring different kinds of instruments and follow-up studies to test a broader range of cultures. In particular, they would like to gain insights from musicians who use ‘inharmonic’ instruments to understand whether they have internalised different concepts of harmony to the Western participants in this study. More

  • in

    Maths: Smart learning software helps children during lockdowns — and beyond

    Intelligent tutoring systems for math problems helped pupils remain or even increase their performance during the Corona pandemic. This is the conclusion of a new study led by the Martin Luther University Halle-Wittenberg (MLU) and Loughborough University in the UK. As part of their work, the researchers analysed data from five million exercises done by around 2,700 pupils in Germany over a period of five years. The study found that particularly lower-performing children benefit if they use the software regularly. The paper was published in the journal Computers and Education Open.
    Intelligent tutoring systems are digital learning platforms that children can use to complete maths problems. “The advantage of those rapid learning aids is that pupils receive immediate feedback after they submit their solution. If a solution is incorrect, the system will provide further information about the pupil’s mistake. If certain errors are repeated, the system recognises a deficit and provides further problem sets that address the issue,” explains Assistant Professor Dr Markus Spitzer, a psychologist at MLU. Teachers could also use the software to discover possible knowledge gaps in their classes and adapt their lessons accordingly.
    For the new study, Spitzer and his colleague Professor Korbinian Moeller from Loughborough University used data from “Bettermarks,” a large commercial provider of such tutoring systems in Germany. The team analysed the performance of pupils before, during and after the first two coronavirus lockdowns. Their analysis included data from around 2,700 children who solved more than five million problems. The data was collected between January 2017 and the end of May 2021. “This longer timeframe allowed us to observe the pupils’ performance trajectories over several years and analyse them in a wider context,” says Spitzer.
    The students’ performance was shown to remain constant throughout the period. “The fact that their performance didn’t drop during the lockdowns is a win in and of itself. But our analysis also shows that lower-performing children even managed to narrow the gap between themselves and higher achieving pupils,” Spitzer concludes.
    According to the psychologist, intelligent tutoring systems are a useful addition to conventional maths lessons. “The use of tutoring systems varies greatly from state to state. However, our study suggests that their use should be expanded across the board,” explains Spitzer. The systems could also help during future school closures, for example in the event of extreme weather conditions, transport strikes or similar events. More

  • in

    Visual prosthesis simulator offers a glimpse into the future

    In collaboration with their colleagues at the Donders Institute, researchers at the Netherlands Institute for Neuroscience have developed a simulator that enables artificial visual observations for research into the visual prosthesis. This open source tool is available to researchers and offers those who are interested insight into the future application.
    Blindness affects approximately forty million people worldwide and is expected to become increasingly common in the coming years. Patients with a damaged visual system can be broadly divided into two groups: those in whom the damage is located in front of or in the photoreceptors of the retina; and those in whom the damage is further along in the visual system. Various retinal prostheses have been developed for the first group of patients in recent years and clinical tests are underway. The problems for the second group are more difficult to tackle.
    A potential solution for these patients is to stimulate the cerebral cortex. By implanting electrodes in the brain’s visual cortex and stimulating the surrounding tissue with weak electrical currents, tiny points of light known as ‘phosphenes’ can be generated. This prosthesis converts camera input into electrical stimulation of the cerebral cortex. In doing so, it bypasses part of the affected visual system and thus allow some form of vision. You could compare it with a matrix sign along the highway, where individual lights form a combined image.
    How we can ensure that such an implant can actually be used to navigate the street or read texts remains an important question. Maureen van der Grinten and Antonia Lozano, from Pieter Roelfsema’s group, along with colleagues from the Donder’s Institute, are members of a large European consortium. This consortium is working on a prosthesis that focuses on the visual cerebral cortex. Maureen van der Grinten emphasizes: “At the moment there is a discrepancy between the amount of electrodes we can implant in people and the functionalities we would like to test. The hardware is simply not far enough yet. To bridge this gap, the process is often imitated through a simulation.”
    Simulated Phosphene Vision
    “Instead of waiting until blind people have received implants, we’re trying to simulate the situation based on the knowledge we have. We can use that as a basis to see how many points of light people need to find a door for example. We call this ‘simulated phosphene vision’. So far this has only been tested with simple shapes: 200 light points that are neatly-orientated, rectangular pixels of equal size on a screen. People can test this with VR glasses, which is very useful, but does not correspond to the actual vision of blind people with a prosthesis.”
    “To make our simulation more realistic, we collected a whole load of literature, created and validated models and looked at the extent to which the results correspond to the effects that people reported. It turns out that the dots vary greatly in shape and size depending on the parameters used in the stimulation. You can imagine that if you increase the current, the stimulation in the brain will spread further, hit more neurons and therefore provide a larger bright spot. The location of the electrode also determines the size of the dots. By influencing the various parameters, we looked at how this actually changes what people see.”
    Publicly Accessible
    “The simulator is currently being used for research in Nijmegen, where they are investigating the impact of eye movements. With this article we hope to offer other researchers the opportunity to use our simulation as well. We would like to emphasize that the simulator is publicly accessible to everyone, with the flexibility to make adjustments where necessary. It is even possible to optimize the simulation using AI, which can assist you in identifying the necessary stimulation for a specific image.”
    “We are now also using the simulator to give people an idea of where this research could go and what to expect when the first treatments are carried out in a few years. Using VR glasses we can simulate the current situation with 100 electrodes, which also highlights how limited vision through a prosthesis is: they may be able to find a door, but won’t have the ability to recognize facial expressions. Alternatively, we can show a situation with tens of thousands electrodes and what that will bring us when this technology is developed far enough.’ More

  • in

    Researchers use Hawk supercomputer and lean into imperfection to improve solar cell efficiency

    Since the turn of the century, Germany has made major strides in solar energy production. In 2000, the country generated less than one percent of its electricity with solar power, but by 2022, that figure had risen to roughly 11 percent. A combination of lucrative subsidies for homeowners and technological advances to bring down the costs of solar panels helped drive this growth.
    With global conflicts making oil and natural gas markets less reliable, solar power stands to play an even larger role in helping meet Germany’s energy needs in the years to come. While solar technology has come a long way in the last quarter century, the solar cells in contemporary solar panels still only operate at about 22 percent efficiency on average.
    In the interest of improving solar cell efficiency, a research team led by Prof. Wolf Gero Schmidt at the University of Paderborn has been using high-performance computing (HPC) resources at the High-Performance Computing Center Stuttgart (HLRS) to study how these cells convert light to electricity. Recently, the team has been using HLRS’s Hawk supercomputer to determine how designing certain strategic impurities in solar cells could improve performance.
    “Our motivation on this is two-fold: at our institute in Paderborn, we have been working for quite some time on a methodology to describe microscopically the dynamics of optically excited materials, and we have published a number of pioneering papers about that topic in recent years,” Schmidt said. “But recently, we got a question from collaborators at the Helmholtz Zentrum Berlin who were asking us to help them understand at a fundamental level how these cells work, so we decided to use our method and see what we could do.”
    Recently, the team used Hawk to simulate how excitons — a pairing of an optically exited electron and the electron “hole” it leaves behind — can be controlled and moved within solar cells so more energy is captured. In its research, the team made a surprising discovery: it found that certain defects to the system, introduced strategically, would improve exciton transfer rather than impede it. The team published its results in Physical Review Letters.
    Designing solar cells for more efficient energy conversion
    Most solar cells, much like many modern electronics, are primarily made of silicon. After oxygen, it is the second most abundant chemical element on Earth in terms of mass. Around 15 percent of our entire planet consists of silicon, including 25.8 percent of the Earth’s crust. The basic material for climate-friendly energy production is therefore abundant and available almost everywhere.

    However, this material does have certain drawbacks for capturing solar radiation and converting it into electricity. In traditional, silicon-based solar cells, light particles, called photons, transfer their energy to available electrons in the solar cell. The cell then uses those excited electrons to create an electrical current.
    The problem? High-energy photons provide far more energy than what can be transformed into electricity by silicon. Violet light photons, for instance, have about three electron volts (eV) of energy, but silicon is only able to convert about 1.1 eV of that energy into electricity. The rest of the energy is lost as heat, which is both a missed opportunity for capturing additional energy and reduces solar cell performance and durability.
    In recent years, scientists have started to look for ways to reroute or otherwise capture some of that excess energy. While several methods are being investigated, Schmidt’s team has focused on using a molecule-thin layer of tetracene, another organic semiconductor material, as the top layer of a solar cell.
    Unlike silicon, when tetracene receives a high-energy photon, it splits the resulting excitons into two lower-energy excitations in a process known as singlet fission. By placing a carefully designed interface layer between tetracene and silicon, the resulting low-energy excitons can be transferred from tetracene into silicon, where most of their energy can be converted into electricity.
    Utility in imperfection
    Whether using tetracene or another material to augment traditional solar cells, researchers have focused on trying to design the perfect interface between constituent parts of a solar cell to provide the best-possible conditions for exciton transfer.

    Schmidt and his team use ab initio molecular dynamics (AIMD) simulations to study how particles interact and move within a solar cell. With access to Hawk, the team is able to do computationally expensive calculations to observe how several hundred atoms and their electrons interact with one another. The team uses AIMD simulations to advance time at femtosecond intervals to understand how electrons interact with electron holes and other atoms in the system. Much like other researchers, the team sought to use its computational method to identify imperfections in the system and look for ways to improve on it.
    In search of the perfect interface, they found a surprise: that an imperfect interface might be better for exciton transfer. In an atomic system, atoms that are not fully saturated, meaning they are not completely bonded to other atoms, have so-called “dangling bonds.” Researchers normally assume dangling bonds lead to inefficiencies in electronic interfaces, but in its AIMD simulations, the team found that silicon dangling bonds actually fostered additional exciton transfer across the interface.
    “Defect always implies that there is some unwanted thing in a system, but that is not really true in our case,” said Prof. Uwe Gerstmann, a University of Paderborn professor and collaborator on the project. “In semiconductor physics, we have already strategically used defects that we call donors or acceptors, which help us build diodes and transistors. So strategically, defects can certainly help us build up new kinds of technologies.”
    Dr. Marvin Krenz, a postdoctoral researcher at the University of Paderborn and lead author on the team’s paper, pointed out the contradiction in the team’s findings compared to the current state of solar cell research. “It is an interesting point for us that the current direction of the research was going toward designing ever-more perfect interfaces and to remove defects at all costs. Our paper might be interesting for the larger research community because it points out a different way to go when it comes to designing these systems,” he said.
    Armed with this new insight, the team now plans to use its future computing power to design interfaces that are perfectly imperfect, so to speak. Knowing that silicon dangling bonds can help foster this exciton transfer, the team wants to use AIMD to reliably design an interface with improved exciton transfer. For the team, the goal is not to design the perfect solar cell overnight, but to continue to make subsequent generations of solar technology better.
    “I feel confident that we will continue to gradually improve solar cell efficiency over time,” Schmidt said. “Over the last few decades, we have seen an average annual increase in efficiency of around 1% across the various solar cell architectures. Work such as the one we have carried out here suggests that further increases can be expected in the future. In principle, an increase in efficiency by a factor of 1.4 is possible through the consistent utilization of singlet fission.” More

  • in

    Resurrecting niobium for quantum science

    For years, niobium was considered an underperformer when it came to superconducting qubits. Now scientists supported by Q-NEXT have found a way to engineer a high-performing niobium-based qubit and so take advantage of niobium’s superior qualities.
    When it comes to quantum technology, niobium is making a comeback.
    For the past 15 years, niobium has been sitting on the bench after experiencing a few mediocre at-bats as a core qubit material.
    Qubits are the fundamental components of quantum devices. One qubit type relies on superconductivity to process information.
    Touted for its superior qualities as a superconductor, niobium was always a promising candidate for quantum technologies. But scientists found niobium difficult to engineer as a core qubit component, and so it was relegated to the second string on Team Superconducting Qubit.
    Now, a group led by Stanford University’s David Schuster has demonstrated a way to create niobium-based qubits that rival the state-of-the-art for their class.
    “This was a promising first foray, having resurrected niobium junctions. … With niobium-based qubits’ broad operational reach, we open up a whole new set of capabilities for future quantum technologies.” — David Schuster, Stanford University
    “We’ve shown that niobium is relevant again, expanding the possibilities of what we can do with qubits,” said Alexander Anferov of the University of Chicago’s Physical Science division, one of the lead scientists of the result.

    The team’s work is published in Physical Review Applied and was supported in part by Q-NEXT, a U.S. Department of Energy (DOE) National Quantum Information Science Research Center led by DOE’s Argonne National Laboratory.
    By harnessing niobium’s standout features, scientists will be able to expand the capabilities of quantum computers, networks and sensors. These quantum technologies draw on quantum physics to process information in ways that outclass their traditional counterparts and are expected to improve areas as varied as medicine, finance and communication.
    The niobium advantage
    When it comes to superconducting qubits, aluminum has ruled the roost. Aluminum-based superconducting qubits can store information for a relatively long time before the data inevitably disintegrates. These longer coherence times mean more time for processing information.
    The longest coherence times for an aluminum-based superconducting qubit are a few hundred millionths of a second. By contrast, in recent years, the best niobium-based qubits yielded coherence times that are 100 times shorter — a few hundred billionths of a second.
    Despite that short qubit lifetime, niobium held attractions. A niobium-based qubit can operate at higher temperatures than its aluminum counterpart and so would require less cooling. It can also operate across an eight-times-greater frequency range and a massive 18,000-times-wider magnetic field range compared to aluminum-based qubits, expanding the menu of uses for the superconducting-qubit family.

    In one respect, there was no contest between the two materials: Niobium’s operating range trounced aluminum’s. But for years, the short coherence time made the niobium-based qubit a nonstarter.
    “No one really made that many qubits out of niobium junctions because they were limited by their coherence,” Anferov said. “But our group wanted to make a qubit that could work at higher temperatures and a greater frequncy range — at 1 K and 100 gigahertz. And for both of those properties, aluminum is not sufficient. We needed something else.”
    So, the team had another look at niobium.
    Losing the lossiness
    Specifically, they had a look at the niobium Josephson junction. The Josephson junction is the information-processing heart of the superconducting qubit.
    In classical information processing, data comes in bits that are either 0s or 1s. In quantum information processing, a qubit is a mixture of 0 and 1. The superconducting qubit’s information “lives” as a mixture of 0 and 1 inside the junction. The longer the junction can sustain the information in that mixed state, the better the junction and the better the qubit.
    The Josephson junction is structured like a sandwich, consisting of a layer of nonconducting material squeezed between two layers of superconducting metal. A conductor is a material that provides easy passage for electrical current. A superconductor kicks it up a notch: It carries electrical current with zero resistance. Electromagnetic energy flows between the junction’s outer layers in the mixed quantum state.
    The typical, trusty aluminum Josephson junction is made of two layers of aluminum and a middle layer of aluminum oxide. A typical niobium junction is made of two layers of niobium and a middle layer of niobium oxide.
    Schuster’s group found that the junction’s niobium oxide layer sapped the energy required to sustain quantum states. They also identified the niobium junctions’ supporting architecture as a big source of energy loss, causing the qubit’s quantum state to fizzle out.
    The team’s breakthrough involved both a new junction arrangement and a new fabrication technique.
    The new arrangement called on a familiar friend: aluminum. The design did away with the energy-sucking niobium oxide. And instead of two distinct materials, it used three. The result was a low-loss, trilayer junction — niobium, aluminum, aluminum oxide, aluminum, niobium.
    “We did this best-of-both-worlds approach,” Anferov said. “The thin layer of aluminum can inherit the superconducting properties of the niobium nearby. This way, we can use the proven chemical properties of aluminum and still have the superconducting properties of niobium.”
    The group’s fabrication technique involved removing scaffolding that supported the niobium junction in previous schemes. They found a way to maintain the junction’s structure while getting rid of the loss-inducing, extraneous material that hampered coherence in previous designs.
    “It turns out just getting rid of the garbage helped,” Anferov said.
    A new qubit is born
    After incorporating their new junction into superconducting qubits, the Schuster group achieved a coherence time of 62 millionths of a second, 150 times longer than its best-performing niobium predecessors. The qubits also exhibited a quality factor — an index of how well a qubit stores energy — of 2.57 x 105, a 100-fold improvement over previous niobium-based qubits and competitive with aluminum-based qubit quality factors.
    “We’ve made this junction that still has the nice properties of niobium, and we’ve improved the loss properties of the junction,” Anferov said. “We can directly outperform any aluminum qubit because aluminum is an inferior material in many ways. I now have a qubit that doesn’t die at higher temperatures, which is the big kicker.”
    The results will likely elevate niobium’s place in the lineup of superconducting qubit materials. More

  • in

    What math tells us about social dilemmas

    Human coexistence depends on cooperation. Individuals have different motivations and reasons to collaborate, resulting in social dilemmas, such as the well-known prisoner’s dilemma. Scientists from the Chatterjee group at the Institute of Science and Technology Austria (ISTA) now present a new mathematical principle that helps to understand the cooperation of individuals with different characteristics. The results, published in PNAS, can be applied to economics or behavioral studies.
    A group of neighbors shares a driveway. Following a heavy snowstorm, the entire driveway is covered in snow, requiring clearance for daily activities. The neighbors have to collaborate. If they all put on their down jackets, grab their snow shovels, and start digging, the road will be free in a very short amount of time. If only one or a few of them take the initiative, the task becomes more time-consuming and labor-intensive. Assuming nobody does it, the driveway will stay covered in snow. How can the neighbors overcome this dilemma and cooperate in their shared interests?
    Scientists in the Chatterjee group at the Institute of Science and Technology Austria (ISTA) deal with cooperative questions like that on a regular basis. They use game theory to lay the mathematical foundation for decision-making in such social dilemmas. The group’s latest publication delves into the interactions between different types of individuals in a public goods game. Their new model, published in PNAS, explores how resources should be allocated for the best overall well-being and how cooperation can be maintained.
    The game of public goods
    For decades, the public goods game has been a proven method to model social dilemmas. In this setting, participants decide how much of their own resources they wish to contribute for the benefit of the entire group. Most existing studies considered homogeneous individuals, assuming that they do not differ in their motivations and other characteristics. “In the real world, that’s not always the case,” says Krishnendu Chatterjee. To account for this, Valentin Hübner, a PhD student, Christian Hilbe, and Maria Kleshina, both former members of the Chatterjee group, started modeling settings with diverse individuals. A recent analysis of social dilemmas among unequals, published in 2019, marked the foundation for their work, which now presents a more general model, even allowing multi-player interaction.
    “The public good in our game can be anything, such as environmental protection or combating climate change, to which everybody can contribute,” Hübner explains. The players have different levels of skills. In public goods games, skills typically refer to productivity. “It’s the ability to contribute to a particular task,” Hübner continues. Resources, technically called endowment or wealth, on the other hand, refer to the actual things that participants contribute to the common good.
    In the snowy driveway scenario, the neighbors vary significantly in their available resources and in their abilities to use them. Solving the problem requires them to cooperate. But what role does their inequality play in such a dilemma?

    The two sides of inequality
    Hübner’s new model provides answers to this question. Intuitively, it proposes that for diverse individuals to sustain cooperation, a more equal distribution of resources is necessary. Surprisingly, more equality does not lead to maximum general welfare. To reach this, the resources should be allocated to more skilled individuals, resulting in a slightly uneven distribution. “Efficiency benefits from unequal endowment, while robustness always benefits from equal endowment,” says Hübner. Put simply, for accomplishing a task, resources should be distributed almost evenly. Yet, if efficiency is the goal, resources should be in the hands of those more willing to participate — but only to a certain extent.
    What is more important — cooperation efficiency or stability? The scientists’ further simulations of learning processes suggest that individuals balance the trade-off between these two things. Whether this is also the case in the real world remains to be seen. Numerous interpersonal nuances also contribute to these dynamics, including aspects like reciprocity, morality, and ethical issues, among others.
    Hübner’s model solely focuses on cooperation from a mathematical standpoint. Yet, due to its generality, it can be applied to any social dilemma with diverse individuals, like climate change, for instance. Testing the model in the real world and applying it to society are very interesting experimental directions. “I’m quite sure that there will be behavioral experiments benefiting from our work in the future,” says Chatterjee. The study could potentially also be interesting for economics, where the new model’s principles can help to better inform economic systems and policy recommendations. More

  • in

    Reimagining electron microscopy: Bringing high-end resolution to lower-cost microscopes

    Researchers at the University of Illinois at Urbana-Champaign have shown for the first time that expensive aberration-corrected microscopes are no longer required to achieve record-breaking microscopic resolution.
    The field of microscopy is in the middle of a great revolution. Since the 1800s and the invention of the compound light microscope, there have only been a few major jumps in resolution to see different length scales: from bacteria and cells, to viruses and proteins, and even down to single atoms. Generally, as resolution has been making these incredible jumps, so has the price of the microscopes used to achieve that resolution. Such hefty price tags severely limits the accessibility of these instruments. The current jump in resolution comes from a new technique called electron ptychography — a method that uses computation to boost the resolution of electron microscopes — which has taken the field by storm in the last 5-6 years.
    Researchers at the University of Illinois Urbana-Champaign have demonstrated record-breaking resolution using electron ptychography on “conventional” transmission electron microscopes (conventional meaning without expensive aberration correctors). This breaks the trend of increasing microscope price with increasing resolution. They were able to achieve deep sub-angstrom spatial resolution down to 0.44 angstrom (one angstrom is one ten-billionth of a meter), which exceeds the resolution of aberration-corrected tools and rivals their highest ptychographic resolutions.
    “For the last 90-100 years, our field has thought that the way to do great microscopy is to make better and better microscopes,” says materials science & engineering professor Pinshane Huang, who led this work. “The most exciting thing about our research is that we’re showing that you don’t need a cutting-edge microscope to make this work. We can take a ‘conventional’ microscope and do the same thing, using ptychography, and it’s just as good! This is amazing because there can be a multi-million-dollar difference in cost between the two setups.”
    This research, co-first authored by former MatSE UIUC postdoctoral researcher Kayla Nguyen, former MatSE UIUC graduate student Chia-Hao Lee and Argonne National Laboratory staff scientist Yi Jiang, was recently published in the journal Science.
    Before ptychography, the highest resolution electron microscopes used a technology called aberration-correction to allow scientists to see individual atoms. Rather than using a beam of light to probe a sample, electron microscopes use a beam of electrons, focused by electromagnets. Electrons have wavelengths thousands of times smaller than visible light, which allows electron microscopes to resolve objects that are many times smaller than can be seen with optical microscopes. Scientists use these microscopes to decode the structures of objects ranging from the spike protein on the COVID-19 virus to the arrangements of atoms in graphene and, more generally, to peer inside matter to understand its atomic structure, composition and bonding.
    However, one of the challenges of using beams of electrons is focusing that beam. “It’s impossible to make a perfect lens for electrons,” Huang says. “What people have been doing to compensate is making ‘bad’ lenses, and then putting aberration correctors after them, which are a series of ‘bad’ lenses that are ‘bad’ in opposite ways. Summed together, they make ‘okay’ lenses, and that’s been the gold standard for how we image at the atomic scale for at least 20 years.”
    In optics, an aberration is any way that a lens deviates from a perfect lens. For example, human eyes can have several types of aberrations such as short- and near-sightedness (eyes can’t focus at all distances) and astigmatism (curvature of the eyeball that causes blurred vision). Lee explains, “For electromagnetic lenses, the way to focus these electrons is through an electromagnetic field. But we don’t have a great way of controlling the shape and the strength of the electromagnetic field, which puts a very strong limitation on how precise we can be focusing these electrons.” In aberration-corrected microscopy, the current cutting-edge technology, there is an extra stack of lenses to correct the aberrations from the regular lenses, that changes the shape of the beam before it hits the sample. Those extra aberration correcting lenses are where significant costs are added to the microscope.

    While it is impossible to make a perfect lens, the goal of the last 100 years has been to continuously make better lenses to minimize aberrations. But Huang says, “What’s exciting about ptychography is that you don’t have to make better and better lenses. What we can do instead is use computers.”
    Rather than using a stack of lens optics to remove aberrations, ptychography removes them computationally. With a new generation of detectors, called hybrid pixel detectors, that cost a few hundred thousand dollars (compared to aberration-corrected microscopes that cost up to $7 million) and computer algorithms, this method can double, triple or even quadruple the resolution of what a microscope can achieve with its physical lenses. Huang and her team have shown that their approach quadruples the resolution of conventional transmission electron microscopes. Further, nearly any scanning transmission electron microscope can now be adapted to achieve state-of-the-art resolution at a fraction of the cost.
    While this approach is game-changing, Huang notes that ptychography is still a challenging technique that requires a lot of computation power. It can take hours to get a single reconstruction to reach the best resolution. But, as with many other technologies, computation advances quite rapidly and gets cheaper, faster and easier to use.
    “We brought a cutting-edge technique, electron ptychography, to conventional transmission electron microscopes to show for the first time that a ‘mediocre’ microscope can do just as well as the most expensive microscopes on the market,” Huang says. “This is significant for the hundreds of institutions across the country and across the world who previously couldn’t afford the cutting edge. Now, all they need is a detector, some computers and electron ptychography. And once you do that, you can see the atomic world with much more detail than anyone imagined even 10 years ago. This represents a huge paradigm shift.” More

  • in

    Robots, monitoring and healthy ecosystems could halve pesticide use without hurting productivity

    Smarter crop farming that combats weeds, insect pests and plant diseases by integrating modern technologies like AI-based monitoring, robotics, and next-generation biotechnology with healthy and resilient agricultural ecosystems: One Crop Health, a new research collaboration based at University of Copenhagen aims to reduce the use of pesticides by developing a sustainable agriculture for the future.
    In keeping with the age-old saying: ‘prevention is better than a cure’ more sustainable approaches should await the farmers of the future without compromising their productivity, as envisioned by the researchers leading the recently launched One Crop Health research project.
    Backed by DKK 60 million (€8.05M) from Novo Nordisk, researchers from University of Copenhagen will gather knowledge over the next six years to develop smarter agriculture that is both sustainable and able to produce enough food for the world’s growing population in a collaboration with researchers from Aarhus University and Rothamsted Research in the UK.
    For many years, growers have relied on pesticides to control disease, pests and weeds, which lead to worldwide losses of one-third of crop yield. At the same time, estimates show that we will need 60% more food than today by 2050.
    However, pesticides also threaten health, nature and biodiversity, and there is increasing pressure from society and politicians to limit their use. Recently the desire for healthier and more sustainable crops resulted in EU policy plans for a 50% reduction in pesticides by 2030. These plans have now been set on pause after being challenged by farmers concerned that pesticide reduction could make farming unviable by hurting productivity.
    So, can growers halve their use of pesticides without the profession becoming unproductive and the world running out of food or? This is the central question that One Crop Health seeks to answer.
    Making pesticide reduction profitable for farmers
    “The political controversy, farmer and industry concern, and the uncertainty over regulation clearly shows that more research is needed to enable this transition. Most farmers actually want to use less pesticides, but there is a need for research that demonstrates how this is possible whilst maintaining healthy, high-yielding, and profitable crops. The One Crop Health project aims to bridge the gap by developing research, which places the drive to reduce pesticide use on a scientific foundation that will ultimately help farmers make the rational decision to help reduce pesticide use,” says Professor Paul Neve from the Department of Plant and Environmental Sciences

    According to the project’s lead researcher, making a viable transition is possible though, using a more holistic approach to farming.
    “Generally the focus needs to move away from solving individual problems by way of a few blunt tools like pesticides. Oftentimes, this approach creates new problems elsewhere, which then also need to be addressed. Planting fields densely to avoid weeds is another example of this lack of oversight, where the misguided solution ends up creating the optimal conditions for fungi and diseases instead. We need to get better at understanding entire ecosystems and then make use of all of the modern tools available,” says Professor Paul Neve.
    “If we can create healthy ecosystems that will reduce the numbers of pests, weed and diseases, it will simply reduce the need for spraying. We can largely replace the remaining need with other tools, for example, AI-based monitoring and modelling can help to inform where and when pests need to be controlled and new solutions such as bio-pesticides can be used to achieve that,” he says.
    Natural defenses have been replaced with chemicals
    The resilience that results from healthy ecosystems can reduce challenges that are currently solved by means like pesticides.
    “Today’s crops are characterized by attempts to make crop production more efficient for thousands of years. In the process, many crops have lost their natural defences against pests, weeds and diseases,” explains Paul Neve.

    According to the researcher, tomorrow’s growers can get better at protecting their plants by looking at how things are done in nature. Here, helpful microorganisms like bacteria and fungi protect against diseases, and healthy crops are more competitive against weeds. At the same time, pests that threaten crops will be better controlled by their natural enemies.
    “If you think of the field as an entire ecosystem that needs to thrive — hence the name One Crop Health, we believe you get a preventive overall effect. At the same time, modern knowledge and technology can, for example, change the basic need for pesticides. Whereas entire fields are sprayed today, drone surveillance will allow us to only target where weeds are a threat to the crop, or not at all, and let robots do the work instead,” says the professor.
    100 farmers to help researchers
    “Part of the project is about working with farmers, using their fields to discover smart solutions, so future agriculture can be based on the best possible knowledge,” says Paul Neve.
    In collaboration with 100 farms, and distributed equally across Denmark and England, they will begin by collecting data from scratch.
    “We begin by asking farmers about their current challenges, how they will try to solve them and about what works and what doesn’t. To this knowledge bank, we will add our professional knowledge of ecosystems, modern technologies and methods, and then the goal when the six years have passed is to be able to give a lot of knowledge and concrete methods back to farmers,” says Paul Neve.
    Drone data and models will predict field needs
    The final prong in the effort to achieve a holistic understanding comes from broad knowledge and modern technologies that provide access.
    In an interdisciplinary collaboration with the Department of Computer Science at University of Copenhagen, a group of computer scientists will translate information from surveillance with modern technologies such as drones, and on that basis design so-called digital twins of farmer’s fields.
    “They will provide a kind of model that can predict how fields will behave, the needs that are just around the corner and how different solutions will affect the fields,” says Paul Neve.
    In the second half of the six-year project, 11 Ph.D. students will develop knowledge about specific solutions within their respective focus areas.
    “When the six years have passed, it is important for us that we have integrated data and new tools in a holistic way, providing farmers with concrete methods that they can use to solve their challenges in everyday life more sustainably without compromising productivity,” Paul Neve concludes. More