More stories

  • in

    Quantum dots boost perovskite solar cell efficiency and scalability

    Perovskites are hybrid compounds made from metal halides and organic constituents. They show great potential in a range of applications, e.g. LED lights, lasers, and photodetectors, but their major contribution is in solar cells, where they are poised to overtake the market from their silicon counterparts.
    One of the obstacles facing the commercialization of perovskite solar cells is that their power-conversion efficiency and operational stability drop as they scale up, making it a challenge to maintain high performance in a complete solar cell.
    The problem is partly with the cell’s electron-transport layer, which ensures that the electrons produced when the cell absorbs light will transfer efficiently to the device’s electrode. In perovskite solar cells, the electron-transport layer is made with mesoporous titanium dioxide, which shows low electron mobility, and is also susceptible to adverse, photocatalytic events under ultraviolet light.
    In a new publication in Science, scientists led by Professor Michael Grätzel at EPFL and Dr Dong Suk Kim at the Korea Institute of Energy Research have found an innovative way to increase the performance and maintain it at a high level in perovskite solar cells even at large scales. The innovative idea was to replace the electron-transport layer with a thin layer of quantum dots.
    Quantum dots are nanometer-sized particle that act as semiconductors, and emit light of specific wavelengths (colors) when they illuminated. Their unique optical properties make quantum dots ideal for use in a variety of optical applications, including photovoltaic devices.
    The scientists replaced the titanium dioxide electron-transport layer of their perovskite cells with a thin layer of polyacrylic acid-stabilized tin(IV) oxide quantum dots, and found that it enhanced the devices’ light-capturing capacity, while also suppressing nonradiative recombination, an efficiency-sapping phenomenon that sometimes takes on the interface between the electron-transport layer and the actual perovskite layer.
    By using the quantum dot layer, the researchers found that perovskite solar cells of 0.08 square centimeters attained a record power-conversion efficiency of 25.7% (certified 25.4%) and high operational stability, while facilitating the scale-up. When increasing the surface area of the solar cells to 1, 20, and 64 square centimeters, power-conversion efficiency measured at 23.3, 21.7, and 20.6% respectively.
    Other contributors Ulsan National Institute of Science and Technology University of Ulsan Zurich University of Applied Sciences Uppsala University
    Story Source:
    Materials provided by Ecole Polytechnique Fédérale de Lausanne. Original written by Nik Papageorgiou. Note: Content may be edited for style and length. More

  • in

    Advancing materials science with the help of biology and a dash of dish soap

    Compounds that form tiny crystals hold secrets that could advance renewable energy generation and semiconductor development. Revealing the arrangement of their atoms has already allowed for breakthroughs in materials science and solar cells. However, existing techniques for determining these structures can damage sensitive microcrystals.
    Now scientists have a new tool in their tool belts: a system for investigating microcrystals by the thousands with ultrafast pulses from an X-ray free-electron laser (XFEL), which can collect structural information before damage sets in. This approach, developed over the past decade to study proteins and other large biological molecules at the Department of Energy’s SLAC National Accelerator Laboratory, has now been applied for the first time to small molecules that are of interest to chemistry and materials science.
    Researchers from the University of Connecticut, SLAC, DOE’s Lawrence Berkeley National Laboratory and other institutions developed the new process, called small molecule serial femtosecond X-ray crystallography or smSFX, to determine the structures of three compounds that form microcrystal powders, including two that were previously unknown. The experiments took place at SLAC’s Linac Coherent Light Source (LCLS) XFEL and the SACLA XFEL in Japan.
    The new approach is likely to have a big impact since it should be “broadly applicable across XFEL and synchrotron radiation facilities equipped for serial crystallography,” the research team wrote in a paper published today in Nature.
    Disentangling metal compounds
    Researchers used the method to determine the structures of two metal-organic materials, thiorene and tethrene, for the first time. Both are potential candidates for use in next-generation field effect transistors, energy storage devices, and solar cells and panels. Mapping thiorene and tethrene allowed researchers to better understand why some other metal-organic materials glow bright blue under ultraviolet light, which the scientists compared to Frodo’s magical sword, Sting, in The Lord of the Rings. More

  • in

    Researchers simulate behavior of living 'minimal cell' in three dimensions

    Scientists report that they have built a living “minimal cell” with a genome stripped down to its barest essentials — and a computer model of the cell that mirrors its behavior. By refining and testing their model, the scientists say they are developing a system for predicting how changes to the genomes, living conditions or physical characteristics of live cells will alter how they function.
    They report their findings in the journal Cell.
    Minimal cells have pared-down genomes that carry the genes necessary to replicate their DNA, grow, divide and perform most of the other functions that define life, said Zaida (Zan) Luthey-Schulten, a chemistry professor at the University of Illinois Urbana-Champaign who led the work with graduate student Zane Thornburg. “What’s new here is that we developed a three-dimensional, fully dynamic kinetic model of a living minimal cell that mimics what goes on in the actual cell,” Luthey-Schulten said.
    The simulation maps out the precise location and chemical characteristics of thousands of cellular components in 3D space at an atomic scale. It tracks how long it takes for these molecules to diffuse through the cell and encounter one another, what kinds of chemical reactions occur when they do, and how much energy is required for each step.
    To build the minimal cell, scientists at the J. Craig Venter Institute in La Jolla, California, turned to the simplest living cells — the mycoplasmas, a genus of bacteria that parasitize other organisms. In previous studies, the JCVI team built a synthetic genome missing as many nonessential genes as possible and grew the cell in an environment enriched with all the nutrients and factors needed to sustain it. For the new study, the team added back a few genes to improve the cell’s viability. This cell is simpler than any naturally occurring cell, making it easier to model on a computer.
    Simulating something as enormous and complex as a living cell relies on data from decades of research, Luthey-Schulten said. To build the computer model, she and her colleagues at Illinois had to account for the physical and chemical characteristics of the cell’s DNA; lipids; amino acids; and gene-transcription, translation and protein-building machinery. They also had to model how each component diffused through the cell, keeping track of the energy required for each step in the cell’s life cycle. NVIDIA graphic processing units were used to perform the simulations.
    “We built a computer model based on what we knew about the minimal cell, and then we ran simulations,” Thornburg said. “And we checked to see if our simulated cell was behaving like the real thing.”
    The simulations gave the researchers insight into how the actual cell “balances the demands of its metabolism, genetic processes and growth,” Luthey-Schulten said. For example, the model revealed that the cell used the bulk of its energy to import essential ions and molecules across its cell membrane. This makes sense, Luthey-Schulten said, because mycoplasmas get most of what they need to survive from other organisms.
    The simulations also allowed Thornburg to calculate the natural lifespan of messenger RNAs, the genetic blueprints for building proteins. They also revealed a relationship between the rate at which lipids and membrane proteins were synthesized and changes in membrane surface area and cell volume.
    “We simulated all of the chemical reactions inside a minimal cell — from its birth until the time it divides two hours later,” Thornburg said. “From this, we get a model that tells us about how the cell behaves and how we can complexify it to change its behavior.”
    “We developed a three-dimensional, fully dynamic kinetic model of a living minimal cell,” Luthey-Schulten said. “Our model opens a window on the inner workings of the cell, showing us how all of the components interact and change in response to internal and external cues. This model — and other, more sophisticated models to come — will help us better understand the fundamental principles of life.” More

  • in

    Towards quantum simulation of false vacuum decay

    Phase transitions are everywhere, ranging from water boiling to snowflakes melting, and from magnetic transitions in solids to cosmological phase transitions in the early universe. Particularly intriguing are quantum phase transitions that occur at temperatures close to absolute zero and are driven by quantum rather than thermal fluctuations.
    Researchers in the University of Cambridge studied properties of quantum phases and their transitions using ultracold atoms in an optical lattice potential (formed by a set of standing wave lasers). Typically, the transition from a Mott insulator (MI) to a superfluid (SF), which is governed by the interplay of the atom-atom interactions and the hopping of atoms, is a continuous transition, where the system undergoes a smooth continuous change crossing the phase transition point.
    However, many phase transitions are discontinuous, such as water freezing to ice, or the transition thought to have triggered the inflation period in the early universe. These are called ‘first-order transitions’ and for instance allow both phases to coexist — just like ice blocks in a glass of water — and can lead to hysteresis and metastability, where a system remains stuck in its original phase (the false vacuum) even though the ground state has changed.
    By resonantly shaking the position of the lattice potential, the researchers could couple or “mix” the first two bands of the lattice. For the right parameters, this can excite the atoms from the lowest band into the first excited band, where they would form a new superfluid in which the atoms appear at the edge of the band. Crucially, the transition from the original Mott insulator in the lowest band to the resulting staggered superfluid in the excited band can be first-order (discontinuous), because the non-staggered order in the Mott insulator is incompatible with the staggered order of this superfluid — so the system has to choose one. The researchers could directly observe the metastability and hysteresis associated with this first-order transition by monitoring how fast one phase changes into another, or not. The findings are published in the journal Nature Physics.
    “We realised a very flexible platform where phase transitions could be tuned from continuous to discontinuous by changing the shaking strength. This demonstration opens up new opportunities for exploring the role of quantum fluctuations in first-order phase transitions, for instance, the false vacuum decay in the early universe,” said first author Dr Bo Song from Cambridge’s Cavendish Laboratory. “It is really fascinating that we are on the road to cracking the mystery of the hot and dense early universe using such a cold and tiny atomic ensemble.”
    “We are excited to enhance the scope of quantum simulators from condensed matter settings towards potential simulations of the early universe. While there clearly is a long way still to go, this work is an important first step,” added Professor Ulrich Schneider, who led the research at the Cavendish Laboratory. “This work also provides a testbed for exploring the spontaneous formation of spatial structures when a strongly interacting quantum system undergoes a discontinuous transition.”
    “The underlying physics involves ideas that have a long history at the Cavendish, from Nevill Mott (on correlations) to Pyotr Kapitsa (on superfluidity), and even using shaking to effect dynamical control in a manner explained by Kapitsa but put to use in a way he would never have envisaged,” explained Professor Nigel Cooper, also from the Cavendish.
    The research is funded in part by the European Research Council (ERC), and the UK Engineering and Physical Sciences Research Council (EPSRC) as well as the Simons Foundation.
    Story Source:
    Materials provided by University of Cambridge. Note: Content may be edited for style and length. More

  • in

    Towards compact quantum computers thanks to topology

    Researchers at PSI have compared the electron distribution below the oxide layer of two semiconductors. The investigation is part of an effort to develop particularly stable quantum bits -and thus, in turn, particularly efficient quantum computers. They have now published their latest research, which is supported in part by Microsoft, in the scientific journal Advanced Quantum Technologies.
    By now, the future of computing is inconceivable without quantum computers. For the most part, these are still in the research phase. They hold the promise of speeding up certain calculations and simulations by orders of magnitude compared to classical computers.
    Quantum bits, or qubits for short, form the basis of quantum computers. So-called topological quantum bits are a novel type that might prove to be superior. To find out how these could be created, an international team of researchers has carried out measurements at the Swiss Light Source SLS at PSI.
    More stable quantum bits
    “Computer bits that follow the laws of quantum mechanics can be achieved in different ways,” explains Niels Schröter, one of the study’s authors. He was a researcher at PSI until April 2021, when he moved to the Max Planck Institute of Microstructure Physics in Halle, Germany. “Most types of qubits unfortunately lose their information quickly; you could say they are forgetful qubits.” There is a technical solution to this: Each qubit is backed up with a system of additional qubits that correct any errors that occur. But this means that the total number of qubits needed for an operational quantum computer quickly rises into the millions.
    “Microsoft’s approach, which we are now collaborating on, is quite different,” Schröter continues. “We want to help create a new kind of qubit that is immune to leakage of information. This would allow us to use just a few qubits to achieve a slim, functioning quantum computer.”
    The researchers hope to obtain such immunity with so-called topological quantum bits. These would be something completely new that no research group has yet been able to create. More

  • in

    A new approach to a $1 million mathematical enigma

    Numbers like π, e and φ often turn up in unexpected places in science and mathematics. Pascal’s triangle and the Fibonacci sequence also seem inexplicably widespread in nature. Then there’s the Riemann zeta function, a deceptively straightforward function that has perplexed mathematicians since the 19th century. The most famous quandary, the Riemann hypothesis, is perhaps the greatest unsolved question in mathematics, with the Clay Mathematics Institute offering a $1 million prize for a correct proof.
    UC Santa Barbara physicist Grant Remmen believes he has a new approach for exploring the quirks of the zeta function. He has found an analogue that translates many of the function’s important properties into quantum field theory. This means that researchers can now leverage the tools from this field of physics to investigate the enigmatic and oddly ubiquitous zeta function. His work could even lead to a proof of the Riemann hypothesis. Remmen lays out his approach in the journal Physical Review Letters.
    “The Riemann zeta function is this famous and mysterious mathematical function that comes up in number theory all over the place,” said Remmen, a postdoctoral scholar at UCSB’s Kavli Institute for Theoretical Physics. “It’s been studied for over 150 years.”
    An outside perspective
    Remmen generally doesn’t work on cracking the biggest questions in mathematics. He’s usually preoccupied chipping away at the biggest questions in physics. As the fundamental physics fellow at UC Santa Barbara, he normally devotes his attention to topics like particle physics, quantum gravity, string theory and black holes. “In modern high-energy theory, the physics of the largest scales and smallest scales both hold the deepest mysteries,” he remarked.
    One of his specialties is quantum field theory, which he describes as a “triumph of 20th century physics.” Most people have heard of quantum mechanics (subatomic particles, uncertainty, etc.) and special relativity (time dilation, E=mc2, and so forth). “But with quantum field theory, physicists figured out how to combine special relativity and quantum mechanics into a description of how particles moving at or near the speed of light behave,” he explained. More

  • in

    New simulations can improve avalanche forecasting

    Computer simulations of snow cover can accurately forecast avalanche hazard, according to a new international study involving researchers from Simon Fraser University.
    Currently, avalanche forecasts in Canada are made by experienced professionals who rely on data from local weather stations and on-the-ground observations from ski and backcountry ski operators, avalanche control workers for transportation and industry, and volunteers who manually test the snowpack.
    But simulated snow cover models developed by a team of researchers are able detect and track weak layers of snow and identify avalanche hazard in a completely different way — and can provide forecasters with another reliable tool when local data is insufficient or not available, according to a new study published in the journal Cold Regions Science and Technology.
    “As far as natural hazards go, avalanches are still one of the leading causes of fatalities in Canada,” says Simon Horton, a post-doctoral fellow with the SFU Centre for Natural Hazards Research and a forecaster with Avalanche Canada. “We’ve had these complex models that simulate the layers in the snowpack for a few decades now and they’re getting more and more accurate, but it’s been difficult to find out how to apply that to actual decision-making and improving safety.”
    Researchers took 16 years’ worth of daily meteorological, snow cover and avalanche data from two sites in Canada (Whistler and Rogers Pass, both in British Columbia) and Weissfluhjoch in Davos, Switzerland and ran computer simulations that could classify different avalanche situations.
    The simulations could determine avalanche risk, for either natural or artificial release, for problem types such as new snow, wind slab, persistent weak layers and wet snow conditions.
    “In the avalanche forecasting world, describing avalanche problems — the common scenarios that you might expect to find — are a good way for forecasters to describe avalanche hazard and communicate it to the public, so they know what kind of conditions to expect when they head out,” says Horton. “So that information is already available, except those are all done through expert assessment based on what they know from available field observations. In a lot of situations, there’s a fair bit of uncertainty about the human assessment of what these types of avalanche problems will be.
    “That’s where having more automated tools that can help predict potential hazards can help forecasters better prepare an accurate, precise forecast.”
    The results of the study showed the modelling was consistent with the real observed frequencies of avalanches over those 16 years and that the approach has potential to support avalanche forecasting in the future.
    Researchers also believe the modelling might be useful to study the future impacts of climate change on snow instability.
    Story Source:
    Materials provided by Simon Fraser University. Note: Content may be edited for style and length. More

  • in

    Scientists achieve key elements for fault-tolerant quantum computation in silicon spin qubits

    Researchers from RIKEN and QuTech — a collaboration between TU Delft and the Netherlands Organisation for Applied Scientific Research (TNO) — have achieved a key milestone toward the development of a fault-tolerant quantum computer. They were able to demonstrate a two-qubit gate fidelity of 99.5 percent — higher than the 99 percent considered to be the threshold for building fault-tolerant computers — using electron spin qubits in silicon, which are promising for large-scale quantum computers as the nanofabrication technology for building them already exists. This study was published in Nature.
    The world is currently in a race to develop large-scale quantum computers that could vastly outperform classical computers in certain areas. However, these efforts have been hindered by a number of factors, including in particular the problem of decoherence, or noise generated in the qubits. This problem becomes more serious with the number of qubits, hampering scaling up. In order to achieve a large-scale computer that could be used for useful applications, it is believed that a two-qubit gate fidelity of at least 99 percent to implement the surface code for error correction is required. This has been achieved in certain types of computers, using qubits based on superconducting circuits, trapped ions, and nitrogen-vacancy centers in diamond, but these are hard to scale up to the millions of qubits required to implement practical quantum computation with an error correction.
    To address these problems, the group decided to experiment with a quantum dot structure that was nanofabricated on a strained silicon/silicon germanium quantum well substrate, using a controlled-NOT (CNOT) gate. In previous experiments, the gate fidelity was limited due to slow gate speed. To improve the gate speed, they carefully designed the device and tuned it by applying different voltages to the gate electrodes. This combined an established fast single-spin rotation technique using micromagnets with large two-qubit coupling. The result was a gate speed that was 10 times better than previous attempts. Interestingly, although it had been thought that increasing gate speed would always lead to better fidelity, they found that there was a limit beyond which increasing the speed actually made the fidelity worse.
    In the course of the experiments, they discovered that a property called the Rabi frequency — a marker of how the qubits change states in response to an oscillating field — is key to the performance of the system, and they found a range of frequencies for which the single-qubit gate fidelity was 99.8 percent and the two-qubit gate fidelity was 99.5 percent, clearing the required threshold.
    Through this, they demonstrated that they could achieve universal operations, meaning that all the basic operations that constitute quantum operations, consisting of a single qubit operation and a two-qubit operation, could be performed at gate fidelities above the error correction threshold.
    To test the capability of the new system, the researchers implemented a two-qubit Deutsch-Jozsa algorithm and the Grover search algorithm. Both algorithms output correct results with a high fidelity of 96%-97%, demonstrating that silicon quantum computers can perform quantum calculations with high accuracy.
    Akito Noiri, the first author of the study, says, “We are very happy to have achieved a high-fidelity universal quantum gate set, one of the key challenges for silicon quantum computers.”
    Seigo Tarucha, leader of the research groups, said, “The presented result makes spin qubits, for the first time, competitive against superconducting circuits and ion traps in terms of universal quantum control performance. This study demonstrates that silicon quantum computers are promising candidates, along with superconductivity and ion traps, for research and development toward the realization of large-scale quantum computers.
    Story Source:
    Materials provided by RIKEN. Note: Content may be edited for style and length. More