More stories

  • in

    Resurrecting niobium for quantum science

    For years, niobium was considered an underperformer when it came to superconducting qubits. Now scientists supported by Q-NEXT have found a way to engineer a high-performing niobium-based qubit and so take advantage of niobium’s superior qualities.
    When it comes to quantum technology, niobium is making a comeback.
    For the past 15 years, niobium has been sitting on the bench after experiencing a few mediocre at-bats as a core qubit material.
    Qubits are the fundamental components of quantum devices. One qubit type relies on superconductivity to process information.
    Touted for its superior qualities as a superconductor, niobium was always a promising candidate for quantum technologies. But scientists found niobium difficult to engineer as a core qubit component, and so it was relegated to the second string on Team Superconducting Qubit.
    Now, a group led by Stanford University’s David Schuster has demonstrated a way to create niobium-based qubits that rival the state-of-the-art for their class.
    “This was a promising first foray, having resurrected niobium junctions. … With niobium-based qubits’ broad operational reach, we open up a whole new set of capabilities for future quantum technologies.” — David Schuster, Stanford University
    “We’ve shown that niobium is relevant again, expanding the possibilities of what we can do with qubits,” said Alexander Anferov of the University of Chicago’s Physical Science division, one of the lead scientists of the result.

    The team’s work is published in Physical Review Applied and was supported in part by Q-NEXT, a U.S. Department of Energy (DOE) National Quantum Information Science Research Center led by DOE’s Argonne National Laboratory.
    By harnessing niobium’s standout features, scientists will be able to expand the capabilities of quantum computers, networks and sensors. These quantum technologies draw on quantum physics to process information in ways that outclass their traditional counterparts and are expected to improve areas as varied as medicine, finance and communication.
    The niobium advantage
    When it comes to superconducting qubits, aluminum has ruled the roost. Aluminum-based superconducting qubits can store information for a relatively long time before the data inevitably disintegrates. These longer coherence times mean more time for processing information.
    The longest coherence times for an aluminum-based superconducting qubit are a few hundred millionths of a second. By contrast, in recent years, the best niobium-based qubits yielded coherence times that are 100 times shorter — a few hundred billionths of a second.
    Despite that short qubit lifetime, niobium held attractions. A niobium-based qubit can operate at higher temperatures than its aluminum counterpart and so would require less cooling. It can also operate across an eight-times-greater frequency range and a massive 18,000-times-wider magnetic field range compared to aluminum-based qubits, expanding the menu of uses for the superconducting-qubit family.

    In one respect, there was no contest between the two materials: Niobium’s operating range trounced aluminum’s. But for years, the short coherence time made the niobium-based qubit a nonstarter.
    “No one really made that many qubits out of niobium junctions because they were limited by their coherence,” Anferov said. “But our group wanted to make a qubit that could work at higher temperatures and a greater frequncy range — at 1 K and 100 gigahertz. And for both of those properties, aluminum is not sufficient. We needed something else.”
    So, the team had another look at niobium.
    Losing the lossiness
    Specifically, they had a look at the niobium Josephson junction. The Josephson junction is the information-processing heart of the superconducting qubit.
    In classical information processing, data comes in bits that are either 0s or 1s. In quantum information processing, a qubit is a mixture of 0 and 1. The superconducting qubit’s information “lives” as a mixture of 0 and 1 inside the junction. The longer the junction can sustain the information in that mixed state, the better the junction and the better the qubit.
    The Josephson junction is structured like a sandwich, consisting of a layer of nonconducting material squeezed between two layers of superconducting metal. A conductor is a material that provides easy passage for electrical current. A superconductor kicks it up a notch: It carries electrical current with zero resistance. Electromagnetic energy flows between the junction’s outer layers in the mixed quantum state.
    The typical, trusty aluminum Josephson junction is made of two layers of aluminum and a middle layer of aluminum oxide. A typical niobium junction is made of two layers of niobium and a middle layer of niobium oxide.
    Schuster’s group found that the junction’s niobium oxide layer sapped the energy required to sustain quantum states. They also identified the niobium junctions’ supporting architecture as a big source of energy loss, causing the qubit’s quantum state to fizzle out.
    The team’s breakthrough involved both a new junction arrangement and a new fabrication technique.
    The new arrangement called on a familiar friend: aluminum. The design did away with the energy-sucking niobium oxide. And instead of two distinct materials, it used three. The result was a low-loss, trilayer junction — niobium, aluminum, aluminum oxide, aluminum, niobium.
    “We did this best-of-both-worlds approach,” Anferov said. “The thin layer of aluminum can inherit the superconducting properties of the niobium nearby. This way, we can use the proven chemical properties of aluminum and still have the superconducting properties of niobium.”
    The group’s fabrication technique involved removing scaffolding that supported the niobium junction in previous schemes. They found a way to maintain the junction’s structure while getting rid of the loss-inducing, extraneous material that hampered coherence in previous designs.
    “It turns out just getting rid of the garbage helped,” Anferov said.
    A new qubit is born
    After incorporating their new junction into superconducting qubits, the Schuster group achieved a coherence time of 62 millionths of a second, 150 times longer than its best-performing niobium predecessors. The qubits also exhibited a quality factor — an index of how well a qubit stores energy — of 2.57 x 105, a 100-fold improvement over previous niobium-based qubits and competitive with aluminum-based qubit quality factors.
    “We’ve made this junction that still has the nice properties of niobium, and we’ve improved the loss properties of the junction,” Anferov said. “We can directly outperform any aluminum qubit because aluminum is an inferior material in many ways. I now have a qubit that doesn’t die at higher temperatures, which is the big kicker.”
    The results will likely elevate niobium’s place in the lineup of superconducting qubit materials. More

  • in

    What math tells us about social dilemmas

    Human coexistence depends on cooperation. Individuals have different motivations and reasons to collaborate, resulting in social dilemmas, such as the well-known prisoner’s dilemma. Scientists from the Chatterjee group at the Institute of Science and Technology Austria (ISTA) now present a new mathematical principle that helps to understand the cooperation of individuals with different characteristics. The results, published in PNAS, can be applied to economics or behavioral studies.
    A group of neighbors shares a driveway. Following a heavy snowstorm, the entire driveway is covered in snow, requiring clearance for daily activities. The neighbors have to collaborate. If they all put on their down jackets, grab their snow shovels, and start digging, the road will be free in a very short amount of time. If only one or a few of them take the initiative, the task becomes more time-consuming and labor-intensive. Assuming nobody does it, the driveway will stay covered in snow. How can the neighbors overcome this dilemma and cooperate in their shared interests?
    Scientists in the Chatterjee group at the Institute of Science and Technology Austria (ISTA) deal with cooperative questions like that on a regular basis. They use game theory to lay the mathematical foundation for decision-making in such social dilemmas. The group’s latest publication delves into the interactions between different types of individuals in a public goods game. Their new model, published in PNAS, explores how resources should be allocated for the best overall well-being and how cooperation can be maintained.
    The game of public goods
    For decades, the public goods game has been a proven method to model social dilemmas. In this setting, participants decide how much of their own resources they wish to contribute for the benefit of the entire group. Most existing studies considered homogeneous individuals, assuming that they do not differ in their motivations and other characteristics. “In the real world, that’s not always the case,” says Krishnendu Chatterjee. To account for this, Valentin Hübner, a PhD student, Christian Hilbe, and Maria Kleshina, both former members of the Chatterjee group, started modeling settings with diverse individuals. A recent analysis of social dilemmas among unequals, published in 2019, marked the foundation for their work, which now presents a more general model, even allowing multi-player interaction.
    “The public good in our game can be anything, such as environmental protection or combating climate change, to which everybody can contribute,” Hübner explains. The players have different levels of skills. In public goods games, skills typically refer to productivity. “It’s the ability to contribute to a particular task,” Hübner continues. Resources, technically called endowment or wealth, on the other hand, refer to the actual things that participants contribute to the common good.
    In the snowy driveway scenario, the neighbors vary significantly in their available resources and in their abilities to use them. Solving the problem requires them to cooperate. But what role does their inequality play in such a dilemma?

    The two sides of inequality
    Hübner’s new model provides answers to this question. Intuitively, it proposes that for diverse individuals to sustain cooperation, a more equal distribution of resources is necessary. Surprisingly, more equality does not lead to maximum general welfare. To reach this, the resources should be allocated to more skilled individuals, resulting in a slightly uneven distribution. “Efficiency benefits from unequal endowment, while robustness always benefits from equal endowment,” says Hübner. Put simply, for accomplishing a task, resources should be distributed almost evenly. Yet, if efficiency is the goal, resources should be in the hands of those more willing to participate — but only to a certain extent.
    What is more important — cooperation efficiency or stability? The scientists’ further simulations of learning processes suggest that individuals balance the trade-off between these two things. Whether this is also the case in the real world remains to be seen. Numerous interpersonal nuances also contribute to these dynamics, including aspects like reciprocity, morality, and ethical issues, among others.
    Hübner’s model solely focuses on cooperation from a mathematical standpoint. Yet, due to its generality, it can be applied to any social dilemma with diverse individuals, like climate change, for instance. Testing the model in the real world and applying it to society are very interesting experimental directions. “I’m quite sure that there will be behavioral experiments benefiting from our work in the future,” says Chatterjee. The study could potentially also be interesting for economics, where the new model’s principles can help to better inform economic systems and policy recommendations. More

  • in

    Reimagining electron microscopy: Bringing high-end resolution to lower-cost microscopes

    Researchers at the University of Illinois at Urbana-Champaign have shown for the first time that expensive aberration-corrected microscopes are no longer required to achieve record-breaking microscopic resolution.
    The field of microscopy is in the middle of a great revolution. Since the 1800s and the invention of the compound light microscope, there have only been a few major jumps in resolution to see different length scales: from bacteria and cells, to viruses and proteins, and even down to single atoms. Generally, as resolution has been making these incredible jumps, so has the price of the microscopes used to achieve that resolution. Such hefty price tags severely limits the accessibility of these instruments. The current jump in resolution comes from a new technique called electron ptychography — a method that uses computation to boost the resolution of electron microscopes — which has taken the field by storm in the last 5-6 years.
    Researchers at the University of Illinois Urbana-Champaign have demonstrated record-breaking resolution using electron ptychography on “conventional” transmission electron microscopes (conventional meaning without expensive aberration correctors). This breaks the trend of increasing microscope price with increasing resolution. They were able to achieve deep sub-angstrom spatial resolution down to 0.44 angstrom (one angstrom is one ten-billionth of a meter), which exceeds the resolution of aberration-corrected tools and rivals their highest ptychographic resolutions.
    “For the last 90-100 years, our field has thought that the way to do great microscopy is to make better and better microscopes,” says materials science & engineering professor Pinshane Huang, who led this work. “The most exciting thing about our research is that we’re showing that you don’t need a cutting-edge microscope to make this work. We can take a ‘conventional’ microscope and do the same thing, using ptychography, and it’s just as good! This is amazing because there can be a multi-million-dollar difference in cost between the two setups.”
    This research, co-first authored by former MatSE UIUC postdoctoral researcher Kayla Nguyen, former MatSE UIUC graduate student Chia-Hao Lee and Argonne National Laboratory staff scientist Yi Jiang, was recently published in the journal Science.
    Before ptychography, the highest resolution electron microscopes used a technology called aberration-correction to allow scientists to see individual atoms. Rather than using a beam of light to probe a sample, electron microscopes use a beam of electrons, focused by electromagnets. Electrons have wavelengths thousands of times smaller than visible light, which allows electron microscopes to resolve objects that are many times smaller than can be seen with optical microscopes. Scientists use these microscopes to decode the structures of objects ranging from the spike protein on the COVID-19 virus to the arrangements of atoms in graphene and, more generally, to peer inside matter to understand its atomic structure, composition and bonding.
    However, one of the challenges of using beams of electrons is focusing that beam. “It’s impossible to make a perfect lens for electrons,” Huang says. “What people have been doing to compensate is making ‘bad’ lenses, and then putting aberration correctors after them, which are a series of ‘bad’ lenses that are ‘bad’ in opposite ways. Summed together, they make ‘okay’ lenses, and that’s been the gold standard for how we image at the atomic scale for at least 20 years.”
    In optics, an aberration is any way that a lens deviates from a perfect lens. For example, human eyes can have several types of aberrations such as short- and near-sightedness (eyes can’t focus at all distances) and astigmatism (curvature of the eyeball that causes blurred vision). Lee explains, “For electromagnetic lenses, the way to focus these electrons is through an electromagnetic field. But we don’t have a great way of controlling the shape and the strength of the electromagnetic field, which puts a very strong limitation on how precise we can be focusing these electrons.” In aberration-corrected microscopy, the current cutting-edge technology, there is an extra stack of lenses to correct the aberrations from the regular lenses, that changes the shape of the beam before it hits the sample. Those extra aberration correcting lenses are where significant costs are added to the microscope.

    While it is impossible to make a perfect lens, the goal of the last 100 years has been to continuously make better lenses to minimize aberrations. But Huang says, “What’s exciting about ptychography is that you don’t have to make better and better lenses. What we can do instead is use computers.”
    Rather than using a stack of lens optics to remove aberrations, ptychography removes them computationally. With a new generation of detectors, called hybrid pixel detectors, that cost a few hundred thousand dollars (compared to aberration-corrected microscopes that cost up to $7 million) and computer algorithms, this method can double, triple or even quadruple the resolution of what a microscope can achieve with its physical lenses. Huang and her team have shown that their approach quadruples the resolution of conventional transmission electron microscopes. Further, nearly any scanning transmission electron microscope can now be adapted to achieve state-of-the-art resolution at a fraction of the cost.
    While this approach is game-changing, Huang notes that ptychography is still a challenging technique that requires a lot of computation power. It can take hours to get a single reconstruction to reach the best resolution. But, as with many other technologies, computation advances quite rapidly and gets cheaper, faster and easier to use.
    “We brought a cutting-edge technique, electron ptychography, to conventional transmission electron microscopes to show for the first time that a ‘mediocre’ microscope can do just as well as the most expensive microscopes on the market,” Huang says. “This is significant for the hundreds of institutions across the country and across the world who previously couldn’t afford the cutting edge. Now, all they need is a detector, some computers and electron ptychography. And once you do that, you can see the atomic world with much more detail than anyone imagined even 10 years ago. This represents a huge paradigm shift.” More

  • in

    Robots, monitoring and healthy ecosystems could halve pesticide use without hurting productivity

    Smarter crop farming that combats weeds, insect pests and plant diseases by integrating modern technologies like AI-based monitoring, robotics, and next-generation biotechnology with healthy and resilient agricultural ecosystems: One Crop Health, a new research collaboration based at University of Copenhagen aims to reduce the use of pesticides by developing a sustainable agriculture for the future.
    In keeping with the age-old saying: ‘prevention is better than a cure’ more sustainable approaches should await the farmers of the future without compromising their productivity, as envisioned by the researchers leading the recently launched One Crop Health research project.
    Backed by DKK 60 million (€8.05M) from Novo Nordisk, researchers from University of Copenhagen will gather knowledge over the next six years to develop smarter agriculture that is both sustainable and able to produce enough food for the world’s growing population in a collaboration with researchers from Aarhus University and Rothamsted Research in the UK.
    For many years, growers have relied on pesticides to control disease, pests and weeds, which lead to worldwide losses of one-third of crop yield. At the same time, estimates show that we will need 60% more food than today by 2050.
    However, pesticides also threaten health, nature and biodiversity, and there is increasing pressure from society and politicians to limit their use. Recently the desire for healthier and more sustainable crops resulted in EU policy plans for a 50% reduction in pesticides by 2030. These plans have now been set on pause after being challenged by farmers concerned that pesticide reduction could make farming unviable by hurting productivity.
    So, can growers halve their use of pesticides without the profession becoming unproductive and the world running out of food or? This is the central question that One Crop Health seeks to answer.
    Making pesticide reduction profitable for farmers
    “The political controversy, farmer and industry concern, and the uncertainty over regulation clearly shows that more research is needed to enable this transition. Most farmers actually want to use less pesticides, but there is a need for research that demonstrates how this is possible whilst maintaining healthy, high-yielding, and profitable crops. The One Crop Health project aims to bridge the gap by developing research, which places the drive to reduce pesticide use on a scientific foundation that will ultimately help farmers make the rational decision to help reduce pesticide use,” says Professor Paul Neve from the Department of Plant and Environmental Sciences

    According to the project’s lead researcher, making a viable transition is possible though, using a more holistic approach to farming.
    “Generally the focus needs to move away from solving individual problems by way of a few blunt tools like pesticides. Oftentimes, this approach creates new problems elsewhere, which then also need to be addressed. Planting fields densely to avoid weeds is another example of this lack of oversight, where the misguided solution ends up creating the optimal conditions for fungi and diseases instead. We need to get better at understanding entire ecosystems and then make use of all of the modern tools available,” says Professor Paul Neve.
    “If we can create healthy ecosystems that will reduce the numbers of pests, weed and diseases, it will simply reduce the need for spraying. We can largely replace the remaining need with other tools, for example, AI-based monitoring and modelling can help to inform where and when pests need to be controlled and new solutions such as bio-pesticides can be used to achieve that,” he says.
    Natural defenses have been replaced with chemicals
    The resilience that results from healthy ecosystems can reduce challenges that are currently solved by means like pesticides.
    “Today’s crops are characterized by attempts to make crop production more efficient for thousands of years. In the process, many crops have lost their natural defences against pests, weeds and diseases,” explains Paul Neve.

    According to the researcher, tomorrow’s growers can get better at protecting their plants by looking at how things are done in nature. Here, helpful microorganisms like bacteria and fungi protect against diseases, and healthy crops are more competitive against weeds. At the same time, pests that threaten crops will be better controlled by their natural enemies.
    “If you think of the field as an entire ecosystem that needs to thrive — hence the name One Crop Health, we believe you get a preventive overall effect. At the same time, modern knowledge and technology can, for example, change the basic need for pesticides. Whereas entire fields are sprayed today, drone surveillance will allow us to only target where weeds are a threat to the crop, or not at all, and let robots do the work instead,” says the professor.
    100 farmers to help researchers
    “Part of the project is about working with farmers, using their fields to discover smart solutions, so future agriculture can be based on the best possible knowledge,” says Paul Neve.
    In collaboration with 100 farms, and distributed equally across Denmark and England, they will begin by collecting data from scratch.
    “We begin by asking farmers about their current challenges, how they will try to solve them and about what works and what doesn’t. To this knowledge bank, we will add our professional knowledge of ecosystems, modern technologies and methods, and then the goal when the six years have passed is to be able to give a lot of knowledge and concrete methods back to farmers,” says Paul Neve.
    Drone data and models will predict field needs
    The final prong in the effort to achieve a holistic understanding comes from broad knowledge and modern technologies that provide access.
    In an interdisciplinary collaboration with the Department of Computer Science at University of Copenhagen, a group of computer scientists will translate information from surveillance with modern technologies such as drones, and on that basis design so-called digital twins of farmer’s fields.
    “They will provide a kind of model that can predict how fields will behave, the needs that are just around the corner and how different solutions will affect the fields,” says Paul Neve.
    In the second half of the six-year project, 11 Ph.D. students will develop knowledge about specific solutions within their respective focus areas.
    “When the six years have passed, it is important for us that we have integrated data and new tools in a holistic way, providing farmers with concrete methods that they can use to solve their challenges in everyday life more sustainably without compromising productivity,” Paul Neve concludes. More

  • in

    A novel method for easy and quick fabrication of biomimetic robots with life-like movement

    Ultraviolet-laser processing is a promising technique for developing intricate microstructures, enabling complex alignment of muscle cells, required for building life-like biohybrid actuators, as shown by Tokyo Tech researchers. Compared to traditional complex methods, this innovative technique enables easy and quick fabrication of microstructures with intricate patterns for achieving different muscle cell arrangements, paving the way for biohybrid actuators capable of complex, flexible movements.
    Biomimetic robots, which mimic the movements and biological functions of living organisms, are a fascinating area of research that can not only lead to more efficient robots but also serve as a platform for understanding muscle biology. Among these, biohybrid actuators, made up of soft materials and muscular cells that can replicate the forces of actual muscles, have the potential to achieve life-like movements and functions, including self-healing, high efficiency, and high power-to-weight ratio, which have been difficult for traditional bulky robots that require heavy energy sources. One way to achieve these life-like movements is to arrange muscle cells in biohybrid actuators in an anisotropic manner. This involves aligning them in a specific pattern where they are oriented in different directions, like what is found in living organisms. While previous studies have reported biohybrid actuators with significant movement using this technique, they have mostly focused on anisotropically aligning muscle cells in a straight line, resulting in only simple motions, as opposed to the complex movement of native muscle tissues such as twisting, bending, and shrinking. Real muscle tissues have a complex arrangement of muscle cells, including curved and helical patterns.
    Creating such complex arrangements requires the formation of curved microgrooves (MGs) on a substrate, which then serve as the guide for aligning muscle cells in the required patterns. Fabrication of complex MGs has been achieved by methods such as photolithography, wavy micrography and micro-contact printing. However, these methods involve multiple intricate steps and are not suitable for rapid fabrication.
    To address this, a team of researchers from Tokyo Institute of Technology (Tokyo Tech) in Japan, led by Associate Professor Toshinori Fujie from the School of Life Science and Technology, has developed an ultraviolet (UV) laser-processing technique for fabricating complex microstructures. “Based on our previous prototypes, we hypothesized that biohybrid actuators using an SBS (hard rubber) thin film with arbitrary anisotropic MGs fabricated by a UV laser processing can control cellular alignment in an arbitrarily anisotropic direction to reproduce more life-like flexible movements,” explains Dr. Fujie. Their study has been published in the journal Biofabrication.
    The novel technique includes forming curved MGs on a polyimide through UV-laser processing, which are then transcribed onto a thin film made of SBS. Next, skeletal muscle cells called myotubes, found in living organisms, are aligned using the MGs to achieve an anisotropic curved muscle pattern. The researchers used this method to develop two different biohybrid actuators: one tethered to the glass substrate and the other untethered. Upon electrical stimulation, both actuators deformed through a twisting-like motion. Interestingly, the biohybrid actuator when untethered, transformed into a 3D free-standing structure, due to the curved alignment of myotubes like a native sphincter.
    “These results signify that compared to traditional methods, UV-laser con is a quicker and easier method for the fabrication of tunable MG patterns. This method raises intriguing opportunities for achieving more life-like biohybrid actuators through guided alignment of myotubes,” remarks Dr. Fujie, emphasizing the potential of this innovative technique.
    Overall, this study demonstrates the potential of UV-laser processing for the fabrication of different anisotropic muscle tissue patterns, paving the way for more life-like biohybrid actuators capable of complex, flexible movements! More

  • in

    Scientists closer to solving mysteries of universe after measuring gravity in quantum world

    Scientists are a step closer to unravelling the mysterious forces of the universe after working out how to measure gravity on a microscopic level.
    Experts have never fully understood how the force which was discovered by Isaac Newton works in the tiny quantum world.
    Even Einstein was baffled by quantum gravity and, in his theory of general relativity, said there is no realistic experiment which could show a quantum version of gravity.
    But now physicists at the University of Southampton, working with scientists in Europe, have successfully detected a weak gravitational pull on a tiny particle using a new technique.
    They claim it could pave the way to finding the elusive quantum gravity theory.
    The experiment, published in the Science Advances journal, used levitating magnets to detect gravity on microscopic particles — small enough to boarder on the quantum realm.
    Lead author Tim Fuchs, from the University of Southampton, said the results could help experts find the missing puzzle piece in our picture of reality.

    He added: “For a century, scientists have tried and failed to understand how gravity and quantum mechanics work together.
    “Now we have successfully measured gravitational signals at a smallest mass ever recorded, it means we are one step closer to finally realising how it works in tandem.
    “From here we will start scaling the source down using this technique until we reach the quantum world on both sides.
    “By understanding quantum gravity, we could solve some of the mysteries of our universe — like how it began, what happens inside black holes, or uniting all forces into one big theory.”
    The rules of the quantum realm are still not fully understood by science — but it is believed that particles and forces at a microscopic scale interact differently than regular-sized objects.
    Academics from Southampton conducted the experiment with scientists at Leiden University in the Netherlands and the Institute for Photonics and Nanotechnologies in Italy, with funding from the EU Horizon Europe EIC Pathfinder grant (QuCoM).

    Their study used a sophisticated setup involving superconducting devices, known as traps, with magnetic fields, sensitive detectors and advanced vibration isolation.
    It measured a weak pull, just 30aN, on a tiny particle 0.43mg in size by levitating it in freezing temperatures a hundredth of a degree above absolute zero — about minus-273 degrees Celsius.
    The results open the door for future experiments between even smaller objects and forces, said Professor of Physics Hendrik Ulbricht also at the University of Southampton.
    He added: “We are pushing the boundaries of science that could lead to new discoveries about gravity and the quantum world.
    “Our new technique that uses extremely cold temperatures and devices to isolate vibration of the particle will likely prove the way forward for measuring quantum gravity.
    “Unravelling these mysteries will help us unlock more secrets about the universe’s very fabric, from the tiniest particles to the grandest cosmic structures.” More

  • in

    Measuring the properties of light: Scientists realize new method for determining quantum states

    Scientists at Paderborn University have used a new method to determine the characteristics of optical, i.e. light-based, quantum states. For the first time, they are using certain photon detectors — devices that can detect individual light particles — for so-called homodyne detection. The ability to characterise optical quantum states makes the method an essential tool for quantum information processing. Precise knowledge of the characteristics is important for use in quantum computers, for example. The results have now been published in the specialist journal Optica Quantum.
    “Homodyne detection is a method frequently used in quantum optics to investigate the wave-like nature of optical quantum states,” explains Timon Schapeler from the Paderborn “Mesoscopic Quantum Optics” working group at the Department of Physics. Together with Dr Maximilian Protte, he has used the method to investigate the so-called continuous variables of optical quantum states. This involves the variable properties of light waves. These can be, for example, the amplitude or phase, i.e. the oscillation behaviour of waves, which are important for the targeted manipulation of light, among other things.
    For the first time, the physicists have used superconducting nanowire single photon detectors for the measurements — currently the fastest devices for photon counting. With their special experimental setup, the two scientists have shown that a homodyne detector with superconducting single photon detectors has a linear response to the input photon flux. Translated, this means that the measured signal is proportional to the input signal.
    “In principle, the integration of superconducting single-photon detectors brings many advantages in the area of continuous variables, not least the intrinsic phase stability. These systems also have almost 100 per cent on-chip detection efficiency. This means that no particles are lost during detection. Our results could enable the development of highly efficient homodyne detectors with single-photon sensitive detectors,” says Schapeler.
    Working with continuous variables of light opens up new and exciting possibilities in quantum information processing beyond qubits, the usual computing units of quantum computers. More

  • in

    Mixed-dimensional transistors enable high-performance multifunctional electronic devices

    Downscaling of electronic devices, such as transistors, has reached a plateau, posing challenges for semiconductor fabrication. However, a research team led by materials scientists from City University of Hong Kong (CityUHK) recently discovered a new strategy for developing highly versatile electronics with outstanding performance, using transistors made of mixed-dimensional nanowires and nanoflakes. This innovation paves the way for simplified chip circuit design, offering versatility and low power dissipation in future electronics.
    In recent decades, as the continuous scaling of transistors and integrated circuits has started to reach physical and economic limits, fabricating semiconductor devices in a controllable and cost-effective manner has become challenging. Further scaling of transistor size increases current leakage and thus power dissipation. Complex wiring networks also have an adverse impact on power consumption.
    Multivalued logic (MVL) has emerged as a promising technology for overcoming increasing power consumption. It transcends the limitations of conventional binary logic systems by greatly reducing the number of transistor components and their interconnections, enabling higher information density and lower power dissipation. Significant efforts have been devoted to constructing various multivalued logic devices, including anti-ambipolar transistors (AAT).
    Anti-ambipolar devices are a class of transistors in which positive (holes) and negative (electron) charge carriers can both transport concurrently within the semi-conducting channel. However, existing AAT-based devices utilize predominately 2D or organic materials, which are unstable for large-scale semiconductor device integration. Also, their frequency characteristics and energy efficiency have rarely been explored.
    To address these limitations, a research team led by Professor Johnny Ho, Associate Vice-President (Enterprise) and Associate Head in the Department of Materials Science and Engineering at CityUHK, embarked on research to develop anti-ambipolar device-based circuits with higher information density and fewer interconnections, and explore their frequency characteristics.
    The team created an advanced chemical vapour-deposition technique to create a novel, mixed-dimensional hetero-transistor, which combines the unique properties of high-quality GaAsSb nanowires and MoS2 nanoflakes.
    The new anti-ambipolar transistors had exceptional performance. Owing to the strong interfacial coupling and band-structure alignment properties of the mixed-dimensional GaAsSb/MoS2 junction, the hetero-transistor has prominent anti-ambipolar transfer characteristics with the flipping of transconductance.

    The flipping of transconductance doubles the frequency in response to the input analog circuit signal, greatly reducing the number of devices required compared to conventional frequency multiplier in CMOS technology.
    “Our mixed-dimensional, anti-ambipolar transistors can implement multi-valued logic circuits and frequency multipliers simultaneously, making this the first of its kind in the field of anti-ambipolar transistor applications,” said Professor Ho.
    The multi-valued logic characteristics simplify the complicated wiring networks and reduce chip power dissipation. The shrinking of device dimensionality, together with the downscaled junction region, render the device fast and energy efficient, resulting in high-performance digital and analog circuits.
    “Our findings show that mixed-dimensional anti-ambipolar devices enable chip circuit design with high information storage density and information processing capacity,” said Professor Ho. “So far, most researchers in the semiconductor industry have focused on device miniaturization to keep Moore’s law rolling. But the advent of the anti-ambipolar device shows the comparative superiority of the existing binary logic-based technology. The technology developed in this research represents a big step towards next-generation multifunctional integrated circuits and telecommunications technologies.”
    The research also opens the possibility of further simplifying complex integrated circuit designs to improve performance.
    The mixed-dimensional anti-ambipolar device’s transconductance-flipping feature has shown the possibility of versatile applications in digital and analog signal processing, including ternary logic inverters, advanced optoelectronics and frequency-doubling circuits. “The new device structure heralds the potential of a technological revolution in future versatile electronics,” added Professor Ho. More