More stories

  • in

    Quantum information theory: Quantum complexity grows linearly for an exponentially long time

    Physicists know about the huge chasm between quantum physics and the theory of gravity. However, in recent decades, theoretical physics has provided some plausible conjecture to bridge this gap and to describe the behaviour of complex quantum many-body systems, for example black holes and wormholes in the universe. Now, a theory group at Freie Universität Berlin and HZB, together with Harvard University, USA, has proven a mathematical conjecture about the behaviour of complexity in such systems, increasing the viability of this bridge. The work is published in Nature Physics.
    “We have found a surprisingly simple solution to an important problem in physics,” says Prof. Jens Eisert, a theoretical physicist at Freie Universität Berlin and HZB. “Our results provide a solid basis for understanding the physical properties of chaotic quantum systems, from black holes to complex many-body systems,” Eisert adds.
    Using only pen and paper, i.e. purely analytically, the Berlin physicists Jonas Haferkamp, Philippe Faist, Naga Kothakonda and Jens Eisert, together with Nicole Yunger Halpern (Harvard, now Maryland), have succeeded in proving a conjecture that has major implications for complex quantum many-body systems. “This plays a role, for example, when you want to describe the volume of black holes or even wormholes,” explains Jonas Haferkamp, PhD student in the team of Eisert and first author of the paper.
    Complex quantum many-body systems can be reconstructed by circuits of so-called quantum bits. The question, however, is: how many elementary operations are needed to prepare the desired state? On the surface, it seems that this minimum number of operations — the complexity of the system — is always growing. Physicists Adam Brown and Leonard Susskind from Stanford University formulated this intuition as a mathematical conjecture: the quantum complexity of a many-particle system should first grow linearly for astronomically long times and then — for even longer — remain in a state of maximum complexity. Their conjecture was motivated by the behaviour of theoretical wormholes, whose volume seems to grow linearly for an eternally long time. In fact, it is further conjectured that complexity and the volume of wormholes are one and the same quantity from two different perspectives. “This redundancy in description is also called the holographic principle and is an important approach to unifying quantum theory and gravity. Brown and Susskind’s conjecture on the growth of complexity can be seen as a plausibility check for ideas around the holographic principle,” explains Haferkamp.
    The group has now shown that the quantum complexity of random circuits indeed increases linearly with time until it saturates at a point in time that is exponential to the system size. Such random circuits are a powerful model for the dynamics of many-body systems. The difficulty in proving the conjecture arises from the fact that it can hardly be ruled out that there are “shortcuts,” i.e. random circuits with much lower complexity than expected. “Our proof is a surprising combination of methods from geometry and those from quantum information theory. This new approach makes it possible to solve the conjecture for the vast majority of systems without having to tackle the notoriously difficult problem for individual states,” says Haferkamp.
    “The work in Nature Physics is a nice highlight of my PhD,” adds the young physicist, who will take up a position at Harvard University at the end of the year. As a postdoc, he can continue his research there, preferably in the classic way with pen and paper and in exchange with the best minds in theoretical physics.
    Story Source:
    Materials provided by Helmholtz-Zentrum Berlin für Materialien und Energie. Note: Content may be edited for style and length. More

  • in

    Chaos theory provides hints for controlling the weather

    Under a project led by the RIKEN Center for Computational Science, researchers have used computer simulations to show that weather phenomena such as sudden downpours could potentially be modified by making small adjustments to certain variables in the weather system. They did this by taking advantage of a system known as a “butterfly attractor” in chaos theory, where a system can have one of two states — like the wings of a butterfly — and that it switches back and forth between the two states depending on small changes in certain conditions.
    While weather predictions have reached levels of high accuracy thanks to methods such as supercomputer-based simulations and data assimilation, where observational data is incorporated into simulations, scientists have long hoped to be able to control the weather. Research in this area has intensified due to climate change, which has led to more extreme weather events such as torrential rain and storms.
    There are methods at present for weather modification, but they have had limited success. Seeding the atmosphere to induce rain has been demonstrated, but it is only possible when the atmosphere is already in a state where it might rain. Geoengineering projects have been envisioned, but have not been carried out due to concerns about what unpredicted long-term effects they might have.
    As a promising approach, researchers from the RIKEN team have looked to chaos theory to create realistic possibilities for mitigating weather events such as torrential rain. Specifically, they have focused on a phenomenon known as a butterfly attractor, proposed by mathematician and meteorologist Edward Lorentz, one of the founders of modern chaos theory. Essentially, this refers to a system that can adopt one of two orbits that look like the wings of a butterfly, but can change the orbits randomly based on small fluctuations in the system.
    To perform the work, the RIKEN team ran one weather simulation, to serve as the control of “nature” itself, and then ran other simulations, using small variations in a number of variables describing the convection — how heat moves through the system — and discovered that small changes in several of the variables together could lead to the system being in a certain state once a certain amount of time elapsed.
    According to Takemasa Miyoshi of the RIKEN Center for Computational Science, who led the team, “This opens the path to research into the controllability of weather and could lead to weather control technology. If realized, this research could help us prevent and mitigate extreme windstorms, such as torrential rains and typhoons, whose risks are increasing with climate change.”
    “We have built a new theory and methodology for studying the controllability of weather,” he continues. “Based on the observing system simulation experiments used in previous predictability studies, we were able to design an experiment to investigate predictability based on the assumption that the true values (nature) cannot be changed, but rather that we can change the idea of what can be changed (the object to be controlled).”
    Looking to the future, he says, “In this case we used an ideal low-dimensional model to develop a new theory, and in the future we plan to use actual weather models to study the possible controllability of weather.”
    The work, published in Nonlinear Processes of Geophysics, was done as part of the Moonshot R&D Millennia program, contributing to the new Moonshot goal #8.
    Story Source:
    Materials provided by RIKEN. Note: Content may be edited for style and length. More

  • in

    Design of protein binders from target structure alone

    A team of scientists has created a powerful new method for generating protein drugs. Using computers, they designed molecules that can target important proteins in the body, such as the insulin receptor, as well as vulnerable proteins on the surface of viruses. This solves a long-standing challenge in drug development and may lead to new treatments for cancer, diabetes, infection, inflammation, and beyond.
    The research, appearing March 24 in the journal Nature, was led by scientists in the laboratory of David Baker, professor of biochemistry at the University of Washington School of Medicine and a recipient of the 2021 Breakthrough Prize in Life Sciences.
    “The ability to generate new proteins that bind tightly and specifically to any molecular target that you want is a paradigm shift in drug development and molecular biology more broadly,” said Baker.
    Antibodies are today’s most common protein-based drugs. They typically function by binding to a specific molecular target, which then becomes either activated or deactivated. Antibodies can treat a wide range of health disorders, including COVID-19 and cancer, but generating new ones is challenging. Antibodies can also be costly to manufacture.
    A team led by two postdoctoral scholars in the Baker lab, Longxing Cao and Brian Coventry, combined recent advances in the field of computational protein design to arrive at a strategy for creating new proteins that bind molecular targets in a manner similar to antibodies. They developed software that can scan a target molecule, identify potential binding sites, generate proteins targeting those sites, and then screen from millions of candidate binding proteins to identify those most likely to function.
    The team used the new software to generate high-affinity binding proteins against 12 distinct molecular targets. These targets include important cellular receptors such as TrkA, EGFR, Tie2, and the insulin receptor, as well proteins on the surface of the influenza virus and SARS-CoV-2 (the virus that causes COVID-19).
    “When it comes to creating new drugs, there are easy targets and there are hard targets,” said Cao, who is now an assistant professor at Westlake University. “In this paper, we show that even very hard targets are amenable to this approach. We were able to make binding proteins to some targets that had no known binding partners or antibodies,”
    In total, the team produced over half a million candidate binding proteins for the 12 selected molecular targets. Data collected on this large pool of candidate binding proteins was used to improve the overall method.
    “We look forward to seeing how these molecules might be used in a clinical context, and more importantly how this new method of designing protein drugs might lead to even more promising compounds in the future,” said Coventry.
    The research team included scientists from the University of Washington School of Medicine, Yale University School of Medicine, Stanford University School of Medicine, Ghent University, The Scripps Research Institute, and the National Cancer Institute, among other institutions.
    This work was supported in part by The Audacious Project at the Institute for Protein Design, Open Philanthropy Project, National Institutes of Health (HHSN272201700059C, R01AI140245, R01AI150855, R01AG063845), Defense Advanced Research Project Agency (HR0011835403 contract FA8750-17-C-0219), Defense Threat Reduction Agency (HDTRA1-16-C-0029), Schmidt Futures, Gates Ventures, Donald and Jo Anne Petersen Endowment, and an Azure computing gift for COVID-19 research provided by Microsoft. More

  • in

    Innovative AI technology aids personalized care for diabetes patients needing complex drug treatment

    Hitachi, Ltd., University of Utah Health, and Regenstrief Institute, Inc. today announced the development of an AI method to improve care for patients with type 2 diabetes mellitus who need complex treatment. One in 10 adults worldwide have been diagnosed with type 2 diabetes, but a smaller number require multiple medications to control blood glucose levels and avoid serious complications, such as loss of vision and kidney disease.
    For this smaller group of patients, physicians may have limited clinical decision-making experience or evidence-based guidance for choosing drug combinations. The solution is to expand the number of patients to support development of general principles to guide decision-making. Combining patient data from multiple healthcare institutions, however, requires deep expertise in artificial intelligence (AI) and wide-ranging experience in developing machine learning models using sensitive and complex healthcare data.
    Hitachi, U of U Health, and Regenstrief researchers partnered to develop and test a new AI method that analyzed electronic health record data across Utah and Indiana and learned generalizable treatment patterns of type 2 diabetes patients with similar characteristics. Those patterns can now be used to help determine an optimal drug regimen for a specific patient.
    Some of the results of this study are published in the peer-reviewed medical journal, Journal of Biomedical Informatics, in the article, “Predicting pharmacotherapeutic outcomes for type 2 diabetes: An evaluation of three approaches to leveraging electronic health record data from multiple sources.”
    Hitachi had been working with U of U Health for several years on development of a pharmacotherapy selection system for diabetes treatment. However, the system was not always able to accurately predict more complex and less prevalent treatment patterns because it did not have enough data. In addition, it was not easy to use data from multiple facilities, as it was necessary to account for differences in patient disease states and therapeutic drugs prescribed among facilities and regions. To address these challenges, the project partnered with Regenstrief to enrich the data it was working with.
    The new AI method initially groups patients with similar disease states and then analyzes their treatment patterns and clinical outcomes. It then matches the patient of interest to the disease state groups and predicts the range of potential outcomes for the patient depending on various treatment options. The researchers evaluated how well the method worked in predicting successful outcomes given drug regimens administered to patient with diabetes in Utah and Indiana. The algorithm was able to support medication selection for more than 83 percent of patients, even when two or more medications were used together.
    In the future, the research team expects to help patients with diabetes who require complex treatment in checking the efficacy of various drug combinations and then, with their doctors, deciding on a treatment plan that is right for them. This will lead not only to better management of diabetes but increased patient engagement, compliance, and quality of life.
    The three parties will continue to evaluate and improve the effectiveness of the new AI method and contribute to future patient care through further research in healthcare informatics.
    Hitachi will accelerate efforts, including the practical application of this technology through collaboration between its healthcare and IT business divisions and R&D group. GlobalLogic Inc., a Hitachi Group Company and leader in Digital Engineering, is promoting healthcare-related projects in the U.S., will also deepen the collaboration in this field. Through these efforts, the entire Hitachi group will contribute to the health and safety of people.
    Story Source:
    Materials provided by Regenstrief Institute. Note: Content may be edited for style and length. More

  • in

    Quantum physics sets a speed limit to electronics

    Semiconductor electronics is getting faster and faster — but at some point, physics no longer permits any increase. The speed can definitely not be increased beyond one petahertz (one million gigahertz), even if the material is excited in an optimal way with laser pulses.
    How fast can electronics be? When computer chips work with ever shorter signals and time intervals, at some point they come up against physical limits. The quantum-mechanical processes that enable the generation of electric current in a semiconductor material take a certain amount of time. This puts a limit to the speed of signal generation and signal transmission.
    TU Wien (Vienna), TU Graz and the Max Planck Institute of Quantum Optics in Garching have now been able to explore these limits: The speed can definitely not be increased beyond one petahertz (one million gigahertz), even if the material is excited in an optimal way with laser pulses. This result has now been published in the scientific journal Nature Communications.
    Fields and currents
    Electric current and light (i.e. electromagnetic fields) are always interlinked. This is also the case in microelectronics: In microchips, electricity is controlled with the help of electromagnetic fields. For example, an electric field can be applied to a transistor, and depending on whether the field is switched on or off, the transistor either allows electrical current to flow or blocks it. In this way, an electromagnetic field is converted into an electrical signal.
    In order to test the limits of this conversion of electromagnetic fields to current, laser pulses — the fastest, most precise electromagnetic fields available — are used, rather than transistors.
    “Materials are studied that initially do not conduct electricity at all,” explains Prof. Joachim Burgdörfer from the Institute for Theoretical Physics at TU Wien. “These are hit by an ultra-short laser pulse with a wavelength in the extreme UV range. This laser pulse shifts the electrons into a higher energy level, so that they can suddenly move freely. That way, the laser pulse turns the material into an electrical conductor for a short period of time.” As soon as there are freely moving charge carriers in the material, they can be moved in a certain direction by a second, slightly longer laser pulse. This creates an electric current that can then be detected with electrodes on both sides of the material.
    These processes happen extremely fast, on a time scale of atto- or femtoseconds. “For a long time, such processes were considered instantaneous,” says Prof. Christoph Lemell (TU Wien). “Today, however, we have the necessary technology to study the time evolution of these ultrafast processes in detail.” The crucial question is: How fast does the material react to the laser? How long does the signal generation take and how long does one have to wait until the material can be exposed to the next signal? The experiments were carried out in Garching and Graz, the theoretical work and complex computer simulations were done at TU Wien.
    Time or energy — but not both
    The experiment leads to a classic uncertainty dilemma, as it often occurs in quantum physics: in order to increase the speed, extremely short UV laser pulses are needed, so that free charge carriers are created very quickly. However, using extremely short pulses implies that the amount of energy which is transferred to the electrons is not precisely defined. The electrons can absorb very different energies. “We can tell exactly at which point in time the free charge carriers are created, but not in which energy state they are,” says Christoph Lemell. “Solids have different energy bands, and with short laser pulses many of them are inevitably populated by free charge carriers at the same time.”
    Depending on how much energy they carry, the electrons react quite differently to the electric field. If their exact energy is unknown, it is no longer possible to control them precisely, and the current signal that is produced is distorted — especially at high laser intensities.
    “It turns out that about one petahertz is an upper limit for controlled optoelectronic processes,” says Joachim Burgdörfer. Of course, this does not mean that it is possible to produce computer chips with a clock frequency of just below one petahertz. Realistic technical upper limits are most likely considerably lower. Even though the laws of nature determining the ultimate speed limits of optoelectronics cannot be outsmarted, they can now be analyzed and understood with sophisticated new methods. More

  • in

    Simply printing high-performance perovskite-based transistors

    The printing press has contributed immensely to the advancement of humankind by elevating politics, economy, and culture to higher grounds. Today, it goes beyond simply printing books or documents, and is expanding its influence to the realm of cutting-edge technology. Most notably, high-performance components in various smart devices have been successfully printed and have attracted much attention. And now, a technology to print perovskite-based devices — considered a challenge until now — has been proposed.
    A POSTECH research team led by Professor Yong-Young Noh and Ph.D. candidates Ao Liu and Huihui Zhu (Department of Chemical Engineering), in collaboration with Professor Myung-Gil Kim (School of Advanced Materials Science and Engineering) of Sungkyunkwan University, has improved the performance of a p-type semiconductortransistor using inorganic metal halide perovskite. One of the biggest advantages of the new technology is that it enables solution-processed perovskite transistors to be simply printed as semiconductor-like circuits.
    Perovskite-based transistors control the current by combining p-type semiconductors that exhibit hole mobilities with n-type semiconductors. Compared to n-type semiconductors that have been actively studied so far, fabricating high-performance p-type semiconductors has been a challenge.
    Many researchers have tried to utilize perovskite in the p-type semiconductor for its excellent electrical conductivity, but its poor electrical performance and reproducibility have hindered commercialization.
    To overcome this issue, the researchers used the modified inorganic metal halidecaesium tin triiodide (CsSnI3) to develop the p-type perovskite semiconductor and fabricated the high-performance transistor based on this. This transistor exhibits high hole mobility of 50cm2V-1s-1 and higher and the current ratio of more than 108, and recorded the highest performance among the perovskite semiconductor transistors that have been developed so far.
    By making the material into a solution, the researchers succeeded in simply printing the p-type semiconductor transistor as if printing a document. This method is not only convenient but also cost-effective, which can lead to the commercialization of perovskite devices in the future.
    “The newly developed semiconductor material and transistor can be widely applicable as logic circuits in high-end displays and in wearable electronic devices, and also be used in stacked electronic circuits and optoelectronic devices by stacking them vertically with silicon semiconductors,” explained Professor Yong-Young Noh on the significance of the study.
    This study was conducted with the support from the Mid-Career Researcher Program, Creative Materials Discovery Program, Next-generation Intelligence-Type Semiconductor Development Program, and the Basic Research Lab Program of the National Research Foundation of Korea, and from Samsung Display Corporation.
    Story Source:
    Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. More

  • in

    Keeping the light from fading

    Scientists from Nara Institute of Science and Technology created a new approach to compensate for variations in illumination while scanning cathedral stained-glass windows. This work may be applied to other objects of cultural significance to help capture their colors in the most lifelike way.
    It’s hard to think of a more inspirational experience than watching the sun slowly set through historic stained-glass windows, such as those found in the cathedrals in Europe. While the changing light levels over time may be breathtaking, it also makes high-resolution scans of the windows more challenging. That is, if the scanning process requires minutes or even hours to complete, variations in the natural illumination can lead to inconsistent results.
    Now, a team of researchers led by Nara Institute of Science and Technology has developed a new calibration method to help compensate for changes in the sun’s illumination over the course of the scan. “It can take hours to capture thousands of spectral channels pixel by pixel. Thus, the measurement can be significantly affected by the perturbations in natural light,” first author Takuya Funatomi says.
    The researchers set out to capture hyperspectral images of the famous stained-glass windows in the Amiens Cathedral in France. With some window panels dating back to the 13th century, this location which has been designated as a UNESCO World Heritage Site. A whisk-broom scanner was used to acquire hyperspectral images. This kind of sensor uses a movable mirror to slowly scan across an object. Each pixel is measured one at a time as its light is reflected onto the single detector with the sky in the background. However, when it is applied to outdoor cultural heritages, temporal illumination variations become an issue due to the lengthy measurement time. Hyperspectral scanning is not limited to the wavelengths of light that are visible to humans. For this research, the team used a spectrometer that recorded more than 2,000 channels over a spectrum ranging from about 200 nm to 1100 nm, which includes ultraviolet, visible and infrared colors.
    An extra single column scan was added to help calibrate the images. Using matrix methods, variations in temporal illumination could be removed. This allowed for much more accurate results compared with simply normalizing the total brightness, because each color might be impacted differently by the changing light. “Our method provides a new modality for the digital preservation of large cultural assets,” senior author Yasuhiro Mukaigawa says. This method can be easily adapted to other situations in which outdoor scanning has to occur over long time periods.
    Story Source:
    Materials provided by Nara Institute of Science and Technology. Note: Content may be edited for style and length. More

  • in

    'Hot' spin quantum bits in silicon transistors

    Quantum bits (qubits) are the smallest units of information in a quantum computer. Currently, one of the biggest challenges in developing this kind of powerful computer is scalability. A research group at the University of Basel, working with the IBM Research Laboratory in Rüschlikon, has made a breakthrough in this area.
    Quantum computers promise unprecedented computing power, but to date prototypes have been based on just a handful of computing units. Exploiting the potential of this new generation of computers requires combining large quantities of qubits.
    It is a scalability problem which once affected classic computers, as well; in that case it was solved with transistors integrated into silicon chips. The research team led by Dr. Andreas Kuhlmann and Professor Dominik Zumbühl from the University of Basel has now come up with silicon-based qubits that are very similar in design to classic silicon transistors. The researchers published their findings in the journal Nature Electronics.
    Building on classic silicon technology
    In classic computers, the solution to the scalability problem lay in silicon chips, which today include billions of “fin field-effect transistors” (FinFETs). These FinFETs are small enough for quantum applications; at very low temperatures near absolute zero (0 kelvin or -273.15 degrees Celsius), a single electron with a negative charge or a “hole” with a positive charge can act as a spin qubit. Spin qubits store quantum information in the two states spin-up (intrinsic angular momentum up) and spin-down (intrinsic angular momentum down).
    The qubits developed by Kuhlmann’s team are based on FinFET architecture and use holes as spin qubits. In contrast with electron spin, hole spin in silicon nanostructures can be directly manipulated with fast electrical signals.
    Potential for higher operating temperatures
    Another major obstacle to scalability is temperature; previous qubit systems typically had to operate at an extremely low range of about 0.1 kelvin. Controlling each qubit requires additional measuring lines to connect the control electronics at room temperature to the qubits in the cryostat — a cooling unit which generates extremely low temperatures. The number of these measuring lines is limited because each line produces heat. This inevitably creates a bottleneck in the wiring, which in turn sets a limit to scaling.
    Circumventing this “wiring bottleneck” is one of the main goals of Kuhlmann’s research group, and requires measurement and control electronics to be built directly into the cooling unit. “However, integrating these electronics requires qubit operation at temperatures above 1 kelvin, with the cooling power of the cryostats increasing sharply to compensate for the heat dissipation of the control electronics,” explains Dr. Leon Camenzind of the Department of Physics at the University of Basel. Doctoral student Simon Geyer, who shares lead authorship of the study with Camenzind, adds, “We have overcome the 4 kelvin-mark with our qubits, reaching the boiling point of liquid helium. Here we can achieve much greater cooling power, which allows for integration of state-of-the-art cryogenic control technology.”
    Close to industry standards
    Working with proven technology such as FinFET architecture to build a quantum computer offers the potential for scaling up to very large numbers of qubits. “Our approach of building on existing silicon technology puts us close to industry practice,” says Kuhlmann. The samples were created at the Binnig and Rohrer Nanotechnology Center at the IBM Research Zurich laboratory in Rüschlikon, a partner of the NCCR SPIN, which is based at the University of Basel and counts the research team as a member.
    Story Source:
    Materials provided by https://www.unibas.ch/en.html. Note: Content may be edited for style and length. More