More stories

  • in

    Novel thermometer can accelerate quantum computer development

    Researchers at Chalmers University of Technology, Gothenburg, Sweden, have developed a novel type of thermometer that can simply and quickly measure temperatures during quantum calculations with extremely high accuracy. The breakthrough provides a benchmarking tool for quantum computing of great value — and opens up for experiments in the exciting field of quantum thermodynamics.
    A key component in quantum computers are coaxial cables and waveguides — structures which guide waveforms, and act as the vital connection between the quantum processor, and the classical electronics which control it. Microwave pulses travel along the waveguides to the quantum processor, and are cooled down to extremely low temperatures along the way. The waveguide also attenuates and filters the pulses, enabling the extremely sensitive quantum computer to work with stable quantum states.
    In order to have maximum control over this mechanism, the researchers need to be sure that these waveguides are not carrying noise due to thermal motion of electrons on top of the pulses that they send. In other words, they have to measure the temperature of the electromagnetic fields at the cold end of the microwave waveguides, the point where the controlling pulses are delivered to the computer’s qubits. Working at the lowest possible temperature minimises the risk of introducing errors in the qubits.
    Until now, researchers have only been able to measure this temperature indirectly, with relatively large delay. Now, with the Chalmers researchers’ novel thermometer, very low temperatures can be measured directly at the receiving end of the waveguide — very accurately and with extremely high time resolution.
    “Our thermometer is a superconducting circuit, directly connected to the end of the waveguide being measured. It is relatively simple — and probably the world’s fastest and most sensitive thermometer for this particular purpose at the millikelvin scale,” says Simone Gasparinetti, Assistant Professor at the Quantum Technology Laboratory, Chalmers University of Technology.
    Important for measuring quantum computer performance
    The researchers at the Wallenberg Centre for Quantum Technology, WACQT, have the goal to build a quantum computer — based on superconducting circuits — with at least 100 well-functioning qubits, performing correct calculations by 2030. It requires a processor working temperature close to absolute zero, ideally down to 10 millikelvin. The new thermometer gives the researchers an important tool for measuring how good their systems are and what shortcomings exist — a necessary step to be able to refine the technology and achieve their goal. More

  • in

    Machine learning shows potential to enhance quantum information transfer

    Army-funded researchers demonstrated a machine learning approach that corrects quantum information in systems composed of photons, improving the outlook for deploying quantum sensing and quantum communications technologies on the battlefield.
    When photons are used as the carriers of quantum information to transmit data, that information is often distorted due to environment fluctuations destroying the fragile quantum states necessary to preserve it.
    Researchers from Louisiana State University exploited a type of machine learning to correct for information distortion in quantum systems composed of photons. Published in Advanced Quantum Technologies, the team demonstrated that machine learning techniques using the self-learning and self-evolving features of artificial neural networks can help correct distorted information. This results outperformed traditional protocols that rely on conventional adaptive optics.
    “We are still in the fairly early stages of understanding the potential for machine learning techniques to play a role in quantum information science,” said Dr. Sara Gamble, program manager at the Army Research Office, an element of U.S. Army Combat Capabilities Development Command, known as DEVCOM, Army Research Laboratory. “The team’s result is an exciting step forward in developing this understanding, and it has the potential to ultimately enhance the Army’s sensing and communication capabilities on the battlefield.”
    For this research, the team used a type of neural network to correct for distorted spatial modes of light at the single-photon level.
    “The random phase distortion is one of the biggest challenges in using spatial modes of light in a wide variety of quantum technologies, such as quantum communication, quantum cryptography, and quantum sensing,” said Narayan Bhusal, doctoral candidate at LSU. “Our method is remarkably effective and time-efficient compared to conventional techniques. This is an exciting development for the future of free-space quantum technologies.”
    According to the research team, this smart quantum technology demonstrates the possibility of encoding of multiple bits of information in a single photon in realistic communication protocols affected by atmospheric turbulence.
    “Our technique has enormous implications for optical communication and quantum cryptography,” said Omar Magaña Loaiza, assistant professor of physics at LSU. “We are now exploring paths to implement our machine learning scheme in the Louisiana Optical Network Initiative to make it smart, more secure, and quantum.”
    Story Source:
    Materials provided by U.S. Army Research Laboratory. Note: Content may be edited for style and length. More

  • in

    Researchers' algorithm designs soft robots that sense

    There are some tasks that traditional robots — the rigid and metallic kind — simply aren’t cut out for. Soft-bodied robots, on the other hand, may be able to interact with people more safely or slip into tight spaces with ease. But for robots to reliably complete their programmed duties, they need to know the whereabouts of all their body parts. That’s a tall task for a soft robot that can deform in a virtually infinite number of ways.
    MIT researchers have developed an algorithm to help engineers design soft robots that collect more useful information about their surroundings. The deep-learning algorithm suggests an optimized placement of sensors within the robot’s body, allowing it to better interact with its environment and complete assigned tasks. The advance is a step toward the automation of robot design. “The system not only learns a given task, but also how to best design the robot to solve that task,” says Alexander Amini. “Sensor placement is a very difficult problem to solve. So, having this solution is extremely exciting.”
    The research will be presented during April’s IEEE International Conference on Soft Robotics and will be published in the journal IEEE Robotics and Automation Letters. Co-lead authors are Amini and Andrew Spielberg, both PhD students in MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Other co-authors include MIT PhD student Lillian Chin, and professors Wojciech Matusik and Daniela Rus.
    Creating soft robots that complete real-world tasks has been a long-running challenge in robotics. Their rigid counterparts have a built-in advantage: a limited range of motion. Rigid robots’ finite array of joints and limbs usually makes for manageable calculations by the algorithms that control mapping and motion planning. Soft robots are not so tractable.
    Soft-bodied robots are flexible and pliant — they generally feel more like a bouncy ball than a bowling ball. “The main problem with soft robots is that they are infinitely dimensional,” says Spielberg. “Any point on a soft-bodied robot can, in theory, deform in any way possible.” That makes it tough to design a soft robot that can map the location of its body parts. Past efforts have used an external camera to chart the robot’s position and feed that information back into the robot’s control program. But the researchers wanted to create a soft robot untethered from external aid.
    “You can’t put an infinite number of sensors on the robot itself,” says Spielberg. “So, the question is: How many sensors do you have, and where do you put those sensors in order to get the most bang for your buck?” The team turned to deep learning for an answer.
    The researchers developed a novel neural network architecture that both optimizes sensor placement and learns to efficiently complete tasks. First, the researchers divided the robot’s body into regions called “particles.” Each particle’s rate of strain was provided as an input to the neural network. Through a process of trial and error, the network “learns” the most efficient sequence of movements to complete tasks, like gripping objects of different sizes. At the same time, the network keeps track of which particles are used most often, and it culls the lesser-used particles from the set of inputs for the networks’ subsequent trials.
    By optimizing the most important particles, the network also suggests where sensors should be placed on the robot to ensure efficient performance. For example, in a simulated robot with a grasping hand, the algorithm might suggest that sensors be concentrated in and around the fingers, where precisely controlled interactions with the environment are vital to the robot’s ability to manipulate objects. While that may seem obvious, it turns out the algorithm vastly outperformed humans’ intuition on where to site the sensors.
    The researchers pitted their algorithm against a series of expert predictions. For three different soft robot layouts, the team asked roboticists to manually select where sensors should be placed to enable the efficient completion of tasks like grasping various objects. Then they ran simulations comparing the human-sensorized robots to the algorithm-sensorized robots. And the results weren’t close. “Our model vastly outperformed humans for each task, even though I looked at some of the robot bodies and felt very confident on where the sensors should go,” says Amini. “It turns out there are a lot more subtleties in this problem than we initially expected.”
    Spielberg says their work could help to automate the process of robot design. In addition to developing algorithms to control a robot’s movements, “we also need to think about how we’re going to sensorize these robots, and how that will interplay with other components of that system,” he says. And better sensor placement could have industrial applications, especially where robots are used for fine tasks like gripping. “That’s something where you need a very robust, well-optimized sense of touch,” says Spielberg. “So, there’s potential for immediate impact.”
    “Automating the design of sensorized soft robots is an important step toward rapidly creating intelligent tools that help people with physical tasks,” says Rus. “The sensors are an important aspect of the process, as they enable the soft robot to “see” and understand the world and its relationship with the world.”
    This research was funded, in part, by the National Science Foundation and the Fannie and John Hertz Foundation. More

  • in

    Tunable smart materials

    Researchers developed a system of self-assembling polymer microparticles with adjustable concentrations of two types of attached residues. They found that tuning the concentration of each type allowed them to control the aggregation and resulting shape of the clusters. This work may lead to advances in ‘smart’ materials, including sensors and damage-resistant surfaces.
    Scientists from the Graduate School of Science at Osaka University created superabsorbent polymer (SAP) microparticles that self-assemble into structures that can be modified by adjusting the proportion of particle type. This research may lead to new tunable biomimetic “smart materials” that can sense and respond to specific chemicals.
    Biological molecules in living organisms have a remarkable ability to form self-assembled structures when triggered by an external molecule. This has led scientists to try to create other “smart materials” that respond to their environment. Now, a team of researchers at Osaka University has come up with a tunable system involving poly(sodium acrylate) microparticles that can have one of two types of chemical groups attached. The adjustable parameters x and y refer to the molar percent of microparticles with β-cyclodextrin (βCD) and adamantyl (Ad) residues, respectively.
    “We found that the macroscopic shape of assemblies formed by microparticles was dependent on the residue content,” co-senior author Akihito Hashidzume says. In order for assemblies to form, x needed to be at least 22.3; however, the shape of assemblies could be controlled by varying y. As the value of y increased, the clusters became more and more elongated. The team hypothesized that at higher values of y, small clusters could form early and stick together, leading to elongated aggregates. Conversely, when y was small, clusters would only stick together after many collisions, resulting in more spherical aggregates. This provides a way to tune to the shape of the resulting clusters. The team measured the aggregates under a microscope to determine the shapes of assemblies using a statistical analysis.
    “On the basis of these findings, we hope to help reveal the origin of the diverse shape of living organisms, which are macroscopic assemblies controlled by molecular recognition,” co-senior author Akira Harada says. This research may also lead to the development of new smart sensors that can form clusters large enough to be seen with the naked eye.
    Story Source:
    Materials provided by Osaka University. Note: Content may be edited for style and length. More

  • in

    Its curvature foreshadows the next financial bubble

    An international team of interdisciplinary researchers has identified mathematical metrics to characterize the fragility of financial markets. Their paper “Network geometry and market instability” sheds light on the higher-order architecture of financial systems and allows analysts to identify systemic risks like market bubbles or crashes.
    With the recent rush of small investors into so-called meme stocks and reemerging interest in cryptocurrencies talk of market instability, rising volatility, and bursting bubbles is surging. However, “traditional economic theories cannot foresee events like the US subprime mortgage collapse of 2007” according to study author Areejit Samal. He and his colleagues from more than ten mathematics, physics, economics, and complex systems focused institutions around the globe have made a great stride in characterizing stock market instability.
    Their paper abstracts the complexity of the financial market into a network of stocks and employs geometry-inspired network measures to gauge market fragility and financial dynamics. They analyzed and contrasted the stock market networks for the USA S&P500 and the Japanese Nikkei-225 indices for a 32-year period (1985-2016) and for the first time were able to show that several discrete Ricci curvatures are excellent indicators of market instabilities. The work was recently published in the Royal Society Open Science journal and allows analysts to distinguish between ‘business-as-usual’ periods and times of fragility like bubbles or market crashes.
    The network created by connecting stocks with highly correlated prices and trading volumes forms the structural basis of their work. The researchers then employ four discrete curvatures, developed by the director of Max Planck Institute for Mathematics in the Sciences Jürgen Jost and his coworkers, to study the changes in the structure of stock market networks over time. Their comparisons to other market stability metrics have shown that their four notions of curvature serve as generic indicators of market instability.
    One curvature candidate, the Forman-Ricci curvature (FRE), has a particularly high correlation with traditional financial indicators and can accurately capture market fear (volatility) and fragility (risk). Their study confirms that in normal trading periods the market is very fragmented, whereas in times of bubbles and impending market crashes correlations between stocks become more uniform and highly interconnected. The FRE is sensitive to both sector-driven and global market fluctuations and whereas common indicators like the returns remain inconspicuous, network curvatures expose these dynamics and reach extreme values during a bubble. Thus, the FRE can capture the interdependencies within and between sectors that facilitate the spreading of perturbations and increase the danger of market crashes.
    Max Planck Institute for Mathematics in the Sciences director Jürgen Jost summarizes the struggle of analyzing market fragility: “there are no easy definitions of a market crash or bubble and merely monitoring established market indices or log-returns does not suffice, but our methodology offers a powerful tool for continuously scanning market risk and thus the health of the financial system.” The insights gained by this study can help decision-makers to better understand systemic risk and identify tipping points, which can potentially forecast coming financial crises or possibly even avoid them altogether.
    Story Source:
    Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length. More

  • in

    Expressing some doubts about android faces

    Researchers from the Graduate School of Engineering and Symbiotic Intelligent Systems Research Center at Osaka University used motion capture cameras to compare the expressions of android and human faces. They found that the mechanical facial movements of the robots, especially in the upper regions, did not fully reproduce the curved flow lines seen in the faces of actual people. This research may lead to more lifelike and expressive artificial faces.
    The field of robotics has advanced a great deal over the past decades. However, while current androids can appear very humanlike at first, their active facial expressions may still be unnatural and slightly unsettling to us. The exact reasons for this effect have been difficult to pinpoint. Now, a research team at Osaka University has used motion capture technology to monitor the facial expressions of five android faces and compared the results with actual human facial expressions. This was accomplished with six infrared cameras that monitored reflection markers at 120 frames per second and allowed the motions to be represented as three-dimensional displacement vectors.
    “Advanced artificial systems can be difficult to design because the numerous components have complex interactions with each other. The appearance of an android face can experience surface deformations that are hard to control,” study first author Hisashi Ishihara says. These deformations can be due to interactions between components such as the soft skin sheet and the skull-shaped structure, as well as the mechanical actuators.
    The first difference the team found between the androids and adult males was in their flow lines, especially the eye and forehead areas. These lines tended to be almost straight for the androids but were curved for the human adult males. Another major difference was with the skin surface undulation patterns in the upper part of the face.
    “Redesigning the face of androids so that the skin flow pattern resembles that of humans may reduce the discomfort induced by the androids and improve their emotional communication performance,” senior author Minoru Asada says. “Future work may help give the android faces the same level of expressiveness as humans have. Each robot may even have its own individual “personality” that will help people feel more comfortable.”
    Story Source:
    Materials provided by Osaka University. Note: Content may be edited for style and length. More

  • in

    People affected by COVID-19 are being nicer to machines

    People are not very nice to machines. The disdain goes beyond the slot machine that emptied your wallet, a dispenser that failed to deliver a Coke or a navigation system that took you on an unwanted detour.
    Yet USC researchers report that people affected by COVID-19 are showing more goodwill — to humans and to human-like autonomous machines.
    “The new discovery here is that when people are distracted by something distressing, they treat machines socially like they would treat other people. We found greater faith in technology due to the pandemic and a closing of the gap between humans and machines,” said Jonathan Gratch, senior author of the study and director for virtual humans research at the USC Institute for Creative Technologies.
    The findings, which appeared recently in the journal iScience, come from researchers at USC, George Mason University and the U.S. Department of Defense.
    The scientists noted that, in general, people mostly dispense with social norms of human interaction and treat machines differently. The behavior holds even as machines become more humanlike; think Alexa, the persona in your vehicle navigation system or other virtual assistants. This is because human default behavior is often driven by heuristic thinking — the snap judgments people use to navigate complex daily interactions.
    In studying human-machine interactions, the researchers noted that people impacted by COVID-19 also displayed more altruism both toward other people and to machines. More

  • in

    A promising breakthrough for a better design of electronic materials

    Finding the best materials for tomorrow’s electronics is the goal of Professor Emanuele Orgiu of the Institut national de la recherche scientifique (INRS). Among the materials in which Professor Orgiu is interested, some are made of molecules that can conduct electricity. He has demonstrated the role played by molecular vibrations on electron conductivity on crystals of such materials. This finding is important for applications of these molecular materials in electronics, energy and information storage. The study, conducted in collaboration with a team from the INRS and the University of Strasbourg (France), was published in the Advanced Materials journal.
    Scientists were interested in observing the relationship between the structure of materials and their ability to conduct electricity. To this end, they measured the speed of propagation of electrons in crystals formed by these molecules. In their study, the authors compared two perylene diimide derivatives, which are semiconducting molecules of interest because of their use on flexible devices, smart clothes or foldable electronics. The two compounds encompassed within the study have similar chemical structure but featured very different conduction properties.
    With the goal of determining what caused this difference, the research group was able to establish that the different molecular vibrations composing the material were responsible for the different electrical behaviour observed in devices. “For a current to flow through a material, electrons must ‘hop’ from one molecule to the neighbouring one. Depending on the level of ‘movement’ of the molecules, which depends on the amplitude and energy of the related vibrations (called phonons), the electrons can move more or less easily through the material,” explains Professor Orgiu, whose research team is the first to demonstrate which vibrations have the greatest influence on electron flows.
    An Ad Hoc Molecular Design to Make Electrons Travel Faster
    This breakthrough paves the way for the development of even more efficient materials for electronics. “By knowing what type of vibrations allows charges to move more easily, we are providing chemists with a formula for synthesizing the right materials, rather than going in blindly,” explains Marc-Antoine Stoeckel. This research opens up new applications that could not be envisaged with silicon, the most widely used material in electronics, including computers.
    Professor Orgiu collaborated with INRS Professor Luca Razzari to measure the vibrations of the molecules. The two researchers are now working on a new spectroscopic technique that would enable them to visualize the vibrations when electrons are present. This will allow them to see if charges affect molecular vibrations.
    Story Source:
    Materials provided by Institut national de la recherche scientifique – INRS. Original written by Audrey-Maude Vézina. Note: Content may be edited for style and length. More